{"product_id":"q2n68a-refurbished-hpe-nvidia-tesla-v100-pcie-16gb-computational-accelerator","title":"Q2N68A\t- Refurbished - HPE NVIDIA Tesla V100 PCIe 16GB Computational Accelerator","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003eRequires 8P Keyed Cable Kit (871829-B21)\u003c\/p\u003e\n\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"65\"\u003eHPE NVIDIA Tesla V100 PCIe 16GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-start=\"80\" data-end=\"90\"\u003eQ2N68A\u003c\/strong\u003e) is a high-performance GPU designed for data center applications such as AI training, high-performance computing (HPC), and deep learning.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt leverages NVIDIA's Volta architecture and High Bandwidth Memory (HBM2) to deliver exceptional computational capabilities.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"959\"\u003e\n\u003cli data-start=\"160\" data-end=\"221\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"221\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"182\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Volta\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"222\" data-end=\"277\" class=\"\"\u003e\n\u003cp data-start=\"224\" data-end=\"277\" class=\"\"\u003e\u003cstrong data-start=\"224\" data-end=\"238\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e5,120\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"278\" data-end=\"301\" class=\"\"\u003e\n\u003cp data-start=\"280\" data-end=\"301\" class=\"\"\u003e\u003cstrong data-start=\"280\" data-end=\"296\"\u003eTensor Cores\u003c\/strong\u003e: 640\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"302\" data-end=\"357\" class=\"\"\u003e\n\u003cp data-start=\"304\" data-end=\"357\" class=\"\"\u003e\u003cstrong data-start=\"304\" data-end=\"318\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,245 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"358\" data-end=\"414\" class=\"\"\u003e\n\u003cp data-start=\"360\" data-end=\"414\" class=\"\"\u003e\u003cstrong data-start=\"360\" data-end=\"375\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,380 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"415\" data-end=\"466\" class=\"\"\u003e\n\u003cp data-start=\"417\" data-end=\"466\" class=\"\"\u003e\u003cstrong data-start=\"417\" data-end=\"427\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB HBM2\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"467\" data-end=\"528\" class=\"\"\u003e\n\u003cp data-start=\"469\" data-end=\"528\" class=\"\"\u003e\u003cstrong data-start=\"469\" data-end=\"489\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4,096-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"529\" data-end=\"590\" class=\"\"\u003e\n\u003cp data-start=\"531\" data-end=\"590\" class=\"\"\u003e\u003cstrong data-start=\"531\" data-end=\"551\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e900 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"591\" data-end=\"647\" class=\"\"\u003e\n\u003cp data-start=\"593\" data-end=\"647\" class=\"\"\u003e\u003cstrong data-start=\"593\" data-end=\"606\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"648\" data-end=\"706\" class=\"\"\u003e\n\u003cp data-start=\"650\" data-end=\"706\" class=\"\"\u003e\u003cstrong data-start=\"650\" data-end=\"665\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDual-slot, full-height (267 mm x 112 mm x 38 mm)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"707\" data-end=\"761\" class=\"\"\u003e\n\u003cp data-start=\"709\" data-end=\"761\" class=\"\"\u003e\u003cstrong data-start=\"709\" data-end=\"720\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"762\" data-end=\"826\" class=\"\"\u003e\n\u003cp data-start=\"764\" data-end=\"826\" class=\"\"\u003e\u003cstrong data-start=\"764\" data-end=\"785\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e250W (1x 8-pin PCIe power connector)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"827\" data-end=\"885\" class=\"\"\u003e\n\u003cp data-start=\"829\" data-end=\"885\" class=\"\"\u003e\u003cstrong data-start=\"829\" data-end=\"844\"\u003eECC Support\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eYes (enabled by default)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"886\" data-end=\"959\" class=\"\"\u003e\n\u003cp data-start=\"888\" data-end=\"959\" class=\"\"\u003e\u003cstrong data-start=\"888\" data-end=\"907\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"966\" data-end=\"995\" class=\"\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-start=\"997\" data-end=\"1250\"\u003e\n\u003cli data-start=\"997\" data-end=\"1067\" class=\"\"\u003e\n\u003cp data-start=\"999\" data-end=\"1067\" class=\"\"\u003e\u003cstrong data-start=\"999\" data-end=\"1026\"\u003eFP64 (Double Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e7.0 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1068\" data-end=\"1138\" class=\"\"\u003e\n\u003cp data-start=\"1070\" data-end=\"1138\" class=\"\"\u003e\u003cstrong data-start=\"1070\" data-end=\"1097\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e14.0 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1139\" data-end=\"1250\" class=\"\"\u003e\n\u003cp data-start=\"1141\" data-end=\"1250\" class=\"\"\u003e\u003cstrong data-start=\"1141\" data-end=\"1163\"\u003eTensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e112 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1344\" data-end=\"1378\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1380\" data-end=\"1505\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla V100 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1507\" data-end=\"1678\"\u003e\n\u003cli data-start=\"1507\" data-end=\"1548\" class=\"\"\u003e\n\u003cp data-start=\"1509\" data-end=\"1548\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL380 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1549\" data-end=\"1590\" class=\"\"\u003e\n\u003cp data-start=\"1551\" data-end=\"1590\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL190r Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1591\" data-end=\"1678\" class=\"\"\u003e\n\u003cp data-start=\"1593\" data-end=\"1678\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL270d Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363020900,"sku":null,"price":3179.9,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/Q2N68A.jpg?v=1771296040","url":"https:\/\/resilient-tec.com\/products\/q2n68a-refurbished-hpe-nvidia-tesla-v100-pcie-16gb-computational-accelerator","provider":"Resilient Tec, LLC","version":"1.0","type":"link"}