{"title":"GPU","description":"","products":[{"product_id":"q0v80a-refurbished-hpe-nvidia-tesla-p40-24gb-computational-accelerator","title":"Q0V80A\t- Refurbished - HPE NVIDIA Tesla P40 24GB Computational Accelerator","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003eRequires 8P Keyed Cable Kit (871829-B21)\u003c\/p\u003e\n\u003cp\u003e\u003cstrong data-end=\"59\" data-start=\"4\"\u003eHPE NVIDIA Tesla P40 24GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-end=\"84\" data-start=\"74\"\u003eQ0V80A\u003c\/strong\u003e) is a high-performance GPU designed for data center applications such as deep learning inference, machine learning, and high-performance computing (HPC). It leverages NVIDIA's Pascal architecture and is optimized for inference workloads.\u003c\/p\u003e\n\u003ch3 class=\"\" data-end=\"158\" data-start=\"133\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-end=\"899\" data-start=\"160\"\u003e\n\u003cli class=\"\" data-end=\"221\" data-start=\"160\"\u003e\n\u003cp class=\"\" data-end=\"221\" data-start=\"162\"\u003e\u003cstrong data-end=\"182\" data-start=\"162\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Pascal (GP102)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"277\" data-start=\"222\"\u003e\n\u003cp class=\"\" data-end=\"277\" data-start=\"224\"\u003e\u003cstrong data-end=\"238\" data-start=\"224\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e3,840\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"333\" data-start=\"278\"\u003e\n\u003cp class=\"\" data-end=\"333\" data-start=\"280\"\u003e\u003cstrong data-end=\"294\" data-start=\"280\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,300 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"390\" data-start=\"334\"\u003e\n\u003cp class=\"\" data-end=\"390\" data-start=\"336\"\u003e\u003cstrong data-end=\"351\" data-start=\"336\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,530 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"442\" data-start=\"391\"\u003e\n\u003cp class=\"\" data-end=\"442\" data-start=\"393\"\u003e\u003cstrong data-end=\"403\" data-start=\"393\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e24 GB GDDR5\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"504\" data-start=\"443\"\u003e\n\u003cp class=\"\" data-end=\"504\" data-start=\"445\"\u003e\u003cstrong data-end=\"465\" data-start=\"445\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e384-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"566\" data-start=\"505\"\u003e\n\u003cp class=\"\" data-end=\"566\" data-start=\"507\"\u003e\u003cstrong data-end=\"527\" data-start=\"507\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e346 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"623\" data-start=\"567\"\u003e\n\u003cp class=\"\" data-end=\"623\" data-start=\"569\"\u003e\u003cstrong data-end=\"582\" data-start=\"569\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"682\" data-start=\"624\"\u003e\n\u003cp class=\"\" data-end=\"682\" data-start=\"626\"\u003e\u003cstrong data-end=\"641\" data-start=\"626\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDual-slot\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"737\" data-start=\"683\"\u003e\n\u003cp class=\"\" data-end=\"737\" data-start=\"685\"\u003e\u003cstrong data-end=\"696\" data-start=\"685\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"802\" data-start=\"738\"\u003e\n\u003cp class=\"\" data-end=\"802\" data-start=\"740\"\u003e\u003cstrong data-end=\"761\" data-start=\"740\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e250W (1x 8-pin PCIe power connector)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"825\" data-start=\"803\"\u003e\n\u003cp class=\"\" data-end=\"825\" data-start=\"805\"\u003e\u003cstrong data-end=\"820\" data-start=\"805\"\u003eECC Support\u003c\/strong\u003e: Yes\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"899\" data-start=\"826\"\u003e\n\u003cp class=\"\" data-end=\"899\" data-start=\"828\"\u003e\u003cstrong data-end=\"847\" data-start=\"828\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 class=\"\" data-end=\"935\" data-start=\"906\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-end=\"1198\" data-start=\"937\"\u003e\n\u003cli class=\"\" data-end=\"1007\" data-start=\"937\"\u003e\n\u003cp class=\"\" data-end=\"1007\" data-start=\"939\"\u003e\u003cstrong data-end=\"966\" data-start=\"939\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e11.7 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1078\" data-start=\"1008\"\u003e\n\u003cp class=\"\" data-end=\"1078\" data-start=\"1010\"\u003e\u003cstrong data-end=\"1037\" data-start=\"1010\"\u003eFP64 (Double Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e367 GFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1198\" data-start=\"1079\"\u003e\n\u003cp class=\"\" data-end=\"1198\" data-start=\"1081\"\u003e\u003cstrong data-end=\"1111\" data-start=\"1081\"\u003eINT8 Inference Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e47 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 class=\"\" data-end=\"1326\" data-start=\"1292\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp class=\"\" data-end=\"1453\" data-start=\"1328\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla P40 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-end=\"1668\" data-start=\"1455\"\u003e\n\u003cli class=\"\" data-end=\"1496\" data-start=\"1455\"\u003e\n\u003cp class=\"\" data-end=\"1496\" data-start=\"1457\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE Apollo pc40\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1538\" data-start=\"1497\"\u003e\n\u003cp class=\"\" data-end=\"1538\" data-start=\"1499\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL380 Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1580\" data-start=\"1539\"\u003e\n\u003cp class=\"\" data-end=\"1580\" data-start=\"1541\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL190r Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1668\" data-start=\"1581\"\u003e\n\u003cp class=\"\" data-end=\"1668\" data-start=\"1583\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE Synergy 480 Gen10\u003c\/span\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923362955364,"sku":null,"price":2349.85,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/QOV80A.jpg?v=1771296038"},{"product_id":"r0w29c-refurbished-hpe-nvidia-tesla-t4-16gb-computational-accelerator","title":"R0W29C - Refurbished - HPE NVIDIA Tesla T4 16GB Computational Accelerator","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"58\"\u003eHPE NVIDIA Tesla T4 16GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-start=\"73\" data-end=\"83\"\u003eR0W29C\u003c\/strong\u003e) is a versatile, energy-efficient GPU designed for AI inference, machine learning, and virtual desktop infrastructure (VDI) workloads.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eBuilt on NVIDIA's Turing architecture, it offers a balance of performance and efficiency for data center deployments.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"841\"\u003e\n\u003cli data-start=\"160\" data-end=\"221\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"221\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"182\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Turing\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"222\" data-end=\"277\" class=\"\"\u003e\n\u003cp data-start=\"224\" data-end=\"277\" class=\"\"\u003e\u003cstrong data-start=\"224\" data-end=\"238\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e2,560\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"278\" data-end=\"301\" class=\"\"\u003e\n\u003cp data-start=\"280\" data-end=\"301\" class=\"\"\u003e\u003cstrong data-start=\"280\" data-end=\"296\"\u003eTensor Cores\u003c\/strong\u003e: 320\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"302\" data-end=\"357\" class=\"\"\u003e\n\u003cp data-start=\"304\" data-end=\"357\" class=\"\"\u003e\u003cstrong data-start=\"304\" data-end=\"318\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e585 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"358\" data-end=\"409\" class=\"\"\u003e\n\u003cp data-start=\"360\" data-end=\"409\" class=\"\"\u003e\u003cstrong data-start=\"360\" data-end=\"370\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB GDDR6\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"410\" data-end=\"471\" class=\"\"\u003e\n\u003cp data-start=\"412\" data-end=\"471\" class=\"\"\u003e\u003cstrong data-start=\"412\" data-end=\"432\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e256-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"472\" data-end=\"533\" class=\"\"\u003e\n\u003cp data-start=\"474\" data-end=\"533\" class=\"\"\u003e\u003cstrong data-start=\"474\" data-end=\"494\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e320 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"534\" data-end=\"588\" class=\"\"\u003e\n\u003cp data-start=\"536\" data-end=\"588\" class=\"\"\u003e\u003cstrong data-start=\"536\" data-end=\"549\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"589\" data-end=\"647\" class=\"\"\u003e\n\u003cp data-start=\"591\" data-end=\"647\" class=\"\"\u003e\u003cstrong data-start=\"591\" data-end=\"606\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eLow-profile, single-slot\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"648\" data-end=\"702\" class=\"\"\u003e\n\u003cp data-start=\"650\" data-end=\"702\" class=\"\"\u003e\u003cstrong data-start=\"650\" data-end=\"661\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"703\" data-end=\"767\" class=\"\"\u003e\n\u003cp data-start=\"705\" data-end=\"767\" class=\"\"\u003e\u003cstrong data-start=\"705\" data-end=\"726\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e70W (no external power connector required)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"768\" data-end=\"841\" class=\"\"\u003e\n\u003cp data-start=\"770\" data-end=\"841\" class=\"\"\u003e\u003cstrong data-start=\"770\" data-end=\"789\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"848\" data-end=\"877\" class=\"\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-start=\"879\" data-end=\"1212\"\u003e\n\u003cli data-start=\"879\" data-end=\"949\" class=\"\"\u003e\n\u003cp data-start=\"881\" data-end=\"949\" class=\"\"\u003e\u003cstrong data-start=\"881\" data-end=\"908\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e8.1 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"950\" data-end=\"1018\" class=\"\"\u003e\n\u003cp data-start=\"952\" data-end=\"1018\" class=\"\"\u003e\u003cstrong data-start=\"952\" data-end=\"977\"\u003eFP16 (Half Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e65 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1019\" data-end=\"1092\" class=\"\"\u003e\n\u003cp data-start=\"1021\" data-end=\"1092\" class=\"\"\u003e\u003cstrong data-start=\"1021\" data-end=\"1051\"\u003eINT8 Inference Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e130 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1093\" data-end=\"1212\" class=\"\"\u003e\n\u003cp data-start=\"1095\" data-end=\"1212\" class=\"\"\u003e\u003cstrong data-start=\"1095\" data-end=\"1125\"\u003eINT4 Inference Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e260 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1306\" data-end=\"1338\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1340\" data-end=\"1465\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla T4 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1467\" data-end=\"1764\"\u003e\n\u003cli data-start=\"1467\" data-end=\"1508\" class=\"\"\u003e\n\u003cp data-start=\"1469\" data-end=\"1508\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL360 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1509\" data-end=\"1550\" class=\"\"\u003e\n\u003cp data-start=\"1511\" data-end=\"1550\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL380 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1551\" data-end=\"1592\" class=\"\"\u003e\n\u003cp data-start=\"1553\" data-end=\"1592\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL385 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1593\" data-end=\"1634\" class=\"\"\u003e\n\u003cp data-start=\"1595\" data-end=\"1634\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL325 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1635\" data-end=\"1676\" class=\"\"\u003e\n\u003cp data-start=\"1637\" data-end=\"1676\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ML350 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1677\" data-end=\"1764\" class=\"\"\u003e\n\u003cp data-start=\"1679\" data-end=\"1764\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE Synergy 480 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp data-start=\"1766\" data-end=\"1851\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDue to its passive cooling design, it's essential to ensure that the host system provides adequate airflow to maintain optimal operating temperatures.\u003c\/span\u003e\u003c\/p\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923362988132,"sku":null,"price":2200.0,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/ROW29C.jpg?v=1771296039"},{"product_id":"q2n68a-refurbished-hpe-nvidia-tesla-v100-pcie-16gb-computational-accelerator","title":"Q2N68A\t- Refurbished - HPE NVIDIA Tesla V100 PCIe 16GB Computational Accelerator","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003eRequires 8P Keyed Cable Kit (871829-B21)\u003c\/p\u003e\n\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"65\"\u003eHPE NVIDIA Tesla V100 PCIe 16GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-start=\"80\" data-end=\"90\"\u003eQ2N68A\u003c\/strong\u003e) is a high-performance GPU designed for data center applications such as AI training, high-performance computing (HPC), and deep learning.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt leverages NVIDIA's Volta architecture and High Bandwidth Memory (HBM2) to deliver exceptional computational capabilities.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"959\"\u003e\n\u003cli data-start=\"160\" data-end=\"221\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"221\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"182\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Volta\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"222\" data-end=\"277\" class=\"\"\u003e\n\u003cp data-start=\"224\" data-end=\"277\" class=\"\"\u003e\u003cstrong data-start=\"224\" data-end=\"238\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e5,120\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"278\" data-end=\"301\" class=\"\"\u003e\n\u003cp data-start=\"280\" data-end=\"301\" class=\"\"\u003e\u003cstrong data-start=\"280\" data-end=\"296\"\u003eTensor Cores\u003c\/strong\u003e: 640\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"302\" data-end=\"357\" class=\"\"\u003e\n\u003cp data-start=\"304\" data-end=\"357\" class=\"\"\u003e\u003cstrong data-start=\"304\" data-end=\"318\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,245 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"358\" data-end=\"414\" class=\"\"\u003e\n\u003cp data-start=\"360\" data-end=\"414\" class=\"\"\u003e\u003cstrong data-start=\"360\" data-end=\"375\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,380 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"415\" data-end=\"466\" class=\"\"\u003e\n\u003cp data-start=\"417\" data-end=\"466\" class=\"\"\u003e\u003cstrong data-start=\"417\" data-end=\"427\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB HBM2\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"467\" data-end=\"528\" class=\"\"\u003e\n\u003cp data-start=\"469\" data-end=\"528\" class=\"\"\u003e\u003cstrong data-start=\"469\" data-end=\"489\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4,096-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"529\" data-end=\"590\" class=\"\"\u003e\n\u003cp data-start=\"531\" data-end=\"590\" class=\"\"\u003e\u003cstrong data-start=\"531\" data-end=\"551\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e900 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"591\" data-end=\"647\" class=\"\"\u003e\n\u003cp data-start=\"593\" data-end=\"647\" class=\"\"\u003e\u003cstrong data-start=\"593\" data-end=\"606\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"648\" data-end=\"706\" class=\"\"\u003e\n\u003cp data-start=\"650\" data-end=\"706\" class=\"\"\u003e\u003cstrong data-start=\"650\" data-end=\"665\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDual-slot, full-height (267 mm x 112 mm x 38 mm)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"707\" data-end=\"761\" class=\"\"\u003e\n\u003cp data-start=\"709\" data-end=\"761\" class=\"\"\u003e\u003cstrong data-start=\"709\" data-end=\"720\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"762\" data-end=\"826\" class=\"\"\u003e\n\u003cp data-start=\"764\" data-end=\"826\" class=\"\"\u003e\u003cstrong data-start=\"764\" data-end=\"785\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e250W (1x 8-pin PCIe power connector)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"827\" data-end=\"885\" class=\"\"\u003e\n\u003cp data-start=\"829\" data-end=\"885\" class=\"\"\u003e\u003cstrong data-start=\"829\" data-end=\"844\"\u003eECC Support\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eYes (enabled by default)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"886\" data-end=\"959\" class=\"\"\u003e\n\u003cp data-start=\"888\" data-end=\"959\" class=\"\"\u003e\u003cstrong data-start=\"888\" data-end=\"907\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"966\" data-end=\"995\" class=\"\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-start=\"997\" data-end=\"1250\"\u003e\n\u003cli data-start=\"997\" data-end=\"1067\" class=\"\"\u003e\n\u003cp data-start=\"999\" data-end=\"1067\" class=\"\"\u003e\u003cstrong data-start=\"999\" data-end=\"1026\"\u003eFP64 (Double Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e7.0 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1068\" data-end=\"1138\" class=\"\"\u003e\n\u003cp data-start=\"1070\" data-end=\"1138\" class=\"\"\u003e\u003cstrong data-start=\"1070\" data-end=\"1097\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e14.0 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1139\" data-end=\"1250\" class=\"\"\u003e\n\u003cp data-start=\"1141\" data-end=\"1250\" class=\"\"\u003e\u003cstrong data-start=\"1141\" data-end=\"1163\"\u003eTensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e112 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1344\" data-end=\"1378\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1380\" data-end=\"1505\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla V100 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1507\" data-end=\"1678\"\u003e\n\u003cli data-start=\"1507\" data-end=\"1548\" class=\"\"\u003e\n\u003cp data-start=\"1509\" data-end=\"1548\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL380 Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1549\" data-end=\"1590\" class=\"\"\u003e\n\u003cp data-start=\"1551\" data-end=\"1590\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL190r Gen10\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1591\" data-end=\"1678\" class=\"\"\u003e\n\u003cp data-start=\"1593\" data-end=\"1678\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL270d Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363020900,"sku":null,"price":3179.9,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/Q2N68A.jpg?v=1771296040"},{"product_id":"q0e21a-refurbished-hpe-nvidia-tesla-p100-pcie-16gb-computational-accelerator","title":"Q0E21A\t- Refurbished - HPE NVIDIA Tesla P100 PCIe 16GB Computational Accelerator, GPU","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003eRequires 8P Keyed Cable Kit (871829-B21)\u003c\/p\u003e\n\u003cp class=\"\" data-end=\"126\" data-start=\"0\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e\u003cstrong data-end=\"65\" data-start=\"4\"\u003eHPE NVIDIA Tesla P100 PCIe 16GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-end=\"90\" data-start=\"80\"\u003eQ0E21A\u003c\/strong\u003e) is a high-performance GPU designed for data center applications such as high-performance computing (HPC), deep learning, and scientific simulations.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt leverages NVIDIA's Pascal architecture and High Bandwidth Memory (HBM2) to deliver exceptional computational capabilities.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 class=\"\" data-end=\"158\" data-start=\"133\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-end=\"935\" data-start=\"160\"\u003e\n\u003cli class=\"\" data-end=\"221\" data-start=\"160\"\u003e\n\u003cp class=\"\" data-end=\"221\" data-start=\"162\"\u003e\u003cstrong data-end=\"182\" data-start=\"162\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Pascal (GP100)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"277\" data-start=\"222\"\u003e\n\u003cp class=\"\" data-end=\"277\" data-start=\"224\"\u003e\u003cstrong data-end=\"238\" data-start=\"224\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e3,584\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"333\" data-start=\"278\"\u003e\n\u003cp class=\"\" data-end=\"333\" data-start=\"280\"\u003e\u003cstrong data-end=\"294\" data-start=\"280\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,189 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"390\" data-start=\"334\"\u003e\n\u003cp class=\"\" data-end=\"390\" data-start=\"336\"\u003e\u003cstrong data-end=\"351\" data-start=\"336\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,328 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"442\" data-start=\"391\"\u003e\n\u003cp class=\"\" data-end=\"442\" data-start=\"393\"\u003e\u003cstrong data-end=\"403\" data-start=\"393\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB HBM2\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"504\" data-start=\"443\"\u003e\n\u003cp class=\"\" data-end=\"504\" data-start=\"445\"\u003e\u003cstrong data-end=\"465\" data-start=\"445\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4,096-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"566\" data-start=\"505\"\u003e\n\u003cp class=\"\" data-end=\"566\" data-start=\"507\"\u003e\u003cstrong data-end=\"527\" data-start=\"507\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eUp to 732 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"623\" data-start=\"567\"\u003e\n\u003cp class=\"\" data-end=\"623\" data-start=\"569\"\u003e\u003cstrong data-end=\"582\" data-start=\"569\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"682\" data-start=\"624\"\u003e\n\u003cp class=\"\" data-end=\"682\" data-start=\"626\"\u003e\u003cstrong data-end=\"641\" data-start=\"626\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDual-slot, full-height (267 mm x 112 mm x 38 mm)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"737\" data-start=\"683\"\u003e\n\u003cp class=\"\" data-end=\"737\" data-start=\"685\"\u003e\u003cstrong data-end=\"696\" data-start=\"685\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"802\" data-start=\"738\"\u003e\n\u003cp class=\"\" data-end=\"802\" data-start=\"740\"\u003e\u003cstrong data-end=\"761\" data-start=\"740\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e250W (1x 8-pin PCIe power connector)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"861\" data-start=\"803\"\u003e\n\u003cp class=\"\" data-end=\"861\" data-start=\"805\"\u003e\u003cstrong data-end=\"820\" data-start=\"805\"\u003eECC Support\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eYes (enabled by default)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"935\" data-start=\"862\"\u003e\n\u003cp class=\"\" data-end=\"935\" data-start=\"864\"\u003e\u003cstrong data-end=\"883\" data-start=\"864\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 class=\"\" data-end=\"971\" data-start=\"942\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-end=\"1229\" data-start=\"973\"\u003e\n\u003cli class=\"\" data-end=\"1043\" data-start=\"973\"\u003e\n\u003cp class=\"\" data-end=\"1043\" data-start=\"975\"\u003e\u003cstrong data-end=\"1002\" data-start=\"975\"\u003eFP64 (Double Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4.7 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1114\" data-start=\"1044\"\u003e\n\u003cp class=\"\" data-end=\"1114\" data-start=\"1046\"\u003e\u003cstrong data-end=\"1073\" data-start=\"1046\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e9.3 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1229\" data-start=\"1115\"\u003e\n\u003cp class=\"\" data-end=\"1229\" data-start=\"1117\"\u003e\u003cstrong data-end=\"1142\" data-start=\"1117\"\u003eFP16 (Half Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e18.7 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 class=\"\" data-end=\"1357\" data-start=\"1323\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp class=\"\" data-end=\"1484\" data-start=\"1359\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla P100 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-end=\"1615\" data-start=\"1486\"\u003e\n\u003cli class=\"\" data-end=\"1527\" data-start=\"1486\"\u003e\n\u003cp class=\"\" data-end=\"1527\" data-start=\"1488\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE Apollo pc40\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1615\" data-start=\"1528\"\u003e\n\u003cp class=\"\" data-end=\"1615\" data-start=\"1530\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant XL270d Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp class=\"\" data-end=\"1702\" data-start=\"1617\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDue to its passive cooling design, it's essential to ensure that the host system provides adequate airflow to maintain optimal operating temperatures.\u003c\/span\u003e\u003c\/p\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363053668,"sku":null,"price":1999.8,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/Q0E21A.jpg?v=1771296041"},{"product_id":"q0v79a-refurbished-hpe-nvidia-tesla-p4-8gb-computational-accelerator","title":"Q0V79A - Refurbished - HPE NVIDIA Tesla P4 8GB Computational Accelerator GPU","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e\u003cstrong data-end=\"57\" data-start=\"4\"\u003eHPE NVIDIA Tesla P4 8GB Computational Accelerator\u003c\/strong\u003e (Part Number: \u003cstrong data-end=\"82\" data-start=\"72\"\u003eQ0V79A\u003c\/strong\u003e) is a low-profile, passively cooled GPU designed for AI inference, video processing, and virtual desktop infrastructure (VDI) workloads.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eBuilt on NVIDIA’s Pascal architecture, it offers efficient performance for data center environments.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 class=\"\" data-end=\"158\" data-start=\"133\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-end=\"876\" data-start=\"160\"\u003e\n\u003cli class=\"\" data-end=\"221\" data-start=\"160\"\u003e\n\u003cp class=\"\" data-end=\"221\" data-start=\"162\"\u003e\u003cstrong data-end=\"182\" data-start=\"162\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Pascal (GP104)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"277\" data-start=\"222\"\u003e\n\u003cp class=\"\" data-end=\"277\" data-start=\"224\"\u003e\u003cstrong data-end=\"238\" data-start=\"224\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e2,560\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"333\" data-start=\"278\"\u003e\n\u003cp class=\"\" data-end=\"333\" data-start=\"280\"\u003e\u003cstrong data-end=\"294\" data-start=\"280\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e886 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"390\" data-start=\"334\"\u003e\n\u003cp class=\"\" data-end=\"390\" data-start=\"336\"\u003e\u003cstrong data-end=\"351\" data-start=\"336\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,114 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"442\" data-start=\"391\"\u003e\n\u003cp class=\"\" data-end=\"442\" data-start=\"393\"\u003e\u003cstrong data-end=\"403\" data-start=\"393\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e8 GB GDDR5\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"504\" data-start=\"443\"\u003e\n\u003cp class=\"\" data-end=\"504\" data-start=\"445\"\u003e\u003cstrong data-end=\"465\" data-start=\"445\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e256-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"566\" data-start=\"505\"\u003e\n\u003cp class=\"\" data-end=\"566\" data-start=\"507\"\u003e\u003cstrong data-end=\"527\" data-start=\"507\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e192 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"623\" data-start=\"567\"\u003e\n\u003cp class=\"\" data-end=\"623\" data-start=\"569\"\u003e\u003cstrong data-end=\"582\" data-start=\"569\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"682\" data-start=\"624\"\u003e\n\u003cp class=\"\" data-end=\"682\" data-start=\"626\"\u003e\u003cstrong data-end=\"641\" data-start=\"626\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eLow-profile, single-slot (16.7 cm x 5.3 cm)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"737\" data-start=\"683\"\u003e\n\u003cp class=\"\" data-end=\"737\" data-start=\"685\"\u003e\u003cstrong data-end=\"696\" data-start=\"685\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"802\" data-start=\"738\"\u003e\n\u003cp class=\"\" data-end=\"802\" data-start=\"740\"\u003e\u003cstrong data-end=\"761\" data-start=\"740\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e75W (no external power connector required)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"876\" data-start=\"803\"\u003e\n\u003cp class=\"\" data-end=\"876\" data-start=\"805\"\u003e\u003cstrong data-end=\"824\" data-start=\"805\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003ch3 class=\"\" data-end=\"912\" data-start=\"883\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-end=\"1239\" data-start=\"914\"\u003e\n\u003cli class=\"\" data-end=\"984\" data-start=\"914\"\u003e\n\u003cp class=\"\" data-end=\"984\" data-start=\"916\"\u003e\u003cstrong data-end=\"943\" data-start=\"916\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e5.5 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1055\" data-start=\"985\"\u003e\n\u003cp class=\"\" data-end=\"1055\" data-start=\"987\"\u003e\u003cstrong data-end=\"1014\" data-start=\"987\"\u003eFP64 (Double Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e178.2 GFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1129\" data-start=\"1056\"\u003e\n\u003cp class=\"\" data-end=\"1129\" data-start=\"1058\"\u003e\u003cstrong data-end=\"1088\" data-start=\"1058\"\u003eINT8 Inference Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eUp to 130 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli class=\"\" data-end=\"1239\" data-start=\"1130\"\u003e\n\u003cp class=\"\" data-end=\"1239\" data-start=\"1132\"\u003e\u003cstrong data-end=\"1152\" data-start=\"1132\"\u003eVideo Processing\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eCapable of transcoding and inferencing up to 35 HD video streams in real time\u003c\/span\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e\u003c\/span\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1333\" data-end=\"1365\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1367\" data-end=\"1492\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla P4 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1494\" data-end=\"1623\"\u003e\n\u003cli data-start=\"1494\" data-end=\"1535\" class=\"\"\u003e\n\u003cp data-start=\"1496\" data-end=\"1535\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL360 Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1536\" data-end=\"1623\" class=\"\"\u003e\n\u003cp data-start=\"1538\" data-end=\"1623\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHPE ProLiant DL380 Gen9\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp\u003e \u003c\/p\u003e\n\u003cp\u003e \u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363086436,"sku":null,"price":1295.45,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/Q0V79A.jpg?v=1771296042"},{"product_id":"ng3py-refurbished-dell-nvidia-l4-24gb-passive-gpu","title":"NG3PY - Refurbished - Dell NVIDIA L4 24GB Passive GPU","description":"\u003cp\u003eCondition - Refurbished\u003c\/p\u003e\n\u003cp\u003e\u003cstrong data-end=\"39\" data-start=\"4\"\u003eDell NVIDIA L4 24GB Passive GPU\u003c\/strong\u003e (Dell Part Number: \u003cstrong data-end=\"68\" data-start=\"59\"\u003eNG3PY\u003c\/strong\u003e) is a high-efficiency, low-profile accelerator designed for AI inference, video processing, virtual desktops, and graphics-intensive workloads. Powered by NVIDIA’s Ada Lovelace architecture, it offers significant performance improvements over its predecessor, the T4, while maintaining a compact and energy-efficient design.\u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"877\"\u003e\n\u003cli data-start=\"160\" data-end=\"221\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"221\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"182\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Ada Lovelace\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"222\" data-end=\"277\" class=\"\"\u003e\n\u003cp data-start=\"224\" data-end=\"277\" class=\"\"\u003e\u003cstrong data-start=\"224\" data-end=\"238\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e7,424\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"278\" data-end=\"335\" class=\"\"\u003e\n\u003cp data-start=\"280\" data-end=\"335\" class=\"\"\u003e\u003cstrong data-start=\"280\" data-end=\"296\"\u003eTensor Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e232 (4th generation)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"336\" data-end=\"389\" class=\"\"\u003e\n\u003cp data-start=\"338\" data-end=\"389\" class=\"\"\u003e\u003cstrong data-start=\"338\" data-end=\"350\"\u003eRT Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e58 (3rd generation)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"390\" data-end=\"445\" class=\"\"\u003e\n\u003cp data-start=\"392\" data-end=\"445\" class=\"\"\u003e\u003cstrong data-start=\"392\" data-end=\"406\"\u003eGPU Memory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e24 GB GDDR6 with ECC\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"446\" data-end=\"507\" class=\"\"\u003e\n\u003cp data-start=\"448\" data-end=\"507\" class=\"\"\u003e\u003cstrong data-start=\"448\" data-end=\"468\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e300 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"508\" data-end=\"562\" class=\"\"\u003e\n\u003cp data-start=\"510\" data-end=\"562\" class=\"\"\u003e\u003cstrong data-start=\"510\" data-end=\"523\"\u003eInterface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCIe Gen 4.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"563\" data-end=\"621\" class=\"\"\u003e\n\u003cp data-start=\"565\" data-end=\"621\" class=\"\"\u003e\u003cstrong data-start=\"565\" data-end=\"580\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eLow-profile, single-slot (169 mm x 69 mm)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"622\" data-end=\"676\" class=\"\"\u003e\n\u003cp data-start=\"624\" data-end=\"676\" class=\"\"\u003e\u003cstrong data-start=\"624\" data-end=\"635\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"677\" data-end=\"741\" class=\"\"\u003e\n\u003cp data-start=\"679\" data-end=\"741\" class=\"\"\u003e\u003cstrong data-start=\"679\" data-end=\"700\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e72W (no external power connector required)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"742\" data-end=\"769\" class=\"\"\u003e\n\u003cp data-start=\"744\" data-end=\"769\" class=\"\"\u003e\u003cstrong data-start=\"744\" data-end=\"763\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"770\" data-end=\"877\" class=\"\"\u003e\n\u003cp data-start=\"772\" data-end=\"877\" class=\"\"\u003e\u003cstrong data-start=\"772\" data-end=\"790\"\u003eSupported APIs\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDirectX 12 Ultimate, Shader Model 6.6, OpenGL 4.6, Vulkan 1.3, CUDA 12.0, OpenCL 3.0\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"884\" data-end=\"913\" class=\"\"\u003ePerformance Highlights\u003c\/h3\u003e\n\u003cul data-start=\"915\" data-end=\"1307\"\u003e\n\u003cli data-start=\"915\" data-end=\"985\" class=\"\"\u003e\n\u003cp data-start=\"917\" data-end=\"985\" class=\"\"\u003e\u003cstrong data-start=\"917\" data-end=\"944\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e30.3 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"986\" data-end=\"1056\" class=\"\"\u003e\n\u003cp data-start=\"988\" data-end=\"1056\" class=\"\"\u003e\u003cstrong data-start=\"988\" data-end=\"1015\"\u003eTF32 Tensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e60 TFLOPS (up to 120 TFLOPS with sparsity)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1057\" data-end=\"1127\" class=\"\"\u003e\n\u003cp data-start=\"1059\" data-end=\"1127\" class=\"\"\u003e\u003cstrong data-start=\"1059\" data-end=\"1086\"\u003eFP16 Tensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e121 TFLOPS (up to 242 TFLOPS with sparsity)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1128\" data-end=\"1197\" class=\"\"\u003e\n\u003cp data-start=\"1130\" data-end=\"1197\" class=\"\"\u003e\u003cstrong data-start=\"1130\" data-end=\"1156\"\u003eFP8 Tensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e242.5 TFLOPS (up to 485 TFLOPS with sparsity)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1198\" data-end=\"1307\" class=\"\"\u003e\n\u003cp data-start=\"1200\" data-end=\"1307\" class=\"\"\u003e\u003cstrong data-start=\"1200\" data-end=\"1220\"\u003eINT8 Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e242.5 TOPS (up to 485 TOPS with sparsity)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1401\" data-end=\"1435\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1437\" data-end=\"1602\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe L4's low-profile, single-slot design and passive cooling make it ideal for dense server environments.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt is compatible with systems that have a PCIe Gen 4.0 x16 slot and sufficient airflow.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe absence of external power connectors simplifies integration into existing infrastructures.\u003c\/span\u003e\u003c\/p\u003e\n\u003cp\u003e \u003c\/p\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363119204,"sku":null,"price":4290.45,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/NG3PY.jpg?v=1771296043"},{"product_id":"vfj45-refurbished-dell-nvidia-v100-16gb-hbm2-passive-gpu","title":"VFJ45 - Refurbished - Dell NVIDIA V100 16GB HBM2 Passive GPU","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp data-start=\"0\" data-end=\"126\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"52\"\u003eDell NVIDIA Tesla V100 16GB HBM2 Passive GPU\u003c\/strong\u003e (Dell Part Number: \u003cstrong data-start=\"72\" data-end=\"81\"\u003eVFJ45\u003c\/strong\u003e) is a high-performance accelerator designed for intensive workloads such as AI training, deep learning inference, and high-performance computing (HPC).\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eBuilt on NVIDIA’s Volta architecture, it delivers exceptional computational capabilities in a dual-slot, passively cooled form factor.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"1150\"\u003e\n\u003cli data-start=\"160\" data-end=\"221\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"221\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"182\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Volta (GV100)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"222\" data-end=\"277\" class=\"\"\u003e\n\u003cp data-start=\"224\" data-end=\"277\" class=\"\"\u003e\u003cstrong data-start=\"224\" data-end=\"238\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e5,120\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"278\" data-end=\"301\" class=\"\"\u003e\n\u003cp data-start=\"280\" data-end=\"301\" class=\"\"\u003e\u003cstrong data-start=\"280\" data-end=\"296\"\u003eTensor Cores\u003c\/strong\u003e: 640\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"302\" data-end=\"357\" class=\"\"\u003e\n\u003cp data-start=\"304\" data-end=\"357\" class=\"\"\u003e\u003cstrong data-start=\"304\" data-end=\"318\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,245 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"358\" data-end=\"414\" class=\"\"\u003e\n\u003cp data-start=\"360\" data-end=\"414\" class=\"\"\u003e\u003cstrong data-start=\"360\" data-end=\"375\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,380 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"415\" data-end=\"466\" class=\"\"\u003e\n\u003cp data-start=\"417\" data-end=\"466\" class=\"\"\u003e\u003cstrong data-start=\"417\" data-end=\"427\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB HBM2 with ECC\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"467\" data-end=\"528\" class=\"\"\u003e\n\u003cp data-start=\"469\" data-end=\"528\" class=\"\"\u003e\u003cstrong data-start=\"469\" data-end=\"489\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4,096-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"529\" data-end=\"590\" class=\"\"\u003e\n\u003cp data-start=\"531\" data-end=\"590\" class=\"\"\u003e\u003cstrong data-start=\"531\" data-end=\"551\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eUp to 900 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"591\" data-end=\"659\" class=\"\"\u003e\n\u003cp data-start=\"593\" data-end=\"659\" class=\"\"\u003e\u003cstrong data-start=\"593\" data-end=\"618\"\u003ePeak FP32 Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e14 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"660\" data-end=\"728\" class=\"\"\u003e\n\u003cp data-start=\"662\" data-end=\"728\" class=\"\"\u003e\u003cstrong data-start=\"662\" data-end=\"687\"\u003ePeak FP64 Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e7 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"729\" data-end=\"799\" class=\"\"\u003e\n\u003cp data-start=\"731\" data-end=\"799\" class=\"\"\u003e\u003cstrong data-start=\"731\" data-end=\"758\"\u003ePeak Tensor Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e112 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"800\" data-end=\"838\" class=\"\"\u003e\n\u003cp data-start=\"802\" data-end=\"838\" class=\"\"\u003e\u003cstrong data-start=\"802\" data-end=\"832\"\u003eTDP (Thermal Design Power)\u003c\/strong\u003e: 250W\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"839\" data-end=\"893\" class=\"\"\u003e\n\u003cp data-start=\"841\" data-end=\"893\" class=\"\"\u003e\u003cstrong data-start=\"841\" data-end=\"852\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"894\" data-end=\"952\" class=\"\"\u003e\n\u003cp data-start=\"896\" data-end=\"952\" class=\"\"\u003e\u003cstrong data-start=\"896\" data-end=\"911\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eFull-height, dual-slot, 10.5 inches (267 mm) in length\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"953\" data-end=\"1015\" class=\"\"\u003e\n\u003cp data-start=\"955\" data-end=\"1015\" class=\"\"\u003e\u003cstrong data-start=\"955\" data-end=\"974\"\u003ePower Connector\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1 × 8-pin PCIe\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1016\" data-end=\"1076\" class=\"\"\u003e\n\u003cp data-start=\"1018\" data-end=\"1076\" class=\"\"\u003e\u003cstrong data-start=\"1018\" data-end=\"1035\"\u003eBus Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCIe 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1077\" data-end=\"1150\" class=\"\"\u003e\n\u003cp data-start=\"1079\" data-end=\"1150\" class=\"\"\u003e\u003cstrong data-start=\"1079\" data-end=\"1098\"\u003eDisplay Outputs\u003c\/strong\u003e: None\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"1157\" data-end=\"1191\" class=\"\"\u003eCompatibility \u0026amp; Deployment\u003c\/h3\u003e\n\u003cp data-start=\"1193\" data-end=\"1318\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla V100 VFJ45 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient power delivery.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003cspan class=\"\" data-state=\"closed\"\u003e\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1320\" data-end=\"1491\"\u003e\n\u003cli data-start=\"1320\" data-end=\"1361\" class=\"\"\u003e\n\u003cp data-start=\"1322\" data-end=\"1361\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell PowerEdge C4140\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1362\" data-end=\"1403\" class=\"\"\u003e\n\u003cp data-start=\"1364\" data-end=\"1403\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell PowerEdge R730, R740xd, R7525\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1404\" data-end=\"1491\" class=\"\"\u003e\n\u003cp data-start=\"1406\" data-end=\"1491\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell PowerEdge T430, T440, T560, T630, T640\u003c\/span\u003e\u003cspan class=\"\" data-state=\"closed\"\u003e\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp data-start=\"1493\" data-end=\"1578\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDue to its passive cooling design, it's essential to ensure that the host system provides adequate airflow to maintain optimal operating temperatures.\u003c\/span\u003e\u003c\/p\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363151972,"sku":null,"price":1145.9,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/VFJ45.jpg?v=1771296044"},{"product_id":"hckjm-pg183-7ph29-ppgxg-refurbished-dell-nvidia-tesla-t4-16gb-passive-gpu","title":"HCKJM (PG183, 7PH29, PPGXG) - Refurbished - Dell NVIDIA Tesla T4 16GB Passive GPU","description":"\u003cp\u003eCondition: Refurbished\u003c\/p\u003e\n\u003cp\u003eAlternate Part Numbers: PG183 , 7PH29 , PPGXG\u003c\/p\u003e\n\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"45\"\u003eDell NVIDIA Tesla T4 16GB Passive GPU\u003c\/strong\u003e (Dell Part Number: HCKJM \/ 7PH29) is a high-efficiency, single-slot accelerator designed for AI inference, machine learning, and virtual desktop infrastructure (VDI) workloads.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eBuilt on NVIDIA’s Turing architecture, it delivers strong performance in a compact, low-power form factor.\u003c\/span\u003e\u003c\/p\u003e\n\u003cp\u003e \u003c\/p\u003e\n\u003ch3 data-start=\"133\" data-end=\"158\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"160\" data-end=\"901\"\u003e\n\u003cli data-start=\"160\" data-end=\"217\" class=\"\"\u003e\n\u003cp data-start=\"162\" data-end=\"217\" class=\"\"\u003e\u003cstrong data-start=\"162\" data-end=\"178\"\u003eArchitecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eNVIDIA Turing (TU104 GPU)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"218\" data-end=\"273\" class=\"\"\u003e\n\u003cp data-start=\"220\" data-end=\"273\" class=\"\"\u003e\u003cstrong data-start=\"220\" data-end=\"234\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e2,560\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"274\" data-end=\"297\" class=\"\"\u003e\n\u003cp data-start=\"276\" data-end=\"297\" class=\"\"\u003e\u003cstrong data-start=\"276\" data-end=\"292\"\u003eTensor Cores\u003c\/strong\u003e: 320\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"298\" data-end=\"353\" class=\"\"\u003e\n\u003cp data-start=\"300\" data-end=\"353\" class=\"\"\u003e\u003cstrong data-start=\"300\" data-end=\"314\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e585 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"354\" data-end=\"410\" class=\"\"\u003e\n\u003cp data-start=\"356\" data-end=\"410\" class=\"\"\u003e\u003cstrong data-start=\"356\" data-end=\"371\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1,590 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"411\" data-end=\"462\" class=\"\"\u003e\n\u003cp data-start=\"413\" data-end=\"462\" class=\"\"\u003e\u003cstrong data-start=\"413\" data-end=\"423\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e16 GB GDDR6 with ECC\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"463\" data-end=\"524\" class=\"\"\u003e\n\u003cp data-start=\"465\" data-end=\"524\" class=\"\"\u003e\u003cstrong data-start=\"465\" data-end=\"485\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e256-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"525\" data-end=\"586\" class=\"\"\u003e\n\u003cp data-start=\"527\" data-end=\"586\" class=\"\"\u003e\u003cstrong data-start=\"527\" data-end=\"547\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eUp to 320 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"587\" data-end=\"648\" class=\"\"\u003e\n\u003cp data-start=\"589\" data-end=\"648\" class=\"\"\u003e\u003cstrong data-start=\"589\" data-end=\"607\"\u003ePCIe Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCIe 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"649\" data-end=\"713\" class=\"\"\u003e\n\u003cp data-start=\"651\" data-end=\"713\" class=\"\"\u003e\u003cstrong data-start=\"651\" data-end=\"672\"\u003ePower Consumption\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e70W (no external power connector required)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"714\" data-end=\"768\" class=\"\"\u003e\n\u003cp data-start=\"716\" data-end=\"768\" class=\"\"\u003e\u003cstrong data-start=\"716\" data-end=\"727\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePassive (requires adequate system airflow)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"769\" data-end=\"827\" class=\"\"\u003e\n\u003cp data-start=\"771\" data-end=\"827\" class=\"\"\u003e\u003cstrong data-start=\"771\" data-end=\"786\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eSingle-slot, full-height, half-length (FHHL)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"908\" data-end=\"934\" class=\"\"\u003ePerformance Metrics\u003c\/h3\u003e\n\u003cul data-start=\"936\" data-end=\"1225\"\u003e\n\u003cli data-start=\"936\" data-end=\"1006\" class=\"\"\u003e\n\u003cp data-start=\"938\" data-end=\"1006\" class=\"\"\u003e\u003cstrong data-start=\"938\" data-end=\"965\"\u003eFP32 (Single Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e8.1 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1007\" data-end=\"1075\" class=\"\"\u003e\n\u003cp data-start=\"1009\" data-end=\"1075\" class=\"\"\u003e\u003cstrong data-start=\"1009\" data-end=\"1034\"\u003eFP16 (Half Precision)\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e65 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1076\" data-end=\"1127\" class=\"\"\u003e\n\u003cp data-start=\"1078\" data-end=\"1127\" class=\"\"\u003e\u003cstrong data-start=\"1078\" data-end=\"1086\"\u003eINT8\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e130 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1128\" data-end=\"1225\" class=\"\"\u003e\n\u003cp data-start=\"1130\" data-end=\"1225\" class=\"\"\u003e\u003cstrong data-start=\"1130\" data-end=\"1138\"\u003eINT4\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e260 TOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp data-start=\"1227\" data-end=\"1312\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThese metrics make the Tesla T4 suitable for tasks such as AI inference, real-time analytics, and video processing.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"1319\" data-end=\"1338\" class=\"\"\u003eCompatibility\u003c\/h3\u003e\n\u003cp data-start=\"1340\" data-end=\"1465\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla T4 is compatible with systems that have a PCIe 3.0 x16 slot and sufficient airflow for passive cooling.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in various server models, including:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1467\" data-end=\"1638\"\u003e\n\u003cli data-start=\"1467\" data-end=\"1508\" class=\"\"\u003e\n\u003cp data-start=\"1469\" data-end=\"1508\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell PowerEdge R640, R740, R7525\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1509\" data-end=\"1550\" class=\"\"\u003e\n\u003cp data-start=\"1511\" data-end=\"1550\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell C6520, C6525\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1551\" data-end=\"1638\" class=\"\"\u003e\n\u003cp data-start=\"1553\" data-end=\"1638\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eDell XR12\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp data-start=\"1640\" data-end=\"1725\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eEnsure your system supports the card's power and cooling requirements before installation.\u003c\/span\u003e\u003c\/p\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363184740,"sku":null,"price":979.9,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/HCKJM.jpg?v=1771296045"},{"product_id":"ktdch-refurbished-nvidia-tesla-k40-12g-gpu","title":"KTDCH - Refurbished - NVIDIA TESLA K40 12G GPU","description":"\u003cp\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe \u003cstrong data-start=\"4\" data-end=\"33\"\u003eNVIDIA Tesla K40 12GB GPU\u003c\/strong\u003e is a high-performance computing accelerator designed for scientific, engineering, and enterprise applications.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt is not a network card but a specialized GPU optimized for parallel processing tasks.\u003c\/span\u003e\u003c\/p\u003e\n\u003ch3 data-start=\"166\" data-end=\"191\" class=\"\"\u003eSpecifications\u003c\/h3\u003e\n\u003cul data-start=\"193\" data-end=\"909\"\u003e\n\u003cli data-start=\"193\" data-end=\"254\" class=\"\"\u003e\n\u003cp data-start=\"195\" data-end=\"254\" class=\"\"\u003e\u003cstrong data-start=\"195\" data-end=\"215\"\u003eGPU Architecture\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eKepler (GK110B)\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"255\" data-end=\"310\" class=\"\"\u003e\n\u003cp data-start=\"257\" data-end=\"310\" class=\"\"\u003e\u003cstrong data-start=\"257\" data-end=\"271\"\u003eCUDA Cores\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e2,880\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"311\" data-end=\"366\" class=\"\"\u003e\n\u003cp data-start=\"313\" data-end=\"366\" class=\"\"\u003e\u003cstrong data-start=\"313\" data-end=\"327\"\u003eBase Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e745 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"367\" data-end=\"423\" class=\"\"\u003e\n\u003cp data-start=\"369\" data-end=\"423\" class=\"\"\u003e\u003cstrong data-start=\"369\" data-end=\"384\"\u003eBoost Clock\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eUp to 875 MHz\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"424\" data-end=\"475\" class=\"\"\u003e\n\u003cp data-start=\"426\" data-end=\"475\" class=\"\"\u003e\u003cstrong data-start=\"426\" data-end=\"436\"\u003eMemory\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e12 GB GDDR5 ECC\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"476\" data-end=\"537\" class=\"\"\u003e\n\u003cp data-start=\"478\" data-end=\"537\" class=\"\"\u003e\u003cstrong data-start=\"478\" data-end=\"498\"\u003eMemory Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e384-bit\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"538\" data-end=\"601\" class=\"\"\u003e\n\u003cp data-start=\"540\" data-end=\"601\" class=\"\"\u003e\u003cstrong data-start=\"540\" data-end=\"560\"\u003eMemory Bandwidth\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e288 GB\/s\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"602\" data-end=\"682\" class=\"\"\u003e\n\u003cp data-start=\"604\" data-end=\"682\" class=\"\"\u003e\u003cstrong data-start=\"604\" data-end=\"641\"\u003ePeak Single-Precision Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e4.29 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"683\" data-end=\"763\" class=\"\"\u003e\n\u003cp data-start=\"685\" data-end=\"763\" class=\"\"\u003e\u003cstrong data-start=\"685\" data-end=\"722\"\u003ePeak Double-Precision Performance\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003e1.43 TFLOPS\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"764\" data-end=\"824\" class=\"\"\u003e\n\u003cp data-start=\"766\" data-end=\"824\" class=\"\"\u003e\u003cstrong data-start=\"766\" data-end=\"783\"\u003eBus Interface\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003ePCI Express 3.0 x16\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"825\" data-end=\"909\" class=\"\"\u003e\n\u003cp data-start=\"827\" data-end=\"909\" class=\"\"\u003e\u003cstrong data-start=\"827\" data-end=\"857\"\u003eTDP (Thermal Design Power)\u003c\/strong\u003e: 235W\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003ch3 data-start=\"916\" data-end=\"952\" class=\"\"\u003eCompatibility \u0026amp; Installation\u003c\/h3\u003e\n\u003cul data-start=\"954\" data-end=\"1219\"\u003e\n\u003cli data-start=\"954\" data-end=\"1012\" class=\"\"\u003e\n\u003cp data-start=\"956\" data-end=\"1012\" class=\"\"\u003e\u003cstrong data-start=\"956\" data-end=\"971\"\u003eForm Factor\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eFull-height, dual-slot card\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1013\" data-end=\"1067\" class=\"\"\u003e\n\u003cp data-start=\"1015\" data-end=\"1067\" class=\"\"\u003e\u003cstrong data-start=\"1015\" data-end=\"1026\"\u003eCooling\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eAvailable in both active (with fan) and passive (requires chassis airflow) cooling solutions\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1068\" data-end=\"1219\" class=\"\"\u003e\n\u003cp data-start=\"1070\" data-end=\"1219\" class=\"\"\u003e\u003cstrong data-start=\"1070\" data-end=\"1092\"\u003ePower Requirements\u003c\/strong\u003e: \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eTypically requires one 6-pin and one 8-pin PCIe power connectors\u003c\/span\u003e \u003cspan class=\"ms-1 inline-flex max-w-full items-center relative top-[-0.094rem] animate-[show_150ms_ease-in]\"\u003e\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp data-start=\"1221\" data-end=\"1251\" class=\"\"\u003e\u003cstrong data-start=\"1221\" data-end=\"1250\"\u003eMotherboard Compatibility\u003c\/strong\u003e:\u003c\/p\u003e\n\u003cp data-start=\"1253\" data-end=\"1378\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eThe Tesla K40 is compatible with motherboards that have a PCIe 3.0 x16 slot and sufficient power delivery.\u003c\/span\u003e \u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eIt has been used in systems such as:\u003c\/span\u003e\u003c\/p\u003e\n\u003cul data-start=\"1380\" data-end=\"1671\"\u003e\n\u003cli data-start=\"1380\" data-end=\"1461\" class=\"\"\u003e\n\u003cp data-start=\"1382\" data-end=\"1461\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eHP Workstations: Z420, Z620, Z820\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1462\" data-end=\"1543\" class=\"\"\u003e\n\u003cp data-start=\"1464\" data-end=\"1543\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eLenovo P510\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003cli data-start=\"1544\" data-end=\"1671\" class=\"\"\u003e\n\u003cp data-start=\"1546\" data-end=\"1671\" class=\"\"\u003e\u003cspan class=\"relative -mx-px my-[-0.2rem] rounded px-px py-[0.2rem] transition-colors duration-100 ease-in-out\"\u003eSupermicro Socket 2011 boards\u003c\/span\u003e\u003c\/p\u003e\n\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"ShopITgear","offers":[{"title":"Default Title","offer_id":47923363250276,"sku":null,"price":124.95,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/KTDCH.jpg?v=1771296046"},{"product_id":"dell-ymd9w-nvidia-rtx-6000-ada-48gb-gddr6-pcie-ref","title":"Dell YMD9W, NVIDIA RTX 6000 ADA 48GB GDDR6 PCIe GPU, Refurbished","description":"\u003cp dir=\"auto\"\u003e\u003cstrong\u003eDell Part Number:\u003c\/strong\u003e YMD9W \u003cstrong\u003eManufacturer:\u003c\/strong\u003e NVIDIA (Dell-validated \/ OEM version) \u003cstrong\u003eModel:\u003c\/strong\u003e RTX 6000 Ada Generation (Lovelace Architecture)\u003c\/p\u003e\n\u003cp dir=\"auto\"\u003e\u003cstrong\u003eKey Specifications:\u003c\/strong\u003e\u003c\/p\u003e\n\u003cul dir=\"auto\"\u003e\n\u003cli\u003e\n\u003cstrong\u003eGPU Memory:\u003c\/strong\u003e 48GB GDDR6 with ECC\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eMemory Interface:\u003c\/strong\u003e 384-bit\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eMemory Bandwidth:\u003c\/strong\u003e 960 GB\/s\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eCUDA Cores:\u003c\/strong\u003e 18,176\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eTensor Cores:\u003c\/strong\u003e 568 (4th Generation)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eRT Cores:\u003c\/strong\u003e 142 (3rd Generation)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eSingle-Precision Performance:\u003c\/strong\u003e 91.1 TFLOPS\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eTensor Performance:\u003c\/strong\u003e Up to 1,457 TFLOPS\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eInterface:\u003c\/strong\u003e PCIe 4.0 x16\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003ePower Consumption (TDP):\u003c\/strong\u003e 300W\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003ePower Connector:\u003c\/strong\u003e 1x 16-pin (PCIe CEM5)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eOutputs:\u003c\/strong\u003e 4x DisplayPort 1.4a\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eForm Factor:\u003c\/strong\u003e Full Height, Full Length (FHFL), Dual-slot, Active cooling\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eMax Displays:\u003c\/strong\u003e 4x (supports up to 8K resolutions)\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp dir=\"auto\"\u003eThis is the \u003cstrong\u003eDell-branded\/validated version\u003c\/strong\u003e (YMD9W) of the professional NVIDIA RTX 6000 Ada, optimized for Dell PowerEdge servers (including the R7525 with GPU Enablement Kit) and Precision workstations. It delivers exceptional performance for:\u003c\/p\u003e\n\u003cul dir=\"auto\"\u003e\n\u003cli\u003eAI \/ Machine Learning inference and training\u003c\/li\u003e\n\u003cli\u003e3D rendering and visualization\u003c\/li\u003e\n\u003cli\u003eScientific simulations and data science\u003c\/li\u003e\n\u003cli\u003eProfessional graphics and CAD\u003c\/li\u003e\n\u003cli\u003eLarge dataset workloads (thanks to 48GB ECC memory)\u003c\/li\u003e\n\u003c\/ul\u003e","brand":"Resilient Tec, LLC","offers":[{"title":"Default Title","offer_id":48225201979492,"sku":null,"price":7495.0,"currency_code":"USD","in_stock":true}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/RTX6000.jpg?v=1775654297"},{"product_id":"dell-575tk","title":"Dell 575TK NVIDIA A100 80GB PCIe GPU, Refurbished","description":"\u003cp dir=\"auto\"\u003e\u003cstrong\u003eCondition: Refurbished\u003c\/strong\u003e\u003c\/p\u003e\n\u003cp dir=\"auto\"\u003e \u003c\/p\u003e\n\u003cp dir=\"auto\"\u003e\u003cstrong\u003eNVIDIA OEM Part Number:\u003c\/strong\u003e 900-21001-0020-000 (or similar 900-21001-XXXX variants) \u003cstrong\u003eModel:\u003c\/strong\u003e NVIDIA A100 80GB PCIe (Ampere Architecture)\u003c\/p\u003e\n\u003cp dir=\"auto\"\u003e\u003cstrong\u003eKey Specifications:\u003c\/strong\u003e\u003c\/p\u003e\n\u003cul dir=\"auto\"\u003e\n\u003cli\u003e\n\u003cstrong\u003eGPU Memory:\u003c\/strong\u003e 80GB HBM2e with ECC\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eMemory Bandwidth:\u003c\/strong\u003e 1,935 GB\/s\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eCUDA Cores:\u003c\/strong\u003e 6,912\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eTensor Cores:\u003c\/strong\u003e 432 (3rd Generation)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eRT Cores:\u003c\/strong\u003e N\/A (compute-focused)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eFP64 Performance:\u003c\/strong\u003e 9.7 TFLOPS\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eTensor Float-32 (TF32) Performance:\u003c\/strong\u003e 156 TFLOPS (312 TFLOPS with sparsity)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eFP16 \/ BF16 Tensor Core:\u003c\/strong\u003e Up to 312 TFLOPS (624 TFLOPS with sparsity)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eINT8 Tensor Core:\u003c\/strong\u003e Up to 624 TOPS (1,248 TOPS with sparsity)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eInterface:\u003c\/strong\u003e PCIe 4.0 x16\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003ePower Consumption (TDP):\u003c\/strong\u003e 300W\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003ePower Connector:\u003c\/strong\u003e 1x 8-pin EPS auxiliary power\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eForm Factor:\u003c\/strong\u003e Full Height, Full Length (FHFL), Dual-slot, Passive cooling (requires high airflow server environment)\u003c\/li\u003e\n\u003cli\u003e\n\u003cstrong\u003eFeatures:\u003c\/strong\u003e Multi-Instance GPU (MIG) support – up to 7 isolated instances @ 10GB each, NVLink support (for 2 GPUs)\u003c\/li\u003e\n\u003c\/ul\u003e\n\u003cp dir=\"auto\"\u003eThe \u003cstrong\u003eNVIDIA A100 80GB PCIe\u003c\/strong\u003e is a powerhouse data center GPU designed for demanding AI, machine learning, deep learning training\/inference, high-performance computing (HPC), data analytics, and large-scale simulations. The massive 80GB HBM2e memory makes it ideal for handling enormous models and datasets that smaller GPUs can't accommodate.\u003c\/p\u003e","brand":"Resilient Tec, LLC","offers":[{"title":"Default Title","offer_id":48225210073188,"sku":null,"price":22500.0,"currency_code":"USD","in_stock":false}],"thumbnail_url":"\/\/cdn.shopify.com\/s\/files\/1\/0767\/3482\/4548\/files\/575TK.jpg?v=1775654714"}],"url":"https:\/\/resilient-tec.com\/collections\/gpu.oembed","provider":"Resilient Tec, LLC","version":"1.0","type":"link"}