Accelerating FPGA Adoption for AI Inference with the Inspur TF2 – Intel on AI – Episode 13

Intel on AI - En podcast af Intel Corporation

Kategorier:

In this Intel on AI podcast episode: FPGA (field-programmable gate array) technology can offer a very high level of flexibility and performance, with low latency. Yet, with software writing thresholds, limited performance optimization, and difficult power control, FPGA solutions can also be challenging to implement. Bob Anderson, General Manager of Sales for Strategic Accounts at Inspur, joins Intel on AI to talk about the Inspur TensorFlow-supported FPGA Compute Acceleration Engine (TF2). Bob illustrates how the TF2 helps customers more easily deploy FPGA solutions and take advantage of the customization and performance of FPGAs for AI inference applications. He also describes how the TF2 is especially suitable for image-based AI applications with high real-time requirements. To learn more, visit: inspursystems.com Visit Intel AI Builders at: builders.intel.com/ai

Visit the podcast's native language site