Amazon takes on big guns in machine learning

A new player has entered the machine learning arena. While big players such as Google, Intel and NVIDIA have been slugging it out, Amazon has thrown its hat into the ring with its newly-launched Inferentia chip.

Just like Google, which introduced its Tensor Processing Unit (TPU) in 2016, Amazon now has its own chip to power its cloud computing services. It may no longer need to rely on chips from Intel and NVIDIA.

Inferentia is a machine learning inference chip designed to deliver high performance at low cost. It will support the TensorFlow, Apache MXNet, and PyTorch deep learning frameworks, as well as models that use the ONNX format.

According to Amazon, some inference workloads require an entire graphic processing unit (GPU) or have extremely low latency requirements. Solving this challenge at low cost requires a dedicated inference chip.

In April, e-commerce giant Alibaba announced plans to design its own artificial intelligence (AI) processors.

These moves, which can be costly to the tech companies, point to the growing impact of AI in the days ahead.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.