On June 13, Intel had the GPU world in a flurry when it tweeted “Intel’s first GPU coming in 2020”. The media were quick to post stories of this incoming new GPU, which would add interesting competition to a market dominated by NVIDIA with AMD a distant second.
NVIDIA has announced the availability of NVIDIA Isaac, a new platform to power the next generation of autonomous machines, bringing artificial intelligence (AI) capabilities to robots for manufacturing, logistics, agriculture, construction, and many other industries.
Taiwan is going big on artificial intelligence (AI) and its Ministry of Science and Technology (MOST) will be collaborating with NVIDIA on AI initiatives.
Power efficient chipsets are set to be the main driver as artificial intelligence (AI) moves makes a significant shift from the cloud to the edge, according to ABI Research.
NVIDIA researchers have demonstrated how robots can be trained to observe and repeat human actions — a “first of its kind” capability powered by deep learning.
Dubbed “the brains behind the bots”, NVIDIA researchers will be heading to the International Conference on Robotics and Automation (ICRA) at Brisbane Convention and Exhibition Centre in Australia from May 21 to 25.
Tokyo-based startup incubator DEEPCORE is partnering NVIDIA to support AI startups and promote university research programmes across Japan.
Adobe and NVIDIA have formed a strategic partnership to rapidly enhance their industry-leading artificial intelligence (AI) and deep learning technologies.
The GPU Technology Conference (GTC) has hit new highs with a record of more than 8,000 participants, and filling the entire San Jose McEnery Convention Center.
Cryptocurrency mining has been given a boost with the revelation that Samsung is working on chip just for that purpose.
Think artificial intelligence (AI) and the advent of powerful thinking machines and images of Arnold Schwarzenegger of The Terminator come to mind.
China carmaker Chery has adopted the new ZP ProAI system, powered by NVIDIA Drive AI self-driving technology, for its autonomous vehicles. The move will bring Level 3 autonomous driving to the world’s biggest auto market.
In a week where the world’s eyes were supposed to be focusing on the exciting new gadgets and technologies coming out at CES 2018, it was news from past technologies that had the world reeling.
Entelechy Asia turns five today. So much has changed since we launched in November 2012.
The need for deep learning skills is increasing as more and more companies and industries hop on the bandwagon. Launch a little more than a year ago, NVIDIA’s Deep Learning Institute (DLI) has already trained tens of thousands of students, developers and data scientists.
And the company is expanding its DLI offerings with:
- New partnerships: Team up with Booz Allen Hamilton and deeplearning.ai to train thousands of students, developers and government specialists in artificial intelligence (AI).
- New University Ambassador Program: Instructors worldwide can teach students critical job skills and practical applications of AI at no cost.
- New courses: More courses are added to teach domain-specific applications of deep learning for finance, natural language processing, robotics, video analytics, and self-driving cars.
With artificial intelligence (AI) being a hot topic this year, NVIDIA is organising its first AI-focused regional conference in Singapore on October 23 and 24.
The event will be held in two parts with the first day focusing on Deep Learning Institute (DLI) workshop where participants will received hands-on training on deep learningl and the second day filled with keynote addresses, panel discussion and three tracks. It is targeted at data scientists and senior decision makers in the field of AI in both public and private sectors.
“Singapore is aiming to be the world’s first smart nation and AI is playing a critical role. NVIDIA is well positioned to help drive the government’s Smart Nation initiative with the development of solutions based on AI. Our GPUs are making headlines across the world by enabling many breakthroughs in various industries using deep learning,” said Raymond Teh, Vice President of APAC sales and marketing at NVIDIA.
NVIDIA has teamed up with BINUS University and Kinetica to establish the first artificial intelligence (AI) research and development (R&D) centre in Indonesia.
Located at the university’s Anggrek Campus, the centre will support BINUS University’s aim to be the premier R&D hub for Al in Indonesia. Leveraging the power of NVIDIA’s GPUs, it will be a showcase of the commercial potential of GPU-accelerated deep learning applications.
“Today, we stand at the beginning of the AI computing era, ignited by a new computing model, GPU deep learning. This new model — where deep neural networks are trained to recognise patterns from massive amounts of data — has proven to be ‘unreasonably’ effective at solving some of the most complex problems in computer science. In this era, software writes itself and machines learn. Soon, hundreds of billions of devices will be infused with intelligence. AI will revolutionise every industry. NVIDIA provides the products and solutions to power this revolution,” said Raymond Teh, Vice President of APAC Sales and Marketing of NVIDIA.
NVIDIA is among a group of investors led by Chinese social media company Sina investing more than US$20 million in Chinese startup TuSimple.
Formed in 2015, TuSimple has more than 100 employees in R&D centres in Beijing and San Diego developing technology for autonomous long-distance freight delivery. It uses NVIDIA GPUs, NVIDIA DRIVE PX 2, Jetson TX2, CUDA, TensorRT, and cuDNN to develop its autonomous driving solution.
In June, the company successfully completed a 200-mile Level 4 test drive from San Diego to Yuma, Arizona, using NVIDIA GPUs and cameras as the primary sensor.
“I’m amazed at the quality of the papers presented. The project teams’ line of thinking and breakthrough concepts are refreshing,” exclaimed a leading artificial intelligence (AI) scientist at the International Conference on Machine Learning (ICML) in Sydney.
International Convention Centre Sydney was a massive hive of activities as 3,000 of the world’s top researchers, developers and students in AI gathered for ICML. The participants moved rapidly from one workshop to another and took great interest in the exhibition booths of top deep learning proponents such as NVIDIA, Google and Facebook.
With so many bright young talents. the event proved to be a good fishing ground for vendors as they held recruitment interviews at their booths, as well as posted openings on the board.
At the inaugural Audi Summit in Spain, Audi revealed that its flagship 2018 A8 features a multitude of high-tech wizardry powered by NVIDIA.
“The car of the future will make its occupants’ life easier with the help of artificial intelligence (AI),” declared Rupert Stadler, Chairman of the Board of Audi, as he introduced such A8 features as Audi AI Traffic Jam Pilot, Remote Park Pilot, Natural Voice Control and Swarm Intelligence.
The A8 is packed with NVIDIA powered systems, including revolutionary new user interfaces, a new infotainment system, a new virtual cockpit, and new rear seat entertainment options.
NVIDIA is investing in Deep Instinct, an Israeli-based startup that uses deep learning to thwart cyber attacks.
Deep Instinct uses a GPU-based neural network and CUDA to achieve 99 percent detection rates, compared with about 80 percent detection from conventional cyber security software. Its software can automatically detect and defeat the most advanced cyber attacks.
“Deep Instinct is an emerging leader in applying GPU-powered AI through deep learning to address cybersecurity, a field ripe for disruption as enterprise customers migrate away from traditional solutions. We’re excited to work together with Deep Instinct to advance this important field,” said Jeff Herbst, Vice President of Business Development of NVIDIA.
NVIDIA and Baidu have teamed up to bring artificial intelligence (AI) technology to cloud computing, self-driving vehicles and AI home assistants.
Baidu will deploy NVIDIA HGX architecture with Tesla Volta V100 and Tesla P4 GPU accelerators for AI training and inference in its data centres. Combined with Baidu’s PaddlePaddle deep learning framework and NVIDIA’s TensorRT deep learning inference software, researchers and companies can harness state-of-the-art technology to develop products and services with real-time understanding of images, speech, text and video.
To accelerate AI development, the companies will work together to optimise Baidu’s open-source PaddlePaddle deep learning framework on NVIDIA’s Volta GPU architecture.
NVIDIA is among six technology companies to receive a total of US$258 funding from the US Department of Energy’s Exascale Computing Project (ECP).
The funding is to accelerate the development of next-generation supercomputers with the delivery of at least two exascale computing systems, one of which is targeted by 2021.
Such systems would be about 50 times more powerful than the US’ fastest supercomputer, Titan, located at Oak Ridge National Laboratory.
SoftBank Group has taken a US$4 billion stake in NVIDIA, according to Bloomberg.
This dovetails nicely with SoftBank founder Masayoshi Son’s aim to become the biggest investor in technology over the next decade.
NVIDIA’s stocks tripled last year and is surging again this year. The chipmaker is banking on driving the artificial intelligence (AI) trend with its powerful graphics processing units (GPUs).
With all that rage of artificial intelligence (AI) powering driverless cars, the same technology can also be used to keep drivers safe. It can acts like a guardian angel and look our for danger (watch video above).
NVIDIA’s AI Co-Pilot technology uses sensor data from the cameras and microphone inside and outside a car to track the environment around the driver. When the AI notices a problem — such as the driver looking away from an approaching pedestrian — it could sound an alert.
Hundreds of thousands of computers in 150 countries have been hit by the WannaCry ransomware. While users are scampering around trying to fix their computers, the top of mind question is whether this could have been avoided. And if artificial intelligence could have predicted and prevented such an attack.
At GPU Technology Conference (GTC) in San Jose, California last week, Israeli firm Deep Instinct won the Most Disruptive Startup category in NVIDIA’s Inception Award. The firm is the first to use AI to predict and prevent malware attacks.
According David Eli, Chief Technology Officer (CTO) of Deep Instinct, more than a million new malware threats are released daily, but most antivirus software focuses on known threats.
His firm’s graphics processing unit (GPU)-accelerated deep learning software detects malware in real time. Trained on hundreds of millions of files, the neural network learns to detect more threats and then uses its experience to predict new attacks.
“Winning this prize is the ultimate recognition from the deep learning industry because deep learning and NVIDIA are synonymous,” said Eli.
This is the inaugural year of NVIDIA’s Inception Awards, which recognises startups in three categories – Hottest Emerging, Most Disruptive and Social Innovation. Winners received significant cash prizes and graphics processing unit (GPU) hardware to further accelerate their activities.
Emerging out of stealth mode in November 2015, Deep Instinct’s patent-pending application of deep learning to cybersecurity results in cutting-edge capabilities of unmatched accurate detection and real-time prevention.
Leveraging the capabilities associated with deep learning, Deep Instinct provides instinctive protection on any device, platform, and operating system. Zero-day and APT attacks are immediately detected and blocked before any harm can happen to the enterprise’s endpoints, servers, and mobile devices.
“Deep Instinct relies on end-to-end deep learning for all its advanced malware detection and prevention capabilities. The deep neural network is trained on hundreds of millions of malicious and legitimate files. To handle such large-scale training, Deep Instinct developed its proprietary deep learning infrastructure directly on NVIDIA’s GPU machines,” said Eli.
“The powerful capabilities of NVIDIA GPUs enable us to perform our training at a substantially faster speed compared to CPUs: while training the Deep Instinct brain on NVIDIA’s GPUs takes a little over a single day of training, the same task on CPUs would take more than three months,” he added.
“We are thrilled to be recognised by NVIDIA for what we believe is a groundbreaking application of GPUs. Being able to leverage powerful technological capabilities to apply deep learning to cybersecurity empowers enterprises with unprecedented, real-time protection from the next unexpected attack,” said Guy Caspi, Chief Executive Officer of Deep Instinct.
As a sign of its coming of age, the GPU Technology Conference (GTC) held annually in San Jose, California since 2009, is no longer a niche event but one that is drawing the who’s who of the technology industry.
NVIDIA’s shift of focus from being a visual computing company to an AI company has certainly played a big part in the expansion of the conference. It has attracted around 50 sponsors and 150 exhibitors on top of more than 7,000 participants.
However, it’s not the number of sponsors and exhibitors but rather the quality that is worthy of attention. The line-up of technology firms includes luminaries such as Adobe, Alibaba, Amazon, Autodesk, Cisco, Cray, Dell EMC, DreamWorks Animation, IBM Watson, Lenovo, Microsoft, Samsung Electronics, Verizon Labs, VMware, and Yahoo Research.
With AI being such a prime mover of autonomous vehicles, it is also not surprising that leading names in the automotive industry were also present — BMW Group, Chevron, ExxonMobil, Ford Motor Company, General Motors, Honda Research Institute, and Mercedes-Benz R&D North America.
Amid the various booths showcasing VR technologies was one by NASA Ames Research Center, which showed a VR demonstration on Mars.
Artificial intelligence (AI) is not new. In fact, it has so many false starts over the past 60 years. The term went into hibernation for a long time.
Research into AI began way back in Dartmouth College in 1956 and was constantly associated with being the next frontier in the 1980s when mainframe computers ruled and supercomputers were a ginormous investment that very few could afford.
Despite the research put in over the years, the technology never quite took off and fell flat in many instances.