Deep learning articles
Number Representations in Computer Hardware, Explained
Welcome to a most fundamental area of computer design: how numbers are represented in hardware! We know that modern computers use and are very efficient at using binary, but wasn't always the case.
This is what comes next for Amazon's Alexa
Nvidia DLSS upscaling is now available in over 200 games
Nvidia pursuing multi chip module architecture to meet evolving data needs
Researchers develop deep-learning method for translating vocal signals from the brain to text
GAN Theft Auto is a neural network's attempt to recreate GTA5
Canvas from Nvidia can turn your digital scribbles into near photo-real images
Proposed EU regulations aim to restrict AI use based on its risk to public safety or liberty
UCSD scientists developed a technique that fools deepfake detection systems
Explainer: What Are Tensor Cores?
Nvidia has been making graphics chips that feature extra cores, beyond the normal ones used for shaders. Known as tensor cores, these mysterious units can be found in thousands of system, but what exactly are they and what are they used for? Today we'll explain what a tensor is and how tensor cores are used in the world of graphics and deep learning.
Explainer: What Is Machine Learning?
Machine learning (ML) is the study of computer systems that automatically improve with experience, a hot topic in the last few years, but a concept that's been around for decades. IBM programmer and AI pioneer Arthur Samuel coined the term "machine learning" in 1952.
Nvidia reveals why it chose rival AMD over Intel for its deep learning system
Microsoft and Intel are working on a project that converts malware into images for easier identification
Learn AI and deep learning with this bundle, for a price you choose
Nvidia's GauGAN AI turns rough sketches into photorealistic images in real-time
Google makes speech-to-text available completely offline in Gboard
DeepSqueak is a deep-learning algorithm used to study ultrasonic rat chatter
AI demonstrates competency as a physician assistant
3DMark's Port Royal benchmark now supports Nvidia's DLSS
Intel announces the Neural Compute Stick 2 for deep learning acceleration
Nvidia DLSS: An Early Investigation
Today we're addressing one of the most frequent discussion topics surrounding the new RTX 2080 and RTX 2080 Ti graphics cards. Is it worth buying the RTX 2080 for DLSS?, or is DLSS the killer feature for the RTX cards? As with ray tracing, we won't really know until we have more to test with, but today we're doing an early investigation into DLSS using the current demos we have within reach.