|
August 14, 2018
The Shifting Nature of Technology at Work
August 7, 2018
The Beauty of 4K
July 31, 2018
The Future of End User Computing
July 24, 2018
5G Complexity to Test Standards
July 17, 2018
California Data Privacy Law Highlights Growing Frustration with Tech Industry
July 10, 2018
Dual Geographic Paths to the Tech Future
July 3, 2018
The Changing Relationship Between People and Technology
June 12, 2018
The Business of Business Software
June 5, 2018
Siri Shortcuts Highlights Evolution of Voice-Based Interfaces
May 29, 2018
Virtual Travel and Exploration Apps Are Key to Mainstream VR Adoption
May 22, 2018
The World of AI Is Still Taking Baby Steps
May 15, 2018
Device Independence Becoming Real
May 8, 2018
Bringing Vision to the Edge
May 1, 2018
The Shifting Enterprise Computing Landscape
April 24, 2018
The "Not So" Late, "And Still" Great Desktop PC
April 17, 2018
The Unseen Opportunities of AR and VR
April 10, 2018
The New Security Reality
April 3, 2018
Making AI Real
March 27, 2018
Will IBM Apple Deal Let Watson Replace Siri For Business Apps?
March 20, 2018
Edge Servers Will Redefine the Cloud
March 13, 2018
Is it Too Late for Data Privacy?
March 6, 2018
The Hidden Technology Behind Modern Smartphones
February 27, 2018
The Surprising Highlight of MWC: Audio
February 20, 2018
The Blurring Lines for 5G
February 13, 2018
The Modern State of WiFi
February 6, 2018
Wearables to Benefit from Simplicity
January 30, 2018
Smartphone Market Challenges Raise Major Questions
January 23, 2018
Hardware-Based AI
January 16, 2018
The Tech Industry Needs Functional Safety
January 9, 2018
Will AI Power Too Many Smart Home Devices?
January 2, 2018
Top Tech Predictions for 2018
|
|
|
|
August 21, 2018
By Bob O'Donnell
Sometimes it takes more than just brute horsepower to achieve the most challenging computing tasks. At the Gamescom 2018 press event hosted by Nvidia yesterday, the company’s CEO Jensen Huang hammered this point home with the release of the new line of RTX2070 and RTX2080 graphics cards. Based on the company’s freshly announced Turing architecture, these cards are the first consumer-priced products to offer real-time ray tracing, a long sought after target in the world of computer graphics and visualization. To achieve that goal, however, it took advancements in both graphics technologies as well as deep learning and AI.
Ray tracing essentially involves the realistic creation of digital images by following, or tracing, the path that light rays would take as they hit and bounce off objects in a scene, taking into consideration the material aspects of the those objects, such as reflectivity, light absorption, color and much more. It’s a very computational intensive task that previously could only be done offline and not in real-time.
What was particularly interesting about the announcement was how Nvidia ended up solving the real-time ray tracing problem—a challenge that they claimed to have worked on and developed over a 10-year period. As part of their RTX work, the company created some new graphical compute subsystems inside their GPUs called RT Cores that are dedicated to accelerating the ray tracing process. While different in function, these are conceptually similar to programmable shaders and other more traditional graphics rendering elements that Nvidia, AMD, and others have created in the past, because they focus purely on the raw graphics aspect of the task.
Rather than simply using these new ray tracing elements, however, the company realized that they could leverage other work they had done for deep learning and artificial intelligence applications. Specifically, they incorporated several of the Tensor cores they had originally created for neural network workloads into the new RTX boards to help speed the process. The basic concept is that certain aspects of the ray tracing image rendering process can be sped up by applying algorithms developed through deep learning.
In other words, rather than having to use the brute force method of rendering every pixel in an image through ray tracing, other AI-inspired techniques like denoising are used to speed up the ray tracing process. Not only is this a clever implementation of machine learning, but I believe it’s likely a great example of how AI is going to influence technological developments in other areas as well. While AI and machine learning are often thought of as delivering capabilities and benefits in and of themselves, they’re more likely to provide enhancements and advancements to other existing technology categories by accelerating certain key aspects of those technologies, just as they have to computer graphics in this particular application.
It’s also important to remember that ray tracing is not the only type of image creation technique used on the new family of RTX cards, which will range in price from $499 to $1,199. Like all other major graphics cards, the RTX line will also support more traditional shader-based image rasterization technologies, allowing products based on the architecture to work with existing games and other applications. To leverage the new ray tracing capabilities, in fact, games will have to be specifically designed to tap into the ray tracing features—they won’t simply show up on their own. Thankfully, it appears that Nvidia has already lined up some big name titles and game publishers to support their efforts. PC gamers will also have to specifically think about the types of systems that can support these new cards, as they are very power hungry and demand up to 250W of power on their own (and a minimum 650W power supply for the full desktop system).
For Nvidia, the RTX line is important for several reasons. First, achieving real-time ray tracing is a significant goal for a company that’s been highly focused on computer graphics for 25 years. More importantly, though, it allows the company to combine what some industry observers had started to see as two distinct business focus areas—graphics and AI/deep learning/machine learning—into a single coherent story. Finally, the fact it’s their first major gaming-focused GPU upgrades in some time can’t be overlooked either.
For the tech industry as a whole, the announcement likely represents one of the first of what will be many examples of companies leveraging AI/machine learning technologies to enhance their existing products rather than creating completely new ones.
Here's a link to the column: https://techpinions.com/nvidia-rtx-announcement-highlights-ai-influence-on-computer-graphics/53493
Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech.
Podcasts
Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research participates in a video-based podcast called Everything Technology.
LEARN MORE |
|
Research Offerings
TECHnalysis Research offers a wide range of research deliverables that you can read about here.
READ MORE |
|