recentpopularlog in

charlesarthur : nvidia   6

Nvidia shrugs off crypto-mining crash, touts live ray-tracing GPUs, etc • The Register
Katyanna Quach:
<p>The demand for GPUs grew 40 per cent from last year to account for $2.66bn in sales, we're told. Popular online titles such as Fortnite and PUBG have helped Nvidia in the gaming department, which grew 52% in terms of revenue to $1.8bn. The boom in deep learning is also accelerating its data center business by 83%, to $760m, where its graphics cards are used as math accelerators. Nvidia’s automotive area is smaller with $161m in revenues, up 13% year-over-year. Its professional visualization arm grew 20% to $281m.

It was weakest in cryptocurrency mining. People just aren't buying Nvidia cards for crafting digital fun bucks any more, relatively speaking, and won't for a while, it seems. So that's good news for folks unable to get hold of an Nvidia card due to hoarding by crypto-coin nerds.

“Our revenue outlook had anticipated cryptocurrency-specific products declining to approximately $100 million, while actual crypto-specific product revenue was $18 million, and we now expect a negligible contribution going forward,” the biz reported during its the earnings call with analysts on Thursday.

A few months back CEO Jensen Huang said a shortage of its chips – particularly the GeForce series – was down to mining Ethereum. The prices skyrocketed for a brief period of time, have been declining, and are going back to normal levels. Huang previously said Nvidia were not targeting the crypto industry, and wanted to reserve GeForce parts for gamers.</p>


Basically, Nvidia expects zero revenue from people buying for mining in future. The candle burned bright, but it burnt out.
nvidia  crypto 
august 2018 by charlesarthur
How the cryptocurrency gold rush could backfire on NVIDIA and AMD • Tech.pinions
Ryan Shrout:
<p>With all that is going right for AMD and NVIDIA because of this repurposed used of current graphics card products lines, there is a significant risk at play for all involved. Browse into any gaming forum or subreddit and you’ll find just as many people unhappy with the cryptocurrency craze as you will happy with its potential for profit. The PC gamers of the world that simply want to buy the most cost-effective product for their own machines are no longer able to do so, with inventory snapped up the instant it shows up. And when they can find a card for sale, they are significantly higher prices. A look at Amazon.com today for Radeon RX 580 cards show starting prices at the $499 mark but stretching to as high as $699. This product launched with an expected MSRP of just $199-$239, making the current prices a more than 2x increase.

As AMD was the first target of this most recent coin mining boon, the Radeon brand is seeing a migration of its gaming ecosystem to NVIDIA and the GeForce brand. A gamer that decides a $250 card is in their budget for a new PC would find that the Radeon RX 580 is no longer available to them. The GeForce GTX 1060, with similar performance levels and price points, is on the next (virtual) shelf over, so that becomes the defacto selection. This brings the consumer into NVIDIA’s entire ecosystem, using its software like GeForce Experience, looking at drivers, game optimizations, free game codes, inviting research into GeForce-specific technology like G-Sync. For Radeon, it has not lost a sale this generation (as the original graphics card that consumer would have bought has been purchased for mining) but it may have lost a long-term customer to its competitor.</p>


Weird if cryptocurrencies squeeze PC gaming so much that it migrates elsewhere. And meanwhile, what is this rush to GPUs doing to big companies' machine learning efforts?
nvidia  amd  gaming  cryptocurrency 
june 2017 by charlesarthur
Intel on the outside: the rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel • The Economist
<p>This unipolar world [of Intel processors] is starting to crumble. Processors are no longer improving quickly enough to be able to handle, for instance, machine learning and other AI applications, which require huge amounts of data and hence consume more number-crunching power than entire data centres did just a few years ago. Intel’s customers, such as Google and Microsoft together with other operators of big data centres, are opting for more and more specialised processors from other companies and are designing their own to boot.

Nvidia’s GPUs are one example. They were created to carry out the massive, complex computations required by interactive video games. GPUs have hundreds of specialised “cores” (the “brains” of a processor), all working in parallel, whereas CPUs have only a few powerful ones that tackle computing tasks sequentially. Nvidia’s latest processors boast 3,584 cores; Intel’s server CPUs have a maximum of 28.

The company’s lucky break came in the midst of one of its near-death experiences during the 2008-09 global financial crisis. It discovered that hedge funds and research institutes were using its chips for new purposes, such as calculating complex investment and climate models. It developed a coding language, called CUDA, that helps its customers program its processors for different tasks. When cloud computing, big data and AI gathered momentum a few years ago, Nvidia’s chips were just what was needed.

Every online giant uses Nvidia GPUs to give their AI services the capability to ingest reams of data from material ranging from medical images to human speech. The firm’s revenues from selling chips to data-centre operators trebled in the past financial year, to $296m.</p>
intel  nvidia  hardware 
february 2017 by charlesarthur
CES proves carmakers still confused about autonomous driving • The Information
Amir Efrati:
<p>Mr. Hafner’s [of Mercedes, which has teamed up with Nvidia] comments are interesting given a view among traditionalists in the self-driving field—including people who work at Waymo (formerly Google), Baidu and Ford—that Nvidia’s approach, which is sometimes called “end-to-end deep learning,” either won’t work or is outright dangerous.

Coincidentally, a day before the Mercedes-Nvidia announcement, a primitive version of Nvidia’s “AI-trained” car being demonstrated in a parking lot outside the exhibition hall veered off course. It would have crashed into a portable wall if Nvidia engineers hadn’t remotely stopped it, according to a person who saw the incident.

Danny Shapiro, senior director at Nvidia’s automotive business, said in an interview that the car’s self-driving system, called “pilot net,” had been “trained” earlier in the week during cloudy conditions so when the sun came out on Thursday, the system was unprepared. He added that the car is not representative of Nvidia-powered autonomous driving systems because it was making driving decisions based on data from just one camera. Nvidia’s latest system supports vehicles with many more cameras and other sensors.</p>

But how long will it take to train them in every conceivable weather, road and other condition?
Mercedes  selfdrivingcar  nVidia  ai 
january 2017 by charlesarthur
Nvidia creates a 15bn-transistor chip for deep learning » VentureBeat
Dean Takahashi:
<p>Nvidia chief executive Jen-Hsun Huang announced that the company has created a new chip, the Tesla P100, with 15 billion transistors for deep-learning computing. It’s the biggest chip ever made, Huang said.

Huang made the announcement during his keynote at the GPUTech conference in San Jose, California. He unveiled the chip after he said that deep-learning artificial intelligence chips have already become the company’s fastest-growing business.

“We are changing so many things in one project,” Huang said. “The Tesla P100 has five miracles.”

Nvidia previously launched its Tesla M4 and Tesla M40 deep-learning chips, and those chips are selling fast. Now the Tesla P100 is in volume production today, Huang said.

“We decided to go all-in on A.I.,” Huang said. “This is the largest FinFET chip that has ever been done.”</p>


Maybe Intel could focus on GPUs instead of CPUs? Seems to be where the business is heading.
nvidia  ai 
april 2016 by charlesarthur
Samsung sues Nvidia for faking benchmarks comparing the Tegra K1 to the Exynos 5433 >> SamMobile
<blockquote class="quoted">The Tegra K1 has been touted by Nvidia to be a “desktop-class” SoC, so powerful that it’s only meant for tablets (though nothing is stopping a manufacturer from using it on a phone.) However, Samsung is now alleging that the Tegra K1 is not as powerful as those benchmark scores indicate, and has sued Nvidia for misleading consumers in benchmark figures that compare the SHIELD Tablet with the Galaxy Note 4.

The suit is, in fact, a countersuit against Nvidia’s lawsuit against Samsung earlier this year that said the latter had infringed on some of Nvidia’s graphics-related patents in its mobile chips (which has caused the US ITC to investigate some Samsung devices.) Samsung’s lawsuit against the popular GPU manufacturer alleges that Nvidia also used six Samsung patents without licensing them; Samsung is also suing Velocity Micro, a company that uses Nvidia’s graphics cards and hence is being accused of using two of the Korean manufacturers.

1) I'm shocked, shocked that people might seek ways to fiddle benchmarks (2) does anyone with any sense actually care about benchmarks on mobile processors, when some companies tie boat anchors around them by skinning Android?
samsung  nvidia 
november 2014 by charlesarthur

Copy this bookmark:





to read