The GPU — Scoobi LAND Sale — 7th Wave

SCooBi-Doge
13 min readMar 19, 2022
Starting on Monday March 21st, 2022 at 1pm GMT

Sixth wave has ended last Monday and the launch of the woofish seventh wave is coming Monday, March 21st 2022 at 1PM GMT. Be ready to purchase and own one of the 32 Lands that are going to be listed. As scheduled, the price will increase to 0.24 ETH per parcel.

7th wave is on the stove and we are introducing The Spaceship as a LAND NFT and a new unique perk, The GPU as EXTRA NFT that will give you access to more many perks in the Scoobiverse overtime.

Sale starts Monday March 21st 2022 on Opensea: https://opensea.io/collection/scoobi-lands

The Spaceship

Somewhere in the far end of the galaxy there are some spaceships running with ether as gasoline. One of those ship, obviously the most prestigious one, has a captain named “Vitalik”. The principal bases for his ship is on Moons as he loves the powerful gleam of them since he is a baby. He often carries heavy loads of artifacts to Mars and to the Earth to satiated the devouring desire of the entities he meets.

The Spaceship just landed on a moon

Enveloped or pushed by the grayness of moon rocks, and pulled by the reddish color of the lava or the more precisely the IMPs “Irregular Mare Patches” that are showing up all over the landscape, this area is located near Vitalik’s main repository. If you start digging there, you can find tons of commit messages buried in a very special lunar rare earth.

Vitalik the Alien sitting on Scoobi and Snoop’s Pink Cadillac with Shib and Dogira behind him.

Every first class warship has a lot of storage place, same in his vessel but it’s not unlimited so he repeatedly has to go empty his trashes, the code name is GC for “Garbage Collect”. Vitalik is a green ecologist in his deep inside and really cares about the world wide living being community. Refilling his tanker with fruits and living animals is one of his favorite task and evidently he needs a large amount of clear and drinkable water that he calls “Gas” to be able to sustain his body and his inventions during his long trips never lasting less than 4.2069 year.

32 x The Spaceship LAND NFT available on Opensea

Anyway now you get it, and soon you will know what is his final plan and what will happen to Draculon Musk who secretly signed a smart contract on an ambigram and palindrome day which makes it last for a minimum of 50 years…

The GPU

EXTRA NFT that comes along the purchase of a LAND NFT.

Sale starts Monday March 21st 2022 on Opensea: https://opensea.io/collection/scoobi-lands

God damn GPU, they are everywhere, in cars, airplanes, boats, drones, radars, antennas, satellites, buildings and many more. It’s a big part of our hurrying evolution, the base that carry the Artificial intelligence we are creating. Used everywhere to render the virtual world we are going straight into, Graphical Processing Unit or GPU are the backbone of computational power democratization.

Let’s dive in it, super-computer is a network of computers (nodes) up in a pretty well secured warehouse, one of the most powerful is the IBM one which has the U.S. Department of Energy as support and sponsor, the code name “Summit” it has 4'608 nodes with 2 IBM POWER9 CPUs(processors) and 6 Nvidia Tesla V100 GPUs with over 600 GB of coherent memory which is addressable by all CPUs and GPUs plus 800 GB of non-volatile RAM on each node. This is obviously a huge amount of Computation power, we can definitely call it the gold 3.0!

The supercomputing Center of Excellence established by IBM and NVIDIA at the Lawrence Livermore National Lab in Livermore,

A unique Masterpiece built by deeply devoted human beings spending countless amount of time to succeed their mission. So take 6 GPUs multiplied by 4'608 makes 27'648 GPUs… How much do you think 1 of these top notch world class V100 GPU cost? 13k USD. Price is for public consumer like me and you. You can buy it on Amazon if you want :) here. I’ll let you do the calculation to know the exact cost to buy those 27'648 Nvidia V100 GPUs to build this kind of tiny personnal computer… because I’m getting really too old for that and tired to do it all the time :P

But what can you do with that? Supercomputers were originally developed for code cracking, as well as ballistics. They were designed to make an enormous amount of calculations at a time, which was a big improvement over, say, 20 mathematics graduate students in a room, hand-scratching operations.

First supercomputer CDC 6600 and was released in 1964 in CERN (Europe) more info

Currently those massive, hulking, overheating machines that were the world’s introduction to computing, are used to simulate nuclear weapons capability or whenever you check the weather app on your phone, the National Oceanic and Atmospheric Administration (NOAA) is using a supercomputer called the Weather and Climate Operational Supercomputing System to forecast weather. Not only, but if a million people are playing WoW (World of Wordcraft) at a time, graphics and speed are of utmost importance. Enter the supercomputers, used to make the endless calculations that help the game go global. Thank you for your service dear GPU.

Speaking of games, we won’t forget Deep Blue, the supercomputer that beat chess champion Garry Kasparov in a six-game match in 1997. And then there’s Watson in 2011, the IBM supercomputer that famously beat Ken Jennings in an intense game of Jeopardy. Currently, Watson is being used by a health insurer to predict patient diagnoses and treatments.

Supercomputer Deep blue on the left and Watson on the right

But, like the canonists in the thirteenth century when the feudal system was dying, we live in a time of tremendous social and technological change. Consider Go, a complicated game of Chinese origin. In 2016, a London laboratory called DeepMind developed the first computer program to defeat a world champion at Go. The program was trained on thirty million moves played by human experts, and it had some capacity to learn.

Last fall, a new version of DeepMind’s AlphaGo program was released: a computer program that did not use any moves from human experts to train. It learned by playing millions of games against itself. After that training it took on its predecessor program — with already the strongest play in the world — and defeated it one hundred games to zero. And then, using something called “transfer learning,” it was also able to defeat chess computers at chess.

Pause to think about that for a moment. A computer program that used no human data beat other machines at a game it was not programmed to play.

It’s not hard to imagine how machine learning like this will change our lives dramatically. If you are interested know about more stuff on AI, read this or watch this video:

The history of GPU

Can you really imagine that 50 years ago some dudes where going freaking crazy and jumped all over the room when the managed for the first time to play at the simplest ever video game named “Pong” table tennis–themed sports video game originally designed and built by an Atari engineer Allan Alcorn as a training exercise. It was a huge success and they didn’t know at that time but they created a multi billion dollar market to become.

I’m still trying to put myself in their shoes, but it’s hard to imagine, who’d would nowadays spend hours all night playing at Pong? Who can really conceptualize that day when they were in the begins of public internet connection and was able to play against someone living in the other side of the planet on real time!? We guess for us who are born into it and grew up with it since childhood, it’s very hard to feel what they really went through at that particular moment.

On the left “Pong” in 1972 and on the right “Soul Calibur VI”

The GPU (Graphics Processing Unit) or VPU (Visual Processing Unit) handles the display and acceleration of graphics (especially 3D). It consists of one or more processors and also has its own RAM. Modern GPUs also handle more complex calculations (Physics) and can be used for heavy parallel processing tasks. This means everything. In other words, it’s worthwhile to talk about GPU history.

As early as 1951, MIT built the Whirlwind, a flight simulator for the Navy. Although it may be considered the first 3D graphics system, the base of today’s GPUs was formed in the mid-70s with so-called video shifters and video address generators. They carried out information from the central processor to the display. Specialized graphics chips were widely used in arcade system boards. Jay Miner was a brilliant integrated circuit designer who moved to Atari during the late 1970s. One of his first breakthroughs was the design of the TIA circuit, which was the display hardware of the famous Atari 2600 gaming console. In 1976, RCA built the “Pixie” video chip, which was able to output video signal at 62×128 resolution.

Tomb Rider 3D Evolution from 1962 to present

In 1981, IBM started using monochrome and a color display adapter (MDA/CDA) in its PCs. Not a modern GPU yet, it was a particular computer component designed for one purpose: to display video. At first, it was 80 columns by 25 lines of text characters or symbols. ISBX 275 Video Graphics Controller Multimodule Board, released by Intel in 1983, was the next revolutionary device. It was able to display eight colors at a resolution of 256x256 or monochrome at 512x512.

Some other famous games evolution over time.

ΙΒΜ Professional Graphics Adapter

In 1984 IBM introduced its first GPU called Professional Graphics Controller (PGC) or Professional Graphics Adapter (PGA). In essence, it was an expansion card that could accelerate 3D graphics as well as 2D graphics. It consisted of three separate boards that were connected together, and it had its own CPU along with dedicated RAM (an Intel 8088 CPU and 320KB RAM).

The PGC supported resolutions of up to 640 x 480 pixels, with 256 colors simultaneously shown on the display and a refresh rate of 60 frames per second. Its price was $4,290 when it was first introduced. This specific GPU didn’t manage to achieve notable commercial success, not only because of its high price, but mainly because of the lack of adequate software support. However, the PGC is still considered an important milestone in the history of GPUs.

ATI will consolidate AMD

In 1985, three Hong Kong immigrants in Canada formed Array Technology Inc, soon renamed as ATI Technologies (now part of AMD). This company would lead the market for years with its Wonder line of graphics boards and chips. In 2000, ATI switched to the Radeon branding still in use today, ATI continued to operate independently until 2006. Even after its acquisition by AMD, products were sold under the ATI brand for several more years.

Customers should come first, at every stage of a company’s activities. — AMD’s customer driven approach since 1969

Here’s an Evolution of AMD Radeon GPUs

S3 Graphics introduced the S3 86C911, named after the Porsche 911, in 1991. The name was to indicate the performance increase. This card spawned a crowd of imitators: by 1995, all major players in the making of graphics cards had added 2D acceleration support to their chips. Throughout the 1990s, the level of integration of video cards was significantly improved with the additional application programming interfaces (APIs).

Overall, the early 1990s was the time when a lot of graphics hardware companies were found, and then acquired or ousted out of business. Among the winners founded during this time was NVIDIA. By the end of 1997, this company had nearly 25 percent of the graphics market. In the following graphic you can see who is still the leader nowadays.

Nvidia is still leading the global revenue of the semiconductor suppliers in 2017

3D Revolution (1995–2006)

The history of modern GPUs starts in 1995 with the introduction of the first 3D add-in cards, and later the adoption of the 32-bit operating systems and affordable personal computers. Previously, the industry was focused on 2D and non-PC architecture, and graphics cards were mostly known by alphanumeric names and huge price tags.

3DFx’s Voodoo graphics card, launched in late 1996, took over about 85% of the market. Cards that could only render 2D became obsolete very fast. The Voodoo1 steered clear of 2D graphics entirely; users had to run it together with a separate 2D card. But it still was a godsend for gamers. The next company’s product, Voodoo2 (1998), had three onboard chips and was one of the first video cards ever to support parallel work of two cards within a single computer.

First series of 3DFx’s Voodoo GPU

With the progress of manufacturing technology, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition’s Verite chipsets were among the first to do this well. 3D accelerator cards were not just rasterizers any more.

Finally, the “world’s first GPU” came in 1999! This is how Nvidia promoted its GeForce 256. Nvidia defined the term graphics processing unit as “a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.”

Evolution of gaming consoles from 1980 to 2012

The rivalry between ATI and Nvidia was the highlight of the early 2000s. Over this time, the two companies went head to head and delivered graphics cards with features that are now commonplace. For example, the capability to perform specular shading, volumetric explosion, waves, refraction, shadow volumes, vertex blending, bump mapping and elevation mapping.

General Purpose GPUs (2006 — present day)

The era of the general purpose GPUs began in 2007. Both Nvidia and ATI (since acquired by AMD) had been packing their graphics cards with ever-more capabilities.

However, the two companies took different tracks to general purpose computing GPU (GPGPU). In 2007, Nvidia released its CUDA development environment, the earliest widely adopted programming model for GPU computing. Two years later, OpenCL became widely supported. This framework allows for the development of code for both GPUs and CPUs with an emphasis on portability. Thus, GPUs became a more generalized computing device.

In 2010, Nvidia collaborated with Audi. They used Tegra GPUs to power the cars’ dashboard and increase navigation and entertainment systems. These advancements in graphics cards in vehicles pushed self-driving technology.

NVIDIA Titan V, 2017

Pascal is the newest generation of graphics cards by Nvidia, released in 2016. Their 16 nm manufacturing process improves upon previous microarchitectures. AMD released Polaris 11 and Polaris 10 GPUs featuring 14 nm process, which resulted in a robust increase performance per watt in the performance per watt. However, the energy consumption of modern GPUs has increased as well.

Cryptocurrency mining rig with 6 GPU Radeon R7 370 and 2 power supplies.

Today, graphics processing units are not only for graphics. They have found their way into fields as diverse as machine learning, oil exploration, scientific image processing, statistics, linear algebra, 3D reconstruction, medical research and even stock options pricing determination. The GPU technology tends to add even more programmability and parallelism to a core architecture that is ever-evolving towards a general purpose CPU-like core.

Hopefully now you are good and you know way more about how money has been fabricated. We have decided to mint this particular NFT to exemplify the symbolic importance of Graphical Processing Units in the last century of the history of humanity.

Sale starts Monday March 21st 2022 on Opensea: https://opensea.io/collection/scoobi-lands

EXTRA NFT that comes along the purchase of a LAND NFT

This post was inspired from several sources: https://www.investopedia.com/articles/07/roots_of_money.asp
https://medium.com/altumea/a-brief-history-of-gpu-47d98d6a0f8a
https://blog.aarons.com/lifestyle/history-of-gaming-consoles
https://www.tomshardware.com/picturestory/735-history-of-amd-graphics-2.html

💡Reminder: We will release 2 to 4 act per year. Some of the NFT created for each act will be airdropped to reward long term holders, diamond hands who are never selling Scoobi tokens, or at least not all their bag. Last episode of each act will be only available through Airdrop. You will need to hold at least 10M Scoobi to be eligible for the Scoobi NFTdrops. The weight of all holder is calculated by a custom-made algorithm which is taking in account amount of Scoobi held, the LP provided (which will give you a lot of boost to your weight) and the amount of time you have been holding those tokens. If you sell all the Scoobi you are holding, your weight level will not be reset to zero right away, but instead it will decrease slowly and gradually.

The SCooBi Doge Team

Twitter : https://twitter.com/ScoobiDoge
Telegram : https://t.me/scoobidoge
Github : https://github.com/Scoobi-doge/Scoobi-doge.github.io
Website : https://scoobidoge.com/
Discord : https://discord.gg/zdnWZgPTEH

⚠️ Disclaimer: We do not guarantee anything. Past performance is not a reliable indicator of future results and investors may not recover the full amount invested. The value of this collection can greatly fluctuate as a result of Scoobi’s investment policy and it is not guaranteed. The above references an opinion and it is for information purposes only. It is not intended to be investment advice. Seek a duly licensed professional for investment advice.

--

--

SCooBi-Doge

Scoobi-Doge is a Comic NFT MEME DAO building a collection and gaming ecosystem with a governance token based upon a decentralized voting system. scoobi.space