Springen naar inhoud

- - - - -

Ibm announced its intent to build an exaflop system.


  • Log in om te kunnen reageren

#1

Buyck Ruben

    Buyck Ruben


  • >25 berichten
  • 47 berichten
  • Gebruiker

Geplaatst op 21 september 2009 - 14:40

A quintillion operations per second? It's possible

No computer yet built has performed 1,000,000,000,000,000,000 operations in a single second. That would be a thousand times faster than today's fastest supercomputer. But that's exactly the challenge that IBM has set for itself, the next "moon shot" in high-performance computing: on June 23, 2009, IBM announced its intent to build an exaflop system, capable of unimagined power and benefit to the world.

"Exascale computing"—the performance of one million trillion calculations, an exaflop, in a single second by a single computer—isn't possible today. (That's the equivalent of the combined performance of 50 million laptop computers, a stack that would be 1,000 miles high and weigh over 100,000 tons.) But given its research capabilities and the next-generation of architectures it is developing, IBM engineers know it could be. What might be possible with such computing power that isn't possible today? From pharmaceutical and genetic research to assessing financial risk with pinpoint accuracy or modeling the effects of climate change over the course of a century, breaking the exaflop barrier will enable advances in solving many of the world's present challenges.
Exascale possibilities for a smarter planet

Computers capable of operating at the rate of an exaflop/second could unlock the following possibilities and much more:

More than doubling the world's oil reserves
Today's oil recovery techniques—the finding and drilling of oil deposits—have a success rate of only 30%. Tomorrow's exascale computing could predict with incredible accuracy the location of oil deposits, increasing those recovery rates to as high as 70%.

Predicting and fighting pandemics in real-time
Today's supercomputers allow scientists to simulate incredibly complex biological functions—identifying the origin of diseases and discovering new treatments. However, these complex tasks can take weeks, even with the help of today's most powerful computers. In the hands of tomorrow's scientists, exascale systems can turn around disease prediction, identification and cures in real time, allowing doctors to outrun the epidemics of tomorrow.

Real-time analysis of oceans of financial services data
Today's supercomputers—like those used by TD Securities—can speed advanced financial calculations by 2000% when compared to traditional methods. Tomorrow's exascale financial calculations will include real-time, intelligent analysis of important factors such as investor profile data, live market trading dynamics, RSS news feeds and social networks—helping control financial risk and provide more accurate valuations of assets and investments.

Link: http://www-07.ibm.co...ers/index3.html

source: IBM


Square Kilometre Array (SKA) telescope project.
Andere link: http://www.computerw...orth_processing

Dit forum kan gratis blijven vanwege banners als deze. Door te registeren zal de onderstaande banner overigens verdwijnen.

#2

Buyck Ruben

    Buyck Ruben


  • >25 berichten
  • 47 berichten
  • Gebruiker

Geplaatst op 28 september 2009 - 16:13

Ter verduidelijking deze computer is 1000x krachtiger dan de huidige "supercomputer roadrunner" eveneens van IBM.

Mogelijkheden in de wetenschap van een exaflop systeem zoals Blue Brain Project ook voor ogen heeft.



Gathering and Testing 100 Years of Data

The most immediate benefit is to provide a working model into which the past 100 years knowledge about the microstructure and workings of the neocortical column can be gathered and tested. The Blue Column will therefore also produce a virtual library to explore in 3D the microarchitecture of the neocortex and access all key research relating to its structure and function.

Cracking the Neural Code

The Neural Code refers to how the brain builds objects using electrical patterns. In the same way that the neuron is the elementary cell for computing in the brain, the NCC is the elementary network for computing in the neocortex. Creating an accurate replica of the NCC which faithfully reproduces the emergent electrical dynamics of the real microcircuit, is an absolute requirement to revealing how the neocortex processes, stores and retrieves information.

Understanding Neocortical Information Processing

The power of an accurate simulation lies in the predictions that can be generated about the neocortex. Indeed, iterations between simulations and experiments are essential to build an accurate copy of the NCC. These iterations are therfore expected to reveal the function of individual elements (neurons, synapses, ion channels, receptors), pathways (mono-synaptic, disynaptic, multisynaptic loops) and physiological processes (functional properties, learning, reward, goal-oreinted behavior).

A Novel Tool for Drug Discovery for Brain Disorders

Understanding the functions of different elements and pathways of the NCC will provide a concrete foundation to explore the cellular and synaptic bases of a wide spectrum of neurological and psychiatric diseases. The impact of receptor, ion channel, cellular and synaptic deficits could be tested in simulations and the optimal experimental tests can be determined.

A Global Facility

A software replica of a NCC will allow researchers to explore hypotheses of brain function and dysfunction accelerating research. Simulation runs could determine which parameters should be used and measured in the experiments. An advanced 2D, 3D and 3D immersive visualization system will allow "imaging" of many aspects of neural dynamics during processing, storage and retrieval of information. Such imaging experiments may be imposible in reality or may be prohibitively expensive to perform.

A Foundation for Whole Brain Simulations

With current and envisageable future computer technology it seems unlikely that a mammalian brain can be simulated with full cellular and synaptic complexity (above the molecular level). An accurate replica of an NCC is therefore required in order to generate reduced models that retain critical functions and computational capabilities, which can be duplicated and interconnected to form neocortical brain regions. Knowledge of the NCC architecture can be transferred to facilitate reconstruction of subcortical brain regions.

A Foundation for Molecular Modeling of Brain Function

An accurate cellular replica of the neocortical column will provide the first and essential step to a gradual increase in model complexity moving towards a molecular level description of the neocortex with biochemical pathways being simulated. A molecular level model of the NCC will provide the substrate for interfacing gene expression with the network structure and function. The NCC lies at the interface between the genes and complex cognitive functions. Establishing this link will allow predictions of the cognitive consequences of genetic disorders and allow reverse engineering of cognitive deficits to determine the genetic and molecular causes. This level of simulation will become a reality with the most advanced phase of Blue Gene development.

Source: BLUE BRAIN PROJECT

link: http://bluebrain.epf.../page18926.html



Het lijkt er wel op dat dit zullen halen aangezien de enorme snelle ontwikkelingen in de chipindustrie.

#3

Vladimir Lenin

    Vladimir Lenin


  • >250 berichten
  • 829 berichten
  • Ervaren gebruiker

Geplaatst op 28 september 2009 - 16:48

Wat is nog het nut van het bouwen van zo'n computer als over een jaar of vijf die machine af is komen waarschijnlijk de eerste Kwantumcomputers eraan. Deze niet-deterministische machines blazen de microprocessoren weg in snelheid zoveel is zeker. Het bouwen van een machine met microprocessoren zal dus waarschijnlijk een stille dood sterven tegen de tijd dat de machine operatief is.
"Als je niet leeft zoals je denkt, zul je snel gaan denken zoals je leeft."
--Vladimir Lenin-- (Владимир Ильич Ульянов)

#4

Cycloon

    Cycloon


  • >1k berichten
  • 4810 berichten
  • VIP

Geplaatst op 28 september 2009 - 17:11

Wat is nog het nut van het bouwen van zo'n computer als over een jaar of vijf die machine af is komen waarschijnlijk de eerste Kwantumcomputers eraan. Deze niet-deterministische machines blazen de microprocessoren weg in snelheid zoveel is zeker. Het bouwen van een machine met microprocessoren zal dus waarschijnlijk een stille dood sterven tegen de tijd dat de machine operatief is.


Binnen een jaar of 5 vind ik nogal een grove overschatting. Als het ooit zover komt.

En stilstaan is achteruit gaan, daarmee is alles gezegd lijkt mij.

Veranderd door Cycloon, 28 september 2009 - 17:11


#5

Buyck Ruben

    Buyck Ruben


  • >25 berichten
  • 47 berichten
  • Gebruiker

Geplaatst op 29 september 2009 - 14:51

@ Vladimir Lenin: 5 jaar is krap... ik zou zeggen tien jaar. Dat laatste aan een betaalbaar niveau. Want tussen een quantumchip maken en aan een betaalbare prijs iets maken is nog een verschil. Momenteel lopen de klassieke processen in de chipindustrie erg snel en zal in 2011 overgaan worden naar EUV lithografie. Dan komt Intel en in hun kielzog bedrijven met nodes van 22nm. In 2013 komt 15nm (2x zoveel transistors per chip). Intel zegt met deze techniek tot 5nm te kunnen komen natuurlijk zit de concurrentie ook niet stil. Ik verwacht totaal nieuwe technologieën zoals kwantumcomputers tegen 2020.





0 gebruiker(s) lezen dit onderwerp

0 leden, 0 bezoekers, 0 anonieme gebruikers

Ook adverteren op onze website? Lees hier meer!

Gesponsorde vacatures

Vacatures