💾 Archived View for gmi.noulin.net › mobileNews › 1649.gmi captured on 2023-01-29 at 20:25:28. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
2009-12-04 11:06:43
A vast network of computers is being harnessed to design components for the
next generation of silicon chips.
Simulations of transistors smaller than 30 nanometres (billionths of a metre)
are being run on the UK e-science grid, which links thousands of computers.
The results will help designers cope with the physical constraints that occur
when working at such tiny scales.
About 20 years worth of processing time has been used by simulating hundreds of
thousands of tiny transistors.
Atomic impurity
The researchers hope to get a sense of how such tiny components vary to work
out the best way to produce future generations of chips with even smaller
components.
"What we do in these simulations is try to predict the behaviour of these
devices in the presence of atomic scale effects," said Professor Asen Asenov,
head of the device modelling group at the University of Glasgow, which is
leading the NanoCMOS simulation project.
The increasing power of silicon chips is largely dictated by the size of the
components that chip makers can cram on to each chunk of silicon. The basic
building block of a chip is the transistor, tiny switches that can either be
"on" or "off".
The current generation of chips use transistors with features around 32
nanometres in size, but many manufacturers will move to 22 nanometres soon.
"These problems started to appear a couple of generations ago but right now
it's one of the most serious problems," said Prof Asenov.
"What's happening at such dimensions is that the atomic structure of the
transistor cannot be precisely controlled," he said. "In order to make them
work we have to put in impurities to define different regions."
Prof Asenov and his team are not seeking the perfect design for a transistor,
instead they are finding out how best to lay down materials so transistors
perform consistently.
It used to be the case, said Prof Asenov, that silicon chips were identical and
could be relied on to work in the same way. But as components shrink to 30
nanometres and beyond such certainty disappears.
No longer can designers expect to lay down crisp ranks of perfectly formed
transistors during manufacturing.
"Instead," he said, "designers have to introduce redundancy, self-organisation
and self testing."
The design and testing was done using a grid, essentially a piece of software
that unites tens of thousands of PCs scattered across different sites.
Richard Sinnott, technical director at the National E-Science Centre in
Glasgow, which brokers the grid resources for projects such as NanoCMO, said
the team needed to use hundreds of thousands of hours of computer time.
"Prof Asenov wanted access to as much high performance computing as we could
give them," he said.
His team gave them access to the number-crunching power they needed and helped
them manage the huge amount of data being produced.
Over the course of a few weeks, he said, the project racked up around 20 years
worth of processing time as batches of hundreds of transistors were simulated.
"It's the biggest project we are involved with right now," said Mr Sinnott.