Archive for April, 2012


The computer industry is nearing a crisis: microchips get smaller and faster but they struggle to transfer data at sufficient speeds. Electrons flowing through standard chip connections are just too slow. Now EU-funded researchers have shown how chips with built-in lasers which use multiple wavelengths of light could in the future transmit data at terabit speeds.


Lasers are great for transmitting information. Every time you use the Internet or make a telephone call data, in the form of light pulses or photons, travels hundreds of kilometres through the optical fibre networks that crisscross the continent.

But the insides of computers still stick to old fashioned electronics. Microprocessors do their calculations with electrons, and they transfer data within and between chips using electrons too.

‘Electronics is fast approaching a crunch point,’ explains Dries Van Thourhout from the Department on Information Technology at Ghent University, an associated lab of imec, in Belgium.’Up to now we have been trying to increase the speed of transistors, but that performance has stopped increasing now, it is just a question of packing more into a smaller space. But the biggest hindrance to performance is the speed of the connections between chips and devices. We call it the “interconnectivity bottleneck.” ‘

Imagine a sweet factory which makes thousands of sweets per second, but the plant can only bag the sweets and dispatch them to the shops at a rate of a few hundred per second. Unless you slow down production you will end up with sweets piling up, rolling over the floor and clogging the system.

The powerful microprocessors in computers today use vast quantities of data and perform millions of calculations per second. You need to transfer this data around your computer (or your mobile phone for that matter). But the connections can’t keep up, they simply can not shift electrons fast enough. The only way to cope is to slow down data production.

This is where light comes in: you can use lasers to send photons down silicon ‘wires’ (light at infrared wavelengths travels remarkably well through silicon, says Mr Van Thourhout) instead of electrons. But the speed of light is not why optical interconnects are better. The real trick is that light can be ‘multiplexed’; basically you can send photons of different wavelengths through your interconnect at the same time. Use three wavelengths and you effectively triple the speed of data transmission.

Divide and conquer

With this in mind the ‘Wavelength division multiplexed photonic layer on CMOS’ ( Wadimos) project set out to develop a demonstration chip with multiplexing optical interconnects. The chip was based on technology developed in a predecessor project (PICMOS) which created the first ever microchip with integrated microlaser light sources, thanks to a unique bonding ‘glue’ developed by the PICMOS partners.

‘The PICMOS project was a great success. We showed that optical interconnects could be manufactured and that they would work,’ says Mr Van Thourhout. ‘But it is one thing to make and demonstrate something in the lab. You won’t get chips like these into the mainstream or solve that interconnectivity bottleneck unless you can manufacture them at the industrial scale, making millions of them. PICMOS demonstrated the principle of optical interconnects. Wadimos is proving that multiplexing is possible and that the chips can be made in a standard CMOS fabrication plant.’

Europe’s largest chip manufacturer STMicroelectronics has worked in collaboration with universities and research institutions from France and Italy and a Dutch SME which specialises in lithography (etching) for electronic components. Together these partners have extended the results of PICMOS and adapted them to more commercial manufacturing processes.

One of the biggest challenges was to replace the gold connections on the microlasers in the PICMOS prototype. ‘You can’t have gold in a chip fabrication plant,’ explains Mr Van Thourhout. ‘Gold is a contaminant, so partner CEA-LETI developed a process that would mean the integrated lasers mounted on the chips could be connected using metals commonly used in chip manufacturing such as aluminium, titanium and titanium nitride.’

Belgian project partner imec has also worked to optimise the passive router structures in silicon and investigated the feasibility for their industrial production. Other project partners have contributed their expertise: the Lyon Institute of Nanotechnology (INL) in France demonstrated a new type of ‘microsource’ for which you can control the output wavelength. INL also worked with STMicroelectronics to develop a way to simulate the optical network on a chip. Finally the University of Trento, Italy, designed and demonstrated a new type of silicon router which could be used to ‘switch’ photons down particular optical pathways.

Bringing these developments together, the Wadimos team has produced a network of eight fully interconnected silicon blocks. The researchers have demonstrated successful multiplexing across these connections and the feasibility of optical filtering to direct and control the passage of photons through the silicon interconnects and their subsequent detection.

There is still plenty of research to do, however, especially to keep the lasers working in the high temperature environment of a chip’s surface. Mr Van Thourhout says that they will need to find new materials that can cope with the heat.

‘Nevertheless, we are very hopeful that this approach will prove very successful in the long term,’ he asserts. ‘We are taking an exploratory approach.’ He explains that other research groups, especially those in the US, have developed optical interconnects that use an ‘off chip’ laser source; the laser beam is split and redirected for each interconnect.

‘These chips are more advanced and will soon be used in supercomputers,’ says Mr Van Thourhout, ‘and may eventually trickle down to mainstream computing, but in the long run it will be more efficient to have chips with integrated laser sources.

‘We expect the Wadimos interconnects to allow computer processing power to continue to increase and overcome the data transmission bottleneck. Our goal is to make optical interconnects a standard technology that will support the development of yet more powerful, smaller microprocessors capable of transferring data at rates of 100 terabits per second.’

 

HD 10180 planetary system (artist’s impression)
Wikimedia Commons

A star 127 light-years away, which stunned the world in 2010 by becoming the largest star system beyond our own, playing host to five, possibly seven alien worlds, is back in the headlines as it may actually have nine exoplanets orbiting it.

HD 10180 is a yellow dwarf star very much like the sun, so this discovery has drawn many parallels with our own Solar System.

It is a multi-planetary system surrounding a sun-like star. But it is also a very alien place with an assortment of worlds spread over wildly different orbits.

It is believed that one of HD 10180’s exoplanets is small, although astronomers only know the planets’ masses, not their physical size or composition. The smallest world weighs-in at 1.4 times the mass of Earth, making it a “super-Earth”.

When it was first revealed that HD 10180 was a multi-planetary system, astronomers of the European Southern Observatory (ESO) detected six exoplanets gravitationally “tugging” on their host star.

Using the “radial velocity” exoplanet detection method, the astronomers watched the star’s wobble to decipher up to seven worlds measuring between 1.4 to 65 times the mass of Earth.

Five exoplanets were found to be 12-to-25 times the mass of Earth, “Neptune-like” masses, while another was detected orbiting in the outermost reaches of the system with a mass of 65 Earth masses, a “Saturn-like” world, taking around 2,200 days to complete one orbit.

But now, in addition to verifying the signal of the small 1.4 Earth-mass world, there appears to be another two small alien worlds.

“In addition to these seven signals, we report two additional periodic signals that are, according to our model probabilities…statistically significant and unlikely to be caused by noise or data sampling or poor phasecoverage of the observations,” Discovery News quoted Mikko Tuomi from the University of Hertfordshire as saying.

This basically means that Tuomi has reanalysed the data from previous observations made by the HARPS spectrograph (attached to the ESO’s 3.6-meter telescope at La Silla, Chile), confirmed signals relating to the seven exoplanets discovered in 2010 and uncovered two new worlds in the process.

What’s more, these two new signals represent another two super-Earths, says Tuomi. One is 1.9 times more massive than Earth and the other is 5.1 Earth-masses.

Although these may be “super-Earths”, the only similarity to Earth is their mass, so don’t go getting excited that we may have spotted the much sought-after Earth analogs.

The 1.4 Earth-mass exoplanet has an orbital period of only 1.2 days. The two new super-Earths also have very tight orbits, where their “years” last only 10 and 68 days.

Therefore, any question of life existing on these worlds is moot, they will likely be hellishly hot, with no chance of liquid water existing on their surfaces. It’s debatable whether these worlds could hold onto any kind of atmosphere as they would be constantly sandblasted by intense stellar winds.

Beltelecom, the Belarusian state-owned fixed line and broadband operator, has employed Huawei to build out its 100G optical network.

The 1,200 km WDM-based network, which leverages the vendor’s optical coherent detection technology, will transit Belarus via a link from Grodno on the Polish border to Vitebsk near the Russian border.

This deployment follows a 100G transmission trial that Beltelecom conducted with Huawei on its live national backbone network over the Grodno-Vitebsk route. Over a distance of 900 km, the trial tested the ability to concurrently transmit 10G and 40G services over the same fiber without regeneration.

For the live 100G long-haul network transmission test, Huawei leveraged Polarization Division Multiplexing Quadrature Phase-Shift Keying (ePDM-QPSK) modulation. Beltelecom will leverage the same ePDM-QPSK modulation format when the network is commercially launched to provide up to 8 Tbps of capacity.

Given all of the customer announcements for 100G by Huawei’s key competitors–Alcatel-Lucent (NYSE: ALU) and Ciena (Nasdaq: CIEN)–lately, the win will Beltelecom gives the Chinese vendor its own proof point that its 100G systems are up to serve the task in large networks.

Overall, the ongoing deployments by Beltelecom reflects the larger trend where 40G and 100G deployments are driving 19 percent growth in the DWDM market as seen in a recent Dell’Oro report. Led by the trio of Huawei, Ciena and Alcatel-Lucent the 40G and 100G wavelength market grew over 60 percent, contributing almost one-third of the DWDM equipment segment’s revenues in 2011.

Apple Inc. filed its plans with the N.C. Utilities Commission last week to build the 4.8-megawatt project in Maiden, about 40 miles northwest of Charlotte, N.C. That’s where Cupertino, Calif.-based Apple has built a data center to support the company’s iCloud online  and its Siri voice-recognition software.

The fuel cell project, the nation’s largest such project not built by an electric utility company, will be developed this year. It will be located on the same data complex that will host a planned 20-megawatt solar farm – the biggest ever proposed in this state.

Apple logo

But it’s the fuel cell project that’s generating buzz, eclipsing anything ever dreamed of in California, the nation’s epicenter for fuel cell projects.

“That’s a huge vote of confidence in fuel cells,” said James Warner, policy director of the Fuel Cell and  Association.

Fuel cells generate electricity through an electro-chemical process and are compared to batteries that give out power as long as they have a source of hydrogen.

They are exorbitantly expensive and in the past have been used only in experimental realms, such as NASA moon launches. The federal government offers a 30 percent tax credit, but no state incentive is available for fuel cells in North Carolina, making Apple’s project all the more intriguing. Apple is also developing miniature fuel cells to power .

According to a recent report by the U.S.  Information Administration, fuel cells are among the world’s most expensive forms of electricity, costing $6.7 million per megawatt, which would put Apple’s project in the $30 million range.

North Carolina’s fuel cell exposure is limited to demonstration projects that are a tiny fraction of the size of Apple’s fuel cells. Microcell Corp. is the Raleigh, N.C., company behind the demos.

Like quantum physics? What about quantum computers? Or quantum computers in adiamond? Then you should know that researchers at the Max Planck Institute have appropriately devised a way to create aquantum network in which a photon is exchanged between two atoms. Future!

According to Time, the two atoms transmit the photon over a 60 meter fiber optic cable and is said to be the first to send, receive and store information without failure.

Professor Ignacio Cirac, a director at MPQ, proposed the framework for the experiment. In his team’s quantum network, individual rubidium atoms were lodged between two highly reflective mirrors placed less than a millimeter apart – a setup referred to as an “optical cavity.” The team then fired a laser at one of the atoms, calibrated so as not to disturb it and instead cause it to emit a photon, which then traversed the 60-meter fiberoptic cable to be absorbed by the second atom, transferring the first atom’s quantum information.

Because quantum bits can computer 0s and 1s at the same time, the bits need only exchange the status of their quantum state, which researchers say is a faster and more elegant way of transferring data. They even suggest a entire quantum internet could be possible. I want the future now.

Is this a record for a quantum computer? A group of physicists in China have used a process called adiabatic computing to find the prime factors of the number 143, beating the previous record for a quantum computer of 21. However, there are doubts about the quantum nature of this method, and its potential to scale up any further.

Rather than bits, quantum computers use quantum bits, or qubits, which can exist in multiple states at once. In theory, these superpositions should allow the machines to complete some calculations, including factorisation, much faster than conventional or classical computers.

That could be a boon to some types of computation, but it could also pose a threat: encryption schemes rely on the fact that factorising large numbers is hard for classical computers. As yet, though, no one has built a quantum machine big enough to harness this power.

The Chinese experiment, led by Jiangfeng Du at the University of Science and Technology in Hefei, China, is based on well-established technology called liquid-phase NMR. The qubits in this set-up are the spins of hydrogen nuclei in molecules of 1-bromo-2-chlorobenzene. Each spin is a quantum magnet that can be manipulated using bursts of radio waves.

Qubit pool

While this hardware is conventional, its use in this case is not. Adiabatic quantum computing does away with the circuits and separate components found in other quantum and classical computers, which are based on switches and logic gates. Instead, a pool of qubits is encouraged to find the answer collectively.

The technique relies on a pool of qubits always seeking its lowest overall energy state; the trick is to adjust the system so that the lowest-energy state gives the answer to the problem. It is a bit like putting a lightweight ball in the middle of a stretched out blanket, and then moving the blanket’s edges to manoeuvre the ball into the right spot. Only the right moves will coax it into position; similarly, only the right quantum algorithm will enable a pool of qubits to solve a given problem.

In 2006, Ralf Schützhold and Gernot Schaller at the Dresden Technical University in Germany worked out an adiabatic algorithm to factorise a number using a pool of qubits. Now Du and colleagues have simplified that algorithm, and used it to factorise the number 143. (It’s 13 x 11, in case you were wondering.)

Though that is a leap of an order of magnitude, it is still a small enough number that ordinary computers can do the calculation in a flash.

The researchers’ computer has just four qubits and it is hard to scale up liquid-phase NMR further. That means that to factorise larger numbers,different hardware would be required such as trapped ions or superconducting circuits – or perhaps a hybrid of the two. But “the algorithm could be used in other quantum-computing architectures”, says Du.

Entanglement clue

Most groups are taking a different approach, trying to beat classical computers at factorisation via another quantum algorithm called Shor’s algorithm.

There is mathematical proof that Shor’s algorithm will be much faster than any classical algorithm for factorising large numbers, but will the same be true of the adiabatic algorithm? Maybe not, says Scott Aaronson, a quantum physicist at the Massachusetts Institute of Technology.

“It doesn’t ‘know’ about the special mathematical properties of factoring that make that an easy problem for quantum computers,” Aaronson says. “In contrast to Shor’s algorithm, we have no reason to think it would be able to factor 10,000 digit numbers in less than astronomical amounts of time.”

Du acknowledges that there is no mathematical proof for his team’s algorithm, but he says there is strong evidence from numerical simulations that it will be fast for large numbers.

Aaronson also questions the quantum nature of the calculation. “I didn’t see any evidence that quantum behaviour played a role in finding the factors of 143,” he says. Rather, the experiment might have reached the conclusion in a classical way, he suggests.

Du responds that the system starts from a superposition of all possible quantum states, and that qubits are linked in a uniquely quantum manner called entanglement. “No classical algorithms could handle the computing task in this way.”

Scientists at TU Delft’s Kavli Institute and the Foundation for Fundamental Research on Matter (FOM Foundation) have succeeded for the first time in detecting a Majorana particle. In the 1930s, the brilliant Italian physicist Ettore Majorana deduced from quantum theory the possibility of the existence of a very special particle, a particle that is its own anti-particle: the Majorana fermion. That ‘Majorana’ would be right on the border between matter and anti-matter.

Nanoscientist Leo Kouwenhoven already caused great excitement among scientists in February by presenting the preliminary results at a scientific congress. Today, the scientists have published their research in Science. The research was financed by the FOM Foundation and Microsoft.

Quantum computer and dark matter

Majorana fermions are very interesting — not only because their discovery opens up a new and uncharted chapter of fundamental physics; they may also play a role in cosmology. A proposed theory assumes that the mysterious ‘dark matter’, which forms the greatest part of the universe, is composed of Majorana fermions. Furthermore, scientists view the particles as fundamental building blocks for the quantum computer. Such a computer is far more powerful than the best supercomputer, but only exists in theory so far. Contrary to an ‘ordinary’ quantum computer, a quantum computer based on Majorana fermions is exceptionally stable and barely sensitive to external influences.

Nanowire

For the first time, scientists in Leo Kouwenhoven’s research group managed to create a nanoscale electronic device in which a pair of Majorana fermions ‘appear’ at either end of a nanowire. They did this by combining an extremely small nanowire, made by colleagues from Eindhoven University of Technology, with a superconducting material and a strong magnetic field. “The measurements of the particle at the ends of the nanowire cannot otherwise be explained than through the presence of a pair of Majorana fermions,” says Leo Kouwenhoven.

Particle accelerators

It is theoretically possible to detect a Majorana fermion with a particle accelerator such as the one at CERN. The current Large Hadron Collider appears to be insufficiently sensitive for that purpose but, according to physicists, there is another possibility: Majorana fermions can also appear in properly designed nanostructures. “What’s magical about quantum mechanics is that a Majorana particle created in this way is similar to the ones that may be observed in a particle accelerator, although that is very difficult to comprehend,” explains Kouwenhoven. “In 2010, two different groups of theorists came up with a solution using nanowires, superconductors and a strong magnetic field. We happened to be very familiar with those ingredients here at TU Delft through earlier research.” Microsoft approached Leo Kouwenhoven to help them lead a special FOM programme in search of Majorana fermions, resulting in a successful outcome..

Ettore Majorana

The Italian physicist Ettore Majorana was a brilliant theorist who showed great insight into physics at a young age. He discovered a hitherto unknown solution to the equations from which quantum scientists deduce elementary particles: the Majorana fermion. Practically all theoretic particles that are predicted by quantum theory have been found in the last decades, with just a few exceptions, including the enigmatic Majorana particle and the well-known Higgs boson. But Ettore Majorana the person is every bit as mysterious as the particle. In 1938 he withdrew all his money and disappeared during a boat trip from Palermo to Naples. Whether he killed himself, was murdered or lived on under a different identity is still not known. No trace of Majorana was ever found.



It’s a shame about Pluto. If it still counted as a planet, our sun would still be among the record-holders in the planet stakes.

Instead, that crown may have just been stolen by HD 10180, a star 130 light years away that has mass, temperature, brightness and chemistry similar to the sun. A new report sees evidence for up to nine planets in HD 10180’s family, all of which are more massive than the Earth. The finding is one of two suggesting our solar system may not be as weird as we thought.

After our sun, we used to think the stars with the most planets were Kepler-11 and HD 10180:, each appeared to have six orbiting worlds. Then Mikko Tuomi, an astronomer at the University of Hertfordshire, UK, re-examined observations of HD 10180 from the HARPS (High Accuracy Radial velocity Planet Searcher) spectrograph at the La Silla observatory in Chile. He confirmed the presence of a suspected seventh planet and found new evidence of two more, which would bring the total up to nine.

The suggestion that other solar systems have similar numbers of planets to our own fits with growing evidence that ours is not as freakish as earlier evidence suggested.

The early days of exoplanet hunting mostly turned up bizarre and exotic beasts like hot Jupiters, behemoths many times larger than Jupiter that orbit scorchingly close to their stars, often in single-planet families. Sometimes their orbits were askance, titled at crazy angles with respect to the star’s axis of rotation.

The new zoo of planets threw doubt on conventional models of planet formation. Based only on our own crowded but orderly family of planets, astronomers had assumed that planets coalesce calmly out of a flat disc of gas and dust that circled the star like a record. Hot Jupiters are too massive to have formed as close to their stars as they are now, implying a history ofplanet-on-planet violence in which bigger planets tossed smaller ones out in order to migrate inward.

Now though, it seems our orderly orbits might not be so odd. A second study out this week from the EXOEarths collaboration compared data from HARPS, which is sensitive to all planetary systems regardless of their orientation with respect to Earth, and the Kepler Space Telescope, which can see planets only if they transit, or cross in front of their star as seen from Earth. If Kepler sees multiple transits across the same star, that means there must be two or more planets in more or less the same orbital plane.

EXOEarths team member Pedro Figueira of the University of Porto, Portugal, and colleagues calculated how many such systems Kepler should see, given the frequency of all planetary seen by HARPS. The results matched what Kepler actually sees.

That means that planets’ orbits are probably more often aligned than not, and suggests that planets often form in a disc without much violent jostling.

“These results show us that the way our solar system formed must be common,” Figueira said, according to a press release. “Its structure is the same as the other planetary systems we studied, with all planets orbiting roughly in the same plane.”

© 

The future potential to build and realize the concepts of the human mind lie just there, within the potential of the human mind. For years the architectural world has been struggling to keep up with the ability of pen-to-paper and the recent advents in NURB surface computer modeling, algorithmic and parametric architecture. This in-return has led to the  building and technology industry playing catch-up with the recent advances in 3D architectural visualizations. In fact, as computer-aided design invaded these practices in the 1980s, radically transforming their generative foundations and productive capacities, architecture found itself most out-of-step and least alert, immersed in ideological and tautological debates and adrift in a realm of referents severed from material production.

The clear disconnect between how/what we design and the tangible manifestation of tectonic form has stalled the future of architecture. A beauty and suspense that came with the “paper architect” was the hope of one day being able to build and realize that which the mind created long before it was possible to build. One reason for this disconnect is the continued separation between building and structure, another being the lack of emphasis and research with materials science and the exploration for a means to break-away from traditional building methods.

© Enrico Dini

The architectural field’s current use of the parametric has been superficial and skin-deep. Despite the contemporary collective desire to forget postmodern semiotic signification, everything visual eventually devolves into symbolic imagery. Michael Meredith, an Associate Professor at the Harvard Graduate School of Design, goes as far as saying that the “parametric work” being produced today fits within an evolution of so-called postmodernism, concerning the image and referent although the parametric is the tautological modulated image of quantity. To the extent the profession has utilized parametrics today, there is very little instigating complexity other than a mind-numbing image of complexity, falling far short of its rich potential to correlate multivalent processes or typological transformations, parallel meanings, complex functional requirements, site-specific problems or collaborative networks.

© Enrico Dini

For this reason it is refreshing and provides a new sense of hope in the future of architecture that has lead and pushed Enrico Dini, an Italian inventor, and Markus Kayser, a young German born furniture and product designer to search out a method that combines building, structure and material. Creating and setting up new methods for how architects, engineers and designers can finally plan and realize the future that they have long promised.

© Enrico Dini

Enrico Dini dreamt of buildings, construction and impossible shapes. He has long been inspired by Antoni Gaudi’s architecture and loves the ambition with which Gaudi practiced. Dini became a civil engineer and later branched out into making machines, all the while dreaming of impossible shapes. For Dini, thinking about having to build with concrete and brick and the required use of scaffolding and manpower seemed outdated and inefficient. Rather than accept the constraints of the current building methods, in 2004 he invented and patented a full scale 3D printing method that used epoxy to bind sand. Enrico could now 3D print buildings.

© Enrico Dini

In 2007, Enrico went back to his invention and did away with the use of the messy and sticky epoxy and got a new patent for a system using an inorganic binding material combined with sand to once more 3D print buildings. The new process had lower maintenance costs and was easier to use and cost effective. Enrico is currently working on further improving the accuracy and will 3D print a full sized roundabout sculpture in Pisa Italy. Enrico Dini calls his real scale printing machine D_Shape. As of now the D_Shape technology can easily 3D print 6m x 6m x 1m parts that could then be either shipped or assembled in place. The goal being to literally 3D print an entire building is not far off. The parts made by D_shape resemble ‘sandstone.’ They are comparable in strength to reinforced concrete and the ingredients are the binding material and any type of sand. D_Shape’s materials cost more than regular concrete but much less manpower is needed for construction. No scaffolding needs to be constructed so overall building cost should be lower than traditional building methods.

© Enrico Dini

The system works with a rigging that is suspended over the buildable part. The system deposits the sand and then the inorganic binding ink. No water is necessary. Because the two components meet outside the nozzle, the machine does not clog up and can keep up its accuracy of 25 DPI. Enrico and D_Shape are currently talking to lots of construction & engineering companies and architects about their technology. Currently Enrico has partnered with Norman Foster Architects to incorporate the us of moon dust and the building element with the ambition of one day being able to transport the 3D printing machine to the moon and build structures made of the same sand/dust found on the surface of the moon.

© Markus Kayser

Markus Kayser was born near Hannover,  in 1983. He studied 3D Furniture and Product Design at London Metropolitan University from 2004 – 2008 and continued 2009 with the study of Product Design at the Royal College of Art and gained his Master in 2011. Markus Kayser Studio was set up in London, UK in 2011. From early works of furniture and lights in his father’s farm workshop through to today Markus Kayser developed an understanding of materials, processes and technologies which he sees as being key in combination with the natural given. He wants to engage by producing objects that one can relate to, that speak about something else other than just their utilitarian qualities. The layers to be discovered as well as one’s associations with objects interest him.

Experimentation plays a central part in developing his designs. Kayser’s recent work demonstrates the exploration of hybrid solutions linking technology and natural energy to show the great opportunities, to question current methodologies in manufacturing and to test new scenarios of production. In his process it is important that behind the thorough research and the theory there must be a realistic proof of concept, which elucidates the real potential of a given subject. He tries to tell a story and to balance the seriousness with a sense of humour. This kind of storytelling makes his products as well as his experimental works digestible without losing its depths in content.

© Markus Kayser

In August 2010 he took his first solar machine – the Sun-Cutter – to the Egyptian desert in a suitcase. This was a solar-powered, semi-automated low-tech laser cutter, that used the power of the sun to drive it and directly harnessed its rays through a glass ball lens to ‘laser’ cut 2D components using a cam-guided system. The Sun-Cutter produced components in thin plywood with an aesthetic quality that was a curious hybrid of machine-made and “nature craft” due to the crudeness of its mechanism and cutting beam optics, alongside variations in solar intensity due to weather fluctuations.

© Markus Kayser

In the deserts of the world two elements dominate – sun and sand. The former offers a vast energy source of huge potential, the latter an almost unlimited supply of silica in the form of quartz. The experience of working in the desert with the Sun-Cutter led him directly to the idea of a new machine that could bring together these two elements. Silicia sand when heated to melting point and allowed to cool solidifies as glass. This process of converting a powdery substance via a heating process into a solid form is known as sintering and has in recent years become a central process in design prototyping known as 3D printing or SLS (selective laser sintering).

© Markus Kayser

These 3D printers use laser technology to create very precise 3D objects from a variety of powdered plastics, resins and metals – the objects being the exact physical counterparts of the computer-drawn 3D designs inputted by the designer. By using the sun’s rays instead of a laser and sand instead of resins, he had the basis of an entirely new solar-powered machine and production process for making glass objects that taps into the abundant supplies of sun and sand to be found in the deserts of the world.

© Markus Kayser

His first manually-operated solar-sintering machine was tested in February 2011 in the Moroccan desert with encouraging results that led to the development of the current larger and fully-automated computer driven version – the Solar-Sinter. The Solar-Sinter was completed in mid-May and later that month he took this experimental machine to the Sahara desert near Siwa, Egypt, for a two week testing period. The machine and the results of these first experiments presented here represent the initial significant steps towards what I envisage as a new solar-powered production tool of great potential.

© Markus Kayser

In a world increasingly concerned with questions of energy production and raw material shortages, this project explores the potential of desert manufacturing, where energy and material occur in abundance. In this experiment sunlight and sand are used as raw energy and material to produce glass objects using a 3D printing process, that combines natural energy and material with high-tech production technology. Solar-sintering aims to raise questions about the future of manufacturing and triggers dreams of the full utilisation of the production potential of the world’s most efficient energy resource – the sun. Whilst not providing definitive answers, this experiment aims to provide a point of departure for fresh thinking.

© Markus Kayser

This should be an especially loud call to arms to architects, who hold within their own minds the ability to think forward and manifest their own destiny. While it is great that these technologies are being pursued it should be architects who pursue them directly. With the recent passing of Steve Jobs, who many considered the Frank Lloyd Wright of the technology industry, we as architects should aim to pursue the same level of great and ambitious work, and the desire to unify architecture similar to how he unified his products. Steve Jobs created the need for a product, then designed the product and the technology to be able to realize those products. Likewise, architecture and architects should aim to unify the building, the experience, the structure, the material and the technology to make it all possible.

Digital Chocolatier

While all the recent chatter about the way that 3D-printing technology has advanced is pretty cool, there’s a new prototype that has us thinking about printing out custom chocolates instead of Companion Cubes. Called the Digital Chocolatier Prototype, it allows you to quickly design and assemble different kinds of chocolate candies via a touchscreen interface. Using a chocolate extruder, the device puts the delicious confection into a thermoelectric cup that rapidly cools the candy ready for consumption.

User interface

The mastermind behind this awesomeness is a man named Marcelo Coelho, a Canadian designer and researcher who is currently finishing his doctorate a the MIT Media Lab. When he’s not getting smarter in Cambridge, he runs a studio called Zigelbaum + Coelho in Montreal. His goal with tackling food-based printing is to continue the convergence that’s going on between technology and everyday life. What’s more a part of our culture and our identities than food?

As mentioned above, this machine is simply a prototype. It consists of a carousel of 4 different delicious items that you can layer to your hearts content. The pictured user interface is simply a concept as to how the device would work. There are no plans to build a fully working unit for mass production, so you can view this as adding to the body of work that Coelho is trying to develop for his doctorate work. Of course, if a working prototype emerges, we’ll be first in line to try it.

Using chocolate or other confections as the material for 3D printing is not a new concept. Rather than extrudable plastics, you simply replace the building material with molten chocolate to create a layered object that you can actually eat.

How long before an established chocolate company gets wind of this and commercializes it?