Category Archive : Tech

Cette exosquelette de type catapulte pourrait rendre les humains 50% plus rapides

 

Avec le bon engin mécanique attaché, les vitesses de course humaines pourraient rivaliser avec celles des cyclistes, selon de nouvelles recherches – obtenant des coureurs jusqu’à 20,9 mètres par seconde, ou plus de 46 miles par heure.

 

Il convient de noter que ce sont des calculs théoriques pour le moment, et l’équipe responsable espère que son prototype sera prêt l’année prochaine. Les chiffres sont basés sur le potentiel qui serait libéré par un attachement d’exosquelette à ressort, semblable à une catapulte.

Nous avons vu ces sortes d’exosquelettes utilisés dans le passé pour aider des personnes paralysées , et un appareil en état de marche pourrait avoir beaucoup d’utilisations potentielles – bien que tous les records olympiques utilisant peu probable de se tenir.

“Notre résultat pourrait conduire à une nouvelle génération de dispositifs d’augmentation développés pour les sports, les opérations de sauvetage et les forces de l’ordre, où les humains pourraient bénéficier d’une vitesse de mouvement accrue”, écrivent les chercheurs dans leur publié papier .

Les mécaniques elles-mêmes sont en fait basées sur des vélos: les pédales de vélo sont si efficaces parce qu’elles nous aident à nous propulser vers l’avant pendant que nos pieds se redressent, ainsi que lorsqu’ils poussent vers le bas. C’est de l’énergie qui est gaspillée pendant que nous courons.

Même les coureurs les plus rapides sur Terre ne gagnent aucune vitesse une fois que leurs pieds quittent le sol – uniquement lorsque leurs pieds heurtent la piste et poussent. Et comme les coureurs deviennent plus rapides, leurs pieds passent plus de temps dans les airs.

Alors, comment pourrait-on mieux utiliser ce temps «perdu»? Les chercheurs derrière la nouvelle étude ont examiné une variété de concepts, atterrissant sur un attachement à ressort qui fonctionne un peu comme une catapulte pour tirer un coureur vers l’avant.

“La disparité entre la mécanique du cyclisme et la mécanique de la course à pied nous a donné l’idée de faire l’hypothèse d’un appareil permettant aux jambes de travailler en l’air”, a déclaré à Emma Betuel l’ingénieur en mécanique David Braun, de l’Université Vanderbilt, [19459006 ] Inverse .

L’idée de fixer des ressorts aux jambes existe depuis plus d’un siècle, mais elle n’est pas particulièrement efficace en soi. Dans ce cas, l’équipe a utilisé la modélisation informatique pour frapper un appareil où le ressort stocke de l’énergie jusqu’à ce que le pied touche à nouveau le sol.

exo walk 2 Fonctionnement de l’appareil. (Sutrisno & Braun, Science Advances, 2020)

Cette énergie proviendrait de la flexion des jambes et de l’avancement dans l’air, suggèrent les chercheurs. Le ressort pourrait libérer cette énergie au contact du sol et ajouter plus de soutien au corps.

Pour atteindre les vitesses maximales, l’analyse montre que le ressort aurait besoin de capter l’énergie pendant 96 pour cent de l’étape et de pouvoir la transférer entièrement vers l’accélération vers l’avant. Le chiffre de 20,9 mètres par seconde est également basé sur l’énergie pompée par un cycliste de classe mondiale, de sorte que le reste d’entre nous pourrait être un peu plus lent.

De nombreux défis restent à relever – notamment en ce qui concerne l’impact supplémentaire que chaque étape aurait dû descendre d’une plus grande hauteur – mais les chercheurs suggèrent que le ressort pourrait être programmé pour s’adapter à différentes vitesses, comme les engrenages sur un bicyclette.

En plus d’aider les services d’urgence, les secouristes et à peu près tous ceux qui ont besoin de se rendre rapidement, un exosquelette comme celui-ci pourrait même inspirer un tout nouveau sport (tout comme les vélos), disent les chercheurs. Avec le temps, ils espèrent que l’ensemble de l’appareil pourrait même tenir dans une chaussure.

“Cela nous montre jusqu’où nous pouvons repousser les limites et sur quelles caractéristiques clés nous devons nous concentrer pour développer la nouvelle technologie”, a déclaré Braun à Ian Sample à The Guardian [19459004 ].

La recherche a été publiée dans Science Advances .

Les données sont enfin arrivées: les voitures électriques produisent vraiment moins de pollution au CO2

 

Les voitures électriques produisent absolument moins de CO2 que les gourmands au gaz, a confirmé une nouvelle étude – contrant les affirmations selon lesquelles les émissions de carbone provenant de la fabrication de voitures électriques et de la production d’électricité l’emportaient sur les économies réalisées sur la route.

 

Croissant les chiffres sur les données collectées dans 59 régions différentes du monde qui représentent 95 pour cent de la demande mondiale de transport et de chauffage, les chercheurs ont constaté que les voitures électriques entraînent une perte globale de CO2 dans le vaste majorité des places.

Alors que la production d’électricité devient moins carbonée, l’équilibre va encore s’éloigner des voitures à essence. Dans des pays comme la Suède et la France, où une grande partie de l’électricité est produite à partir d’énergies renouvelables et nucléaires, au cours de leur vie, les voitures électriques peuvent représenter 70% de moins en termes d’émissions de carbone que les équivalents essence, lorsque tous les facteurs sont inclus.

L’équipe a également examiné l’impact des pompes à chaleur électriques comme une option à faible émission de carbone pour le chauffage domestique, et a constaté qu’elles étaient également meilleures pour l’environnement dans son ensemble, avec la production d’électricité prise en compte.

“L’idée selon laquelle les véhicules électriques ou les pompes à chaleur électriques pourraient augmenter les émissions est essentiellement un mythe”, dit le spécialiste de l’environnement Florian Knobloch , de l’Université Radboud aux Pays-Bas.

“Nous avons vu beaucoup de discussions à ce sujet récemment, avec beaucoup de désinformation en cours. Voici une étude définitive qui peut dissiper ces mythes. Nous avons effectué les chiffres pour partout dans le monde, en regardant toute une gamme de voitures et de systèmes de chauffage. ”

“Même dans notre pire scénario, il y aurait une réduction des émissions dans presque tous les cas. Cette information devrait être très utile pour les décideurs politiques.”

Les chercheurs ont conclu qu’en l’état actuel, les voitures électriques sont meilleures pour le climat que les voitures à essence dans 95 pour cent du monde. Le petit nombre d’exceptions concerne des pays comme la Pologne, où le processus de production d’électricité dépend encore largement du charbon. (Vous pouvez rechercher votre propre pays dans les tableaux supplémentaires de l’équipe publiés ici.)

Le monde s’orientant désormais vers les carburants renouvelables, il ne devrait bientôt y avoir aucune exception. D’ici 2050, si les gouvernements mettent en place les bonnes politiques, une voiture sur deux sur la route pourrait être électrique, ce qui permettrait d’économiser environ 1,5 gigatonnes d’émissions de carbone par an.

“Compte tenu des émissions provenant de la fabrication et de l’utilisation continue de l’énergie, il est clair que nous devons encourager le passage aux voitures électriques et aux pompes à chaleur domestiques sans aucun regret”, , déclare Knobloch .

Les chercheurs appellent les gouvernements et les décideurs politiques à ne pas tarder à pousser l’interrupteur vers l’électricité – même accélérer le basculement d’un an ou deux pourrait permettre d’économiser une énorme quantité de CO2 dans l’atmosphère.

Dans cet esprit, il est bon de voir l’utilisation des énergies renouvelables en expansion à travers le monde, ainsi que des améliorations technologiques en termes de conception et de fonctionnement des voitures électriques .

Il reste encore beaucoup de défis à relever, en termes de production de voitures électriques, de production d’électricité à partir de sources renouvelables, d’amélioration des infrastructures de recharge – et d’amener davantage de personnes à abandonner la voiture. Cependant, nous savons dans quelle direction nous devons aller.

“La réponse est claire: pour réduire les émissions de carbone, nous devons choisir les voitures électriques et les pompes à chaleur domestiques plutôt que les combustibles fossiles”, , explique un scientifique en informatique. Jean-Francois Mercure , de l’Université d’Exeter au Royaume-Uni.

La recherche a été publiée dans Nature Sustainability .

Crystals Have Been Used to Generate Truly Random Numbers For The Very First Time

Randomness is not always as random as you think. It’s actually very challenging for computers to generate true randomness, because algorithms introduce subtle patterns that can be detected, meaning the numbers they come up with are pseudorandom, and not ultimately unpredictable.

 

Which is not to say machines can’t play a part. What if we took something, like a robot, and combined it with a truly random process? Scientists have made just such a thing, harnessing the innate unpredictability of chemistry in a way that’s never been done before: in this case, watching crystals grow.

Crystallisation is not actually a chemical reaction, but a physical change that happens when crystal solids form from the products of a reaction, and researchers say the randomisation possibilities provided by the crystallisation process may be endless.

“In a chemical system, each time a reaction is performed there is an almost infinite number of energetically equivalent ways for particular reagents to combine, resulting in both high uncertainty and entropy, and the exact pathway undertaken will never be repeated,” a team from the University of Glasgow explains in a new study.

“As such, the entropy of such a chemical system is extraordinarily high, and may therefore serve as a very good entropy pool for application of random number generation.”

010 crystals random 1(Lee et al., Matter, 2020)

In the new work, the researchers exploited this seemingly endless potential for randomness by building a robotic system to prepare, initiate, and monitor hundreds of parallel chemical reactions in a vast array of chemistry vials.

As crystals grew randomly in each vial, the robot would observe the formations via camera, detecting and recording the myriad of variables resulting, including crystal location, size, shape, orientation, and colour.

 

Snapshots of the vial array were captured every 10 minutes, and the images then converted into binary sequences. In subsequent encryption-cracking tests, the output of the crystallisation robot satisfied randomness tests specified by the National Institute for Standards and Technology, beating the results of conventional computer-based pseudorandom number generators.

“We found our messages encoded with the genuinely random numbers took longer to crack than the algorithm, because our system could guess the algorithm and then just brute force it,” one of the team, chemist Leroy Cronin told Vice.

Of course, while it’s a remarkable proof of concept – the first example of generating true random numbers using the stochasticity of chemistry, the team claims – it might not be the most practical way of achieving randomness.

After all, not everybody may have the physical space to host a crystallisation robot running hundreds of chemical experiments in tandem.

Fortunately, the researchers suggest the same kind of system might be capable of miniaturisation in the future – somehow sealing all those infinite possibilities within the body of conventional electronic computers.

“This is a bit of a crazy idea, but this is a way of searching chemical space,” Cronin told Vice. “Because chemical space is just too big to explore. There’s a lot to be said for going in a random direction.”

The findings are reported in Matter.

 

Nuclear Fusion Startup Claims It’s on The Way to Providing ‘Unlimited’ Energy

An Australian fusion startup called HB11, a spin-off from the University of New South Wales, claims to have found a way to revolutionize current nuclear fusion technology, potentially laying the groundwork for a new era of power generation — without running the risk of a nuclear meltdown.

 

The startup’s leadership doesn’t mince words.

“We are sidestepping all of the scientific challenges that have held fusion energy back for more than half a century,” director Warren McKenzie told New Atlas.

Fusion energy, as its name suggests, harnesses the energy released from when atomic nuclei fuse together, as opposed to fission, which splits nuclei apart to generate electricity.

Fusion has been the holy grail of energy production for decades, but scientists have yet to achieve a reaction that spits out more energy than it needs to get going — though they’re starting to get close.

If that sounds too good to be true, it’s worth noting that there does appear to be drama around the claims. A press release about the technology on the New South Wales University site disappeared — though a backup copy appears to still be online. Futurism has reached out to the university to ask about the missing release.

The backup release makes extraordinary claims. It says HB11 has found a new way that does away with the current fusion energy approach that requires inordinately high temperatures and pressure levels to work.

 

In theory — right now it’s not much more than a theory — HB11’s approach is extremely simplified and significantly cheaper. The technique relies on hydrogen and a boron B-11 isotope — instead of extremely rare and expensive radioactive isotopes such as tritium — and employs a specialized set of lasers to get the reaction going.

Inside a “largely empty metal sphere,” fuel pellets of HB-11 isotopes are shot at with two lasers to trigger an “‘avalancee’ fusion chain reaction,” as the company describes it in a statement.

“You could say we’re using the hydrogen as a dart, and hoping to hit a boron , and if we hit one, we can start a fusion reaction,” McKenzie told New Atlas. “That’s the essence of it.”

“Creating fusion using temperature is essentially randomly moving atoms around, and hoping they’ll hit one another, our approach is much more precise,” he added.

The process even skips the “need for a heat exchanger or steam turbine generator” and can feed an electrical flow “almost directly into an existing power grid,” according to the company’s statement.

No nuclear waste, no steam, zero chance of a nuclear meltdown. It almost sounds too good to be true — but the startup still has a lot to prove. McKenzie admitted himself he doesn’t know if or when the startup’s idea could be turned into a commercial reality.

“I don’t want to be a laughing stock by promising we can deliver something in 10 years, and then not getting there,” he told New Atlas.

This article was originally published by Futurism. Read the original article.

 

All Those Low-Cost Satellites in Orbit Could Be Weaponized by Hackers, Warns Expert

Last month, SpaceX became the operator of the world’s largest active satellite constellation. As of the end of January, the company had 242 satellites orbiting the planet with plans to launch 42,000 over the next decade.

 

This is part of its ambitious project to provide internet access across the globe. The race to put satellites in space is on, with Amazon, UK-based OneWeb and other companies chomping at the bit to place thousands of satellites in orbit in the coming months.

These new satellites have the potential to revolutionise many aspects of everyday life – from bringing internet access to remote corners of the globe to monitoring the environment and improving global navigation systems.

Amid all the fanfare, a critical danger has flown under the radar: the lack of cybersecurity standards and regulations for commercial satellites, in the US and internationally.

As a scholar who studies cyber conflict, I’m keenly aware that this, coupled with satellites’ complex supply chains and layers of stakeholders, leaves them highly vulnerable to cyberattacks.

If hackers were to take control of these satellites, the consequences could be dire. On the mundane end of scale, hackers could simply shut satellites down, denying access to their services.

Hackers could also jam or spoof the signals from satellites, creating havoc for critical infrastructure. This includes electric grids, water networks and transportation systems.

Some of these new satellites have thrusters that allow them to speed up, slow down and change direction in space. If hackers took control of these steerable satellites, the consequences could be catastrophic. Hackers could alter the satellites’ orbits and crash them into other satellites or even the International Space Station.

 

Commodity parts open a door

Makers of these satellites, particularly small CubeSats, use off-the-shelf technology to keep costs low. The wide availability of these components means hackers can analyse them for vulnerabilities.

In addition, many of the components draw on open-source technology. The danger here is that hackers could insert back doors and other vulnerabilities into satellites’ software.

The highly technical nature of these satellites also means multiple manufacturers are involved in building the various components. The process of getting these satellites into space is also complicated, involving multiple companies.

Even once they are in space, the organisations that own the satellites often outsource their day-to-day management to other companies. With each additional vendor, the vulnerabilities increase as hackers have multiple opportunities to infiltrate the system.

Hacking some of these CubeSats may be as simple as waiting for one of them to pass overhead and then sending malicious commands using specialised ground antennas. Hacking more sophisticated satellites might not be that hard either.

Satellites are typically controlled from ground stations. These stations run computers with software vulnerabilities that can be exploited by hackers. If hackers were to infiltrate these computers, they could send malicious commands to the satellites.

 

A history of hacks

This scenario played out in 1998 when hackers took control of the US-German ROSAT X-Ray satellite. They did it by hacking into computers at the Goddard Space Flight Center in Maryland.

The hackers then instructed the satellite to aim its solar panels directly at the Sun. This effectively fried its batteries and rendered the satellite useless. The defunct satellite eventually crashed back to Earth in 2011.

Hackers could also hold satellites for ransom, as happened in 1999 when hackers took control of the UK’s SkyNet satellites.

Over the years, the threat of cyberattacks on satellites has gotten more dire. In 2008, hackers, possibly from China, reportedly took full control of two NASA satellites, one for about two minutes and the other for about nine minutes.

In 2018, another group of Chinese state-backed hackers reportedly launched a sophisticated hacking campaign aimed at satellite operators and defense contractors. Iranian hacking groups have also attempted similar attacks.

Although the US Department of Defense and National Security Agency have made some efforts to address space cybersecurity, the pace has been slow. There are currently no cybersecurity standards for satellites and no governing body to regulate and ensure their cybersecurity.

Even if common standards could be developed, there are no mechanisms in place to enforce them. This means responsibility for satellite cybersecurity falls to the individual companies that build and operate them.

 

Market forces work against space cybersecurity

As they compete to be the dominant satellite operator, SpaceX and rival companies are under increasing pressure to cut costs. There is also pressure to speed up development and production. This makes it tempting for the companies to cut corners in areas like cybersecurity that are secondary to actually getting these satellites in space.

Even for companies that make a high priority of cybersecurity, the costs associated with guaranteeing the security of each component could be prohibitive. This problem is even more acute for low-cost space missions, where the cost of ensuring cybersecurity could exceed the cost of the satellite itself.

To compound matters, the complex supply chain of these satellites and the multiple parties involved in their management means it’s often not clear who bears responsibility and liability for cyber breaches.

This lack of clarity has bred complacency and hindered efforts to secure these important systems.

Regulation is required

Some analysts have begun to advocate for strong government involvement in the development and regulation of cybersecurity standards for satellites and other space assets.

Congress could work to adopt a comprehensive regulatory framework for the commercial space sector. For instance, they could pass legislation that requires satellites manufacturers to develop a common cybersecurity architecture.

They could also mandate the reporting of all cyber breaches involving satellites. There also needs to be clarity on which space-based assets are deemed critical in order to prioritise cybersecurity efforts.

Clear legal guidance on who bears responsibility for cyberattacks on satellites will also go a long way to ensuring that the responsible parties take the necessary measures to secure these systems.

Given the traditionally slow pace of congressional action, a multi-stakeholder approach involving public-private cooperation may be warranted to ensure cybersecurity standards. Whatever steps government and industry take, it is imperative to act now.

It would be a profound mistake to wait for hackers to gain control of a commercial satellite and use it to threaten life, limb and property – here on Earth or in space – before addressing this issue.

William Akoto, Postdoctoral Research Fellow, University of Denver.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Scientists Have Created Bionic Jellyfish And Successfully Controlled Their Movements

Scientists have ‘puppeteered’ the movements of a jellyfish and made it even faster than the real thing.

Taking artificial control with a microelectronic implant, researchers have increased the natural swimming speed of a live moon jellyfish (Aurelia aurita) by nearly threefold.

 

What’s more, they achieved this with only a little bit of external power and twice the amount of metabolic effort from the animal.

“Thus,” the authors conclude, “this biohybrid robot uses 10 to 1,000 times less external power per mass than other aquatic robots reported in literature.”

Jellyfish are known to be incredibly efficient swimmers, much more so than any machine we humans have created, so their low cost of transport makes them an ideal “natural scaffold”.

While it’s true certain underwater vehicles can travel much faster than a jellyfish, so far, robots that try and mimic jellyfish behaviour require orders of magnitude more energy and are usually tethered to an external power supply.

The real things, on the other hand, are slow and steady, unencumbered explorers, capable of self-healing. If we can properly control them, some think they could be an intriguing new way to expand ocean monitoring.

“Because jellyfish are naturally found in a wide range of salinities, temperatures, oxygen concentrations, and depths (including 3,700 m [12,100 feet] or deeper in the Mariana Trench),” the authors of the new study propose, “these biohybrid robots also have the potential to be deployed throughout the world’s oceans.”

 

Of course, that would require a lot more control than we currently have. So far, the team has merely shown they can enhance jellyfish swimming without undue cost to the metabolism or health of the animal.

The key to this small but significant step is a portable microelectronic swim controller, which, when attached to the jellyfish, can generate pulse waves and stimulate muscle contractions.

Through this technology, scientists can speed up a jellyfish’s propulsion until it hits an optimal point, where the greatest speed is achieved with the smallest energy output.

By hijacking the metabolism and muscles of the jellyfish in this way, researchers got the creature moving 2.8 times faster than its natural swimming speed.

The team hopes their work can lead to newer underwater vehicles that can some day explore for longer periods of time, while also making minimal disturbances wherever they may roam.

With some more tweaking, there’s a chance we might even be able to use real jellyfish to study distant corners of the ocean, similar to the way we currently use tagged mammals.

“Moreover, because jellyfish do not have a swim bladder, they can reach 3,700-metre [12,100-foot] depths in the ocean,” the authors write.

“Only the microelectronics will require hardening for operation at high pressures.”

Who knows, perhaps it will be an army of biohybrids that one day reveal the vast untold mysteries of our oceans.

The study was published in Science Advances.

 

Helix of an Elusive Rare Earth Metal Could Help Push Moore’s Law to The Next Level

To cram ever more computing power into your pocket, engineers need to come up with increasingly ingenious ways to add transistors to an already crowded space.

Unfortunately there’s a limit to how small you can make a wire. But a twisted form of rare earth metal just might have what it takes to push the boundaries a little further.

 

A team of researchers funded by the US Army have discovered a way to turn twisted nanowires of one of the rarest of rare earth metals, tellurium, into a material with just the right properties that make it an ideal transistor at just a couple of nanometres across.

“This tellurium material is really unique,” says Peide Ye, an electrical engineer from Purdue University.

“It builds a functional transistor with the potential to be the smallest in the world.”

Transistors are the work horse of anything that computes information, using tiny changes in charge to prevent or allow larger currents to flow.

Typically made of semiconducting materials, they can be thought of as traffic intersections for electrons. A small voltage change in one place opens the gate for current to flow, serving as both a switch and an amplifier.

Combinations of open and closed switches are the physical units representing the binary language underpinning logic in computer operations. As such, the more you have in one spot, the more operations you can run.

Ever since the first chunky transistor was prototyped a little more than 70 years ago, a variety of methods and novel materials have led to regular downsizing of the transistor.

 

In fact the shrinking was so regular that co-founder of the computer giant Intel, George Moore, famously noted in 1965 that it would follow a trend of transistors doubling in density every two years.

Today, that trend has slowed considerably. For one thing, more transistors in one spot means more heat building up.

But there are also only so many ways you can shave atoms from a material and still have it function as a transistor. Which is where tellurium comes in.

Though not exactly a common element in Earth’s crust, it’s a semi-metal in high demand, finding a place in a variety of alloys to improve hardness and help it resist corrosion.

It also has properties of a semiconductor; carrying a current under some circumstances and acting as a resistor under others.

Curious about its characteristics on a nanoscale, engineers grew single-dimensional chains of the element and took a close look at them under an electron microscope. Surprisingly, the super-thin ‘wire’ wasn’t exactly a neat line of atoms.

“Silicon atoms look straight, but these tellurium atoms are like a snake. This is a very original kind of structure,” says Ye.

 

On closer inspection they worked out that the chain was made of pairs of tellurium atoms bonded strongly together, and then stacking into a crystal form pulled into a helix by weaker van der Waal forces.

Building any kind of electronics from a crinkly nanowire is just asking for trouble, so to give the material some structure the researchers went on the hunt for something to encapsulate it in.

The solution, they found, was a nanotube of boron nitride. Not only did the tellurium helix slip neatly inside, the tube acted as an insulator, ticking all the boxes that would make it suit life as a transistor.

Most importantly, the whole semiconducting wire was a mere 2 nanometres across, putting it in the same league as the 1 nanometre record set a few years ago.

Time will tell if the team can squeeze it down further with fewer chains, or even if it will function as expected in a circuit.

If it works as hoped, it could contribute to the next generation of miniaturised electronics, potentially halving the size of current cutting edge microchips.

 

“Next, the researchers will optimise the device to further improve its performance, and demonstrate a highly efficient functional electronic circuit using these tiny transistors, potentially through collaboration with ARL researchers,” says Joe Qiu, program manager for the Army Research Office.

Even if the concept pans out, there’s a variety of other challenges for shrinking technology to overcome before we’ll find it in our pockets.

While tellurium isn’t currently considered to be a scarce resource, in spite of its relative rarity, it could be in high demand in future electronics such as solar cells.

This research was published in Nature Electronics.

 

There’s a Simple Way to Store Renewable Energy, And We Already Have The Technology

The effect that fossil fuels are having on the climate emergency is driving an international push to use low-carbon sources of energy. At the moment, the best options for producing low-carbon energy on a large scale are wind and solar power.

 

But despite improvements over the last few years to both their performance and cost, a significant problem remains: the wind doesn’t always blow, and the sun doesn’t always shine.

A power grid that relies on these fluctuating sources struggles to constantly match supply and demand, and so renewable energy sometimes goes to waste because it’s not produced when needed.

One of the main solutions to this problem is large-scale electricity storage technologies. These work by accumulating electricity when supply exceeds demand, then releasing it when the opposite happens. However, one issue with this method is that it involves enormous quantities of electricity.

Existing storage technologies like batteries wouldn’t be good for this kind of process, due to their high cost per unit energy. Currently, over 99 percent of large-scale electricity storage is handled by pumped hydro dams, which move water between two reservoirs through a pump or turbine to store or produce power.

However, there are limits to how much more pumped hydro can be built due to its geographical requirements.

One promising storage option is pumped thermal electricity storage. This relatively new technology has been around for about ten years, and is currently being tested in pilot plants.

file 20200206 43102 py9rpbThe conversion of electricity to heat happens in the central circuit, then stored in hot and cold tanks. (Pau Farres Antunez)

Pumped thermal electricity storage works by turning electricity into heat using a large-scale heat pump. This heat is then stored in a hot material, such as water or gravel, inside an insulated tank.

When needed, the heat is then turned back into electricity using a heat engine. These energy conversions are done with thermodynamic cycles, the same physical principles used to run refrigerators, car engines or thermal power plants.

 

Known technology

Pumped thermal electricity storage has many advantages. The conversion processes mostly rely on conventional technology and components (such as heat exchangers, compressors, turbines, and electrical generators) that are already widely used in the power and processing industries.

This will shorten the time required to design and build pumped thermal electricity storage, even on a large scale.

The storage tanks can be filled with abundant and inexpensive materials such as gravel, molten salts or water. And, unlike batteries, these materials pose no threat to the environment.

Large molten salt tanks have been successfully used for many years in concentrated solar power plants, which is a renewable energy technology that has seen rapid growth during the last decade.

Concentrated solar power and pumped thermal electricity storage share many similarities, but while concentrated solar power plants produce energy by storing sunlight as heat (and then converting it to electricity), pumped thermal electricity storage plants store electricity that may come from any source – solar, wind or even nuclear energy, among others.

Easy to deploy and compact

Pumped thermal electricity storage plants can be installed anywhere, regardless of geography. They can also easily be scaled up to meet the grid’s storage needs.

Other forms of bulk energy storage are limited by where they can be installed. For example, pumped hydro storage requires mountains and valleys where substantial water reservoirs can be built. Compressed air energy storage relies on large subterranean caverns.

 

Pumped thermal electricity storage has a higher energy density than pumped hydro dams (it can store more energy in a given volume). For example, ten times more electricity can be recovered from 1 kilogram of water stored at 100°C (212°F), compared to 1 kilograms of water stored at a height of 500 metres in a pumped hydro plant.

This means that less space is required for a given amount of energy stored, so the environmental footprint of the plant is smaller.

Long life

The components of pumped thermal electricity storage typically last for decades. Batteries, on the other hand, degrade over time and need to be replaced every few years – most electric car batteries are typically only guaranteed for about five to eight years.

However, even though there are many things that make pumped thermal electricity storage well-suited for large-scale storage of renewable energy, it does have its downsides.

Possibly the biggest disadvantage is its relatively modest efficiency – meaning how much electricity is returned during discharge, compared to how much was put in during charge. Most pumped thermal electricity storage systems aim for 50-70 percent efficiency, compared to 80-90 percent for lithium-ion batteries or 70-85 percent for pumped hydro storage.

But what arguably matters most is cost: the lower it is, the faster society can move towards a low carbon future. Pumped thermal electricity storage is expected to be competitive with other storage technologies – though this won’t be known for certain until the technology matures and is fully commercialised.

As it stands, several organisations already have working, real-world prototypes. The sooner we test and start deploying pumped thermal electricity storage, the sooner we can use it to help transition to a low-carbon energy system.The Conversation

Antoine Koen, PhD Candidate in Pumped Thermal Energy Storage, University of Cambridge and Pau Farres Antunez, Postdoctoral researcher in Energy Storage, University of Cambridge.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Australian Team Claims They’re 5 Year From Fusion Energy. Here’s Where We’re Really At

Recent reports from scientists pursuing a new kind of nuclear fusion technology are encouraging, but we are still some distance away from the “holy grail of clean energy”.

The technology developed by Heinrich Hora and his colleagues at the University of NSW uses powerful lasers to fuse together hydrogen and boron atoms, releasing high-energy particles that can be used to generate electricity.

 

As with other kinds of nuclear fusion technology, however, the difficulty is in building a machine that can reliably initiate the reaction and harness the energy it produces.

What is fusion?

Fusion is the process that powers the Sun and the stars. It occurs when the nuclei of two atoms are forced so close to one another that they combine into one, releasing energy in the process.

If the reaction can be tamed in the laboratory, it has the potential to deliver near-limitless baseload electricity with virtually zero carbon emissions.

The easiest reaction to initiate in the laboratory is the fusion of two different isotopes of hydrogen: deuterium and tritium. The product of the reaction is a helium ion and a fast-moving neutron. Most fusion research to date has pursued this reaction.

Deuterium-tritium fusion works best at a temperature of about 100,000,000℃. Confining a plasma – the name for the flamelike state of matter at such temperatures – that hot is no mean feat.

The leading approach to harnessing fusion power is called toroidal magnetic confinement. Superconducting coils are used to create a field about a million times stronger than Earth’s magnetic field to contain the plasma.

 

Scientists have already achieved deuterium-tritium fusion at experiments in the US (the Tokamak Fusion Test Reactor) and the UK (the Joint European Torus). Indeed, a deuterium-tritium fusion campaign will happen in the UK experiment this year.

These experiments initiate a fusion reaction using massive external heating, and it takes more energy to sustain the reaction than the reaction produces itself.

The next phase of mainstream fusion research will involve an experiment called ITER (“the way” in Latin) being built in the south of France. At ITER, the confined helium ions created by the reaction will produce as much heating as the external heating sources. As the fast neutron carries four times as much energy as the helium ion, the power gain is a factor of five.

ITER is a proof of concept before the construction of a demonstration power plant.

What’s different about using hydrogen and boron?

The technology reported by Hora and colleagues suggests using a laser to create a very strong confining magnetic field, and a second laser to heat a hydrogen-boron fuel pellet to reach the point of fusion ignition.

When a hydrogen nucleus (a single proton) fuses with a boron-11 nucleus, it produces three energetic helium nuclei. Compared with the deuterium-tritium reaction, this has the advantage of not producing any neutrons, which are hard to contain.

 

However, the hydrogen-boron reaction is much more difficult to trigger in the first place. Hora’s solution is to use a laser to heat a small fuel pellet to ignition temperature, and another laser to heat up metal coils to create a magnetic field that will contain the plasma.

The technology uses very brief laser pulses, lasting only nanoseconds. The magnetic field required would be extremely strong, about 1,000 times as strong as the one used in deuterium-tritium experiments. Researchers in Japan have already used this technology to create a weaker magnetic field.

Hora and colleagues claim their process will create an “avalanche effect” in the fuel pellet that means a lot more fusion will occur than would otherwise be expected.

While there is experimental evidence to support some increase in fusion reaction rate by tailoring laser beam and target, to compare with deuterium-tritium reactions the avalanche effect would need to increase the fusion reaction rate by more than 100,000 times at 100,000,000℃. There is no experimental evidence for an increase of this magnitude.

Where to from here?

The experiments with hydrogen and boron have certainly produced fascinating physical results, but projections by Hora and colleagues of a five-year path to realising fusion power seem premature. Others have attempted laser-triggered fusion. The National Ignition Facility in the US, for example, has attempted to achieve hydrogen-deuterium fusion ignition using 192 laser beams focused on a small target.

These experiments reached one-third of the conditions needed for ignition for a single experiment. The challenges include precise placement of the target, non-uniformity of the laser beam, and instabilities that occur as the target implodes.

 

These experiments were conducted at most twice per day. By contrast, estimates suggest that a power plant would require the equivalent of 10 experiments per second.

The development of fusion energy is most likely to be realised by the mainstream international program, with the ITER experiment at its core. Australia has international engagement with the ITER project in fields of theory and modelling, materials science and technology development.

Much of this is based at the ANU in collaboration with Australian Nuclear Science and Technology Organisation, which is the signatory to a cooperation agreement with ITER. That said, there is always room for smart innovation and new concepts, and it is wonderful to see all kinds of investment in fusion science.The Conversation

Matthew Hole, Senior Research Fellow, Mathematical Sciences Institute, Australian National University.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Tiny Microbe May Be The Perfect Time Capsule For Messaging Far Future Civilisations

If humanity fails to find a way through this train wreck of a climate crisis, we might want a form of record-keeping that doesn’t decay after a generation or two. You know, just to let future intelligences know we had some redeeming qualities.

 

Artist Joe Davis from Harvard University thinks he has a solution. If you really want to leave your 30th-century descendants a work of art, embed it in the genes of the salt-loving microbe, Halobacterium salinarum.

Davis’s suggestion, described in a new paper, isn’t the first to propose using nucleic acid as a memory solution. Three years ago, Microsoft also announced plans for devising technology that would see them store information in DNA data banks.

It’s not hard to see the appeal of this kind of data storage. Depending on who you ask and how you crunch the numbers, all of the chromosomes in a single human cell could match a pair of CDs for storage space.

That extreme compactness means the sum of all the world’s data could, in theory, fit into a double-garage as strings of A, G, T, and C.

Given the rate at which that library of data grows every year, we might want to think about efficient ways to contain it.

It’s not just a nice theory. Researchers have previously crammed books, images, and even movies into a format that life has been using for billions of years.

 

Unfortunately, if you wanted every Marvel film in the MCU translated into a droplet-sized DVD, you’d still need to have plenty of patience. Companies are making headway on automating and speeding it up, but it’s not exactly the future of Netflix.

But in addition to its small size, the idea has one other appealing factor.

Unlike that stash of VHS tapes at the top of your closet, or even the CD-ROM games buried in your drawer, a properly stored DNA memory bank can be relied upon to preserve its information for the long haul.

That’s all well and good if you have a deep freezer and aren’t expecting a major interruption to electricity in the next thousand years or so, but how do we ensure our data survives in the long term?

Researchers suggest the best method could be to find a hardy little archivist that ensures the data is kept in check the old-fashioned way without our help.

“If all other life is destroyed on Earth, and this is the only thing left, maybe that information could propagate on its own,” biological engineer Jeff Nivala from the University of Washington tells Steve Nadis at Science Magazine.

 

In Davis’s study, his nomination for the best bug for the job isn’t a bacterium, strictly speaking, but a salt-tolerant microorganism called an archaeon.

H. salinarum is right at home in highly salty environments, so it’s proven it can put up with the stress of a hostile wasteland.

Buried in salt and deprived of nutrients, it simply stays put, shutting down and refusing to reproduce until conditions improve.

Whereas oxygen and radiation would corrupt a CD in short order, H. salinarum would use its talent for repairing oxidative damage to ensure all of the data was kept in relatively pristine condition for centuries or more.

Davis doesn’t have a background in biology, himself, but hasn’t let that stop him working with a more qualified team of researchers to demonstrate the microbe’s potential over other candidates.

For a code to preserve, Davis created two 3D art pieces (seen below) inspired by a rather apt Russian folktale called “Koschei the Deathless“, which were reduced to a string of coordinates and translated into base codes before being inserted into a spot in H. salinarum‘s genome.

Egg and a needle 3D art peices(Davis et al., biorxiv.org, 2020)

Even after the altered cells copied themselves several times over, their precious message remained stable.

While the intent may sound more sci-fi than practical, there’s still much to learn about H. salinarum’s metabolism that this kind of research could reveal.

The next phase is to store them away in salt for a number of years before checking in on the code once again, and perhaps tag some proteins to see if they move around while the tiny archivist sleeps.

The study is available on the pre-print website bioRxiv.org.