Archive for the ‘Physics’ Category

Atomic Hard Drive

August 16, 2016

It is easy to become depressed about the state of the world right now. We have what are probably the worst two candidates running for president in this election. The whole world seems to be falling apart and terrorist attacks are starting to become a daily occurrence in Europe and America and our leaders express confusion over their motives; obviously Islam has nothing to do with the Islamic State. The economy seems to be stagnant with the 1% getting ever richer and the rest of us struggling to keep in place.

But this kind of thinking is misleading. We do have problems, yet in so many ways, life in the twenty-first century is better than it has ever been. Our lives are far more comfortable in almost every material sense than those of the people who lived a century ago, thanks to the enormous progress we have made in science and technology. The day-to-day bad news, which tends to depress us, is really a distraction from all the amazing discoveries and inventions that will be changing our lives over the rest of the century.

Here’s a story I read in the Wall Street Journal about one of these discoveries.

By manipulating the interactions between individual atoms, scientists report they have created a device that can pack hundreds of times more information per square inch than the best currently available data-storage technologies.

The working prototype is part of a decades-long attempt to shrink electronics down to the atomic level, a feat scientists believe would allow them to store information much more efficiently, in less space and more cheaply. By comparison, tech companies today build warehouse-sized data centers to store the billions of photos, videos and posts consumers upload to the internet daily. Corporations including International Business Machines Corp. and Hewlett Packard Enterprise Co. also have explored research to reduce such space needs.

The so-called atomic-scale memory, described in a paper published on Monday in the scientific journal Nature Nanotechnology, can hold one kilobyte, the equivalent of roughly a paragraph of text.

It may not sound “very impressive,” said Franz Himpsel, a professor emeritus of physics at the University of Wisconsin, Madison, who wasn’t involved in the study. But “I would call it a breakthrough.”

Most previous attempts at encoding information with atoms, including his own, managed roughly one byte, Dr. Himpsel said. And data could be stored only once. To store new information, the “disk” had to be re-formatted, like CD-Rs popular in the ’90s.

With the new device, “we can rewrite it as often as we like,” said Sander Otte, an experimental physicist at Delft University of Technology in the Netherlands and the lead author on the new paper.

They can actually arrange individual atoms. When I was growing up, no one had ever seen an atom. They were too small to be imaged, even by electron microscopes. Scientists did not invent the scanning tunneling microscope, which allows individual atoms to be “seen” and manipulated until the 1980’s.

Scanning tunnelling microscope

Scanning tunnelling microscope

To build their prototype, the scientists peppered a flat copper bed with about 60,000 chlorine atoms scattered at random, purposely leaving roughly 8,000 empty spaces among them. A mapping algorithm guided the tiny, copper-coated tip of a high-tech microscope to gently pull each chlorine atom to a predetermined location, creating a precise arrangement of atoms and neighboring “holes.”

The team also crafted a language for their device. The stored information is encoded in the patterns of holes between atoms. The atom-tugging needle reads them as ones and zeros, turning them into regular binary code.

The researchers marked up the grid with instructions that cued the software where it should direct the needle to write and read data. For instance, a three-hole diagonal line marked the end of a file.

They still have a lot of work to do before our computers come equipped with an atomic hard drive.

Writing the initial data to the device took about a week, though the rewriting process takes just a few hours, Dr. Otte said.

“It’s automated, so it’s 10 times faster than previous examples,” said Christopher Lutz, a staff scientist at IBM Research-Almaden in San Jose, Calif. Still, “this is very exploratory. It’s important not to see this one-kilobyte memory result as something that can be taken directly to a product.”

Reading the stored data is much too slow to have practical applications soon. Plus, the device is stable for only a few hours at extremely low temperatures. To be competitive with today’s hard drives, the memory would have to persist for years and work in warmer temperatures, said Victor Zhirnov, chief scientist at the Semiconductor Research Corp., a research consortium based in Durham, N.C.

When Dr. Otte’s team took the memory out of the extremely low-temperature environment in which it was built and stored, the information it held was lost. Next, his team will explore other metal surfaces as well as elements similar to, but heavier than, chlorine, to see if that improves the device’s stability.

But, maybe it will happen sooner than we think.

We truly live in a brave new world. If only we stop ourselves from messing everything up.

Advertisements

Living in God’s Matrix

June 22, 2016

Physicist Michio Kaku has come to believe that we live in a universe designed by some sort of intelligence, God for lack of a better word, at least according to this article by Mark Ellis in Godreports, which I found via this post by Walter Hudson at PJMedia.

I wish I knew what those equations meant.

I wish I knew what those equations meant.

Theoretical physicist, futurist, and bestselling author Michio Kaku has developed a theory that points to the existence of God using string theory.

String theory assumes that seemingly specific material particles are actually “vibrational states.”

His view about intelligent design has riled the scientific community because Dr. Kaku is considered one of its most respected and prominent voices. He is the co-creator of string field theory, a branch of string theory.

“I have concluded that we are in a world made by rules created by an intelligence,” he stated, according to the Geophilosophical Association of Anthropological and Cultural Studies.

“I have concluded that we are in a world made by rules created by an intelligence,” he stated, according to the Geophilosophical Association of Anthropological and Cultural Studies.

Dr. Kaku has continued Einstein’s search for a “Theory of Everything,” seeking to unify the four fundamental forces of the universe—the strong force, the weak force, gravity and electromagnetism.

The very purpose of physics, says Kaku is “to find an equation … which will allow us to unify all the forces of nature and allow us to read the mind of God.”

Because string theory may provide a unified description of gravity and particle physics, it is considered a candidate for a Theory of Everything.

To reach his conclusions about intelligence behind the universe, Dr. Kaku made use of what he calls “primitive semi-radius tachyons.”

A tachyon is a particle that always moves faster than light. Many physicists believe such particles cannot exist because they are not consistent with the known laws of physics.

As noted by Einstein and others, special relativity implies that faster-than-light particles, if they existed, could be used to communicate backwards in time.

Dr. Kaku used a technology created in 2005 that allowed him to analyze the behavior of matter at the subatomic scale, relying on a primitive tachyon semi-radius.

When he observed the behavior of these tachyons in several experiments, he concluded that humans live in a “matrix,” a world governed by laws and principles conceived by an intelligent architect.

“I have concluded that we are in a world made by rules created by an intelligence, not unlike a favorite computer game, but of course, more complex and unthinkable,” he said.

“By analyzing the behavior of matter at the subatomic scale affected by the semi tachyon pitch radius, what we call chance no longer makes sense, because we are in a universe governed by established rules and not determined by universal chances plane.

Dr. Michio Kaku seems to be moving into the Intelligent Design camp, at least in so far as he believes that the order and established rules that we observe in the universe imply a mind behind the material reality. This is more of a Deist, or Theist, concept of divinity than the traditional Judeo-Christian God. The idea that the reality we know is some kind of simulation by an ultra-advanced computer is another explanation for the apparent order and logic we see in the universe which has gained a sort of vogue lately. Perhaps the two explanations, intelligent design, and computer simulation are the same, since an intelligence that can create a simulation of the entire universe might be a god to the people living in the simulation. For all we know, any player of a game like the Sims is a god to the sims in the game, if the sims had intelligence and free will.

The problem with these sorts of speculations is that they cannot be considered scientific explanations or hypotheses because there is no way to prove or disprove the ideas. We only know the one universe we are living in and have no way, at present, of learning about any other universes. We have no way compare our, possibly designed or simulated, universe with one that has not been designed or is not a computer simulation. This matter belongs properly to metaphysics rather than to physics and while Dr. Kaku’s views on string theory ought to be respected as belonging to an expert in the field, in his views on whether God or an Intelligence Designer or Programmer is ultimately behind the world he studies, he shares with the rest of us a common ignorance. On these matters, we cannot know truths with the same certainty we know about the behavior of atoms. We can only have faith.

 

Four New Elements

January 16, 2016

Despite all the problems and tumult in the world, the progress of science marches on. According to this article in Yahoo News, the International Union of Pure and Applied Chemistry has officially added four recently discovered elements to the periodic table.

Four new elements have been permanently added to the periodic table, after their discoveries were verified by the global chemistry organization that oversees the table. The International Union of Pure and Applied Chemistry (IUPAC) last week announced that elements 113, 115, 117, and 118 have met its criteria for discovery, making them the first elements to be added to the periodic table since 2011. Their addition also completes the seventh row of the periodic table.

All four man-made elements currently have placeholder names, and will be officially named over the next few months. Elements 115, 117, and 118 were discovered by a team of scientists from the Joint Institute for Nuclear Research in Dubna, Russia and Lawrence Livermore National Laboratory in California. The Russian-American team had also claimed discovery of element 113, currently known as ununtrium, but IUPAC credited a team from the Riken institute in Japan. Element 113 will therefore be the first element to be named by researchers in Asia.

“Greater value than an Olympic gold medal.”

Discovering superheavy elements has proven difficult because they rapidly decay. But research has revealed slightly longer lifetimes for more recent superheavy elements, raising hopes that scientists may eventually discover the so-called “island of stability” — a group of elements that are both superheavy and stable. Kosuke Morita, who led research on element 113 at Riken, said in a statement that his team will now “look to the uncharted territory of element 119 and beyond.”

“To scientists, this is of greater value than an Olympic gold medal,” Ryoji Noyori, the former president of Riken and Nobel laureate in chemistry, tells The Guardian.

 

As the article states, the IUPAC gives temporary, or placeholder, names to newly discovered elements until there is some consensus on the official names. This naming can be contentious if more than one team makes a credible claim to be the discoverer of the element. The placeholder name is simply the atomic number of elements expressed in Latin or Greek with “-ium” added. Thus, element 113 is ununtrium. Elements 115, 117, and 118 are ununpentium, ununseptium and ununoctium.

The-Periodic-Tables-4-New-Elements

These elements are spoken of as being discovered, but it would be more accurate to say that they have been created or synthesized since no element with an atomic number higher than 92, uranium, is found in nature. Elements with higher atomic numbers are radioactive with half-lives too short to have survived since the creation of the Earth and solar system. Every element has unstable, or radioactive, isotopes but every element with an atomic number up to 82, lead, with the exception of technetium, atomic number 43, has at least one staple isotope. I am not sure if scientists know precisely why some isotopes of some elements are stable while others are unstable, but it seems to have something to do with the proportions of protons and neutrons in an atomic nucleus. Thus carbon-12 with 6 protons and 6 neutrons is stable while carbon-14 with 6 protons and 8 neutrons is radioactive. Uranium-238 with 92 protons and 146 neutrons is weakly radioactive with a half-life of over four billion years, but uranium-234 with 92 protons and 142 neutrons is slightly more radioactive with a half-life of 246,000 years.

Periodic table with elements colored according to the half-life of their most stable isotope.   Stable elements: Elements which contain at least one stable isotope;   Slightly radioactive elements: the most stable isotope is very long-lived, with half-life of over two million years;   Moderately radioactive elements: the most stable isotope has half-life between 800 and 34,000 years;   Highly radioactive elements: the most stable isotope has half-life between one day and 103 years;   Significantly radioactive elements: the most stable isotope has half-life between one minute and one day;   Extremely radioactive elements: the most stable isotope has half-life less than a minute. Very little is known about these elements due to their extreme instability and radioactivity.

For elements with atomic numbers higher than uranium’s, there is a tendency to be more radioactive with  shorter half-lives. Americium with an atomic number of 95 has a half-life of around 7370 years. Fermium with atomic number 100 has a half-life of 100 days. Dubnium, atomic number 105, has a half-life of about 28 hours. Elements with higher atomic numbers have half-lives measured in hours, minutes, or seconds. The most stable isotopes of the four new elements, 113, 115, 117, and 118 have half-lives of 20 seconds, 220 milliseconds, 51 milliseconds, and .89 milliseconds, respectively. Strangely, this decline in the length of the half-life does not seem to be as great as expected and it is hoped that at some point there will be the Island of Stability mentioned in the article, where larger atomic nuclei will have the right configuration of protons and neutrons to permit some degree of stability, though whether such atoms will last for seconds, days or years is unknown. Since the processes by which these super heavy elements are created generally only makes a few atoms at a time and these decay quickly the hope is that the elements in the island of stability will be stable enough to permit some research into the chemical and physical properties of super heavy elements.

I don’t suppose there is much practical use for these discoveries, though you never know, but it is a refreshing change to read about people who are adding to humanity’s store of knowledge about the world as opposed to those intent on tearing everything down.

Galileo Was Wrong

May 31, 2015

That is the idea behind an odd website that I found which promotes the theory of geocentrism or the idea that the Earth is the center of the Solar System and that the Sun revolves around the Earth. Geocentrism was, of course, the idea held by every astronomer and scientist up until 1543 when Nicolaus Copernicus proposed his heliocentric, or Sun centered, model of the Solar System. For about a century there was a fierce debate among scientists and philosophers over the true structure of the universe. Heliocentrism won out, of course, and no educated person of the twenty-first believes that the Earth is at the center of the universe. Because of this, those who historically had supported Copernicus’s model, such as Galileo are held to be on the right side of science and history, while those who clung to the older geocentrism, such as many officials of the Roman Catholic Church, seem to have been backwards and on the wrong side. This website contends that in the controversy between Galileo and the Catholic Church, the Church was, in fact, in the right and Galileo was in the wrong, hence the title. The strange thing is that the website is actually correct, in a funny sort of way. Galileo really was in the wrong and the Church was right to be skeptical of Copernicus’s theories.

The controversy between Galileo and the Church has often been depicted as part of the never ending battle between the light of science and religious ignorance. It is generally accepted by historians today that Galileo’s troubles had far less to do with an alleged anti-science position taken by the Catholic Church and more to do with contemporary Italian politics and Galileo’s own irascible personality. What is generally less well known is that the Church had good scientific reasons to oppose Galileo. Neither Copernicus or Galileo had any way to prove that the Earth moves around the Sun. With his telescope Galileo did discover the four largest moons of Jupiter and the fact that Venus shows phases, which indicates that it orbits the sun. These discoveries were certainly  suggestive in that they showed that not everything in the sky directly orbited the Earth, but it was possible that while Venus and the Galilean satellites orbited the Sun and Jupiter, the Sun and Jupiter revolved around the motionless Earth. In fact, there wouldn’t be any conclusive proof that the Earth moves until eighty years after Galileo’s death when the astronomer James Bradley discovered the phenomenon of the aberration of light caused by the Earth’s motion through space.  By this time, there was hardly any doubt about the Earth orbiting the Sun.

Why were astronomers so quick to discard the millenia-old and common sense idea that the Earth rests motionless at the center of the universe without any direct proof? The answer is that a heliocentric Solar System better accounted for the motion of the planets. In order to understand this, we will have to go back to the origins of the science of astronomy.

No one knows where or when people began to really observe the night sky and take note of the motions of the heavenly bodies. It must have seemed obvious that the Earth was a flat surface with the sky a dome enclosing it. The Sun, Moon and stars rose into the sky moved from East to West across the sky and then set beneath the Earth. At some point, these early observers noticed that not all of the objects in the sky moved along with the background of stars. These objects seemed to wander about the sky, so the Greeks referred to them as planetes, or wanderers. There were seven of these “planets”. the Sun and Moon, and five star like objects that were named Mercury, Venus, Mars, Jupiter, and Saturn. It may seem strange to refer to the Sun and Moon as planets but they like the other planets, moved across the sky against the background of the fixed stars.

The Greeks and the Romans knew that the Earth is round and the Greek philosophers such as Plato and Aristotle held that the circle was the perfect shape. Because they believed that the Heavens were perfect and unchanging, as opposed to our corrupt and changing Earth, they believed that the seven planets orbited the Earth in perfect circles. This was the model proposed by the ancient Greek astronomers, especially the last and greatest of the Hellenistic astronomers, Claudius Ptolemy, who lived in the second century AD. For this reason the geocentric model is often called the Ptolemaic model.

There was a major problem with the model proposed by the ancient Greeks, the planets do not move in perfect circles across the sky from West to East against the background of the stars. The planets moved in different paths across the sky and at different speeds. Sometimes they seemed to move backwards in what was called retrograde motion.

This image was created as part of the Philip G...

The retrograde motion of Mars(Photo credit: Wikipedia)

As Earth passes Mars, the latter planet will t...

As Earth passes Mars, the latter planet will temporarily appear to reverse its motion across the sky. (Photo credit: Wikipedia)

 

This apparent retrograde motion is is observed because the planets revolve at varying distances from the Sun and so orbit at varying velocities around the Sun. The Earth being closer to the Sun than Mars travels faster than Mars and so occasionally overtakes the other planet. Venus and Mercury travel faster than Earth and overtake us occasionally. You might be able to get an idea of how this works by considering a group of cars travelling on an Interstate. If I am driving at 60 miles per hour and pass a car that is going at 55 miles per hour, that other car will seem to be going backwards even though we are both going in the same direction. As a car travelling 65 miles per hour passes me, I will seem to be moving backwards to them.

 

In order to account for these dependencies between Aristotelian theory and astronomical observations, Greek astronomers hypothesized that while the planets move in perfect circles, they also moved in smaller circles within the circles. Thus, the heavens were full of wheels within wheels. It was Claudius Ptolemy who developed this system to the form that was used in Medieval astronomy.

epicycle-epicycle

It is easy to disparage Ptolemy for developing such a cumbersome system of concentric circles, but remember he did not have the telescope or many of the instruments invented to observe the motions of the planets that came into use in later years. He certainly cannot be blamed for assuming that the Earth is motionless. After, we cannot feel the Earth move and if we had to go by our own personal observations, we could only conclude the same. In fact, Ptolemy’s system was able to predict the motions of the planets with a high degree of accuracy and this was what the ancient and medieval astronomers were most concerned with.

There is much more to say about how later generations of Islamic and European astronomers refined and improved Ptolemy’s model as better astronomical instruments were invented and how Ptolemy came to be at last dethroned, but I am afraid that will have to wait for another post.

 

Below Zero

January 6, 2014

As I write this, the temperature here in Madison Indiana is -4° Fahrenheit with wind chill down to around -27°. It is cold outside. It is even cooler than I would like inside, even with the heat on. To distract myself from this winter horror, I will try to think warm thoughts and write a little about temperature. What does it mean to say the temperature is 30 degrees or 100 degrees? What exactly are we measuring? Shouldn’t zero degrees be the coldest possible temperature?

People have known that some days are hotter or colder than other days since time immemorial. Before the invention of the thermometer, it was not possible to measure just how much hotter or colder. People could not quantify or measure temperature, except by personal perception, which is subjective.  A person might feel that it is getting warmer, but he could not be sure if the environment was actually getting warmer, or that he was simply feeling warmer, perhaps because he was exerting himself. Also, there was no way of determining just how warmer today was than yesterday. Notice that I am talking about temperature rather than heat. The two concepts are related but are not the same thing. In any substance the atoms and molecules that make up that substance are not standing still but are moving about. In a liquid or a gas, the atoms can move about freely, while in a solid, they are held in place but still vibrate back and forth. In a sense then, the temperature of an object is the average kinetic energy of the atoms in that object. Heat is defined by physicists as the transfer of thermal energy from a warmer body to a colder body. Heat is not measured by degrees but by joules or calories. ( The calories on food labels are actually kilocalories.)

Thermally_Agitated_Molecule

The basic principle on which the thermometer works was actually discovered in ancient times. Hero of Alexandria knew that air expanded or contracted based the temperature and invented a thermometer of sorts by placing a closed tube with its open end in a container of water. The water would move up or down in the tube according to the temperature. Galileo constructed a similar device, as did several other renaissance scientists. None of these devices had a scale, however, so it was still not possible to quantify temperature with them. They were also sensitive to air pressure.

The first thermometer with a scale was invented by either Francesco Sagredo or Santorio Santorio around 1611-1613.

Deutsch: Santorio Santorio. Français : Portrai...

In 1714, Daniel Gabriel Fahrenheit invented a thermometer which used mercury in a glass tube. Once it became possible to

English: Gabriel Daniel Fahrenheit

English: Gabriel Daniel Fahrenheit (Photo credit: Wikipedia)

manufacture thermometers on a standard design, it was also possible to develop a standard scale. Fahrenheit developed such a scale in 1724. He used three points to calibrate his scale. The temperature of a mixture of water, ice, and ammonium chloride was designated as zero. The temperature of water just as ice began to form was set at 32 and human body temperature at exactly 96. Later, it was discovered that there are about 180 of Fahrenheit’s degrees between the melting and boiling points of water so the scale was calibrated to make exactly 180 degrees so that the boiling point of water on the Fahrenheit scale is 212°. The Fahrenheit Scale is the one most used in the United States and is still widely used in Britain and Canada.

In 1742 Anders Celsius developed a scale in which there were one hundred degrees between the melting and boiling points of water. Curiously, he designated the boiling point of water as 0 and the melting point as 100 so the temperature measurement got lower as it got hotter. The Celsius scale was reversed and adopted as part of the metric system. This scale, sometimes called centigrade, is used worldwide, especially by scientists. Conversion between the two scales is easy enough. Because there are 180 degrees Fahrenheit between the melting and boiling points of water, but only 100 degrees Celsius, each degree Fahrenheit is 9/5 of a degree Celsius. Since Fahrenheit has the melting point of water at 32°, to convert from Fahrenheit to Celsius you subtract 32 and then multiply by 9/5. To convert from Celsius to Fahrenheit, multiply by 5/9 and then add 32.

Anders Celsius

Anders Celsius

The coldest possible temperature, at which the atomic motion stops, is called absolute zero. This is -459.67° Fahrenheit or -273.15°. It is not actually possible to reach absolute zero, but scientists have come close. The lowest temperature ever recorded in a laboratory is around .oooooooo1 degrees Celsius. In 1848, the British physicist William Thompson, later to be Lord Kelvin, proposed a temperature scale using degrees Celsius which began at absolute zero. The Kelvin scale is slightly different from other scales in that it does not rely on the physical properties of any materials, being based on absolute zero. Temperatures in the Kelvin scale are measured in “Kelvins” rather than degrees so that you may say that the melting point of water is 273 K. The Kelvin scale is also extensively used by scientists, especially those who work with very low temperatures.

Lord Kelvin

Lord Kelvin

It’s not working. All of this writing about absolute zero is just making me feel colder.

Enhanced by Zemanta

Fusion Breakthrough

October 21, 2013

This is a story from earlier this month that I have been meaning to write about, but somehow didn’t get around to it until now. It may be that there has been a breakthrough in the efforts to produce a controlled thermonuclear reaction on Earth which produces more energy than it consumes. If there is anything at all to this story and fusion power becomes practical, we could be on the verge of a golden age of unlimited energy. If.

I read this report from the BBC.

Harnessing fusion – the process that powers the Sun – could provide an unlimited and cheap source of energy.

But to be viable, fusion power plants would have to produce more energy than they consume, which has proven elusive.

Now, a breakthrough by scientists at the National Ignition Facility (NIF) could boost hopes of scaling up fusion.

NIF, based at Livermore in California, uses 192 beams from the world’s most powerful laser to heat and compress a small pellet of hydrogen fuel to the point where nuclear fusion reactions take place.

The BBC understands that during an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel – the first time this had been achieved at any fusion facility in the world.

This is a step short of the lab’s stated goal of “ignition”, where nuclear fusion generates as much energy as the lasers supply. This is because known “inefficiencies” in different parts of the system mean not all the energy supplied through the laser is delivered to the fuel.

But the latest achievement has been described as the single most meaningful step for fusion in recent years, and demonstrates NIF is well on its way towards the coveted target of ignition and self-sustaining fusion.

For half a century, researchers have strived for controlled nuclear fusion and been disappointed. It was hoped that NIF would provide the breakthrough fusion research needed.

In 2009, NIF officials announced an aim to demonstrate nuclear fusion producing net energy by 30 September 2012. But unexpected technical problems ensured the deadline came and went; the fusion output was less than had originally been predicted by mathematical models.

Soon after, the $3.5bn facility shifted focus, cutting the amount of time spent on fusion versus nuclear weapons research – which was part of the lab’s original mission._70337942_70337940

However, the latest experiments agree well with predictions of energy output, which will provide a welcome boost to ignition research at NIF, as well as encouragement to advocates of fusion energy in general.

If this story has anything to it than it could be the biggest story of the year, far more important than all this idiotic political theater about shutdowns and defaulting. I will look forward to hearing about any additional progress this team makes.

 

G is for Gravitational Constant

September 24, 2013

In the science is never settled department, the generally accepted measurement for the Gravitational Constant may be off by a little bit.

A trio of researchers working in France, along with a colleague from the U.K. has re-measured the gravitational constant using the same apparatus they built 12 years ago and have found a small change. In their paper published in Physical Review Letters, the team describes how they reconfigured their original equipment to re-measure the gravitational constant and this time came up with a slightly higher number than before.

The gravitational constant, denoted by G in math equations, has proven to be far more elusive than scientists imagined after it was first measured by Henry Cavendish approximately 200 years ago. The problem is that it’s far weaker than other forces such as . Fluctuating stronger forces acting on measurement equipment can cause changes to readings, leading to an inaccurate result. For that reason, scientists have been striving to come up with a way to definitively measure exactly how much force G exerts. In this new effort, the research team went back to the same apparatus they constructed 12 years ago—one that simultaneously measures G in two different ways. This time around, however, they reconfigured their device in ways they believed would make it more accurate—and in so doing found a slightly different result, but now, aren’t sure which of their is actually more accurate.

Modern researchers use two main types of methods to try to measure G, the first is a more advanced way to do the same thing Cavendish did two centuries ago, using lasers instead of candle light—it’s based on measuring the amount of applied to a thin ribbon set between heavy balls. The other involves applying voltage to a wire using a servo to counteract twisting due to G. In this renewed effort, the researchers ran both types of measurements in their device and averaged the results. In so doing, they discovered measurements revealed a value of 6.67545(18)x10-11 m3 kg-1 s-2, with 27PPM standard uncertainty. This value is 21PPM lower than the last time they ran the experiment (measurements by others have ranged as far as 241 ppm lower). The team is unable to explain why they found a difference, and cannot say with confidence which of their measurements is likely closer to G’s actual value.

Research into ways to better measure G will continue of course, with the hope that one day a method will be devised that will not be subject to other more powerful forces, or interpretation.

What is the Gravitational Constant? Basically the Gravitational Constant or G is a constant in the equation used to calculate the gravitational force between two objects. It would probably be best to begin with the man who discovered gravity, Sir Isaac Newton.

Before his time people just sort of floated around

Before his time people just sort of floated around

Well, he didn’t actually discover it. People had known since the beginning that there was a force which held them to the ground. Newton was the first one who realized that this was the same force that caused the planets to orbit the Sun and the Moon to orbit the Earth. On other words, Newton realized that gravitation was a universal force that caused every object in the universe to attract every other object. He also realized, with some help from the English scientist and architect Robert Hooke, that gravity diminished in strength inversely proportional to the distance between the objects. The equation that Isaac Newton developed to express the force of gravitational attraction was:

F = G \frac{m_1 m_2}{r^2}\

where F is the gravitational force between two objects, m1 and m2 are the masses of the two objects and r2 is the distance between the two objects squared. G is, of course, the Gravitational Constant. The metric unit of force is the newton. One newton is the amount of force needed to accelerate one kilogram by one meter per second per second. If the distance is measured in meters and the masses in kilograms than G is approximately 6.674 X 10-11 N(m/kg)2.

Newton was not able to measure G. As the article states, the first person to actually determine G was Henry Cavendish. Cavendish was a shy and somewhat eccentric man whose social life consisted almost entirely of attending meetings of the Royal Society. He was so shy around women that he could only communicate with his female servants by writing notes and he supposedly add a back staircase added to his house to avoid meeting them. Despite these eccentricities, Cavendish was able to make several important discoveries. He discovered the element hydrogen and helped disprove the long standing phlogiston theory in chemistry. He studied the transfer of heat and helped to develop the science of thermodynamics. He also attempted to determine the average density of the Earth so that he could estimate its mass. It was during the latter experiment that he determined the gravitational constant, thought he expressed G in terms of Earth’s density. Cavendish’s method consisted of a torsion balance made of a six foot long wooden rod on which were suspended two lead weights around 1.6 pounds. Two larger lead weights of around 350 pounds were suspended on a separate balance next to the smaller balls. With this device, Cavendish was able to measure the very small gravitational attraction between the weights.

Cavendish_Experiment

His result was almost identical to the present measurement of G. The problem is that the force of gravity is so weak that it is very difficult to get a really accurate measurement of the gravitational constant and as a result, scientists have not been able to improve very much on Cavendish’s results. Even with more accurate equipment and more advanced technology, the best measurements often get differing results.

You may wonder how gravity could be considered a very weak force. After all, it is the force that holds us to the ground, the Earth in its orbit around the Sun, and creates black holes. The truth is that gravity is considerably weaker than the other fundamental physical forces of electromagnetism, and the strong and weak nuclear forces. Think of it this way, a small magnet can hold a piece of metal against the gravitational force of the entire Earth. Gravity is 10-36times weaker than the strong nuclear force, that’s .000000000000000000000000000000000001 times weaker. By contrast, electromagnetism is only 100 times weaker that the strong force. Why gravity is so weak is one of the unanswered questions in physics, though it is a good thing for us that it is so weak. If gravity were very much stronger, the universe would have collapsed in on itself shortly after the beginning and the Big Bang would have been a big fizzle.

 

 

 

Helium Shortage

August 19, 2013

We may be facing a shortage of helium in the not too distant future. This may seem like a trivial problem. We can live without balloons, right? Actually, helium is used for a number of industrial processes and a shortage, with a corresponding increase in prices could be serious. As you might expect, government has helped to create the problem. Here is the story in the Washington Post.

Earlier this spring, there was a rare bipartisan flurry of activity around something almost every legislator could agree on: Avoiding a sudden lapse in the national supply of helium.

After years of warnings about rising worldwide demand, Congress remembered that a 1996 law demanded the shutdown of the Federal Helium Reserve–a vast underground lake of gas that stretches from Texas to Kansas–just as soon as it paid off the cost of its creation. That will happen at the end of this fiscal year, October 1. If nothing changes, the rest of the 10 billion cubic feet would have to stay underground, cutting off 40 percent of U.S. consumption, while the cost goes through the roof.

In April, the House made short work of a bill that would keep the program operating. The Senate Committee on Energy and Natural Resources followed suit in June. And then: Nothing. Congress leaves for recess today, and no vote is scheduled; the Senate leadership office didn’t return calls for confirmation on whether the bill would be brought to the floor.

And that’s making the folks who run the helium reserve very nervous.

“We are contingency planning for a shutdown of the Amarillo facility,” said regional Bureau of Land Management spokeswoman Donna Hummel, referring to the program’s 47-person office. “We will be providing notices to employees of Amarillo, private refiners and storage contract holders–companies that store their helium in our reservoir. If we shut this down, you can imagine some consequences there.”

Yeah, no kidding. Helium isn’t just a party gas–it’s also used in a wide range of advanced manufacturing processes, like making computer chips and optical fibers, as well as research and medical procedures, like cooling magnets for MRIs and visualizing lung tissue. That’s why corporations like Intel lined up to push for the helium reserve’s continued operation, along with private refiners that use pieces of the federal infrastructure. Then there are all the government users–scores of universities and military agencies that get a special rate on helium for things like rocket systems and chemical warfare testing. Most of us owe some piece of our daily lives to helium, without even realizing it.

That wasn’t the case in the mid-1990s, when Congress passed the Helium Privatization Act, giving the Bureau of Land Management a date certain for when it would have to get out of the business. Technically, that isn’t until 2015, but the reserve ended up selling off enough helium to pay back the $1.3 billion loan at a faster-than-anticipated clip.

“Our good work is being punished,” sighs Hummel. “We should’ve dragged our feet a little bit, because we really had two years.”

The Senate bill does solve the problem at least in the short term, allowing the Amarillo office to live off its own revenue selling helium until it gets down to 3 billion cubic feet, which will be retained for federal use. After that, Amarillo will be reduced to a skeleton staff (an earlier reduction in force got rid of most of the younger employees, so most are nearing retirement anyway). And then, the feds will manage helium extraction on government-owned land just like any other natural resource, like natural gas (of which helium is actually a byproduct).

My understanding is that by forcing the sale of so much helium, Congress has helped to push the price below market levels, encouraging increased sales and wasting. No matter what happens with the helium reserve, the price will almost certainly increase.

You might wonder why there could possibly be a shortage of helium since it is the second most common element in the universe. Helium is common throughout the universe, but not here on Earth. Hydrogen and helium have the lightest atoms and the Earth’s gravity is not strong enough to hold them, so there has been a steady leakage of these elements from the Earth’s atmosphere. Hydrogen is very reactive and its atoms combine readily with other atoms to form compounds so most of our hydrogen is still on Earth, in water, rocks, etc. Helium, on the other hand is the most noble of the noble gases. Helium atoms do not combine with any other atoms, so whatever helium was present at the Earth’s creation is mostly long gone. Most of the helium present today is the result of the alpha decay of radioactive elements like uranium, and it appears as a byproduct of natural gas.

Maybe I should start hoarding helium balloons and canisters. There is no telling how valuable each balloon will be twenty years from now.

Helium canister

It could make my fortune. (Photo credit: Get Folksy)

 

Seeing Atomic Bonds

June 2, 2013

That last post left me a little depressed, so I think I will go ahead and write about something fascinating to cheer myself up. I found this story in Wired, courtesy of Instapundit, which I think is absolutely amazing.

For the first time, scientists have visually captured a molecule at single-atom resolution in the act of rearranging its bonds. The images look startlingly similar to the stick diagrams in chemistry textbooks.

Until now, scientists were only able to infer molecular structures. Using atomic force microscopy, the individual atomic bonds — each a few ten-millionths of a millimeter long – that connect the carbon molecule’s 26 carbon and 14 hydrogen atoms are clearly visible. The results are reported online May 30 in Science.

The team initially set out to precisely assemble nanostructures made from graphene, a single-layer material in which carbon atoms are arranged in repeating, hexagonal patterns. Building the carbon honeycombs required rearranging atoms from a linear chain into the six-sided shapes; the reaction can produce several different molecules. UC Berkeley chemist Felix Fischer and his colleagues wanted to visualize the molecules to make sure they’d done it right.

To document the graphene recipe, Fischer needed a powerful imaging device, and he turned to the atomic force microscope housed in physicist Michael Crommie’s UC Berkeley lab. Non-contact atomic force microscopy uses a very fine, sharp point to read the electrical forces produced by molecules; as the tip is moved near a molecule’s surface, it’s deflected by different charges, producing an image of how the atoms and bonds are aligned.

With it, the team managed to visualize not only the carbon atoms but the bonds between them, created by shared electrons. They placed a ringed carbon structure on a silver plate and heated it until the molecule rearranged. Subsequent cooling trapped the reaction products, which as it turned out, contained three unexpected products and one molecule the scientists had predicted.

Here are the pictures that came with the article. I hope they don’t mind if I copy them.

reactant1

product2

product3

The images on the left are the result of the new technique. There is more of a resemblance to the pictures you would find in a chemistry textbook, such as the images on the right, than I would have expected.

What a brave new world that has such marvels in it, and how lucky I am to have lived to see it.

 

A Boy and His Atom

May 4, 2013

This has to be the coolest thing I have seen for a long time. This video is a stop motion film created by moving individual atoms around.

Here is more information about this project.

We’re having a hard time getting our heads around just how astoundingly small the scale is, here. Each frame of the IBM video measures a paltry 45 x 25 nanometers. A single inch measures 25 million nanometers across. Putting that into perspective, one nanometer is a thousandth of a thousandth of the size of a piece of rice. So, it would take about 1,000 frames of the film laid side-by-side to extend across a single human hair. Needless to say, this video is HUGELY magnified.

In light of the achievement, Guinness World Records has certified the 250 frame film as the “Smallest Stop-Motion Film.” The project showcases IBM’s efforts to design advanced data storage solutions based on single atoms.

IBM did it by moving atoms with a scanning tunneling microscope (STM). The computer-controlled device weighs two tons, operates at a temperature of -268 degrees Celsius (to make the atoms hold still), and magnifies surfaces over 100 million times. The microscope allows scientists to control temperature, pressure, and vibrations at extremely exact levels, thus making it possible to move atoms with great precision.

When making the stop-motion film, the researchers used the STM to control a super-sharp electrically charged needle along a copper surface. The needle was positioned a mere one nanometer away from the surface, from where it could physically pull atoms and molecules to an exact location. At such a close distance to the surface, the charge can “jump the gap” — an effect in quantum physics called tunnelling.

Interestingly, the atoms made a unique sound when they were moved, allowing the scientists to know how many positions they actually moved.

As the process moved along, the researchers rendered still images of the individually arranged atoms, creating the remarkable 242 frame movie. It took the IBM team two weeks of 18-hour days to complete.

“This movie is a fun way to share the atomic-scale world,” said IBM’s Andreas Heinrich. “The reason we made this was not to convey a scientific message directly, but to engage with students, to prompt them to ask questions.”

I don’t know if there is any practical use for this technology, but then as Benjamin Franklin said when asked what use was electricity, “What use is a newborn baby?” I think that the fact that the scientists at IBM have learned to move individual atoms around like blocks is absolutely amazing.


%d bloggers like this: