The Most Dangerous Technology Ever Invented – Part One & Two

The Most Dangerous Technology Ever Invented – Part One
Published on October 21, 2021
Written by Arthur Firstenberg

The Most Dangerous Technology Ever Invented – Part One & Two Mobile-phones

In 1995, the telecommunications industrywas preparing to introduce a dangerous new product to the United States: the digital cell phone. Existing cell phones were analog and expensive, owned mostly by the wealthy, used for only a few minutes at a time.

Many were car phones whose antennas were outside the car, not held in one’s hand and not next to one’s brain. Cell phones worked only in or near large cities. The few cell towers that existed were mostly on hilltops, mountaintops, or skyscrapers, not close to where people lived.
The problem for the telecommunications industry in 1995 was liability. Microwave radiation was harmful. Cell phones were going to damage everyone’s brain, make people obese, and give millions of people cancer, heart disease and diabetes. And cell towers were going to damage forests, wipe out insects, and torture and kill birds and wildlife.
This was all known. Extensive research had already been done in the United States, Canada, the Soviet Union, Eastern Europe, and elsewhere. Biologist Allan Frey, under contract with the U.S. Navy, was so alarmed by the results of his animal studies that he refused to experiment on humans. “I have seen too much,” he told colleagues at a symposium in 1969. “I very carefully avoid exposure myself, and I have for quite some time now. I do not feel that I can take people into these fields and expose them and in all honesty indicate to them that they are going into something safe.”
Frey discovered that microwave radiation damages the blood-brain barrier — the protective barrier that keeps bacteria, viruses and toxic chemicals out of your brain and keeps the inside of your head at a constant pressure, preventing you from having a stroke. He discovered that both people and animals can hear microwaves.
He discovered that he could stop a frog’s heart by timing microwave pulses at a precise point in the heart’s rhythm. The power level he used for that experiment was only 0.6 microwatts per square centimeter, thousands of times lower than the radiation from today’s cell phones.
Ophthalmologist Milton Zaret, who had contracts with the U.S. Army, Navy and Air Force, as well as with the Central Intelligence Agency, discovered in the 1960s that low-level microwave radiation causes cataracts. In 1973, he testified before the Commerce Committee of the United States Senate. “There is a clear, present and ever-increasing danger,” he told the senators, “to the entire population of our country from exposure to the entire non-ionizing portion of the electromagnetic spectrum.
The dangers cannot be overstated…” Zaret told the committee about patients who not only had cataracts caused by exposure to microwaves, but also malignant tumors, cardiovascular disease, hormonal imbalance, arthritis and mental illness, as well as neurological problems in children born to them. These patients ranged from military personnel exposed to radar to housewives exposed to their microwave ovens.
“The microwave oven leakage standard set by the Bureau of Radiological Health,” he told the committee, “is approximately 1 billion times higher than the total entire microwave spectrum given off by the Sun. It is appalling for these ovens to be permitted to leak at all, let alone for the oven advertisements to encourage our children to have fun learning to cook with them!”
The microwave oven leakage standard, today in 2021, is the same as it was in 1973: 5 milliwatts per square centimeter at a distance of 5 centimeters. And the microwave exposure levels to the brain from every cell phone in use today are higher than that.
The Navy, at that time, was exposing soldiers to low-level microwave radiation in research being conducted in Pensacola, Florida. Echoing Frey, Zaret said these experiments were unethical. “I don’t believe it is possible,” he told the Senate committee, “to get informed, untainted consent from any young adult who agrees to be exposed to irradiation where you are not sure of what the end result is going to be…
Also, that any children that he has at some future time may suffer from this irradiation.” He reemphasized the ethical problems with this research: “I think if it was explained fully to them and they still volunteered, for this project, one would question their mental capacity right off the start.”
Scientists experimenting on birds were just as alarmed by their results, and issued warnings about the environmental effects of the radiation our society was unleashing on the world that were just as dire as the warnings delivered to Congress by Milton Zaret, and the warnings delivered to the Navy by Allan Frey.
In the late 1960s and continuing through the 1970s, John Tanner and his colleagues at Canada’s National Research Council exposed chickens, pigeons and seagulls to microwave radiation, and found frightening effects at every level of exposure.
Chickens exposed to between 0.19 and 360 microwatts per square centimeter for nine months developed tumors of the central nervous system, and avian leukosis – also a type of tumor — of ovaries, intestines and other organs which in some birds reached “massive proportions,” on “a scale never seen before by veterinarians experienced with avian diseases.” Mortality was high in the irradiated birds. All the exposed birds, at every power level, had deteriorated plumage, with feathers lost, broken or with twisted and brittle shafts.
In other experiments, in which these researchers irradiated birds at higher power, the birds collapsed in pain within seconds. This occurred not only when the whole bird was irradiated but also when only its tail feathers were irradiated and the rest of the bird was carefully shielded. In further experiments, they proved that bird feathers make fine receiving aerials for microwaves, and speculated that migratory birds may use their feathers to obtain directional information.
These scientists warned that increasing levels of ambient microwaves would cause wild birds distress and might interfere with their navigation.
Maria Sadchikova, working in Moscow; Václav Bartoniček and Eliska Klimková-Deutshová, working in Czechoslovakia; and Valentina Nikitina, who examined officers of the Russian Navy, found, as early as 1960, that the majority of people exposed to microwave radiation on the job — even people who had ceased such employment five to ten years previously — had elevated blood sugar or had sugar in their urine.
Animal experiments showed that the radiation directly interferes with metabolism, and that it does so rapidly. In 1962, V.A. Syngayevskaya, in Leningrad, exposed rabbits to low level radio waves and found that the animals’ blood sugar rose by one-third in less than an hour. In 1982, Vasily Belokrinitskiy, in Kiev, reported that the amount of sugar in the urine was in direct proportion to the dose of radiation and the number of times the animal was exposed.
Mikhail Navakitikian and Lyudmila Tomashevskaya reported in 1994 that insulin levels decreased by 15 percent in rats exposed for just half an hour, and by 50 percent in rats exposed for twelve hours, to pulsed radiation at a power level of 100 microwatts per square centimeter. This level is comparable to the radiation a person receives today sitting directly in front of a wireless computer, and considerably less than what a person’s brain receives from a cell phone.
These were just a few of the thousands of studies being performed all over the world at that time that found profound effects of microwave radiation on every human organ, and on the functioning and reproduction of every plant and animal. Lieutenant Zory Glaser, commissioned by the U.S.
Navy in 1971 to catalogue the world’s literature on the health effects of microwave and radio-frequency radiation, collected 5,083 studies, textbooks and conference proceedings by 1981. He managed to find about half of the literature existing at that time. So about 10,000 studies had proven microwave and RF radiation to be dangerous to all life, already before 1981.

Cooking Your DNA and Roasting Your Nerves

In the early 1980s Mays Swicord, working at the National Center for Devices and Radiological Health at the Food and Drug Administration, decided to test his conjecture that DNA resonantly absorbs microwave radiation, and that even a very low level of radiation, although producing no measurable heat in the human body as a whole, may nevertheless heat your DNA.
He exposed a solution containing a small amount of DNA to microwave radiation, and found that the DNA itself was absorbing 400 times as much radiation as the solution that it was in, and that different lengths of DNA strands resonantly absorb different frequencies of microwave radiation.
So even though the overall temperature of your cells may not be raised to any detectable degree by the radiation, the DNA inside your cells may be heated tremendously. Swicord’s later research confirmed that this damages DNA, causing both single- and double-strand DNA breakage.
Professor Charles Polk of the University of Rhode Island reported essentially the same thing at the twenty-second annual meeting of the Bioelectromagnetics Society in June 2000 in Munich, Germany. Direct measurements had recently shown that DNA is much more electrically conductive than anyone had suspected: it has a conductivity of at least 105 siemens per meter, which is about 1/10 as conductive as mercury!
A cell phone held to your head may irradiate your brain at a specific absorption rate (SAR) of about 1 watt per kilogram, which produces little overall heating. Polk calculated, however, that this level of radiation would raise the temperature in the interior of your DNA by 60 degrees Celsius per second! He said that the tissues cannot dissipate heat that rapidly, and that such heating would rupture the bonds between complementary strands of DNA, and would explain the DNA breakage reported in various studies.
And in 2006, Markus Antonietti, at Germany’s Max Planck Institute, wondered whether a similar type of resonant absorption occurs in the synapses of our nerves. Cell phones are designed so the radiation they emit will not heat your brain more than one degree Celsius. But what happens in the tiny environment of a synapse, where electrically charged ions are involved in transmitting nerve impulses from one neuron to another?
Antonietti and his colleagues simulated the conditions in nerve synapses with tiny fat droplets in salt water and exposed the emulsions to microwave radiation at frequencies between 10 MHz and 4 GHz. The resonant absorption frequencies, as expected, depended on the size of the droplets and other properties of the solution. But it was the size of the absorption peaks that shocked Antonietti.
“And now comes the tragedy,” said Antonietti. “Exactly where we are closest to the conditions in the brain, we see the strongest heating. There is a hundred times as much energy absorbed as previously thought. This is a horror.”

Efforts by the EPA to Protect Americans

Faced with a barrage of alarming scientific results, the U.S. Environmental Protection Agency (EPA) established its own microwave radiation research laboratory which operated from 1971 until 1985 with up to 30 full-time staff exposing dogs, monkeys, rats and other animals to microwaves.
The EPA was so disturbed by the results of its experiments that it proposed, already in 1978, to develop guidelines for human exposure to microwave radiation for adoption and enforcement by other federal agencies whose activities were contributing to a rapidly thickening fog of electromagnetic pollution throughout our nation. But there was pushback by those agencies.
The Food and Drug Administration did not want the proposed exposure limits to apply to microwave ovens or computer screens. The Federal Aviation Administration did not want to have to protect the public from air traffic control and weather radars. The Department of Defense did not want the limits to apply to military radars. The CIA, NASA, Department of Energy, Coast Guard, and Voice of America did not want to have to limit public exposure to their own sources of radiation.
Finally, in June 1995, with the telecommunications industry planning to put microwave radiation devices into the hands and next to the brains of every man, woman and child, and to erect millions of cell towers and antennas in cities, towns, villages, forests, wildlife preserves and national parks throughout the country in order to make those devices work, the EPA announced that it was going to issue Phase I of its exposure guidelines in early 1996.
The Federal Communications Commission would have been required to enforce those guidelines, cell phones and cell towers would have been illegal, and even if they were not illegal, telecommunications companies would have been exposed to unlimited liability for all the suffering, disease and mortality they were about to cause.
But it was not to be. The Electromagnetic Energy Association, an industry lobbying group, succeeded in preventing the EPA’s exposure guidelines from being published. On September 13, 1995, the Senate Committee on Appropriations stripped the $350,000 that had been budgeted for EPA’s work on its exposure guidelines and wrote in its report, “The Committee believes EPA should not engage in EMF activities.”
The Personal Communications Industry Association (CTIA), another industry group, also lobbied Congress, which was drafting a bill called the Telecommunications Act, and a provision was added to the Act prohibiting states and local governments from regulating “personal wireless service facilities” on the basis of their “environmental effects.”
That provision shielded the telecommunications industry from any and all liability for injury from both cell towers and cell phones and permitted that industry to sell the most dangerous technology ever invented to the American public. People were no longer allowed to tell their elected officials about their injuries at public hearings.
Scientists were no longer allowed to testify in court about the dangers of this technology. Every means for the public to find out that wireless technology was killing them was suddenly prohibited.
The telecommunications industry has done such a good job selling this technology that today the average American household contains 25 different devices that emit microwave radiation and the average American spends five hours per day on their cell phone, has it in their pocket next to their body the rest of the day, and sleeps with it all night in or next to their bed.
Today almost every man, woman and child holds a microwave radiation device in their hand or against their brain or body all day every day, completely unaware of what they are doing to themselves, their family, their pets, their friends, their neighbors, the birds in their yard, their ecosystem, and their planet. Those who are even aware there is a problem at all view only the towers as a threat, but their phone as a friend.
Header image: Mobile Phone History

The Most Dangerous Technology Ever Invented Part Two
Published on October 29, 2021
Written by Arthur Firstenberg

The Most Dangerous Technology Ever Invented – Part One & Two 5G-mast-The-Telegraph

The selling of cell phones is, and always has been, based on lies and deception. The biggest lie is that they are “low power” devices and that this makes them safe.

That is a double lie. It is a lie because they are not low power. If you put a cell phone — any cell phone — in your hand or next to your body, you are being blasted by more microwave radiation from your phone than you are getting from any cell tower, and by ten billion times as much microwave radiation as you are getting from the sun, the Milky Way, or any other natural sources.
The exposure guidelines established by the Federal Communications Commission reflect this reality: cell towers are permitted to expose your body at a specific absorption rate of 0.08 watts per kilogram, while cell phones are allowed to expose your brain at a specific absorption rate of 1.6 watts per kilogram, which is twenty times higher.
And it is a lie because low power devices are not any safer than high power devices. The reason for this is that electromagnetic fields are not toxins in the ordinary sense, and the rule in toxicology that a lower dose is a safer dose does not apply to microwave radiation. As Allan Frey wrote in 1990:

“Electromagnetic fields are not a foreign substance to living beings like lead or cyanide. With foreign substances, the greater the dose, the greater the effect — a dose-response relationship. Rather, living beings are electrochemical systems that use low frequency EMFs in everything from protein folding through cellular communication to nervous system function. To model how EMFs affect living beings, one might compare them to the radio we use to listen to music…
If you impose on the radio an appropriately tuned EMF or harmonic, even if it is very weak, it will interfere with the music. Similarly, if we impose a very weak EMF signal on a living being, it has the possibility of interfering with normal function if it is properly tuned. That is the model that much biological data and theory tell us to use, not a toxicological model.”

The most thorough investigation of the blood-brain barrier effect, which Frey discovered in 1975, was done at Lund University in Sweden beginning in the late 1980s with various sources of microwave radiation and later, in the 1990s and 2000s, with actual cell phones. They found not only that there is not a dose response, but that there is an inverse dose response for this type of injury.
They exposed laboratory rats to what is now called 2G cell phone radiation, and then they reduced the power level of the radiation ten-fold, a hundred-fold, a thousand-fold, and ten thousand-fold. And they found, to their surprise, that the greatest damage to the blood-brain barrier occurred not in the rats that were exposed at full power, but in the rats that were exposed to phones whose radiation was reduced by a factor of ten thousand!
This was the equivalent of holding a cell phone more than one meter away from your body. The leader of the research team, neurosurgeon Leif Salford, warned that non-users of cell phones were being damaged by their neighbors’ cell phones, and that this technology was “the world’s largest biological experiment ever.”
And in a further set of experiments, published in 2003, Salford’s team exposed young rats to what is now called a 2G cell phone, just once for two hours, either at full power, or at two different levels of reduced power, and sacrificed them 50 days later to examine their brains. They found that a single exposure to an ordinary cell phone operating at normal power had permanently destroyed up to 2 percent of almost all the rats.
Damaged neurons dominated the picture in some areas of their brains. When the power of the phone was reduced ten-fold it caused brain damage in every rat. When the power of the phone was reduced one hundred-fold, this type of permanent brain damage was observed in half of the exposed animals.
And in still further experiments, published in 2008, they exposed rats to a cell phone for two hours once a week for a year, still using what is now called a 2G cell phone. The exposed rats suffered from impaired memory, regardless of whether they were exposed at an SAR level of 60 milliwatts per kilogram or 0.6 milliwatts per kilogram. In other words, reducing the power level by a factor of one hundred did not make the cell phone less dangerous.
The lack of a dose response has been reported over and over. Physicist Carl Blackman spent much of his career at the Environmental Protection Agency figuring out why not only particular frequencies but also particular power levels of RF radiation cause calcium to flow out of brain cells. Ross Adey at UCLA, Jean-Louis Schwartz at the National Research Council of Canada, and Jitendra Behari at Jawaharlal University in India reported the same thing.
Geneticist Sisir Dutta, studying the same phenomenon at Howard University in 1986, found peaks of calcium flow at SAR levels of 2 W/kg and 1 W/kg, and also at .05, .0028, .001, .0007, and .0005 W/kg, with some effect all the way down to .0001 W/kg. The effect at 0.0007 W/kg SAR was quadruple the effect at 2.0 W/kg, in other words a 3,000-fold reduction in power level resulted in a 4-fold increase in calcium disturbance. The frequency was 915 MHz, the same frequency that was later to be used for cell phones.
Maria Sadchikova and her Soviet colleagues, in the 1960s and 1970s, examined hundreds of workers exposed to microwave radiation on the job, and consistently found that the sickest workers were the ones who were exposed to the lowest, not the highest power levels.
Igor Belyaev, at Stockholm University, found that genetic effects occurred at specific frequencies and that the magnitude of the effect did not change with power level over 16 orders of magnitude, all the way down to 0.000000000000000001 watts per square centimeter, a level that is one quadrillion times lower than what a cell phone delivers to one’s brain.
Dimitris Panagopoulos, at the University of Athens, found that fruit flies exposed to a cell phone for just one minute a day for five days produced 36 percent fewer offspring than flies that were not exposed at all. When he exposed them to the phone for six minutes a day for five days, it reduced the number of their offspring by 50 to 60 percent.
And the maximum effect occurred when the cell phone was about one foot away from the flies, not when it was touching the vial that the flies were in. In further research, he showed that the effect is due to DNA damage and consequent cell death caused by the radiation.
In another experiment, Panagopoulos’s colleague, Lukas Margaritis, exposed fruit flies to various frequencies of RF radiation at exposure levels ranging from 0.0001 watts per kilogram to 0.04 watts per kilogram, and found that even a single exposure to any of these frequencies at any of these power levels for just 6 minutes caused a significant amount of ovarian cell death.
And in further research, Margaritis’s team exposed fruit flies to a cell phone either once for 6 minutes, once for 12 minutes, 6 minutes a day for 3 days, or 12 minutes a day for 3 days. Under each condition the phone tripled to sextupled the amount of ovarian cell death.
And then this team tried other sources of microwave radiation for between 10 and 30 minutes per day for up to 9 days and found that each of them reduced the number of offspring by between 11 and 32 percent. The cell phone and the cordless phone had the greatest effect, but the WiFi, the baby monitor, the Bluetooth, and the microwave oven also substantially reduced the fecundity of the flies.
The effects on insects are so obvious that even a high school student can easily demonstrate them. In 2004, Alexander Chan, a sophomore at Benjamin Cardozo High School in Queens, New York, exposed fruit fly larvae daily to a loudspeaker, a computer monitor, and a cell phone for a science fair project and observed their development. The flies that were exposed to the cell phone failed to develop wings.

What Are We Doing to Nature?

We are distressing and disorienting not only birds, but also, as is being discovered, insects. It appears that all little creatures that have antennae use them to send and receive communications electronically — communications that are being interfered with and drowned out by the much more powerful communications of our wireless devices.
When honey bees perform their waggle dance to inform one another of the location of food sources, it is not only a visual dance but an electromagnetic one. During the dance they generate electromagnetic signals with a modulation frequency between 180 and 250 Hz. And they send another kind of signal, which has been called the “stop” signal, up to 100 milliseconds long, at a frequency of 320 Hz.
The stop signal is used when the colony already has too much food, and it causes the dancers to stop dancing and leave the dance floor. Uwe Greggers, at Freie Universität Berlin, discovered that bees will start walking and actively moving their antennae in response to artificially generated electromagnetic fields that imitate these natural signals, even in the absence of any visual or auditory cues. Bees whose antennae he had removed or coated with wax did not respond to these signals.
Pollination is also dependent on electromagnetic communication — between bees and flowers. Bees carry positive charge on their bodies from flying in the global atmospheric electric field, while flowers, being connected to the earth, carry a negative charge. Dominic Clarke, at the University of Bristol, has proved that not only does this facilitate pollen transfer from flowers to bees, but that bees sense and are attracted not only to the colors of flowers but also to the distinct patterns of their electric fields.
The electric field of a flower diminishes immediately after being visited by a bee, and other bees “see” this and only visit flowers whose electric field is robust. While honey bees see the fields with their antennae, bumble bees see the fields more with the hairs that cover their bodies, which not only make them such distinctive creatures but also function as a kind of antenna.
In 2007, German biologist Ulrich Warnke published an important booklet in both English and German titled Bees, Birds and Mankind: Destroying Nature by “Elektrosmog” (Bienen, Vögel und Menschen: Die Zerstörung der Natur durch ‚Elektrosmog’). In it, he reminded us that there are only two long-range forces — gravity and electromagnetism — that shape everything in the universe including our bodies, and that we ignore that fact at our peril.
Electricity is the foundation of life, he warned, and “this destruction of the foundation of life has already wiped out many species forever.” We cannot immerse our world, he said, in a sea of electromagnetic radiation that is up to 10,000,000,000 times as strong as the natural radiation that we evolved with without destroying all of life. He summarized the research that he and others had done with honey bees. It is no wonder, wrote Warnke, that bees are disappearing all over the world.
They began disappearing at the dawn of the radio age. On the small island lying off England’s southern coast where Guglielmo Marconi sent the world’s first long-distance radio transmission in 1901, the honey bees began to vanish. By 1906, the island, then host to the greatest density of radio transmissions in the world, was almost empty of bees. Thousands, unable to fly, were found crawling and dying on the ground outside their hives.
Healthy bees imported from the mainland began dying within a week of arrival. In the following decades, Isle of Wight disease spread along with radio broadcasting to the rest of Great Britain, and to Italy, France, Switzerland, Germany, Brazil, Australia, Canada, South Africa, and the United States. In the 1960s and 1970s its name changed to “disappearing disease.”
It became urgent in the late 1990s with the wireless revolution, and became a worldwide emergency by 2006, when it was renamed “colony collapse disorder.” Today not only domestic bees, but all wild bees, are in danger of extinction.
Amphibians are not only disappearing, but large numbers of amphibian species have already gone extinct, even in the most remote, pristine areas of the world — pristine, that is, except for communication towers and radar stations emitting microwave radiation. Amphibians are the most vulnerable of all classes of animals on the planet to electromagnetic radiation, and they have been dwindling and going extinct since the 1980s.
When I looked into this in 1996, every species of frog and toad in Yosemite National Park was disappearing. In the Monteverde Cloud Forest Preserve of Costa Rica, the famous and highly protected golden toad had gone extinct. Eight of thirteen frog species in a Brazilian rainforest preserve had gone extinct. The famous gastric-brooding frog of Australia was extinct.
Seventy-five species of the colorful harlequin frogs that once graced streams in the tropics of the Western Hemisphere were extinct. Today, more than half of all known kinds of frogs, salamanders and caecilians (snake-like amphibians), amounting to 4,300 species, are either extinct or in danger of extinction.
In 1996, when cell towers marched into remote areas of the United States, mutant frogs began turning up by the thousands in lakes, streams and forests all across the American Midwest. Their deformed legs, extra legs, missing eyes, misplaced eyes, and other genetic mistakes were frightening school children out on field trips.
In 2009, wildlife biologist Alfonso Balmori did a simple, obvious experiment on the balcony of an apartment in Valladolid, Spain not far from a cell tower, an experiment that proved what was happening: he raised tadpoles in two identical tanks, except over one of them he draped a thin layer of fabric that was woven with metallic fibers, which admitted air and light but kept out radio waves. The results shocked even Balmori: in a period of two months, 90 percent of the tadpoles in the tank without the shielding had died, versus only 4 percent in the shielded tank.
Similar shielding experiments have confirmed, in spades, what is happening to birds, and what is happening to our forests.
Scientists at the University of Oldenburg in Germany were shocked to find, beginning in 2004, that the migratory songbirds they had been studying were no longer able to orient themselves toward the north in spring and toward the southwest in autumn.
Suspecting that electromagnetic pollution might be responsible, they did for their birds what Balmori did for his tadpoles a few years later: they shielded the aviary from radio waves during the winter with aluminum sheeting. “The effect on the birds’ orientation capabilities was profound,” wrote the scientists. The birds all oriented toward the north the following spring.
And in 2007, in a backyard laboratory in the foothills of Colorado’s Rocky Mountains, Katie Haggerty decided to do the same experiment with aspen seedlings. She wanted to find out if radio waves were responsible for the decline of aspen trees all over Colorado that had begun in 2004. She grew 27 aspen trees — nine without any screening, nine with aluminum window screening around their pots which kept out radio waves, and nine with fiberglass screening which kept out just as much light but let in all the radio waves.
After two months, the new shoots of the radio-shielded aspens were 74 percent longer, and their leaves 60 percent larger, than those of either the mock-shielded or the unshielded aspens. And in the fall, the shielded trees had large, healthy leaves in brilliant fall colors that aspens are famous for: bright orange, yellow, green, dark red, and black. The mock-shielded and unshielded trees had small leaves in drab yellow and green, covered with gray and brown areas of decay.
The only thing that had changed in Colorado’s Rocky Mountains in 2004 was the installation of a new emergency communication system called the Digital Trunked Radio System composed of 203 radio towers whose transmissions covered every square inch of the state.
Header image: The Telegraph

https://principia-scientific.com/the-most-dangerous-technology-ever-invented-part-two/?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+psintl+%28Principia+Scientific+Intl+-+Latest+News%29
Thanks to: https://principia-scientific.com

Source

Views: 1

You can skip to the end and leave a response. Pinging is currently not allowed.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes