14-Million Year Old Vehicle Tracks

 14-Million-Year-Old Vehicle Tracks

14-Million-Year-Old Vehicle Tracks

According to a statement from a Russian geologist: these traces were left by vehicles that belonged to an advanced ancient civilization that inhabited our planet 14 million years ago.

We all know that numerous religious texts speak of giants that roamed the Earth in the distant past.

Even though experts in different fields have different opinions about this possibility, there are others who believe that Ancient Giants did exist and that we can find many traces of their existence today.

Geologist Alexander Koltypin believes that the mysterious markings that extend along the Phrygian Valley, in central Turkey, were made by an intelligent race between 12 and 14 million years ago
“We can assume that ancient vehicles with “wheels” were driven into the soft ground, perhaps a wet surface,” said the geologist.

“Because of the great weight of these vehicles, they left behind very deep grooves which eventually petrified and turned into evidence.”

Geologists are familiar with such phenomena as they have found petrified footprints of dinosaurs that were preserved in the same way.

Together with three colleagues, Dr. Koltypin, director of the Natural Science Scientific Research Centre at Moscow’s International Independent Ecological-Political University, traveled to the site in Anatolia, Turkey where these markings can be found.

Upon returning from his trip, he described the observed as ‘petrified tracking ruts in rocky tuffaceous [made from compacted volcanic ash deposits’.

Dr Koltypin said: ‘All these rocky fields were covered with the ruts left some millions of years ago… we are not talking about human beings.’

‘We are dealing with some kind of cars or all-terrain vehicles. The pairs of ruts are crossing each other from time to time and some ruts are more deep than the others,’ he added.

According to Dr Koltypin, these tracers were left behind by vehicles 14 million years ago. These mysterious ruts are between 12 to 14 million years old.

‘The methodology of specifying the age of volcanic rocks is very well-studied and worked out,’ he said.

Dr Koltypin is one of few experts that actually believes that science needs to change their approach on different matters. He believes that there are many archaeologists who avoid touching this matter since it would ruin all classic theories.

‘As a geologist, I can certainly tell you that unknown antediluvian [pre-Biblical] all-terrain vehicles drove around Central Turkey some 12-to-14 million years ago.’ said Dr. Koltypin.

He said: ‘I think we are seeing the signs of the civilisation which existed before the classic creation of this world. Maybe the creatures of that pre-civilisation were not like modern human beings.’

According to Dr. Koltypin and many other archaeologists and experts which have adopted new ways of thinking these ancient “car tracks” are one of the best preserved pieces of evidence which undoubtedly prove the existence of highly advanced ancient civilizations that inhabited our planet in the distant past.

Many researchers believe that there are several pieces of evidence pointing towards the existence of highly advanced ancient civilizations that existed on Earth millions of years ago.

“There was no comprehensible system for the tracks but the distance between each pair of tracks ‘is always the same,” added Dr Koltypin.

‘The maximum depth of a rut is about three feet (one metre). On the sides of ruts there can be seen horizontal scratches, it looks like they were left by the ends of the axles used for ancient wheels.

‘We found many ruts with such scratches,’ he said.

Is it possible that Dr Koltypin is right? And is it possible that mainstream scientists have ignored these giant pieces of evidence in hopes of preserving their classic and old thinking methods?

Is it possible that mainstream experts are afraid of adopting a new approach to ancient history?

There are many who believe that with a classic approach, science is becoming less objective.

 

Source:  humansarefree.com

Scientists grow 5-week-old human brain

Scientists successfully grow human brain in lab

Scientists successfully grow human brain in lab

Growing brain tissue in a dish has been done before, but bold new research announced this week shows that scientists’ ability to create human brains in laboratory settings has come a long way quickly.

Researchers at the Ohio State University in the US claim to have developed the most complete laboratory-grown human brain ever, creating a model with the brain maturity of a 5-week-old foetus. The brain, which is approximately the size of a pencil eraser, contains 99 percent of the genes that would be present in a natural human foetal brain.

“It not only looks like the developing brain, its diverse cell types express nearly all genes like a brain,” Rene Anand, professor of biological chemistry and pharmacology at Ohio State and lead researcher on the brain model, said in a statement.

“We’ve struggled for a long time trying to solve complex brain disease problems that cause tremendous pain and suffering. The power of this brain model bodes very well for human health because it gives us better and more relevant options to test and develop therapeutics other than rodents.”

Anand turned to stem cell engineering four years ago after his specialized field of research – examining the relationship between nicotinic receptors and central nervous system disorders – ran into complications using rodent specimens. Despite having limited funds, Anand and his colleagues succeeded with their proprietary technique, which they are in the process of commercializing.

The brain they have developed is a virtually complete recreation of a human foetal brain, primarily missing only a vascular system – in other words, all the blood vessels. But everything else (spinal cord, major brain regions, multiple cell types, signalling circuitry is there). What’s more, it’s functioning, with high-resolution imaging of the brain model showing functioning neurons and brain cells.

The researchers say that it takes 15 weeks to grow a lab-developed brain to the equivalent of a 5-week-old foetal human brain, and the longer the maturation process the more complete the organoid will become.

“If we let it go to 16 or 20 weeks, that might complete it, filling in that 1 percent of missing genes. We don’t know yet,” said Anand.

The scientific benefit of growing human brains in laboratory settings is that it enables high-end research into human diseases that cannot be completed using rodents.

“In central nervous system diseases, this will enable studies of either underlying genetic susceptibility or purely environmental influences, or a combination,” said Anand. “Genomic science infers there are up to 600 genes that give rise to autism, but we are stuck there. Mathematical correlations and statistical methods are insufficient to in themselves identify causation. You need an experimental system – you need a human brain.”

The research was presented this week at the Military Health System Research Symposium.

 

Source:  sciencealert.com

Summing up Fukushima

Summing up Fukushima

Summing up Fukushima

Summing up Fukushima

About 60 people died immediately during the actual evacuations in Fukushima Prefecture in March 2011. Between 2011 and 2015, an additional 1,867 people in Fukushima Prefecture died as a result of the evacuations following the nuclear disaster. These deaths were from ill health and suicides.

From the UNSCEAR estimate of 48,000 person Sv, it can be reliably estimated (using a fatal cancer risk factor of 10% per Sv) that about 5,000 fatal cancers will occur in Japan in future from Fukushima’s fallout. This estimate from official data agrees with my own personal estimate using a different methodology.

In sum, the health toll from the Fukushima nuclear disaster is horrendous. At the minimum

  • Over 160,000 people were evacuated most of them permanently.
  • Many cases of post-trauma stress disorder (PTSD), depression, and anxiety disorders arising from the evacuations.
  • About 12,000 workers exposed to high levels of radiation, some up to 250 mSv
  • An estimated 5,000 fatal cancers from radiation exposures in future.
  • Plus similar (unquantified) numbers of radiogenic strokes, CVS diseases and hereditary diseases.
  • Between 2011 and 2015, about 2,000 deaths from radiation-related evacuations due to ill-health and suicides.
  • An as yet unquantified number of thyroid cancers.
  • An increased infant mortality rate in 2012 and a decreased number of live births in December 2011.

Non-health effects include

  • 8% of Japan (30,000 sq.km), including parts of Tokyo, contaminated by radioactivity.
  • Economic losses estimated between $300 and $500 billion.

New evidence from Fukushima shows that as many as 2,000 people have died from necessary evacuations, writes Ian Fairlie, while another 5,000 will die from cancer. Future assessments of fatalities from nuclear disasters must include deaths from displacement-induced ill-heath and suicide in addition to those from direct radiation impacts.

“The Fukushima accident is still not over and its ill-effects will linger for a long time into the future … 2,000 Japanese people have already died from the evacuations and another 5,000 are expected to die from future cancers.”

Official data from Fukushima show that nearly 2,000 people died from the effects of evacuations necessary to avoid high radiation exposures from the disaster.

.

The uprooting to unfamiliar areas, cutting of family ties, loss of social support networks, disruption, exhaustion, poor physical conditions and disorientation can and do result in many people, in particular older people, dying.

Increased suicide has occurred among younger and older people following the Fukushima evacuations, but the trends are unclear.

A Japanese Cabinet Office report stated that, between March 2011 and July 2014,56 suicides in Fukushima Prefecture were linked to the nuclear accident. This should be taken as a minimum, rather than a maximum, figure.

 

Source:  globalresearch.ca

1% of 84,000 Chemicals Have only Been Tested

women make-up

women make-up

There are around 84,000 chemicals on the market, and we come into contact with many of them every single day. And if that isn’t enough to cause concern, the shocking fact is that only about 1 percent of them have been studied for safety.

In 2010, at a hearing of the Senate Subcommittee on Superfund, Toxics and Environmental Health, Lisa Jackson, then the administrator of the EPA, put our current, hyper-toxic era into sharp perspective: “A child born in America today will grow up exposed to more chemicals than any other generation in our history.”

Just consider your morning routine: If you’re an average male, you use up to nine personal care products every single day: shampoo, toothpaste, soap, deodorant, hair conditioner, lip balm, sunscreen, body lotion and shaving products — amounting to about 85 different chemicals. Many of the ingredients in these products are harmless, but some are carcinogens, neurotoxins and endocrine disruptors.

Women are particularly at risk because they generally use more personal care products than men: 25 percent of women apply 15 or more products daily, including makeup and anti-aging creams, amounting to an average of 168 chemicals. For a pregnant woman, the risk is multiplied as she can pass on those toxins to her unborn child: 300 contaminants have been detected in the umbilical cord blood of newborns.

Many people don’t think twice about the chemicals they put on their bodies, perhaps thinking that the government regulates the personal care products that flood the marketplace. In reality, the government plays a very small role, in part because it doesn’t have the legal mandate to protect the public from harmful substances that chemical companies and manufacturers sell in their products. Federal rules designed to ensure product safety haven’t been updated in more than 75 years. New untested chemicals appear on store shelves all the time.

“Under federal law, cosmetics companies don’t have to disclose chemicals or gain approval for the 2,000 products that go on the market every year,” notes environment writer Jane Kay in Scientific American. “And removing a cosmetic from sale takes a battle in federal court.”

It’s high time these rules are revisited. Not only have thousands of new chemicals entered the market in the past several decades, there is overwhelming evidence that the public is unnecessarily exposed to health hazards from consumer products. In 2013, the American College of Obstetricians and Gynecologists issued a report that found “robust” evidence linking “toxic environmental agents” — which includes consumer products — to “adverse reproductive and developmental health outcomes.”

Formaldehyde is a good example. It is a known carcinogen used as a preservative to kill or inhibit the growth of microorganisms in a wide range of personal care products, from cosmetics, soaps, shampoos and lotions to deodorants, nail polishes and hair gels. It is also used in pressed-wood products, permanent-press fabrics, paper product coatings and insulation, and as a fungicide, germicide, disinfectant and preservative. The general public is also exposed to formaldehyde through automobile tailpipe emissions. Formaldehyde has been linked to spontaneous abortion and low birth weight.

While the main concern about formaldehyde exposure centers around industrial use (e.g., industrial workers, embalmers and salon workers), the Cosmetic Ingredient Review, an independent panel of experts that determines the safety of individual chemical compounds as they are used in cosmetics, recommends that for health and safety reasons cosmetics should not contain formaldehyde at amounts greater than 0.2 percent. It’s a small amount, but the problem is that the FDA doesn’t regulate the use of formaldehyde in cosmetics (except for nail polish), and companies aren’t required by law to follow CIR’s recommendations.

 

Source:  alternet.org

NSA planned to infect Samsung with spyware

 

NSA planned to infect Samsung with spyware

NSA planned to infect Samsung with spyware

If you’re in the business of writing spyware or malware, smartphones are a tempting target. For many people, their phone or tablet is now the primary compute device they use to surf the web, access content, and explore new software. Google has had problems keeping the Google Play store free from malware and spyware, but new information suggests that both Google and Samsung almost faced a much more potent opponent — the NSA itself.

A report from The Intercept highlights how the NSA explored options for hacking the App Store and Google Play over several workshops held in Australia and Canada between November 2011 and February 2012. The projects used the Internet-monitoring Xkeyscore system to identify smartphone traffic, then trace that traffic back to app stores. This led to a project dubbed Irritant Horn, the point of which was to develop the ability to distribute “implants” that could be installed when the smartphones in question attempted to connect to Google or Samsung app stores.

The NSA has targeted mobile devices ever since the post-Patriot Act era made such warrantless comprehensive spying legal, but it’s never been clear how the organization managed to tap certain hardware in the first place. The goal was twofold: First, use app stores to launch spyware campaigns and second, gather information about the phone users themselves by infiltrating the app stores in question.

The reference to “Another Arab spring,” refers to the fact that the events of 2010-2011 apparently caught western intelligence agencies off-guard, with few resources that could quickly be brought to bear. The NSA wanted to be aware of future events before they happened. Note, however, that this has precious little to do with the direct goal of protecting the United States from terrorism.

Few would argue that the US should not monitor the activities of known threats, but where was the threat from internal strife and the possible toppling of autocratic governments? It’s true that in the longer run, some new governments might pursue policies that the United States found less desirable than those of the previous regime, but there’s an enormous leap between “We don’t like Country X’s new trade policy,” and “Country X is actively assisting terrorist groups to carry out an attack on the United States.”

 The NSA was primarily interested in the activities of African countries. But in the course of investigating these possibilities, it discovered significant security flaws in a program called UC Browser, used by nearly half a billion people in East Asia. Instead of disclosing the security vulnerability, the NSA and other foreign intelligence groups chose to exploit it — thereby increasing the chances that other criminal elements would have time to find and exploit it as well.

These issues are at the heart of the debate over what the NSA’s role should be in the future. There’s always been tension over whether the NSA should weaken or strengthen the cryptographic standards that allow for secure communication. That discussion may be even more nuanced when it involves software produced by foreign companies. There are few signs, however, that such nuanced discussions of capability have ever occurred. Instead, we continue to see intelligence resources deployed with the goal of vacuuming up all information from any source, regardless of legal precedent or cooperation.

The future of the Patriot Act and the scope of NSA’s future powers remains in some doubt. Senator Rand Paul gave a 10-hour speech yesterday aimed at derailing support for the Patriot Act (his actions were not properly a filibuster, because a vote on the renewal of Section 215 wasn’t actually before the chamber at the time). Others in the House of Representatives have called for a full appeal of the Patriot Act’s provisions, and the Federal Appeals Court for the Second Circuit recently ruled that the current spying program is illegal under the Patriot Act as it stands.

 

Source:  extremetech.com

 

800 terabecquerels of Cesium by 2016

800 terabecquerels of Cesium- 137 by 2016.

800 terabecquerels of Cesium- 137 by 2016.

A professor from Japan’s Fukushima University Institute of Environmental Radioactivity (Michio Aoyama) told Kyodo in April that the West Coast of North America will be hit with around 800 terabecquerels of Cesium- 137 by 2016.

EneNews notes that this is 80% of the cesium-137 deposited in Japan by Fukushima, according to the company which runs Fukushima, Tepco:

(a petabequeral or “PBq” equals 1,000 terabecquerels.)

This is not news for those who have been paying attention.  For example, we noted 2 days after the 2011 Japanese earthquake and tsunami that the West Coast of North America could be slammed with radiation from Fukushima.

We pointed out the next year that a previously-secret 1955 U.S. government report concluded that the ocean may not adequately dilute radiation from nuclear accidents, and there could be “pockets” and “streams” of highly-concentrated radiation.

The same year, we noted that 15 out of 15 bluefin tuna tested in California waters were contaminated with Fukushima radiation.

In 2013, we warned that the West Coast of North America would be hit hard by Fukushima radiation.

And we’ve noted for years that there is no real testing of Fukushima radiation by any government agency.

Indeed, scientists say that the amount of the West Coast of North America could end up exceeding that off the Japanese coast.

What’s the worst case scenario? That the mass die-off of sealife off the West Coast of North America – which may have started only a couple of months after the Fukushima melt-down – is being caused by radiation from Fukushima.

 

Source:  Globalresearch.ca