New gene with a key role in obesity

New gene with a key role in obesity:

New gene with a key role in obesity:

New gene with a key role in obesity:

The researchers observed that blocking the expression of TRIP- Br2 gene protects mice against obesity and insulin resistance . The study shows that the gene modulates fat storage by regulating energy expenditure and lipolysis , the process that transforms fat into lipids for energy in the body. If gene expression is blocked , the mice increase in lipolysis and energy expenditure , thereby reducing their obesity.

Obesity is the result of an alteration in the processes regulating the absorption of food and energy production . This altered balance tips toward excessive fat storage. According to researchers, the understanding of the regulation of the factors controlling the storage, movement and the use of an excess of energy in fat cells ( adipocytes ) can lead to the development of therapies for obesity and related conditions such as type 2 diabetes.

In the words of Cristina Mallol , a researcher at the Autonomous University of Barcelona and co – author of the study : ” The protection of mice lacking expression of TRIP- Br2 gene and its selective elevation of visceral fat in humans point to the path future gene therapy to counteract obesity, insulin resistance and excess lipids in the blood “ .

Big Bang Abandoned

A new cosmology successfully explains the accelerating expansion of the universe without dark energy; but only if the universe has no beginning and no end.

 

As one of the few astrophysical events that most people are familiar with, the Big Bang has a special place in our culture. And while there is scientific consensus that it is the best explanation for the origin of the Universe, the debate is far from closed. However, it’s hard to find alternative models of the Universe without a beginning that are genuinely compelling.

That could change now with the fascinating work of Wun-Yi Shu at the National Tsing Hua University in Taiwan. Shu has developed an innovative new description of the Universe in which the roles of time space and mass are related in new kind of relativity.

Shu’s idea is that time and space are not independent entities but can be converted back and forth between each other. In his formulation of the geometry of spacetime, the speed of light is simply the conversion factor between the two. Similarly, mass and length are interchangeable in a relationship in which the conversion factor depends on both the gravitational constant G and the speed of light, neither of which need be constant.

So as the Universe expands, mass and time are converted to length and space and vice versa as it contracts.

This universe has no beginning or end, just alternating periods of expansion and contraction. In fact, Shu shows that singularities cannot exist in this cosmos.

It’s easy to dismiss this idea as just another amusing and unrealistic model dreamed up by those whacky comsologists.

That is until you look at the predictions it makes. During a period of expansion, an observer in this universe would see an odd kind of change in the red-shift of bright objects such as Type-I supernovas, as they accelerate away. It turns out, says Shu, that his data exactly matches the observations that astronomers have made on Earth.

This kind of acceleration is an ordinary feature of Shu’s universe.

That’s in stark contrast to the various models of the Universe based on the Big Bang. Since the accelerating expansion of the Universe was discovered, cosmologists have been performing some rather worrying contortions with the laws of physics to make their models work.

The most commonly discussed idea is that the universe is filled with a dark energy that is forcing the universe to expand at an increasing rate. For this model to work, dark energy must make up 75 per cent of the energy-mass of the Universe and be increasing at a fantastic rate.

But there is a serious price to pay for this idea: the law of conservation of energy. The embarrassing truth is that the world’s cosmologists have conveniently swept under the carpet one the of fundamental laws of physics in an attempt to square this circle.

That paints Shu’s ideas in a slightly different perspective. There’s no need to abandon conservation of energy to make his theory work.

That’s not to say Shu’s theory is perfect. Far from it. One of the biggest problems he faces is explaining the existence and structure of the cosmic microwave background, something that many astrophysicists believe to be the the strongest evidence that the Big Bang really did happen. The CMB, they say, is the echo of the Big bang.

How it might arise in Shu’s cosmology isn’t yet clear but I imagine he’s working on it.

Even if he finds a way, there will need to be some uncomfortable rethinking before his ideas can gain traction. His approach may well explain the Type-I supernova observations without abandoning conservation of energy but it asks us to give up the notion of the Big Bang, the constancy of the speed of light and to accept a vast new set of potential phenomenon related

 

DARPA’S high resolution drone-mounted camera

DARPA Can See You… From 17,500 Feet In The Air:

 DARPA Can See You... From 17,500 Feet In The Air A new video from the world's highest-resolution drone-mounted camera is mind-blowingly clear. And terrifying.

DARPA Can See You… From 17,500 Feet In The Air

 

Curious as to how the Defense Department could be spying on you next? PBS checked in with DARPA about the latest in drone camera technology for the NOVA special “Rise of the Drones,” including the world’s highest-resolution camera. Actually seeing the sensor on ARGUS-IS, or Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System, is still classified, but the basics of how it works have been deemed fit for public consumption. ARGUS-IS uses 368 imaging chips like those found in cell phone cameras, to stitch together a 1.8 billion pixel video. That means from 17,500 feet in the air, ARGUS-IS can see someone on the ground waving their arms. And it generates that kind of high-definition video for an area 15 square miles across. It can see a bird flying through a parking lot from more than three miles in the air. It can store a million terabytes of video a day, up to 5,000 hours of footage, so soon drones will not only be able to see everything that happens on the ground, but also keep that record. Whether or not ARGUS has been used in the field is still classified. Let’s get real, though: Does this cool a toy get put in a corner?

The New Moore’s Law

A New and Improved Moore’s Law:

A New and Improved Moore's Law

A New and Improved Moore’s Law

Researchers have, for the first time, shown that the energy efficiency of computers doubles roughly every 18 months. The conclusion, backed up by six decades of data, mirrors Moore’s law, the observation from Intel founder Gordon Moore that computer processing power doubles about every 18 months. But the power-consumption trend might have even greater relevance than Moore’s law as battery-powered devices—phones, tablets, and sensors—proliferate. “The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half,” says Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University and lead author of the study. More mobile computing and sensing applications become possible, Koomey says, as energy efficiency continues its steady improvement. The research, conducted in collaboration with Intel and Microsoft, examined peak power consumption of electronic computing devices since the construction of the Electronic Numerical Integrator and Computer (ENIAC) in 1946. The first general purpose computer, the ENIAC was used to calculate artillery firing tables for the U.S. Army, and it could perform a few hundred calculations per second. It used vacuum tubes rather than transistors, took up 1,800 square feet, and consumed 150 kilowatts of power. Even before the advent of discrete transistors, Koomey says, energy efficiency doubled every 18 months. “This is a fundamental characteristic of information technology that uses electrons for switching,” he says. “It’s not just a function of the components on a chip.” The sort of engineering considerations that go into improving computer performance—reducing component size, capacitance, and the communication time between them, among other things—also improves energy efficiency, Koomey says. In July, Koomey released a report that showed, among other findings, that the electricity used in data centers worldwide increased by about 56 percent from 2005 to 2010—a much lower rate than the doubling that was observed from 2000 to 2005. While better energy efficiency played a part in this change, the total electricity used in data centers was less than the forecast for 2010 in part because fewer new servers were installed than expected due to technologies such as virtualization, which allowed existing systems to run more programs simultaneously. Koomey notes that data center computers rarely run at peak power. Most computers are, in fact, “terribly underutilized,” he says. The information technology world has gradually been shifting its focus from computing capabilities to better energy efficiency, especially as people become more accustomed to using smart phones, laptops, tablets, and other battery-powered devices. Since the Intel Core microarchitecture was introduced in 2006, the company has experienced “a sea change in terms of focus on power consumption,” says Lorie Wigle, general manager of the eco-technology program at Intel. “Historically, we have focused on performance and battery life, and increasingly, we’re seeing those two things come together,” she says. “Everyone’s familiar with Moore’s law and the remarkable improvements in the power of computers, and that’s obviously important,” says Erik Brynjolfsson, professor of the Sloan School of Management at MIT. But people are paying more attention to the battery life of their electronics as well as how fast they can run. “I think that’s more and more the dimension that matters to consumers,” Brynjolfsson says. “And in a sense, ‘Koomey’s law,’ this trend of power consumption, is beginning to eclipse Moore’s law for what matters to consumers in a lot of applications.” To Koomey, the most interesting aspect of the trend is thinking about the possibilities for computing. The theoretical limits are still so far away, he says. In 1985, the physicist Richard Feynman analyzed the electricity needs for computers and estimated that efficiency could theoretically improve by a factor of 100 billion before it hit a limit, excluding new technologies such as quantum computing. Since then, efficiency improvements have been about 40,000. “There’s so far to go,” says Koomey. “It’s only limited by our cleverness, not the physics.”