Reconciling predictions of climate change

Harvard University researchers have resolved a conflict in estimates of how much the Earth will warm in response to a doubling of carbon dioxide in the atmosphere.
That conflict – between temperature ranges based on global climate models and paleoclimate records and ranges generated from historical observations – prevented the United Nations’ Intergovernmental Panel on Climate Change (IPCC) from providing a best estimate in its most recent report for how much the Earth will warm as a result of a doubling of CO2 emissions.
The researchers found that the low range of temperature increase – between 1 and 3 degrees Celsius – offered by the historical observations did not take into account long-term warming patterns. When these patterns are taken into account, the researchers found that not only do temperatures fall within the canonical range of 1.5 to 4.5 degrees Celsius but that even higher ranges, perhaps up to 6 degrees, may also be possible.
The research is published in Science Advances.
CO2 in Earth’s atmosphere if half of global-warming emissions are not absorbed (NASA simulation). By NASA/GSFC [Public domain], via Wikimedia Commons
It’s well documented that different parts of the planet warm at different speeds. The land over the northern hemisphere, for example, warms significantly faster than water in the Southern Ocean.
“The historical pattern of warming is that most of the warming has occurred over land, in particular over the northern hemisphere,” said Cristian Proistosescu, PhD ’17, and first author of the paper. “This pattern of warming is known as the fast mode – you put CO2 in the atmosphere and very quickly after that, the land in the northern hemisphere is going to warm.”
But there is also a slow mode of warming, which can take centuries to realize. That warming, which is most associated with the Southern Ocean and the Eastern Equatorial Pacific, comes with positive feedback loops that amplify the process. For example, as the oceans warm, cloud cover decreases and a white reflecting surface is replaced with a dark absorbent surface.
The researchers developed a mathematical model to parse the two different modes within different climate models.
“The models simulate a warming pattern like today’s, but indicate that strong feedbacks kick in when the Southern Ocean and Eastern Equatorial Pacific eventually warm, leading to higher overall temperatures than would simply be extrapolated from the warming seen to date,” said Peter Huybers, Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and co-author of the paper.
Huybers and Proistosescu found that while the slow mode of warming contributes a great deal to the ultimate amount of global warming, it is barely present in present-day warming patterns. “Historical observations give us a lot of insight into how climate changes and are an important test of our climate models,” said Huybers, “but there is no perfect analogue for the changes that are coming.”
For more information visit:- 
Read more

On this day in science history: foam rubber was developed

In 1929, foam rubber was developed at the Dunlop Latex Development Laboratories in Birmingham. British scientist E.A. Murphy whipped up the first batch in 1929, using an ordinary kitchen mixer to froth natural latex rubber. His colleagues were unimpressed – until they sat on it. Within five years it was everywhere, on motorcycle seats, on London bus seats, Shakespeare Memorial Theatre seats, and eventually in mattresses.
In 1937 isocyanate based materials were first used for the formation of foam rubbers, after World War II styrene-butadiene rubber replaced many natural types of foam. Foam rubbers have been used commercially for a wide range of applications since around the 1940s. There are two types of foam in use today. One is flexible foam and the other is rigid foam. The flexible version of the foam is used in furniture, car seats, to insulate walls, and even in the very shoes that we wear. The rigid form of foam rubber is used in insulating buildings, appliances like freezers and refrigeration trucks. 
Foam rubber mattress [Public domain], via Wikimedia Commons
So, how is foam rubber manufactured? Rates of polymerization can range from many minutes to just a few seconds. Fast reacting polymers feature short cycle periods and require the use of machinery to thoroughly mix the reacting agents. Slow polymers may be mixed by hand, but require long periods on mixing. As a result industrial application tends to use machinery to mix products. Product processing can range from a variety of techniques including, but not limited to spraying, open pouring, and molding.
  • Material preparation – Liquid and solid material generally arrive on location via rail or truck, once unloaded liquid materials are stored in heated tanks. When producing slabstock  typically two or more polymers streams are used.
  • Mixing – Open pouring, better known as continuous dispensing is used primarily in the formation of rigid, low density foams. Specific amounts of chemicals are mixed into a mixing head, much like an industrial blender. The foam is poured onto a conveyor belt, where it then cures for cutting.
  • Curing and Cutting – After curing on the conveyor belt the foam is then forced through a horizontal band saw. This band saw cuts the pieces in a set size for the application. General contracting uses 4’x12’x2’’.
  • Further processing – Once cut and cured the slabstock can either be sold or a lamination process can be applied. This process turns the slabstock into a rigid foam board known as boardstock. Boardstock is used for metal roof insulation, oven insulation, and many other durable goods.
Unfortunately, because of the variety in polyurethane chemistries, it is difficult to recycle foam materials using a single method. Reusing slab stock foams for carpet backing is how the majority of recycling is done. This method involves shredding the scrap and bonding the small flakes together to form sheets. Other methods involve breaking the foam down into granules and dispersing them into a polyol blend to be molded into the same part as the original. The recycling process is still ever developing for foam rubber and the future will hopefully unveil new and easier ways for recycling.
For more information, visit:-


Read more

Tipping points are real: Gradual changes in CO2 levels can induce abrupt climate changes

During the last glacial period, within only a few decades the influence of atmospheric CO2 on the North Atlantic circulation resulted in temperature increases of up to 10 degrees Celsius in Greenland – as indicated by new climate calculations from researchers at the Alfred Wegener Institute and the University of Cardiff. Their study is the first to confirm that there have been situations in our planet’s history in which gradually rising CO2 concentrations have set off abrupt changes in ocean circulation and climate at “tipping points.” These sudden changes, referred to as Dansgaard-Oeschger events, have been observed in ice cores collected in Greenland. The results of the study have just been released in the journal Nature Geoscience.
Ice core sample taken from drill. Photo by Lonnie Thompson, Byrd Polar Research Center, Ohio State University. [Public domain], via Wikimedia Commons
Previous glacial periods were characterised by several abrupt climate changes in the high latitudes of the Northern Hemisphere. However, the cause of these past phenomena remains unclear. In an attempt to better grasp the role of CO2 in this context, scientists from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) recently conducted a series of experiments using a coupled atmosphere-ocean-sea ice model.
First author Xu Zhang explains: “With this study, we’ve managed to show for the first time how gradual increases of CO2 triggered rapid warming.” This temperature rise is the result of interactions between ocean currents and the atmosphere, which the scientists used the climate model to explore. According to their findings, the increased CO2 intensifies the trade winds over Central America, as the eastern Pacific is warmed more than the western Atlantic. This is turn produces increased moisture transport from the Atlantic, and with it, an increase in the salinity and density of the surface water. Finally, these changes lead to an abrupt amplification of the large-scale overturning circulation in the Atlantic. “Our simulations indicate that even small changes in the CO2 concentration suffice to change the circulation pattern, which can end in sudden temperature increases,” says Zhang.
Further, the study’s authors reveal that rising CO2 levels are the dominant cause of changed ocean currents during the transitions between glacial and interglacial periods. As climate researcher Gerrit Lohmann explains, “We can’t say for certain whether rising CO2 levels will produce similar effects in the future, because the framework conditions today differ from those in a glacial period. That being said, we’ve now confirmed that there have definitely been abrupt climate changes in the Earth’s past that were the result of continually rising CO2 concentrations.”
For more information visit:-
Read more

Solar paint offers endless energy from water vapor

Researchers have developed a
solar paint that can absorb water vapour and split it to generate hydrogen –
the cleanest source of energy.
The paint contains a newly
developed compound that acts like silica gel, which is used in sachets to
absorb moisture and keep food, medicines and electronics fresh and dry.
Sun with sunspots and limb darkening as seen in visible light with solar filter. By Geoff Elston [CC BY 4.0 (], via Wikimedia Commons
But unlike silica gel, the new
material, synthetic molybdenum-sulphide, also acts as a semi-conductor and
catalyses the splitting of water molecules into hydrogen and oxygen.
Lead researcher Dr Torben
Daeneke, from RMIT University in Melbourne, Australia, said: “We found
that mixing the compound with titanium oxide particles leads to a
sunlight-absorbing paint that produces hydrogen fuel from solar energy and
moist air.
“Titanium oxide is the
white pigment that is already commonly used in wall paint, meaning that the
simple addition of the new material can convert a brick wall into energy
harvesting and fuel production real estate.
“Our new development has
a big range of advantages,” he said. “There’s no need for clean or
filtered water to feed the system. Any place that has water vapour in the air,
even remote areas far from water, can produce fuel.”
His colleague, Distinguished
Professor Kourosh Kalantar-zadeh, said hydrogen was the cleanest source of
energy and could be used in fuel cells as well as conventional combustion
engines as an alternative to fossil fuels.
“This system can also be
used in very dry but hot climates near oceans. The sea water is evaporated by
the hot sunlight and the vapour can then be absorbed to produce fuel.
“This is an extraordinary
concept – making fuel from the sun and water vapour in the air.”
For more information visit:-
Read more

Which Earth-Size Planets are Habitable?

A University of Oklahoma
post-doctoral astrophysics researcher, Billy Quarles, has identified the possible
compositions of the seven planets in the TRAPPIST-1 system. Using thousands of
numerical simulations to identify the planets stable for millions of years,
Quarles concluded that six of the seven planets are consistent with an
Earth-like composition. The exception is TRAPPIST-1f, which has a mass of 25
percent water, suggesting that TRAPPIST-1e may be the best candidate for future
habitability studies.
“The goal of exoplanetary
astronomy is to find planets that are similar to Earth in composition and potentially
habitable,” said Quarles. “For thousands of years, astronomers have
sought other worlds capable of sustaining life.”
The Earth seen from space, by NASA/Apollo 17 crew; taken by either Harrison Schmitt or Ron Evans [Public domain or Public domain], via Wikimedia Commons
Quarles, a researcher in the
Homer L. Dodge Department of Physics and Astronomy, OU College of Arts and
Sciences, collaborated with scientists, E.V. Quintana, E. Lopez, J.E. Schlieder
and T. Barclay at NASA Goddard Space Flight Center on the project. Numerical
simulations for this project were performed using the Pleiades Supercomputer
provided by the NASA High-End Computing Program through the Ames Research
Center and at the OU Supercomputing Center for Education and Research.
TRAPPIST-1 planets are more
tightly spaced than in Kepler systems, which allow for transit timing
variations with the photometric observations. These variations tell the
researchers about the mass of the planets and the radii are measured through
the eclipses. Mass and radius measurements can then infer the density. By
comparing Earth’s density (mostly rock) to the TRAPPIST-1 planets, Quarles can
determine what the planets are likely composed of and provide insight into
whether they are potentially habitable.
TRAPPIST-1f has the tightest
constraints with 25 percent of its mass in water, which is rare given its
radius. The concern of this planet is that the mass is 70 percent the mass of
Earth, but it is the same size as Earth. Because the radius is so large, the
pressure turns the water to steam, and it is likely too hot for life as we know
it. The search for planets with a composition as close to Earth’s as possible
is key for finding places that we could identify as being habitable. Quarles
said he is continually learning about the planets and will investigate them
further in his studies.
TRAPPIST-1 is a nearby ultra-cool
dwarf about 40 light-years away from Earth and host to a remarkable planetary
system consisting of seven transiting planets. The seven planets are known as
TRAPPIST 1b, c, d, e, f, g and h.
For more information visit:-


Read more

On this day in science – the rubber fire hose was patented

In 1821, a fire hose of cotton
web lined with rubber was patented by James Boyd of Boston, Mass. He invented
it to replace riveted leather hose. Leather hose had many drawbacks, including
drying out, cracking and bursting from excessive pressure. The introduction of
rivets (1807), to replace stitching, had allowed higher pressures and greater
delivery of water on the fireground. The improved hose now was 40 to 50 feet in
length and weighed more than 85 pounds with the couplings. Hose oilers were
developed to keep the leather supple and pliable. Various types of oils and
other substances were used to keep the hose in shape. By 1871, the Cincinnati
Fire Department was using the B.F. Goodrich Company’s new rubber hose
reinforced with cotton ply.
Indoor fire hose with a fire extinguisher, by Raysonho @ Open Grid Scheduler / Grid Engine (Own work) [CC0], via Wikimedia Commons
Modern fire hoses use a
variety of natural and synthetic fabrics and elastomers in their construction.
These materials allow the hoses to be stored wet without rotting and to resist
the damaging effects of exposure to sunlight and chemicals. Modern hoses are
also lighter weight than older designs, and this has helped reduce the physical
strain on firefighters. 
Various devices
are becoming more prevalent that remove the air from the interior of fire hose,
commonly referred to as fire hose vacuums. This process makes hoses smaller and
somewhat rigid, thus allowing more fire hose to be packed or loaded into the
same compartment on a fire fighting apparatus.
There are several types of
hose designed specifically for the fire service. Those designed to operate
under positive pressure are called discharge hoses. They include attack hose,
supply hose, relay hose, forestry hose, and booster hose. Those designed to operate
under negative pressure are called suction hoses.
For more information visit:-
Read more

A guide to the twenty common amino acids

Have you ever thought about
what makes up your body? Only 20 amino acids! Take a look at the graphic below,
to discover the structure of each of these, plus information on the notation
used to represent them.
Source: Compound Interest. Click to enlarge.
Amino acids are organic
compounds containing amine (-NH2) and carboxyl (-COOH) functional groups, along
with a side chain (R group) specific to each amino acid. The key elements of an
amino acid are carbon, hydrogen, oxygen, and nitrogen, although other elements
are found in the side chains of certain amino acids. About 500 amino acids are
known and can be classified in many ways. They can be classified according to
the core structural functional groups’ locations as alpha- (α-), beta- (β-),
gamma- (γ-) or delta- (δ-) amino acids; other categories relate to polarity, pH
level, and side chain group type (aliphatic, acyclic, aromatic, containing
hydroxyl or sulfur, etc.). In the form of proteins, amino acid residues form
the second-largest component (water is the largest) of human muscles and other
tissues. Beyond their role as residues in proteins, amino acids participate in
a number of processes such as neurotransmitter transport and biosynthesis.
In biochemistry, amino acids
having both the amine and the carboxylic acid groups attached to the first
(alpha-) carbon atom have particular importance. They are known as 2-, alpha-,
or α-amino acids (generic formula H2NCHRCOOH in most cases, where R is an
organic substituent known as a “side chain”); often the term
“amino acid” is used to refer specifically to these. They include the
22 proteinogenic (“protein-building”) amino acids, which combine into
peptide chains (“polypeptides”) to form the building-blocks of a vast
array of proteins. These are all L-stereoisomers (“left-handed”
isomers), although a few D-amino acids (“right-handed”) occur in
bacterial envelopes, as a neuromodulator (D-serine), and in some antibiotics. 

Twenty of the proteinogenic amino acids are encoded directly by triplet codons
in the genetic code and are known as “standard” amino acids. The
other two (“non-standard” or “non-canonical”) are
selenocysteine (present in many noneukaryotes as well as most eukaryotes, but
not coded directly by DNA), and pyrrolysine (found only in some archea and one
bacterium). Pyrrolysine and selenocysteine are encoded via variant codons; for
example, selenocysteine is encoded by stop codon and SECIS element.
N-formylmethionine (which is often the initial amino acid of proteins in
bacteria, mitochondria, and chloroplasts) is generally considered as a form of
methionine rather than as a separate proteinogenic amino acid. Codon–tRNA
combinations not found in nature can also be used to “expand” the
genetic code and create novel proteins known as alloproteins incorporating
non-proteinogenic amino acids.
Many important proteinogenic
and non-proteinogenic amino acids have biological functions. For example, in
the human brain, glutamate (standard glutamic acid) and gamma-amino-butyric
acid (“GABA”, non-standard gamma-amino acid) are, respectively, the
main excitatory and inhibitory neurotransmitters. Hydroxyproline, a major
component of the connective tissue collagen, is synthesised from proline.
Glycine is a biosynthetic precursor to porphyrins used in red blood cells.
Carnitine is used in lipid transport.
Nine proteinogenic amino acids
are called “essential” for humans because they cannot be created from
other compounds by the human body and so must be taken in as food. Others may
be conditionally essential for certain ages or medical conditions. Essential
amino acids may also differ between species.

Because of their biological
significance, amino acids are important in nutrition and are commonly used in
nutritional supplements, fertilizers, and food technology. Industrial uses
include the production of drugs, biodegradable plastics, and chiral catalysts.
For more information visit:-



Read more

Diesels pollute more than lab tests detect

Because of testing
inefficiencies, maintenance inadequacies and other factors, cars, trucks and
buses worldwide emit 4.6 million tons more harmful nitrogen oxide (NOx) than
standards allow, according to a new study co-authored by University of Colorado
Boulder researchers.
The study, published in Nature,
shows these excess emissions alone lead to 38,000 premature deaths annually
worldwide, including 1,100 deaths in the United States.
The findings reveal major
inconsistencies between what vehicles emit during testing and what they emit in
the real world – a problem that’s far more severe, said the researchers, than
the incident in 2015, when federal regulators discovered Volkswagen had been
fitting millions of new diesel cars with “defeat devices.”
Red Diesel Tank, by Meena Kadri [CC BY 2.0 (], via Wikimedia Commons
The devices sense when a vehicle
is undergoing testing and reduce emissions to comply with government standards.
Excess emissions from defeat devices have been linked to about 50 to 100 U.S.
deaths per year, studies show.
“A lot of attention has been
paid to defeat devices, but our work emphasizes the existence of a much larger
problem,” said Daven Henze, an associate professor of mechanical
engineering at CU Boulder who, along with postdoctoral researcher Forrest
Lacey, contributed to the study. “It shows that in addition to tightening
emissions standards, we need to be attaining the standards that already exist
in real-world driving conditions.”
The research was conducted in
partnership with the International Council on Clean Transportation, a
Washington, D.C.-based nonprofit organization, and Environmental Health
Analytics LLC.
For the paper, the researchers
assessed 30 studies of vehicle emissions under real-world driving conditions in
11 major vehicle markets representing 80 percent of new diesel vehicle sales in
2015. Those markets include Australia, Brazil, Canada, China, the European
Union, India, Japan, Mexico, Russia, South Korea and the United States.
They found that in 2015, diesel
vehicles emitted 13.1 million tons of NOx, a chemical precursor to particulate
matter and ozone. Exposure in humans can lead to heart disease, stroke, lung
cancer and other health problems. Had the emissions met standards, the vehicles
would have emitted closer to 8.6 million tons of NOx.
Heavy-duty vehicles, such as
commercial trucks and buses, were by far the largest contributor worldwide,
accounting for 76 percent of the total excess NOx emissions.
Henze used computer modeling and
NASA satellite data to simulate how particulate matter and ozone levels are,
and will be, impacted by excess NOx levels in specific locations. The team then
computed the impacts on health, crops and climate.
“The consequences of excess
diesel NOx emissions for public health are striking,” said Susan Anenberg,
co-lead author of the study and co-founder of Environmental Health Analytics
China suffers the greatest health
impact with 31,400 deaths annually attributed to diesel NOx pollution, with
10,700 of those deaths linked to excess NOx emissions beyond certification
limits. In Europe, where diesel-passenger cars are common, 28,500 deaths
annually are attributed to diesel NOx pollution, with 11,500 of those deaths
linked to excess emissions.
The study projects that by 2040,
183,600 people will die prematurely each year due to diesel vehicle NOx
emissions unless governments act.
The authors say emission
certification tests, both prior to sale and by vehicle owners, could be more
accurate if they were to simulate a broader variety of speeds, driving styles
and ambient temperatures. Some European countries now use portable testing
devices that track emissions of a car in motion.
“Tighter vehicle emission
standards coupled with measures to improve real-world compliance could prevent
hundreds of thousands of early deaths from air pollution-related diseases each
year,” said Anenberg.
For more information, visit:-


Read more

On this day in science history: the Hindenburg Zeppelin arrived at Lakehurst, New Jersey, USA

In 1936, the Hindenburg
Zeppelin arrived at Lakehurst, New Jersey, USA, from Germany marking the
beginning of a regular transatlantic passenger service. The flight, carrying 51
passengers and 56 crew, took 61 hours.
Hindenburg at Lakehurst, by U.S. Department of the Navy. Bureau of Aeronautics. Naval Aircraft Factory, Philadelphia, Pennsylvania (USA). [Public domain], via Wikimedia Commons
The Hindenburg was a large
German commercial passenger-carrying rigid airship, the lead ship of the
Hindenburg class, the longest class of flying machine and the largest airship
by envelope volume. It was designed and built by the Zeppelin Company
(Luftschiffbau Zeppelin GmbH) on the shores of Lake Constance in
Friedrichshafen and was operated by the German Zeppelin Airline Company
(Deutsche Zeppelin-Reederei). The Hindenburg had a duralumin structure,
incorporating 15 Ferris wheel-like bulkheads along its length, with 16 cotton
gas bags fitted between them. The bulkheads were braced to each other by
longitudinal girders placed around their circumferences. The airship’s outer
skin was of cotton doped with a mixture of reflective materials intended to
protect the gas bags within from radiation, both ultraviolet (which would
damage them) and infrared (which might cause them to overheat). The gas cells
were made by a new method pioneered by Goodyear using multiple layers of
gelatinized latex rather than the previous goldbeater’s skins. In 1931 the
Zeppelin Company purchased 5,000 kg (11,000 lb) of duralumin salvaged from the
wreckage of the October 1930 crash of the British airship R101, which might
have been re-cast and used in the construction of the Hindenburg.
The interior furnishings of
the Hindenburg were designed by Fritz August Breuhaus, whose design experience
included Pullman coaches, ocean liners, and warships of the German Navy. The
upper “A” Deck contained small passenger quarters in the middle
flanked by large public rooms: a dining room to port and a lounge and writing
room to starboard. Paintings on the dining room walls portrayed the Graf
Zeppelin’s trips to South America. A stylized world map covered the wall of the
lounge. Long slanted windows ran the length of both decks. The passengers were
expected to spend most of their time in the public areas, rather than their
cramped cabins.
The lower “B” Deck
contained washrooms, a mess hall for the crew, and a smoking lounge. Harold G.
Dick, an American representative from the Goodyear Zeppelin Company, recalled
“The only entrance to the smoking room, which was pressurized to prevent
the admission of any leaking hydrogen, was via the bar, which had a swiveling
air lock door, and all departing passengers were scrutinized by the bar steward
to make sure they were not carrying out a lit cigarette or pipe.”
Helium was initially selected
for the Hindenburg’s lifting gas because it was the safest to use in airships,
as it is not flammable. One proposed measure to save helium was to make
double-gas cells for 14 of the 16 gas cells; an inner hydrogen cell would be
protected by an outer cell filled with helium, with vertical ducting to the
dorsal area of the envelope to permit separate filling and venting of the inner
hydrogen cells. At the time, however, helium was also relatively rare and
extremely expensive as the gas was only available in industrial quantities from
distillation plants at certain oil fields in the United States. Hydrogen, by
comparison, could be cheaply produced by any industrialized nation and being
lighter than helium also provided more lift. Because of its expense and rarity,
American rigid airships using helium were forced to conserve the gas at all
costs and this hampered their operation.
Despite a U.S. ban on the
export of helium under the Helium Control Act of 1927, the Germans designed the
airship to use the far safer gas in the belief that they could convince the US
government to license its export. When the designers learned that the National
Munitions Control Board would refuse to lift the export ban, they were forced
to re-engineer the Hindenburg to use hydrogen for lift. Despite the danger of
using flammable hydrogen, no alternative lighter-than-air gases could provide
sufficient lift. One beneficial side effect of employing hydrogen was that more
passenger cabins could be added. The Germans’ long history of flying
hydrogen-filled passenger airships without a single injury or fatality
engendered a widely held belief they had mastered the safe use of hydrogen. The
Hindenburg’s first season performance appeared to demonstrate this, however the
airship was destroyed by fire 14 months later on May 6, 1937, at the end of the
first North American transatlantic journey of its second season of service.
Thirty-six people died in the accident, which occurred while landing at
Lakehurst. This was the last of the great airship disasters; it was preceded by the
crashes of the British R38 in 1921 (44 dead), the US airship Roma in 1922 (34 dead),
the French Dixmude in 1923 (52 dead), the British R101 in 1930 (48 dead), and
the US Akron in 1933 (73 dead).


For more information visit:-
Read more

Mice with missing lipid-modifying enzyme heal better after heart attack

Two immune responses are important for recovery after a heart attack – an acute inflammatory response that attracts leukocyte immune cells to remove dead tissue, followed by a resolving response that allows healing.
The human heart by Patrick J. Lynch, medical illustrator (Patrick J. Lynch, medical illustrator) [CC BY 2.5 (], via Wikimedia Commons
Failure of the resolving response can allow a persistent, low-grade nonresolving inflammation, which can lead to progressive acute or chronic heart failure. Despite medical advances, 2 to 17 percent of patients die within one year after a heart attack due to failure to resolve inflammation. More than 50 percent die within five years.
Using a mouse heart attack model, Ganesh Halade, Ph.D., and his University of Alabama at Birmingham colleagues have shown that knocking out one particular lipid-modifying enzyme, along with a short-term dietary excess of a certain lipid, can improve post-heart attack healing and clear inflammation. Halade, an assistant professor in the UAB Department of Medicine, hopes that future physicians will be able to use knowledge from studies like his to boost healing in patients after heart attacks and prevent heart failure.
“Our goal is healing, and we are reaching that goal,” he said of efforts in the UAB Division of Cardiovascular Medicine.
Why are lipids and lipid-modifying enzymes important in inflammation and resolving inflammation? Three key lipid modifying enzymes in the body change the lipids into various signaling agents. Some of these signaling agents regulate the triggering of inflammation, and others promote the reparative pathway.
The lipids modified by the enzymes are two types of essential fatty acids that come from food, since mammals cannot synthesize them. One is n-6 or omega-6 fatty acids, and the other type is n-3 or omega-3 fatty acids. The balance of these two types is important.
The Mediterranean diet, with a near balance of omega-3 and omega-6 fatty acids, promotes heart health. The Western diet, with large amounts of omega-6 fatty acids that greatly exceed the levels of omega-3 fatty acids, can lead to heart disease.
The three main lipid-modifying enzymes compete with each other to modify whatever fatty acids are available from the diet. So, Halade and colleagues asked, what will happen if we knock out one of the key enzymes, the 12/15 lipoxygenase?
They reasoned that this would increase the metabolites produced by the other two main enzymes, cyclooxygenase and cytochrome P450 because they no longer had to compete with 12/15 lipoxygenase for lipids to modify. This might be a benefit because those signaling lipids produced through the cyclooxygenase and cytochrome P450 pathways were already known to lead to major resolution promotion factors for post-heart attack healing.
The UAB researchers found that knocking out the 12/15 lipoxygenase and feeding the mice a short-term excess of polyunsaturated fatty acids led to increased leukocyte clearance after experimental heart attack, meaning less chronic inflammation. It also improved heart function, increased the levels of bioactive lipids during the reparative phase of healing, and led to higher levels of reparative cytokine markers. Additionally, the heart muscle showed less of the fibrosis that is a factor in heart failure.
Besides congestive heart failure, persistent inflammation aggravates a vicious cycle in many cardiovascular diseases, including atherogenesis, atheroprogression, atherosclerosis and peripheral artery disease.
Halade says further mechanistic studies are warranted to develop novel targets for treatment and to find therapies that support the onset of left ventricle healing and prevent heart failure pathology.
For more information visit:-


Read more