Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

NNadir

NNadir's Journal
NNadir's Journal
October 22, 2017

Front Lines in the Battle Against Antibiotic Resistance and a Posthumous Synthesis.

My favorite section heading title in the scientific review article I will discuss in this post, Natural Products as Platforms To Overcome Antibiotic Resistance (Wuest, Fletcher and Rossiter, Chem. Rev. 2017, 117, 12415?12474) is this one:

"Woodward’s Posthumous Synthesis of Erythromycin A."

Robert Burns Woodward is the only person to have synthesized Erythromycin A, which has this structure:



"Only person" is too strong a word; a better term would that Woodward's group is the only group to synthesize erythromycin A, the synthesis which was almost certainly largely designed by him, being completed by his students and post-docs after he died.

A nice internet riff on Woodward's synthesis of erythromycin A is here: Woodward Wednesdays

He did not, however, synthesize the drug after he died, although he was such a remarkable chemist, probably the greatest organic chemist ever, that one is inclined to think of him in quasi-mystical terms.

The completion of the work was conducted under Harvard Professor Yoshito Kishi, who had been one of Woodward's students.

Here is the final paper on this synthesis, published after Woodward died: Asymmetric total synthesis of erythromycin. 3. Total synthesis of erythromycin

Here is the triumphal remarks on the completion of the synthesis:

Completion of the synthesis of erythromycin was carried out in the following manner. Simultaneous deprotection of both the C-4” hydroxyl group of the cladinose moiety and the C-9 amino group in 7d by Na-Hg/MeOH13 furnished (9s)-erythromycylamine (l1a) [mp 126-129 OC, [.Iz5D -48.1’ (c 0.59, CHCl,); 75% yield] which was found to be identical with an authentic Me Me sample prepared from natural erythromycin by a known method.2O Treatment of loa with N-chlorosuccinimide (1 equiv) in pyridine at 25 OC gave 10b (mp 166-170 OC with partial melting at 130-134 “C), which was dehydrochlorinated by AgF in HMPA a t 70 OC to yield erythromycinimine ( 1 0 ~ ) Hydrolysis of 1Oc in water at 50oC afforded the corresponding ketone (40% overall yield from l0a), which was found to be identical with erythromycin (2) in all respects (‘H NMR, mp, mmp, CYD, mass, IR and chromatographic mobility).22


Here, from the review cited at the opening of the post is a scheme of Woodward's synthesis of erythromycin A:




Woodward, by the way was an interesting fellow. When he was 11 years old, in 1928, he wrote the German consulate in Boston to request copies of scientific papers relating to a class of chemical reactions now known as cycloadditions (specifically the Diels-Alder reaction). He entered MIT when he was 16 in 1933, was kicked out in 1934 for neglecting his studies, readmitted in 1935 and granted a bachelor's degree in 1936 and a Ph.D in 1937 when he was 20 years old. It is said he never actually had much interaction with his Ph.D. advisers, and barely knew them, their "adviser" status being a mere formality. He joined the faculty of Harvard University shortly after, remaining there until his death in 1979. Besides being the only person to successfully design a successful total synthesis for erythromycin A, Woodward was the only scientist to supervise a successful synthesis of vitamin B12. He was awarded the Nobel Prize in 1965.

Later on his life he worked with Roald Hoffman - who was born a Jew in Poland and was hidden in an attic from 1943 to 1944 (Anne Frank style) between the ages of 7 and 9 where his mother read him Geography texts - to formulate the Woodward Hoffman rules, for which Hoffman, a chemistry professor at Cornell University, was awarded the Nobel Prize in 1981, after Woodward died in 1979.

I have their book, one of my happiest possessions, in the original Verlag edition in my personal library with an inserted "errata" page in it, and the orbitals drawn in green and blue:




These rules govern the Diels-Alder reaction (and many other reactions) about which Woodward was inquiring when he was 11 years old.

As I remarked earlier in this space, one reason to do the synthesis of complex molecules is that such syntheses are high art, expressions of the beauty of the human mind:

A Beautiful Review Article on the Total Synthesis of Vancomycin.

Commercially erythromycin is not obtained by organic synthesis; it is isolated from cell cultures, a process which makes it about as cheap as aspirin, and it is still a widely used drug. (Woodward's synthesis involved 48 steps: A synthesis of this type would be commercially - and environmentally - prohibitive.) However, the species it kills are rapidly evolving, many resistant strains are known.

In Woodward's time, the goal of the organic synthesis of complex natural products was to prove their structure. Modern instrumentation coupled with modern software, high field NMR, high resolution mass spec, and x-ray and neutron diffraction systems have made organic synthesis less important in structural assignments.

Today, the goal of organic synthesis is less about structure and pure science and more about improving the "SAR" (structure activity relationships) of molecules that are modified versions of the natural products. Organic synthesis can also resolve in some cases drug shortages caused by the rarity of the species producing the natural product, an example being the anti-neoplastic (anti-cancer) drug Taxol, originally isolated from the bark of a slow growing yew tree in the Pacific Northwest, but ultimately provided commercially by partial synthesis from a precursor found in the far more plentiful pine needles of this tree. Another reason for modifying natural products is that they may not be optimized for use as drugs; they may exhibit unacceptable toxicity which needs moderation, or poor bioavailability, short half-lives in vivo, or poor stability.

So this is the goal of modern day synthesis of natural products, to improve upon nature to optimize natural products to serve humanity or to improve on their availability.

This long winded intro about R.B. Woodward brings me to the paper.

One of the many crises now before humanity is the fact that the effectiveness of antibiotics is being defeated by evolution, irrespective of whether the nut cases in the Republican party "believe" in evolution or not. (Evolution is not about "belief"; it is a fact and morons who cannot accept facts are, um, well, morons.) This evolution is being driven by overuse and by misuse, often on the part of patients, who stop taking their antibiotics when they start feeling better, even though some of the infectious organisms still survive in their systems, indeed, those organisms that have the strongest resistance to the antibiotic presented to defeat them. The consequences should be self evident.

Antibiotic development is not a big money maker in the pharmaceutical industry by the way. The industry makes more money on drugs that are palliative than drugs which cure diseases. A blood pressure drug that a patient is required to take for the rest of his or her life is going to make more money than a drug that cures a bacterial infection and only needs to be taken for a week or two.

A nice cartoon from the paper cited at the beginning of demonstrates the modes of action of almost all antibiotics now on the market:



Another graphic shows the number of people who get and die each year from infectious diseases that are resistant to antibiotics.



Figure 2. Total infections (gray) and deaths (black) in the US associated with various pathogenic bacteria.11 CRE = carbapenem-resistant enterococci; VRE = vancomycin-resistant enterococci; MDR =multidrug resistant.


Remarks from the introduction of the review article:

Among the greatest achievements of humankind in recent history stands the discovery and production of penicillin as a life-saving antibiotic. However, nearly a century of unchecked usage has rendered the world’s supply of antibiotics severely weakened; Sir Alexander Fleming noted in his Nobel lecture that underdosage can apply the selective pressure that induces bacteria to evolve resistance to these drugs. In this review, we contrast the traditional method of semisynthetic modifications to natural products with modern synthetic approaches to develop new antibiotics around the privileged scaffolds that informed drug discovery for decades in order to overcome contemporary antibiotic resistance. In the 90 years since the discovery of penicillin (1), natural products have provided a major foundation for the development of antibiotic drugs. The reliance on natural products to provide new molecular entities for virtually every disease is also well established.1 Of the nine antibiotic classes in Figure 1, six represent naturally occurring compounds, with only three (the sulfonamides, fluoroquinolones, and oxazolidinones) conceived entirely through synthetic chemistry. We note the impressive structural diversity and complexity within the natural product antibiotics especially when compared to the synthetic classes.

Scientists have warned for decades that bacteria are rapidly evolving resistance to antibiotics.2?4 Resistance has proliferated due to a confluence of two key factors: the frequent prescription against infections of a nonbacterial nature, such as viral infections, and unregulated usage, which can lead to sublethal doses, permitting resistance to spread rapidly.5 We also observe that prescribing habits vary drastically from country to country; the United States is particularly likely to use recently developed antibiotics rapidly, possibly shortening their lifetime of efficacy.6 Analysis of the IMS Health Midas database indicated that between 2010 and 2014 consumption of antibiotics worldwide increased by 36%;7 the carbapenems and polymyxins, two”last resort” drugs, have increased in usage by 45% and 13%, respectively. This resistance is extensively observed in hospitals where immunocompromised patients are particularly vulnerable. 8 Hospital-acquired resistant infections have spread rapidly since the initial discovery of sulfonamide- and penicillin-resistant strains shortly after the introduction of these drugs in the 1930s and 1940s.9,10 In the U.S. and U.K. this problem has not abated, as nearly 40?60% of hospital-acquired S. aureus strains are methicillin-resistant.11 These public health threats will continue to rise without new antibiotics and meaningful changes in treating infections. Beyond prescription in humans, antibiotics find extensive use as prophylactic agricultural supplements to promote livestock growth and prevent diseases. It is estimated that the US livestock industry consumes a staggering 80% of antibiotics produced.5 Antibiotic-resistant strains of Salmonella have been identified in ground meat,12 and antibiotic use in livestock has been strongly linked to fluoroquinolone-resistant Salmonella.1


The paper is 51 pages long, and regrettably I cannot reproduce it all in a post like this. The point of the article is a review of techniques that may address the utilization and semi-synthesis from natural products that show (out of evolutionary necessity) bacteriocidal effects.

It's chock full of beautiful synthetic schemes; if you can find your way to a good scientific library, and have a love and understanding of organic chemistry, an afternoon reading it might be really well spent.

One of the funnier parts of the article, well it would be funny were it not so awful, comes after the part just quoted above:

The need for new antibiotics is increasingly widely appreciated as a pressing concern by governments, scientists, and the general public.14,15 These factors, in tandem with the reduced research and development toward discovering new antibiotics, have worsened the recent eruption of antibiotic resistant bacterial populations across the globe. As the golden age of antibiotics has clearly ended, the most pessimistic view of the current state of affairs is that a postantibiotic era may be approaching.


The bold is mine.

Um, not your government. Your government is controlled by ignorant clowns and stupid people who hate science because they are not bright enough or educated enough to know a damned thing about it other than that they hate it. And while the neo-nazis in the White House and their racist pals in Congress may be slightly worse than the general public, the general public - and we need to include some people on the left as well, anti-vaxxers, anti-nukes and their ilk, anti-this, anti-that, in the set that has put this country well on the path to a post-scientific age.

The conclusion of the review begins with a restatement of what I said above:

In most talks given by natural product chemists, it is commonly noted that while natural products can have outstanding biological activities, they are often poor choices for drugs, exemplified by erythromycin. While semisynthetic modifications have historically redirected this potent activity into a clinically useful drug, we have demonstrated that the need for new antibiotics to overcome resistant pathogens is so great as to require new generations of drugs occupying previously unexplored regions of chemical space. We have shown herein that the most direct, efficient, and fruitful method of generating drugs that can evade bacterial resistance mechanisms is through the power of total synthesis. We have outlined synthetic achievements toward many antibiotic scaffolds, both traditional and unexplored, and have discussed how these compounds fare against pathogenic bacteria. Traditional SAR studies can be undertaken to identify the key bioactive moieties, which can then be modified to generate more potent compounds. Additionally, synthetic approaches such as DOS, FOS, and CtD can be used to construct unprecedented scaffolds bearing the complexity of natural products, despite that these molecules may be foreign to Nature to the best of our knowledge. We hope that the examples discussed herein will spark further inspiration in the synthetic community to continue exploring innovative targets and methods to ensure a sustainable antibiotic supply.


And it ends with this plaintive scientific call:

Despite the potential for new antibiotic isolates and scaffolds, we must take care to preserve the efficacy of drugs currently prescribed (and overprescribed!). This requires improving upon our antibiotic stewardship by encouraging reduction in both the over prescription and misuse of these medicines. We must also continue to educate the public about the causes and persistence of antibiotic resistance, in part to drive public favor for a greater allocation of resources to address this crisis. Although the recognition of the term “antibiotic resistance” has increased, the understanding of how to avoid it and how it is caused has not been translated as effectively.15 Only by actively combining scientific innovation and communication can we avoid a postantibiotic era.


I've bolded the line to wish the fine scientists who wrote this review good luck with that. We, in this country, have just established ourselves as a nation of morons, an international laughing stock with an educational system being directed by an Amway scanner who absolutely hates education.

I don't mean to be too depressing.

Have a nice Sunday in spite of me.
October 20, 2017

Moisture Swing Absoprtion/Desorption of CO2 Using Polymeric Ionic Liquids.

The paper I will discuss in this thread is this one: Designing Moisture-Swing CO2 Sorbents through Anion Screening of Polymeric Ionic Liquids (Wang et al, Energy Fuels, 2017, 31 (10), pp 11127–11133)

All of humanity's efforts - or lack of effort mixed with a unhealthy dollop of denial - to address climate change have failed.

No one now living will ever again see a reading of under 400 ppm of the dangerous fossil fuel waste carbon dioxide in the atmosphere, an irreversible milestone that was passed last year.

On the right, the open hatred of science has put an uneducated and unintelligent orange human nightmare in the White House who is actually trying to revive the worst dangerous fossil fuel, coal, while on the left, there has been a delusional embrace of so called "renewable energy" that has lead to the very dangerous, and frankly criminal, surge in the dangerous natural gas industry, for which so called "renewable energy" is merely lipstick on the pig.

So called "renewable energy" did not work, is not working, and will not work.

Thus, it will fall to future generations however many - if any - survivors there are, to remove carbon dioxide from the atmosphere, an engineering challenge that is enormous, to the extent it is even feasible. Our practiced contempt for them will leave them with few resources to address this planet.

In recent years I've been thinking and reading a great deal about this challenge. It seems to me that there are only two reasonable pathways that might work, one being extraction of carbon dioxide from seawater (where on a volume basis it is far more concentrated than in air) and the other being the pyrolytic processing of biomass, which, as life is self replicating and can thus produce huge surface areas capable of absorbing CO2.

Right now of course, the primitive way that biomass is utilized - combustion - is responsible for roughly half of the 7 million people killed each year from air pollution while mindless dullards do stuff like for example (one actually sees this kind of thing) whine about the collapse of a tunnel at the Richland National Laboratory where radioactive materials are stored.

However, in pyrolytic treatment of biomass, the biomass is heated in a closed system that is not vented to the atmosphere. If the heat to drive the pyrolytic reactions is nuclear heat - using the only real resource that we are leaving to future generations, used nuclear fuel, depleted uranium, and the thorium waste from the stupid and wasteful wind turbine/electric car industry - pyrolytic treatment of biomass can almost be certainly carried out in a way that is carbon negative to the extent that carbon is captured in products like graphite.

What is necessary in this case is the ability to separate carbon dioxide from other gases, notably hydrogen.

It is thus with some interest the paper linked at the beginning of this post, published by scientists at State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou, Zhejiang Province 310018, P.R. China. (China loses close to 2 million people per year to air pollution, and unlike Americans, they are actually interested in solving the problem.)

An ionic liquid is an salt, almost always partially organic, that is liquid at or near room temperature. A great deal of research has been conducted on ionic liquids, and they are very promising reagents for the capture of carbon dioxide.

I discussed them earlier in this space, providing a link to a lecture by the American scientist Joan Brennecke, an expert in ionic liquids:

On the Solubility of Carbon Dioxide in Ionic Liquids.

The work by the Chinese scientists expands greatly on Dr. Brenneke's talk and offers some very novel approaches.

As the title suggests, this is a moisture swing approach to capture and release of carbon dioxide. There are several types of "swings" that are utilized in the separation of gases. "Pressure swings" rely on a permeable material that will preferentially absorb or release one gas faster than an other when subject to changes in pressure. Commercial nitrogen and oxygen generators work this way to separate either oxygen or nitrogen from air. They consist of a compressor and a valve. "Temperature swings" rely on heating and cooling a material that absorbs a gas, monoaminoethanol is such an agent often proposed (but seldom used on a meaningful scale) for carbon capture.

This "moisture swing" is unique in as much as it relies on wetting and drying a polymeric ionic liquid.

Here's some text from the paper:

Carbon dioxide capture and storage (CCS) is regarded as one of the most effective approaches to alleviating global warming.(1) Among the developed CO2 capture technologies, adsorption based on a solid material is advantageous, since sorbents have unique interfacial properties, such as porous structures, modifiable functional groups,(2, 3) and low emissions to the environment.(4) Recently, polymeric ionic liquids (PILs) have been developed as a family of promising CO2 sorbents, as they possess the unique characteristics of ionic liquids (ILs) and the feasibility of a macromolecular framework.(5-7) Both the CO2 adsorption capacity and rate of PILs can be substantially higher than those of their corresponding ionic liquid monomers.(8) Similar to the fine-tunability of ILs,(9) these properties of PILs can further be tuned by the choice of cations and anions.

Among PILs, the quaternary-ammonium-based PILs have been highlighted in the literature due to their higher stability and CO2 sorption capacity than those of imidazolium-based PILs.(10) More interestingly, when coordinated with relatively basic anions such as OH– and CO32–, these PILs show the unique property of moisture-swing adsorption (MSA).(11) During the moisture-swing cycle, the sorbents adsorb CO2 in a dry atmosphere and release CO2 in a humid atmosphere. For poly[4-vinylbenzyltrimethylammonium carbonate] (P[VBTEA][CO32–]), the equilibrium partial pressure of CO2 under wet conditions is 2 orders of magnitude higher than that under dry conditions.(11) This moisture-induced cycle utilizes the free energy released by water evaporation, and thus, it can avoid the use of high-grade heat for sorbent regeneration and is also environmentally benign...


Humanity will never invent a form of energy that is as is safe, as sustainable, or as reliable as nuclear energy. However, nuclear energy is not perfect, of course (a point raised frequently by stupid people with selective attention who love to burn gas and coal to rant about, say, Fukushima). From my perspective, the biggest single problem with nuclear energy is what to do with the waste heat.

This system, however, offers a wonderful way to utilize waste heat in say, hot climates, where such heat may be of less value than in climates having cold spells - not that his planet will have cold spells as frequently as it once did.

The authors continue:

...Of particular interest in this work is the screening of anions for PIL materials with MSA ability for CO2. Previous studies have shown that the type of anion plays a key role in defining the PIL features.(12) After introducing CO32– or OH– anions into P[VBTEA], the sorbents exhibited such a strong CO2 affinity that they were proposed for directly capturing CO2 from the ambient air (400 ppm).(11, 13-16) For quaternary-ammonium-based salt hydrates with F– or acetate (Ac–), the CO2 absorption capacity was observed to be affected by the hydration state,(17) which is similar to the MSA. The CO2 adsorption isotherms indicate that they are suitable for gas separation from concentrated sources (15–100 kPa).(17) To gain insight into the MSA, several theoretical studies have been conducted on ionic interactions, reaction pathways, and hydration energy. Density functional theory (DFT) calculations by Wang et al.(18) demonstrated the reaction pathways of proton transfer for the PIL with the anion CO32–. The results showed that the hydrated water acts as both a reactant and a catalyst during CO2 adsorption. By building a molecular dynamic model for ion pairs of the quaternary-ammonium cation (N(CH3)4+) and CO32– in a confined space, Shi et al.(19) found that the free energy of CO32– hydrolysis in nanopores decreases with the decrease in water availability. However, how the chemical structures, especially the types of anion, of quaternary-ammonium-based PILs would affect the moisture-swing performance for CO2 capture remains unknown. Therefore, extensive work is required to reveal the relationship between the structural and physicochemical properties of PILs, especially through theoretical approaches.


In this paper, the ionic liquid is bound to polystyrene, as the following picture from the paper shows:




The structure of the ionic liquids is optimized so that it can react freely with water, and the authors screen various anions to form ion pairs with the positively charged polymeric ionic liquid, so that the structure can accommodate 3 water molecules.



The thermodynamic reaction diagram is shown:



The structure is optimized, reflecting that the fluoride ion is found to be the best counterion.



The authors conclude thusly:

Quaternary-ammonium-based PILs are promising sorbents for CO2 capture. In this work, theoretical studies were performed to systematically investigate the effect of varying the counteranion on CO2 adsorption. Our results showed that the PIL with carbonate as the counteranion has the lowest activation energy, strongest CO2 affinity, and largest swing size, and it functions through a two-step mechanism, which indicates that it is a better candidate as a sorbent to capture CO2 from ultralow concentration mixtures, such as that of air. The PIL with fluoride as the counteranion has a low activation energy, strong CO2 affinity, and medium-to-large swing size, and it adsorbs CO2 through a two-step mechanism, owing to the unique ability of fluoride to strongly attract protons. The PIL with acetate as the counteranion has a high activation energy, weak CO2 affinity, and small swing size and functions with a one-step mechanism, which indicates that it is suitable for capturing CO2 at a high partial pressure because of its large capacity and its feasibility to be regenerated through conventional approaches. Further investigations revealed that the repulsion between the two quaternary ammonium cations, which interact with the carbonate anion or two fluoride anions, could promote the dissociation of hydrated water and lower the activation energy of the CO2 adsorption. The two-step reaction pathways also exhibit low activation energy owing to the relatively small structural changes of each step. Our findings could provide a fundamental understanding of CO2 capture by quaternary-ammonium-based PILs and pave the way toward determining the optimal structure of a PIL to be used for CO2 capture in specific circumstances.


This is interesting. Note that if the ionic liquids themselves are synthesized utilizing carbon dioxide as a starting material - this is definitely possible, particularly using biological lignin as a source for the aromatic rings, the carbon is sequestered by the capture agent.

I thought this paper was very cool and thought I'd share it.

Have a nice Friday tomorrow.
October 19, 2017

Long Persisting Organic luminescent Systems.

Here's another cool paper: Organic long persistent luminescence

Long persistent luminescence (LPL) materials—widely commercialized as ‘glow-in-the-dark’ paints—store excitation energy in excited states that slowly release this energy as light1. At present, most LPL materials are based on an inorganic system of strontium aluminium oxide (SrAl2O4) doped with europium and dysprosium, and exhibit emission for more than ten hours2. However, this system requires rare elements and temperatures higher than 1,000 degrees Celsius during fabrication, and light scattering by SrAl2O4 powders limits the transparency of LPL paints1. Here we show that an organic LPL (OLPL) system of two simple organic molecules that is free from rare elements and easy to fabricate can generate emission that lasts for more than one hour at room temperature. Previous organic systems, which were based on two-photon ionization, required high excitation intensities and low temperatures3. By contrast, our OLPL system—which is based on emission from excited complexes (exciplexes) upon the recombination of long-lived charge-separated states—can be excited by a standard white LED light source and generate long emission even at temperatures above 100 degrees Celsius. This OLPL system is transparent, soluble, and potentially flexible and colour-tunable, opening new applications for LPL in large-area and flexible paints, biomarkers, fabrics, and windows. Moreover, the study of long-lived charge separation in this system should advance understanding of a wide variety of organic semiconductor devices4.




...Upon exposure to ultraviolet or visible light, some substances absorb the excitation energy and release it as a differently coloured light either quickly, as fluorescence, or slowly, in the form of either phosphorescence or long persistent luminescence. Exploiting the slowest emission process, LPL materials have been widely commercialized as glow-in-the-dark paints for watches and emergency signs1, and are being explored for application in in vivo biological imaging because their long-lived emission makes it possible to take time-resolved images long after excitation5. In the mid-1990s, a highly efficient LPL system was developed that uses SrAl2O4 doped with europium and dysprosium, and this inorganic system forms the basis of most commercial glow-in-the-dark paints because of its long emission (more than ten hours) and high durability2. However, this system requires not only rare elements for long-lived emission but also very high fabrication temperatures of more than 1,000?°C. Moreover, the manufacturing of paints from the insoluble SrAl2O4 requires many steps, including grinding of the compounds into micrometre-scale powders for dispersion into solvents or matrices, and light scattering by the powders prevents the formulation of a transparent paint1. The realization of LPL from organic molecules would solve many of these problems.


Their glow in the dark organic molecules can stay lit up for about an hour.
October 19, 2017

Ion Sieves from Graphene Oxide.

Cool paper, this one: Ion sieving in graphene oxide membranes via cationic control of interlayer spacing (Wu, Jin, Fang, Li et al Nature 550, 380–383 (19 October 2017))

We're in a hell of place with water and metals on this planet, whether we know it or not, and our foolish "investment" in so called "renewable energy" which is yet another distributed (and thus difficult to control) "thing" in our portfolio of "things" will make it worse.

This kind of filter potentially could be utilized to manage distributed waste which is what distributed things become after a short interlude.

As always, the caveat is the requirement for energy.

Anyway, some text from the very interesting materials science paper:

Graphene oxide membranes—partially oxidized, stacked sheets of graphene1—can provide ultrathin, high-flux and energy-efficient membranes for precise ionic and molecular sieving in aqueous solution2, 3, 4, 5, 6. These materials have shown potential in a variety of applications, including water desalination and purification7, 8, 9, gas and ion separation10, 11, 12, 13, biosensors14, proton conductors15, lithium-based batteries16 and super-capacitors17. Unlike the pores of carbon nanotube membranes, which have fixed sizes18, 19, 20, the pores of graphene oxide membranes—that is, the interlayer spacing between graphene oxide sheets (a sheet is a single flake inside the membrane)—are of variable size. Furthermore, it is difficult to reduce the interlayer spacing sufficiently to exclude small ions and to maintain this spacing against the tendency of graphene oxide membranes to swell when immersed in aqueous solution21, 22, 23, 24, 25. These challenges hinder the potential ion filtration applications of graphene oxide membranes. Here we demonstrate cationic control of the interlayer spacing of graphene oxide membranes with ångström precision using K+, Na+, Ca2+, Li+ or Mg2+ ions. Moreover, membrane spacings controlled by one type of cation can efficiently and selectively exclude other cations that have larger hydrated volumes.





There have been previous efforts to tune the interlayer spacing. For example, it can be widened, to increase the permeability of the graphene oxide membrane (GOM), by intercalating large nanomaterials21, 22 as well as by cross-linking large and rigid molecules23. Reducing GOMs can lead to a sharp decrease in the interlayer spacing, but renders them highly impermeable to all gases, liquids and aggressive chemicals24, 25. Recent work reported a way of sieving ions through GOMs by encapsulating the graphene oxide sheets in epoxy films and varying the relative humidity27 to tune the interlayer spacing. It remains difficult to reduce the interlayer spacing sufficiently (to less than a nanometre) to exclude small ions while still permitting water flow and enabling scalable production25. This limits the potential of GOMs for separating ions from bulk solution or for sieving ions of a specific size range from a mixed salt solution—such as the most common ions in sea water and those in the electrolytes of lithium-based batteries and super-capacitors (Na+, Mg2+, Ca2+, K+ and Li+)2, 25. Here we combine experimental observations and theoretical calculation to show that cations (K+, Na+, Ca2+, Li+ and Mg2+) themselves can determine and fix the interlayer spacing of GOMs at sizes as small as a nanometre


From the conclusion:

In summary, we have experimentally achieved facile and precise control of the interlayer spacing in GOMs, with a precision of down to 1?Å, and corresponding ion rejection, through the addition of one kind of cation. This method is based on our understanding of the strong noncovalent hydrated cation–? interactions between hydrated cations and the aromatic ring, and its production is scalable. We note that our previous density functional theory computations show that other cations (Fe2+, Co2+, Cu2+, Cd2+, Cr2+ and Pb2+) have a much stronger cation–? interaction with the graphene sheet26, suggesting that other ions could be used to produce a wider range of interlayer spacings. Overall, our findings represent a step towards graphene-oxide-based applications, such as water desalination and gas purification, solvent dehydration, lithium-based batteries and supercapacitors and molecular sieving.


Have a nice day tomorrow.
October 19, 2017

On the future of work...

There are some interesting, if worrisome, reports in the current issue of Nature, one entitled "The Future of Work"


From a news item in the current issue of Nature:

Last year, entrepreneur Sebastian Thrun set out to augment his sales force with artificial intelligence. Thrun is the founder and president of Udacity, an education company that provides online courses and employs an armada of salespeople who answer questions from potential students through online chats. Thrun, who also runs a computer-science lab at Stanford University in California, worked with one of his students to collect the transcripts of these chats, noting which resulted in students signing up for a course. The pair fed the chats into a machine-learning system, which was able to glean the most effective responses to a variety of common questions.

Next, they put this digital sales assistant to work alongside human colleagues. When a query came in, the program would suggest an appropriate response, which a salesperson could tailor if necessary. It was an instantaneously reactive sales script with reams of data supporting every part of the pitch. And it worked; the team was able to handle twice as many prospects at once and convert a higher percentage of them into sales. The system, Thrun says, essentially packaged the skills of the company's best salespeople and bequeathed them to the entire team — a process that he views as potentially revolutionary. “Just as much as the steam engine and the car have amplified our muscle power, this could amplify our brainpower and turn us into superhumans intellectually,” he says.

The past decade has seen remarkable advances in digital technologies, including artificial intelligence (AI), robotics, cloud computing, data analytics and mobile communications. Over the coming decades, these technologies will transform nearly every industry — from agriculture, medicine and manufacturing to sales, finance and transportation — and reshape the nature of work. “Millions of jobs will be eliminated, millions of new jobs will be created and needed, and far more jobs will be transformed,” says Erik Brynjolfsson, who directs the Initiative on the Digital Economy at the Massachusetts Institute of Technology in Cambridge.


http://www.nature.com/polopoly_fs/7.47100.1508235986!/image/Future-of-work_graphic1.jpg_gen/derivatives/landscape_630/Future-of-work_graphic1.jpg

This is actually slightly encouraging. Robots do better in extreme environments than human beings, and since we are well on the way to making this entire planet an extreme environment, it's possible someone, um, something, will be here to um, "enjoy?!?" it.

In another news item in the same issue, there's a report of a computer that learned to play the game "Go" without human intervention and taught itself, in days, to beat the world's best players.

Self-taught AI is best yet at strategy game Go


Artificial-intelligence program AlphaGo Zero trained in just days, without any human input.


An artificial intelligence (AI) program from Google-owned company DeepMind has reached superhuman level at the strategy game Go — without learning from any human moves.

This ability to self-train without human input is a crucial step towards the dream of creating a general AI that can tackle any task. In the nearer-term, though, it could enable programs to take on scientific challenges such as protein folding or materials research, said DeepMind chief executive Demis Hassabis at a press briefing. “We’re quite excited because we think this is now good enough to make some real progress on some real problems.”

Previous Go-playing computers developed by DeepMind, which is based in London, began by training on more than 100,000 human games played by experts. The latest program, known as AlphaGo Zero, instead starts from scratch using random moves, and learns by playing against itself. After 40 days of training and 30 million games, the AI was able to beat the world's previous best 'player' — another DeepMind AI known as AlphaGo Master. The results are published today in Nature1, with an accompanying commentary2.

October 19, 2017

An FDA panel has recommended approval of the first gene therapy drug.

FDA advisers back gene therapy for rare form of blindness. (News Item, Nature, Vol. 550 Issue 7676)



Advisers to the US Food and Drug Administration (FDA) have paved the way for the agency’s first approval of a gene therapy to treat a disease caused by a genetic mutation.

On 12 October, a panel of external experts unanimously voted that the benefits of the therapy, which treats a form of hereditary blindness, outweigh its risks. The FDA is not required to follow the guidance of its advisers, but it often does. A final decision on the treatment, called voretigene neparvovec (Luxturna), is expected by 12 January.

An approval in the lucrative US drug market would be a validation that gene-therapy researchers have awaited for decades. “It’s the first of its kind,” says geneticist Mark Kay of Stanford University in California, of the treatment. “Things are beginning to look more promising for gene therapy.”

Luxturna is made by Spark Therapeutics of Philadelphia, Pennsylvania, and is designed to treat individuals who have two mutated copies of a gene called RPE65. The mutations impair the eye’s ability to respond to light, and ultimately lead to the destruction of photoreceptors in the retina.

The treatment consists of a virus loaded with a normal copy of the RPE65 gene. The virus is injected into the eye, where the gene is expressed and supplies a normal copy of the RPE65 protein.

In a randomized controlled trial that enrolled 31 people, Spark showed that, on average, patients who received the treatment improved their ability to navigate a special obstacle course1. This improvement was sustained for the full year during which the company gathered data. The control group, however, showed no improvement overall. This was enough to convince the FDA advisory committee that the benefits of the therapy outweigh the risks.

Long road

That endorsement is an important vote of confidence for a field that has struggled over the past 20 years. In the early 1990s, gene therapy was red hot, says David Williams, chief scientific officer at Boston Children’s Hospital in Massachusetts. “You couldn’t keep young people out of the field,” he says. “Everyone wanted in.” Then came the death of a young patient enrolled in a gene-therapy clinical trial, and the realization that a gene therapy used to treat children with an immune disorder could cause leukaemia.

Investors backed away from gene therapy, and some academics grew scornful of it. Although European regulators approved one such therapy in 2012, for a condition that causes severe pancreatitis, many doubted that it worked. (The company that makes it has announced that it will not renew its licence to market the drug when it expires on 25 October.) “You’re too smart to work in this field,” a colleague told Kay. “It’s a pseudoscience.”

But some researchers kept plugging away at the problem, improving the vectors that shuttle genes into human cells. Over time, new clinical trials began to show promise, and pharmaceutical companies became more interested in developing treatments for rare genetic diseases. Gradually, investors returned...


These people will be, of course, GMO, and we can look for the bourgeois assholes at Greenpeace to protest their existence.

Blindness has never worried Greenpeace types, as we can see from their awful and frankly criminal campaign against golden rice, which might have addressed vitamin A deficiencies in, um, poor people.

It is perfectly acceptable of course, to make people suffer if people who neither understand nor like nor are competent to understand science object loudly.

This is good news in any case.

I am involved (peripherally) professionally in a project involving gene therapy for a disease that kills people. Thus I find this approval encouraging.



October 14, 2017

Despacito/Desparate Cheeto.

A senior person in my wife's office is a Trumper; and she was out yesterday, and the group left behind, including one American native of Puerto Rico, had a celebration of "Less than Sympathy for the Devil.".

As an old guy decidedly on the Nerdist side, I'm certainly unaware of what's the hottest and most notable Pop Music.

But I understand this song, allegedly slightly oversexed (I don't speak Spanish) from the devastated American island of Puerto Rico hated by the racist in chief is a big hit:

<iframe width="854" height="480" src="

" frameborder="0" allowfullscreen></iframe>

My wife showed me this video yesterday, so that I could understand the humor of "Desperate Cheeto."

<iframe width="854" height="480" src="
" frameborder="0" allowfullscreen></iframe>

Enjoy



October 12, 2017

Two Interesting Papers On the Utilization of Low Grade Heat.

It is an incontrovertible law of thermodynamics - the famous second law - that the system that can sustain the highest temperatures will be the most efficient.

In the dangerous natural gas industry, which is the fastest growing energy business in the world - if one is intelligent enough and educated enough to understand that the peak power of a so called "renewable energy" facility is nowhere near its average continuous power (intelligence and education which seem increasingly rare) - the most efficient systems are combined cycle system, which can demonstrate efficiency of close to 60%. In a combined cycle gas plant the gas is burned and expands against a turbine, its temperature actually exceeding the melting point of the turbine blades, were they not coated with thermal barrier coatings. The exhaust leaving the turbine is still hot enough to boil water even under pressure, and as such is used to power a standard Rankine steam cycle.

I discussed this state of affairs in a relatively recent post in this space (An interesting thesis on the utility of MAX phases in the manufacture of turbine blades.

Despite its use in high efficiency systems, and despite having the so called "renewable energy" industry function as a smokescreen for its use, natural gas is not sustainable. It's continued use is a crime against all future generations, humans who are babies today, and humans who will be babies 500 years from now, assuming that there will be humans in 500 years.

The only sustainable form of energy is nuclear energy, whether we admit it to ourselves or prefer going along stupidly claiming otherwise, and nuclear energy, and only nuclear energy is capable of continuously providing combustion free high temperatures.

The problem of course is that the laws of thermodynamics require that the heat must go somewhere, and usually that "somewhere" is most often a body of water. (This is true not only for nuclear plants, but is also true of gas plants and coal plants. This is the subject of considerable public ignorance which assumes, incorrectly, that only nuclear plants have cooling towers.) Where thermal electric plants operate away from ocean water, they are generally net water consumers. This point is made in the first of the two papers I will briefly discuss in this post: A Combined Heat- and Power-Driven Membrane Capacitive Deionization System (Hatzel, Hatzel and Zhang, Environ. Sci. Technol. Lett., Article ASAP Published online October 2, 2017)

The paper is open sourced, anyone can read it.

The introduction says it all:

Managing energy consumption during water treatment processes and water use during energy generation is a critical component of the water–energy nexus.(1) Thermoelectric power plants account for 38% of fresh water withdrawals in the United States, and a majority of this water is used for on-site cooling (?80%) and power generation (?10%).(2, 3) Technologically, dry cooling could aid in minimizing the demand for water during cooling, and low-energy water treatment technologies could reduce the amount of energy spent on boiler water treatment. Furthermore, improvements made in treating boiler water have a direct impact on improving plant thermal efficiencies, as high total dissolved solids (TDS) result in a low rate of heat transfer due to corrosion and fouling.(4) Boiler water treatment will become increasingly important because high-efficiency supercritical based power plants require more stringent water quality.(4)

Treating water to pure and ultrapure levels can be energy intensive and traditionally requires treatment strategies that combine softening with multiple passes through a reverse osmosis (RO) system. Most energy generation and industrial sites that require pure water also have access to an abundance of waste heat, which could act as an ideal free energy source for water treatment.(5) Therefore, developing synergistic approaches to use this “waste energy” source has become desirable. Currently, indirect and direct means for converting low-grade waste heat into deionized water do exist. Indirect approaches include those that convert heat to power through technologies such as thermoelectric devices and then use that power to operate a water treatment system.(6) While possible, undesirable energy conversion losses, larger system footprints, and cost typically limit their practical implementation.


Here is what the authors claim:

Here, we aim to detail a process for harvesting thermal energy within an electrochemically driven deionization system termed membrane capacitive deionization (MCDI). MCDI offers several advantages for effective brackish water treatment (low specific energy consumption), yet is purely driven by electrical energy.(14) We experimentally investigate the potential for harvesting thermal energy through exploiting the electrostatic and membrane potentials dependence on temperature. We also highlight the role heat plays in limiting losses that arise when moving MCDI toward high-water recovery operating conditions.


The details are in the paper, which is, again, open sourced.

Here's a nice cartoon that suggests what is going on:



This desalination system consumes electricity, but does so more efficiently than, for example, reverse osmosis, and it utilizes low grade heat in such a way as to eliminate its disposal to water.

Some comments on desalination. I'm not entirely sanguine about desalination, owing to a concern about changes to ocean currents deriving from saline gradients. This is probably not quite as serious as changing planetary weather patterns as the wind industry proposes to do - I've seen several very silly references in the the literature to using wind turbines to stop hurricanes, one proposed by the anti-nuke idiot Mark Z. Jacobsen at Stanford, which suggests (were it true, undoubtedly its not) that wind turbines can stop the, um, wind. This said, changing salt gradients in the ocean is certainly problematic. (Happily the wind industry is too expensive and useless to actually produce significant energy, so weather patterns are safe for the time being.)

However, desalination may be a risk that future generations may have to assume, since we have left them with nothing other than trillion ton quantities of dangerous fossil fuel waste.

I note that the concentration of carbon dioxide on a volumetric scale is much higher in seawater than in air, which suggests that if one were trying to remove dangerous fossil fuel waste from the atmosphere, the processing of seawater would be a key to accomplishing that task - if it can be achieved. I've written about that in this space elsewhere.

Another paper I've seen on waste heat, written by Mexican scientists (both of whom are far more intelligent than that orange excuse for a human being in the White House) proposes to add a third layer to a combined cycle system by using a working fluid that boils at temperatures much lower than that of water, 100C.

Here is the paper, which is regrettably not open sourced but must be obtained in a good science library: Thermo-Economic Multiobjective Optimization of a LOW Temperature Organic Rankine Cycle for Energy Recovery (Ruben Omar Bernal-Lara† and Antonio Flores-Tlacuahuac Ind. Eng. Chem. Res., 2017, 56 (40), pp 11477–11495)

Here is the introduction:

The high energy demand in the world in the last years has become a focus of attention for researchers due to the constant quest of alternative processes to produce power in a economic way and low environmental impact. The recovery of energy from waste heat streams is an example of an alternative energy process seeking to meet energy demand and taking care of sustainability issues.1,2 In fact, waste heat sources are commonly found in solar and geothermal sources, as well as in industrial process streams. Thermodynamic cycles have been widely studied to use waste heat sources and produce power.3?5 Most of those cycles use water as working fluid for low-temperature energy recovery. The most common thermodynamic cycle is the water Rankine cycle. However, the thermodynamic efficiency of the Rankine cycle is low at temperatures below 370 °C.6 To improve the performance of the Rankine cycle for lowtemperature energy recovery, the Organic Rankine Cycle (ORC), featuring organic compounds as working fluids, has been proposed. 1 Organic compounds are good candidates as working fluids because of their low boiling point, low medium vapor pressure, and high vaporization enthalpy at low temperature ranges.


Unfortunately several of the working fluids in these systems are in fact, HFC's (and even one banned CFC), all of which are potent greenhouse gases in their own right; a safer and more sustainable option would be to use flammable working fluids, and several are examined, including cyclopropane, n-butane, isobutane and n-pentane.

This is a math heavy paper - clearly over the head of an American President with tiny hands and an even tinier brain - but the point is well taken. No matter how one goes about this matter however, all these systems require a low temperature heat sink. It's not clear that, since we have elected to do absolutely nothing about climate change except to post ever more absurd fantasies about the so called "renewable energy" nirvana which did not come, is not here and will never come, there will be appropriate low temperature heat sinks in the future, almost certainly not in Mexico.

Here is a nice graphic that touches on the cost of these systems:



Here is the Pareto curve and a schematic cartoon of the system:



I have been thinking and reading about approaches to waste heat utilization for some time - these are just two examples - and note that there are many other useful things that might be done with it, but those are subjects for another day.

Have a nice day tomorrow.





October 9, 2017

Zeolites for the Rapid and Selective Uptake of Cesium.

If you were born after the late 1940's or early 1950's you have always been "contaminated" with the radionuclide cesium-137.

This isotope was released in large quantities during the era of open air nuclear weapons testing, and a clearly detectable amount of it usually leaks out after underground nuclear weapons testing. The isotope is so ubiquitous that it is often utilized as a tracer to understand soil erosion.

I explored this issue of nuclear testing and its relationship to modern soil testing elsewhere: Every Cloud Has A Silver Lining, Even Mushroom Clouds: Cs-137 and Watching the Soil Die.

Of course, this isotope was also released by the nuclear screw up at Chernobyl as well as the failure of the Fukushima reactors in a natural disaster.

As it happens, the amount of radioactivity cesium-137 leaching into the oceans from the Fukushima event are trivial when compared with the natural radioactivity of the ocean, which is largely connected with the huge amounts of naturally occurring (and impossible to avoid) potassium-40 and the polonium-210 which is a product of the decay series. This point was clearly made in the famous "Fukushima Tuna" paper, which was immediately and grotesquely misinterpreted by journalists around the world, causing a fair amount of hysterical gas and coal burning to power computers for people who can't read a scientific paper (or anything else) very well leading them to freak out and announce we're all going to die because of Fukushima.

We didn't.

Here is a link to the famous Fukushima Tuna paper: Pacific bluefin tuna transport Fukushima-derived radionuclides from Japan to California


Here is the text comparing the Fukushima radioactivity with the natural radioactivity that has always been in the ocean and always will be in the ocean, no matter how much we do - and we're doing a lot of it with dangerous fossil fuels - to destroy the ocean:

Inferences about the safety of consuming radioactivity-contaminated seafood can be complicated due to complexities in translating food concentration to actual dose to humans (12), but it is important to put the anthropogenic radioactivity levels in the context of naturally occurring radioactivity. Total radiocesium concentrations of post-Fukushima PBFT were approximately thirty times less than concentrations of naturally occurring 40K in post-Fukushima PBFT and YFT and pre-Fukushima PBFT (Table 1). Furthermore, before the Fukushima release the dose to human consumers of fish from 137Cs was estimated to be 0.5% of that from the ?-emitting 210Po (derived from the decay of 238U, naturally occurring, ubiquitous and relatively nonvarying in the oceans and its biota (13); not measured here) in those same fish (12). Thus, even though 2011 PBFT showed a 10-fold increase in radiocesium concentrations, 134Cs and 137Cs would still likely provide low doses of radioactivity relative to naturally occurring radionuclides, particularly 210Po and 40K.


Here is another paper in the same journal by the same authors (with a few added to boot) complaining about the stupidity, fear, and ignorance with which their original paper, which was about the migration of Tuna and not about risk to human health, was handled by the media and the general public:

Evaluation of radiation doses and associated risk from the Fukushima nuclear accident to marine biota and human consumers of seafood (Madigan et al, PNAS, 110, 26, 10670-10675.)

Recent reports describing the presence of radionuclides released from the damaged Fukushima Daiichi nuclear power plant in Pacific biota (1, 2) have aroused worldwide attention and concern. For example, the discovery of 134Cs and 137Cs in Pacific bluefin tuna (Thunnus orientalis; PBFT) that migrated from Japan to California waters (2) was covered by >1,100 newspapers worldwide and numerous internet, television, and radio outlets. Such widespread coverage reflects the public’s concern and general fear of radiation. Concerns are particularly acute if the artificial radionuclides are in human food items such as seafood. Although statements were released by government authorities, and indeed by the authors of these papers, indicating that radionuclide concentrations were well below all national safety food limits, the media and public failed to respond in measure. The mismatch between actual risk and the public’s perception of risk may be in part because these studies reported radionuclide activity concentrations in tissues of marine biota but did not report dose estimates and predicted health risks for the biota or for human consumers of contaminated seafood. We have therefore calculated the radiation doses absorbed by diverse marine biota in which radioactivity was quantified (1, 2) and humans that potentially consume contaminated PBFT


Never underestimate the power of stupidity.

The ocean contains about 530 billion curies of potassium-40, which corresponds to about 2 times 10 to the 22nd power nuclear decays per second, or since, there are 31,556,736 seconds in a sidereal year, 6.2 times 10 to 29th power per year. The specific activity of cesium is 3.12 times 10 to the 12th power bequerels per gram, meaning that to match the natural radioactivity associated with radiopotassium in the ocean, we would need to deliberately and directly dump 6100 tons of cesium-137 directly into the ocean. However, this is more radiocesium, by orders of magnitude, than has ever been produced by all the world's nuclear reactors operating - and all the nuclear weapons detonations - for more than half a century.

In high concentrations, in any case, Cs-137 can and does represent a real risk, not quite the same risk as dangerous fossil fuel waste which kills half of the seven million people who die each year from air pollution, the other half being killed by the combustion products of "renewable" biomass.

Further, it turns out that as a strong gamma radiation emitter, cesium-137 should be regarded as an extremely valuable material for mineralizing the vast amounts of halocarbons that have been dumped into the environment, where they represent a huge risk. These are the fluorocarbons like PFOS, chlorocarbons like the famous CFC's, the extremely toxic PCBs, certain insecticides like DDT, and the awful business about brominated flame retardants like the PBDE's and their related compounds.

Thus it is a pretty bad idea to throw this stuff away, either in the idiotic notion of waste dumps, or by either deliberate or unintentional release. Thus we need to recover this stuff in order to utilize it.

Thus it is with interest I came across this paper for highly selective removal of cesium from dilute solutions:

Highly Selective and Rapid Uptake of Radionuclide Cesium Based on Robust Zeolitic Chalcogenide via Stepwise Ion-Exchange Strategy (Feng et al Chem. Mater. 2016, 28, 8774?8780)

From the introduction:

As an efficient and low-carbon power generation method, nuclear power plays a critical role in meeting the increasing energy needs. However, nuclear wastes and reactor accidents could result in the leak of radionuclides into environments, which is a key reason limiting more widespread use of nuclear energy.1,2 Among various radioactive nuclides, 137Cs+ is the most hazardous due to its high fission yield (6.09%), long halflife (?30 years), and high solubility.3?5 When accidentally released to the sea or ground, it must be decontaminated immediately for public safety. The 137Cs+ ions also need to be recycled effectively from nuclear waste solutions in the reprocessing plants. Therefore, for 137Cs+ cleanup, high selectivity for Cs+ in the presence of relatively high concentration of competing cations (Na+, K+, Ca2+, Mg2+), fast kinetics, and commercial availability are desired in largescale application.6


The authors then describe some problems systems similar to the one they synthesize here were, and claim to address them:


For ion-exchange applications, maximizing the concentration of exchangeable cations is of critical importance for the process efficiency. The concentration of cations in traditional oxidebased zeolites is determined by the framework Al3+/Si4+ molar ratio which has a maximum value of 1 due to the Löwenstein’s rule (as in NaAlSiO4). Surprisingly, the Löwenstein’s rule is not obeyed in zeolite-type metal chalcogenides so that M3+/M4+ ratio can be significantly great than 1.31 This motivates us to initiate investigating the ion-exchange applications of such materials, because we expect that the large negative charge of framework and the associated high concentration of exchangeable cations will lead to a record-high cation exchange capacity. In addition, unlike low-dimensional materials, these zeolite-type chalocgenides have 3-D multidimensional, and mutually intersecting channels that could greatly facilitate ion diffusion and ion exchange kinetics. At the early stage of this study, we encountered a major obstacle to unlock the aforementioned intrinsic advantages of zeolite-type chalcogenides. Specifically, the as-synthesized materials typically contain bulky protonated amines in the channels and their ion exchange process is quite sluggish. 29,32?34

Herein we designed a two-step ion-exchange strategy to address this issue (Scheme 1). In this work, we selected a highly stable and porous amine-directed zeolitic chalcogenide framework, namely UCR-20 (zeolite type code: RWY).34,35 We demonstrated that the protonated amines located in its channels can be exchanged completely into “hard” alkali ions through the stepwise ion-exchange strategy. Interestingly, the K+-exchanged RWY (K@RWY) can rapidly capture Cs+ with high selectivity. This material also shows an excellent ability for Cs+ capture from real water samples including potable water and even seawater.


Here's a cute cartoon of their approach:



Here's some graphic data:



The caption:

Figure 2. (a) Equilibrium curve for cesium uptake fitted by Langmuir model with Ci = 1?500 ppm (RT, V:m = 1000 mL/g). (b) Distribution coefficient and removal efficiency for cesium uptake under different initial concentrations. (c) Adsorption kinetics of K@ RWY and Pristine RWY for Cs+ uptake with initial concentration around 50 ppm at room temperature (V:m = 1000 mL/g). (d) pH dependent cesium uptake. The initial concentrations were set to 10 ppm.


Their conclusion:

In conclusion, we designed and successfully realized a stepwise ion-exchange strategy based on zeolitic chalcogenide (RWY) to replace the organic amines in the channels with “hard” K+. The K@RWY could rapidly capture Cs+ with excellent selectivity, high capacity, good resistance against acid and alkali, and excellent resistance to ?- and ? irradiation. High selectivity of Cs+ uptake against Na+, K+, Ca2+, and Mg2+ has been confirmed by further competitive ion exchange experiments. It should also be noted that K@RWY could capture Cs+ efficiently in real water samples including seawater with trace levels. The results indicated that K@RWY is a very promising ion exchanger for the removal of radioactive 137Cs+. Because amine-directed chalcogenide frameworks are a large family of compounds with various compositions and topologies, this strategy reported here could greatly extend the applications of this family of materials to nuclear waste remediation and toxic metal sequestration.


Nice work I think.

Enjoy the coming workweek.



October 8, 2017

Conjugated Polymeric Photosensitizers for Photodynamic Cancer Therapy.

I stumbled upon a very cool paper this afternoon on cancer therapy.

It's here: Two-Dimensional Fully Conjugated Polymeric Photosensitizers for Advanced Photodynamic Therapy (Dai et alChem. Mater., 2016, 28 (23), pp 8651–8658)

Many people are aware that radiation can both cause and cure cancer - sometimes do both - and the mechanism by which this takes place often involves highly energetic species, often free radicals. Of course, "radiation" is a broad term; it applies not just high energy radiation, but also to light, radio waves, microwaves and radiant heat (infrared). The most efficient forms of energy for providing free radicals are high energy, UV radiation (as in the generation of sunburns and melanoma), x-rays, and gamma rays. However, under certain circumstances lower energy radiation can generate reactive species. That's what this paper refers to doing.

Photodynamic therapy is a therapy that generates reactive oxygen species (high energy species) that can react locally with cancer cells and kill them. However since human tissue is opaque, the idea is to use light waves that can penetrate tissue without depositing energy. As people who have used microwaves and understand something about how they work, know, radio and infrared radiation can do this.

From the introductory text:

Photodynamic therapy (PDT) has attracted tremendous attention as an emerging clinical modality for treatment of neoplastic and nonmalignant lesions, including cancers of the head and neck, brain, lung, pancreas, intraperitoneal cavity, breast, prostate, and skin, to name a few.(1-3) PDT generally involves photoexcitation of a photosensitizer, which transfers energy to surrounding O2 to generate reactive oxygen species (ROS), especially singlet oxygen (1O2),(4) to impart a selective irreversible cytotoxic process to malignant cells with respect to noncancerous tissues. PDT with an optical precision could show a minimal toxicity to normal tissues, negligible systemic (organ) or long-term effect, and excellent cosmetic appeal. However, the near-infrared (NIR) light is often required to effectively penetrate biological tissues, such as skin and blood, with minimal normal tissue damage.(2, 4) This is because visible light below 700 nm cannot penetrate deep into tissues with a high level of endogenous scatters and/or absorbers, such as oxy-/deoxy-hemoglobin, lipids, and water, in skin and blood.(5) Therefore, it is important to develop efficient photosensitizers with strong absorption in the desired therapeutic window (particularly, 700–1000 nm) for advanced PDT.

Porphyrin, a conjugated macrocycle with intense optical absorption, plays important roles in our life (e.g., in heme to act as a cofactor of the protein hemoglobin) and has been widely used as a photosensitizing reagent for PDT.(2) Due to the short conjugation length intrinsically associated with individual porphyrin macrocycles of a limited size, however, most of the clinically approved porphyrin-based photosensitizers show optical absorption well below 700 nm with insignificant absorption within the tissue transparency window (e.g., 700–900 nm)2. As a typical example, porfimer sodium (Photofrins), one of the widely used clinical PDT agents, with oligomeric porphyrin units being linked by nonconjugated ester and ether linkages to gain solubility, shows diminished absorption above 630 nm.(6) Therefore, it is highly desirable to develop new photosensitizers of a long conjugation length with alternating C–C single and C═C double bonds, and hence efficient absorption within the tissue transparency window (e.g., 700–900 nm).


Here's a picture of the molecules they make:



The reactive molecule they generate with these species is "singlet oxygen"

Mechanism study on singlet oxygen generation

To study the mechanism of singlet oxygen generation from photoirradiation of COP-P-SO3H, we performed the first-principles calculations with B3LYP hybrid density functional theory with the Gaussian program(15) based on the cluster model derived from the above experimental characterization data. Our calculations revealed that O2 molecules prefer to adsorb via the Yeager model(12, 27) (Figures S8–17 and Table S2) in COP-P-SO3H and that the optimized O2 adsorption site is on the top of the H-free pyrrolic N in the porphyrin ring (Figure S13). Furthermore, our molecular orbital calculations indicated that the HOMO is almost entirely localized in the porphyrin ring, being dominantly associated with the ?-bonding orbital from the pyrrolic N (No. 22 and No. 24 in Figure S18) and the ?-bonding orbital from the pyrrolic and methine bridge carbons in the porphyrin ring, while the LUMO is mainly associated with the ?-antibonding orbital from oxygen molecules and the ?-antibonding orbital from the pyrrolic ring and the methine bridge carbon in the porphyrin (Figure 7A). Clearly, therefore, the charge density distribution calculations indicate that the electron was dominantly transferred from pyrrolic N (No. 24 in Figure S18) to an oxygen molecule when it approached the porphyrin ring (Figure S19).


The conclusion:

In summary, we have, for the first time, developed a well water-dispersive, fully conjugated two-dimensional covalent organic polymer (i.e., COP-P-SO3H) via a facile and scalable, but very efficient and cost-effective, Yamamoto Ullmann cross-coupling of multiple porphyrin macrocycles through conjugated linkages followed by sulfonation. The resultant COP-P-SO3H of a good dispersiveness and low band gap exhibited strong optical absorption up to 1100 nm and acted as an efficient photosensitizer for advanced photodynamic therapy with a 20% higher singlet oxygen quantum yield than that of the clinically used protoporphyrin IX (PPIX), but a negligible cyto-/genotoxicity without light exposure. Compared to those isolated porphyrin macrocycles (e.g., PPIX, TBBPP), COP-P-SO3H, with fully conjugated multiple porphyrin macrocycles, could effectively generate singlet oxygen species during the PDT process to efficiently kill the breast tumor cells (MDA-MD-231 cells) through successive DNA damage, as revealed by the combined experimental and theoretical approach used in this study. This is the first time for 2D covalent organic polymers (COPs) to be used as efficient photosensitizers of practical significance for advanced PDT. This work clearly opens up exciting new applications for COPs as well as new avenues for the development of other novel 2D COPs for advanced cancer therapy and beyond.


The nice graphic overview:



Cool I think.

Enjoy your Sunday evening.

Profile Information

Gender: Male
Current location: New Jersey
Member since: 2002
Number of posts: 33,561
Latest Discussions»NNadir's Journal