Search Immortality Topics:

Page 21234..1020..»


Category Archives: Human Genetic Engineering

An Introduction to PCR – Technology Networks

Polymerase chain reaction (PCR) is a technique that has revolutionized the world of molecular biology and beyond. In this article, we will discuss a brief history of PCR and its principles, highlighting the different types of PCR and the specific purposes to which they are being applied.

In 1983, American biochemist Kary Mullis was driving home late at night when a flash of inspiration struck him. He wrote on the back of a receipt the idea that would eventually grant him the Nobel Prize for Chemistry in 1993. The concept was straightforward: reproducing in a laboratory tube the DNA replication process that takes place in cells. The outcome is the same: the generation of new complementary DNA (cDNA) strands based upon the existing ones.

Mullis used the basis of Sanger's DNA sequencing as a starting point for his new technique. He realized that the repeated use of DNA polymerase triggered a chain reaction resulting in a specific DNA segment's amplification.

The foundations for his idea were laid by a discovery in 1976 of a thermostable DNA polymerase, Taq, isolated from the bacterium Thermus aquaticus found in hot springs.1 Taq DNA polymerase has a temperature optimum of 72 C and survives prolonged exposure to temperatures as high as 96 C, meaning that it can tolerate several denaturation cycles.

Before the discovery of Taq polymerase, molecular biologists were already trying to optimize cyclic DNA amplification protocols, but they needed to add fresh polymerase at each cycle because the enzyme could not withstand the high temperatures needed for DNA denaturation. Having a thermostable enzyme meant that they could repeat the amplification process many times over without the need for fresh polymerase at every cycle, making the whole process scalable, more efficient and less time-consuming.

The first description of this polymerase chain reaction (PCR) using Taq polymerase was published in Science in 1985.2

In 1993, the first FDA-approved PCR kit came to market. Since then, PCR has been steadily and systematically improved. It has become a game-changer in everything from forensic evidence analysis and diagnostics, to disease monitoring and genetic engineering. It is undoubtedly considered one of the most important scientific advances of the 20th century.

The PCR is used to amplify a specific DNA fragment from a complex mixture of starting material called template DNA. The sample preparation and purification protocols depend on the starting material, including the sample matrix and accessibility of target DNA. Often, minimal DNA purification is needed. However, PCR does require knowledge of the DNA sequence information that flanks the DNA fragment to be amplified (called target DNA).

From a practical point of view, a PCR experiment is relatively straightforward and can be completed in a few hours. In general, a PCR reaction needs five key reagents:

DNA to be amplified: also called PCR template or template DNA. This DNA can be of any source, such as genomic DNA (gDNA), cDNA, and plasmid DNA.DNA polymerase: all PCR reactions require a DNA polymerase that can work at high temperatures. Taq polymerase is a commonly used one, which can incorporate nucleotides at a rate of 60 bases/second at 70 C and can amplify templates of up to 5 kb, making it suitable for standard PCR without special requirements. New generations of polymerases are being engineered to improve reaction performance. For example, some are engineered to be only activated at high temperatures to reduce non-specific amplification at the beginning of the reaction. Others incorporate a proofreading function, important, for example, when it is critical that the amplified sequence matches the template sequence exactly, such as during cloning.Primers: DNA polymerases require a short sequence of nucleotides to indicate where they need to initiate amplification. In a PCR, these sequences are called primers and are short pieces of single-stranded DNA (approximately 15-30 bases). When designing a PCR experiment, the researcher determines the region of DNA to be amplified and designs a pair of primers, one on the forward strand and one on the reverse, that specifically flanks the target region. Primer design is a key component of a PCR experiment and should be done carefully. Primer sequences must be chosen to target the unique DNA of interest, avoiding the possibility of binding to a similar sequence. They should have similar melting temperatures because the annealing step occurs simultaneously for both strands. The melting temperature of a primer can be impacted by the percentage of bases that are guanine (G) or cytosine (C) compared to adenine (A) or thymine (T), with higher GC contents increasing melting temperatures. Adjusting primer lengths can help to compensate for this in matching a primer pair. It is also important to avoid sequences that will tend to form secondary structures or primer dimers, as this will reduce PCR efficiency. Many free online tools are available to aid in primer design.Deoxynucleotide triphosphates (dNTPs): these serve as the building blocks to synthesize the new strands of DNA and include the four basic DNA nucleotides (dATP, dCTP, dGTP, and dTTP). dNTPs are usually added to the PCR reaction in equimolar amounts for optimal base incorporation.PCR buffer: the PCR buffer ensures that optimal conditions are maintained throughout the PCR reaction. The major components of PCR buffers include magnesium chloride (MgCl2), tris-HCl and potassium chloride (KCl). MgCl2 serves as a cofactor for the DNA polymerase, while tris-HCl and KCl maintain a stable pH during the reaction.The PCR reaction is carried out in a single tube by mixing the reagents mentioned above and placing the tube in a thermal cycler.The PCR amplification consists of three defined sets of times and temperatures termed steps: denaturation, annealing, and extension (Figure 1).

Figure 1: Steps of a single PCR cycle.

Each of these steps, termed cycles, is repeated 30-40 times, doubling the amount of DNA at each cycle and obtaining amplification (Figure 2).

Figure 2: The different stages and cycles of DNA molecule amplification by PCR.

Let's take a closer look at each step.

The first step of PCR, called denaturation, heats the template DNA up to 95 C for a few seconds, separating the two DNA strands as the hydrogen bonds between them are rapidly broken.

The reaction mixture is then cooled for 30 seconds to 1 minute. Annealing temperatures are usually 50 - 65 C however, the exact optimal temperature depends on the primers' length and sequence. It must be carefully optimized with every new set of primers.

The two DNA strands could rejoin at this temperature, but most do not because the mixture contains a large excess of primers that bind, or anneal, to the template DNA at specific, complementary positions. Once the annealing step is completed, hydrogen bonds will form between the template DNA and the primers. At this point, the polymerase is ready to extend the DNA sequence.

The temperature is then raised to the ideal working temperature for the DNA polymerase present in the mixture, typically around 72 C, 74 C in the case of Taq.

The DNA polymerase attaches to one end of each primer and synthesizes new strands of DNA, complementary to the template DNA. Now we have four strands of DNA instead of the two that were present to start with.

The temperature is raised back to 94 C and the double-stranded DNA molecules both the "original" molecules and the newly synthesized ones denature again into single strands. This begins the second cycle of denaturation-annealing-extension. At the end of this second cycle, there are eight molecules of single-stranded DNA. By repeating the cycle 30 times, the double-stranded DNA molecules present at the beginning are converted into over 130 million new double-stranded molecules, each one a copy of the region of the starting molecule delineated by the annealing sites of the two primers.

To determine if amplification has been successful, PCR products may be visualized using gel electrophoresis, indicating amplicon presence/absence, size and approximate abundance. Depending on the application and the research question, this may be the endpoint of an experiment, for example, if determining whether a gene is present or not. Otherwise, the PCR product may just be the starting point for more complex downstream investigations such as sequencing and cloning.

Thanks to their versatility, PCR techniques have evolved over recent years leading to the development or several different types of PCR technology.

Some of the most widely used ones are:

One of the most useful developments has been quantitative real-time PCR or qPCR. As the name suggests, qPCR is a quantitative technique that allows real-time monitoring of the amplification process and detection of PCR products as they are made.2 It can be used to determine the starting concentration of the target DNA, negating the need for gel electrophoresis in many cases. This is achieved thanks to the inclusion of non-specific fluorescent intercalating dyes, such as SYBR Green, that fluoresce when bound to double-stranded DNA, or DNA oligonucleotide sequence-specific fluorescent probes, such as hydrolysis (TaqMan) probes and molecular beacons. Probes bind specifically to DNA target sequences within the amplicon and use the principle of Frster Resonance Energy Transfer (FRET) to generate fluorescence via the coupling of a fluorescent molecule on one end and a quencher at the other end. For both fluorescent dyes and probes, as the number of copies of the target DNA increases, the fluorescence level increases proportionally, allowing real-time quantification of the amplification with reference to standards containing known copy numbers (Figure 3).

qPCR uses specialized thermal cyclers equipped with fluorescent detection systems that monitor the fluorescent signal as the amplification occurs.

Figure 3: Example qPCR amplification plot and standard curve used to enable quantification of copy number in unknown samples.

Reverse transcription (RT) -PCR and RT-qPCR are two commonly used PCR variants enabling gene transcription analysis and quantification of viral RNA, both in clinical and research settings.

RT is the process of making cDNA from single-stranded template RNA3 and is consequently also called first-strand cDNA synthesis. The first step of RT-PCR is to synthesize a DNA/RNA hybrid between the RNA template and a DNA oligonucleotide primer. The reverse transcriptase enzyme that catalyzes this reaction has RNase activity that then degrades the RNA portion of the hybrid. Subsequently, a single-stranded DNA molecule is synthesized by the DNA polymerase activity of the reverse transcriptase. High purity and quality starting RNA are essential for a successful RT-PCR.

RT-PCR can be performed following two approaches: one-step RT-PCR and two-step RT-PCR. In the first case, the RT reaction and the PCR reaction occur in the same tube, while in the two-step RT-PCR, the two reactions are separate and performed sequentially.

The reverse transcription described above often serves as the first step in qPCR too, quantifying RNA in biological samples (either RNA transcripts or derived from viral RNA genomes).

As with RT-PCR, there are two approaches for quantifying RNA by RT-qPCR: one-step RT-qPCR and two-step RT-qPCR. In both cases, RNA is first reverse-transcribed into cDNA, which is used as the template for qPCR amplification. In the two-step method, the reverse transcription and the qPCR amplification occur sequentially as two separate experiments. In the one-step method, RT and qPCR are performed in the same tube.

Digital PCR (dPCR) is another adaptation of the original PCR protocol.4 Like qPCR, dPCR technology uses DNA polymerase to amplify target DNA from a complex sample using a primer set and probes. The main difference, though, lies in the partitioning of the PCR reactions and data acquisition at the end.

dPCR and ddPCR are based on the concept of limiting dilutions. The PCR reaction is split into large numbers of nanoliter-sized sub-reactions (partitions). The PCR amplification is carried out within each droplet. Following PCR, each droplet is analyzed with Poisson statistics to determine the percentage of PCR-positive droplets in the original sample. Some partitions may contain one or more copies of the target, while others may contain no target sequences. Therefore, partitions classify either as positive (target detected) or negative (target not detected), providing the basis for a digital output format.

ddPCR is a recent technology that became available in 2011.5 ddPCR utilizes a water-oil emulsion to form the partitions that separate the template DNA molecules. The droplets essentially serve as individual test tubes in which the PCR reaction takes place.

The recent development of microfluidic handling systems with microchannels and microchambers has paved the way for a range of practical applications, including the amplification of DNA via PCR on microfluidic chips.

PCR performed on a chip benefits from microfluidics advantages in speed, sensitivity and low consumption of reagents. These features make microfluidic PCR particularly appealing for point-of-care testing, for example, for diagnostics applications. From a practical point of view, the sample flows through a microfluidic channel, repeatedly passing the three temperature zones reflecting the different steps of PCR. It takes just 90 seconds for a 10 L sample to perform 20 PCR cycles.6 The subsequent analysis can then be easily carried out off-chip.

The different PCR approaches all have advantages and disadvantages that impact the applications to which they are suited 7. These are summarized in Table 1.

Approach

Advantages

Limitations

PCR

Easiest PCR to perform

Low cost of equipment and reagents

Several downstream applications (e.g., cloning)

Results are only qualitative

Requires post-amplification analyses that increase time and risk of error

Products may need to be confirmed by sequencing

qPCR

Produces quantitative results

Probe use can ensure high specificity

High analytical sensitivity

Low turnaround time

Eliminates requirements for post-amplification analysis

Requires more expensive reagents and equipment

Less flexibility in primer and probe selection

Less amenable to other downstream product confirmation analyses (such as sequencing) due to the small length of the amplicon

Not suitable for some downstream applications such as cloning

RT-PCR and RT-qPCR

Can be used with all RNA types

RNA is prone to degradation

The RT step may increase the time and potential for contamination

dPCR and ddPCR

Fast

No DNA purification step

Provides absolute quantification

Increased sensitivity for detecting the target in limited clinical samples

Highly scalable

Costly

Based on several statistical assumptions

Microfluidic PCR

Accelerated PCR process

Reduced reagent consumption

Can be adapted for high throughput

Portable device for point-of-care applications

Allows single-cell analysis

Still very new technology

Requires extensive sample preparation to remove debris and unwanted compounds

Restricted choice of materials for the microfluidic device due to high temperatures

Table 1: Key advantages and disadvantages of different PCR approaches.

PCR has become an indispensable tool in modern molecular biology and has completely transformed scientific research. The technique has also opened up the investigation of cellular and molecular processes to those outside the field of molecular biology and consequently also finds utility by scientists in many disciplines.

Whilst PCR is itself a powerful standalone technique, it has also been incorporated into wider techniques, such as cloning and sequencing, as one small but important part of these workflows.

Research applications of PCR include:

Gene transcription -PCR can examine variations in gene transcription among cell types, tissues and organisms at a specific time point. In this process, RNA is isolated from samples of interest, and reverse-transcribed into cDNA. The original levels of RNA for a specific gene can then be quantified from the amount of cDNA amplified in PCR.Genotyping -PCR can detect sequence variations in alleles of specific cells or organisms. A common example is the genotyping of transgenic organisms, such as knock-out and knock-in mice. In this application, primers are designed to amplify either a transgene portion (in a transgenic animal) or the mutation (in a mutant animal).Cloning and mutagenesis- PCR cloning is a widely used technique where double-stranded DNA fragments amplified by PCR are inserted into vectors (e.g., gDNA, cDNA, plasmid DNA). This for example, enables the creation of bacterial strains from which genetic material has been deleted or inserted. Site-directed mutagenesis can also be used to introduce point mutations via cloning. This often employs a technique known as recombinant PCR, in which overlapping primers are specifically designed to incorporate base substitutions (Figure 4). This technique can also be used to create novel gene fusions.

Figure 4: Diagram depicting an example of recombinant PCR.Sequencing- PCR can be used to enrich template DNA for sequencing. The type of PCR recommended for the preparation of sequencing templates is called high-fidelity PCR and is able to maintain DNA sequence accuracy. In Sanger sequencing, PCR-amplified fragments are then purified and run in a sequencing reaction. In next-generation sequencing (NGS), PCR is used at the library preparation stage, where DNA samples are enriched by PCR to increase the starting quantity and tagged with sequencing adaptors to allow multiplexing. Bridge PCR is also an important part of the second-generation NGS sequencing process.Both as an independent technique and as a workhorse within other methods, PCR has transformed a range of disciplines. These include:

Genetic research- PCR is used in most laboratories worldwide. One of the most common applications is gene transcription analysis9, aimed at evaluating the presence or abundance of particular gene transcripts. It is a powerful technique in manipulating the genetic sequence of organisms animal, plant and microbe - through cloning. This enables genes or sections of genes to be inserted, deleted or mutated to engineer in genetic markers alter phenotypes, elucidate gene functions and develop vaccines to name but a few. In genotyping, PCR can be used to detect sequence variations in alleles in specific cells or organisms. Its use isnt restricted to humans either. Genotyping plants in agriculture assists plant breeders in selecting, refining, and improving their breeding stock. PCR is also the first step to enrich sequencing samples, as discussed above. For example, most mapping techniques in the Human Genome Project (HGP) relied on PCR.Medicine and biomedical research- PCR is used in a host of medical applications, from diagnostic testing for disease-associated genetic mutations, to the identification of infectious agents. Another great example of PCR use in the medical realm is prenatal genetic testing. Prenatal genetic testing through PCR can identify chromosome abnormalities and genetic mutations in the fetus, giving parents-to-be important information about whether their baby has certain genetic disorders. PCR can also be used as a preimplantation genetic diagnosis tool to screen embryos for in vitro fertilization (IVF) procedures.Forensic science- Our unique genetic fingerprints mean that PCR can be instrumental in both paternity testing and forensic investigations to pinpoint samples' sources. Small DNA samples isolated from a crime scene can be compared with a DNA database or with suspects' DNA, for example. These procedures have really changed the way police investigations are carried out. Authenticity testing also makes use of PCR genetic markers, for example, to determine the species from which meat is derived. Molecular archaeology too utilizes PCR to amplify DNA from archaeological remains.Environmental microbiology and food safety- Detection of pathogens by PCR, not only in patients' samples but also in matrices like food or water, can be vital in diagnosing and preventing infectious disease.PCR is the benchmark technology for detecting nucleic acids in every area, from biomedical research to forensic applications. Kary Mullis's idea, written on the back of a receipt on the side of the road, turned out to be a revolutionary one.

References1. Chien A, Edgar DB, Trela JM. Deoxyribonucleic acid polymerase from the extreme thermophile Thermus aquaticus. J Bacteriol 1976;127(3):1550-57 doi: 10.1128/JB.127.3.1550-1557.1976

2. Saiki RK, Scharf S, Faloona F, et al. Enzymatic amplification of beta-globin genomic sequences and restriction site analysis for diagnosis of sickle cell anemia. Science 1985;230(4732):1350 doi: 10.1126/science.2999980

3. Arya M, Shergill IS, Williamson M, Gommersall L, Arya N, Patel HRH. Basic principles of real-time quantitative PCR. Expert Review of Molecular Diagnostics 2005;5(2):209-19 doi: 10.1586/14737159.5.2.209

4. Bachman J. Chapter Two - Reverse-Transcription PCR (RT-PCR). In: Lorsch J, ed. Methods in Enzymology: Academic Press, 2013:67-74. doi : 10.1016/B978-0-12-420037-1.00002-6

5. Morley AA. Digital PCR: A brief history. Biomol Detect Quantif 2014;1(1):1-2 doi: 10.1016/j.bdq.2014.06.001

6. Taylor SC, Laperriere G, Germain H. Droplet Digital PCR versus qPCR for gene expression analysis with low abundant targets: from variable nonsense to publication quality data. Scientific Reports 2017;7(1):2409 doi: 10.1038/s41598-017-02217-x

7. Ahrberg CD, Manz A, Chung BG. Polymerase chain reaction in microfluidic devices. Lab on a Chip 2016;16(20):3866-84 doi: 10.1039/C6LC00984K

8. Garibyan L, Avashia N. Polymerase chain reaction. J Invest Dermatol 2013;133(3):1-4 doi: 10.1038/jid.2013.1

9. VanGuilder HD, Vrana KE, Freeman WM. Twenty-five years of quantitative PCR for gene expression analysis. BioTechniques 2008;44(5):619-26 doi: 10.2144/000112776

Read the original post:
An Introduction to PCR - Technology Networks

Posted in Human Genetic Engineering | Comments Off on An Introduction to PCR – Technology Networks

Interview: Elizabeth Kolbert on why well never stop messing with nature – Grist

In Australia, scientists collect buckets of coral sperm, mixing one species with another in an attempt to create a new super coral that can withstand rising temperatures and acidifying seas. In Nevada, scientists nurse a tiny colony of one-inch long Devils Hole pupfish in an uncomfortably hot, Styrofoam-molded pool. And in Massachusetts, Harvard University scientists research injecting chemicals into the atmosphere to dim the suns light and slow down the runaway pace of global warming.

These are some of the scenes from Elizabeth Kolberts new book, Under a White Sky, a global exploration of the ways that humanity is attempting to engineer, fix, or reroute the course of nature in a climate-changed world. (The title refers to one of the consequences of engineering the Earth to better reflect sunlight: Our usual blue sky could turn a pale white.)

Kolbert, a New Yorker staff writer, has been covering the environment for decades: Her first book, Field Notes from a Catastrophe, traced the scientific evidence for global warming from Greenland to Alaska; her second, The Sixth Extinction, followed the growing pace of animal extinctions.

Under a White Sky covers slightly different ground. Humanity is now, Kolbert explains, in the midst of the Anthropocene a geologic era in which we are the dominant force shaping earth, sea, and sky. Faced with that reality, humans have gotten more creative at using technology to fix the problems that we unwittingly spawned: Stamping out Australias cane toad invasion with genetic engineering, for example, or using giant air conditioners to suck carbon dioxide out of air and turn it into rock. As Kolbert notes, tongue-in-cheek: What could possibly go wrong?

This interview has been condensed and lightly edited for clarity.

Q.Under a White Sky is about a lot of things rivers, solar geoengineering, coral reefs but its also about what nature means in our current world. What got you interested in that topic?

A.All books have complicated births, as it were. But about four years ago, I went to Hawaii to report on a project that had been nicknamed the super coral project. And it was run by a very charismatic scientist named Ruth Gates, who very sadly passed away about two years ago. We have very radically altered the oceans by pouring hundreds of billions of tons of CO2 into the air and we cant get that heat out of the oceans in any foreseeable timescale. We cant change the chemistry back. And if we want coral reefs in the future, were going to have to counter what weve done to the oceans by remaking reefs so they can withstand warmer temperatures. The aim of the project was to see if you could hybridize or crossbreed corals to get more vigorous varieties.

This idea that we have to counteract one form of intervention in the natural world (climate change) with another form of intervention (trying to recreate reefs) just struck me as a very interesting new chapter in our long and very complicated relationship with nature. And once I started to think about it that way, I started to see that as a pretty widespread pattern. Thats really what prompted the book.

Q.Some of these human interventions to save nature seem hopeful and positive and others go wrong in pretty epic ways. How do you balance those two types of stories?

A.The book starts with examples that probably will strike many Grist readers as OK, that makes sense. That makes sense. But it goes from regional engineering solutions through biotechnology, through gene editing, and all the way up to solar geoengineering. So it kind of leads you down what we might call a slippery slope. And one of the interesting things about these cases is that they will divide up people differently. Even people who consider themselves environmentalists will come down on different sides of some of these technologies. The bind were in is so profound that theres no right answer.

Q.So someone who accepts what were doing to save the Devils Hole pupfish might not necessarily accept gene-editing mosquitos or dimming the sun through solar geoengineering.

A.Exactly. And I think sometimes those lines seem clearer than they are once you start to think about it.

Q.At one point in the book, theres a quote that is (apocryphally) attributed to Einstein: We cannot solve our problems with the same thinking we used when we created them. But you dont say whether you agree with that sentiment or not. Is that on purpose?

A.Yeah, you can read the book and say, Im really glad people are doing these things, and I feel better. Or you can read the book and say, as one scientific quote does, This is a broad highway to hell. And both of those are very valid reactions.

Q.When you write about geoengineering, you point out that many scientists conclude that its necessary to avoid catastrophic levels of warming, but that it could also be a really bad idea. Do you think that in 15 or 20 years youll be writing about a geoengineering experiment gone wrong, much as youre writing now about failed attempts to protect Louisiana from flooding?

A.I might argue about the timescales. Im not sure Ill be reporting on it in 15 years, but I think you might be reporting on it in 30 years.

At the moment, its still the realm of sci-fi, and Im not claiming to have any particular insight into how people are going to respond in the future. But the case thats made in the book by some very smart scientists is that we dont have very many tools in our toolbox for dealing with climate change quickly, because the system has so much inertia. Its like turning around a supertanker: It takes literally decades, even if we do everything absolutely right.

Q.Youve reported on climate change for a long time. How does it feel to see geoengineering being explored as a more valuable and potentially necessary option?

A.Well, one thing I learned in the course of reporting the book was that what we now refer to as geoengineering was actually the very first thing that people started to think about when they realized we were warming the climate. The very first report about climate change that was handed to Lyndon Johnson in 1965 wasnt about how we should stop emitting it was: Maybe we should find some reflective stuff to throw into the ocean to bounce more sunlight back into space!

Its odd, its kind of almost freakish, and I cant explain it, except to say that it sort of fits the pattern of the book.

Q.Theres been a longstanding fight in environmentalism between a technology-will-save-us philosophy and a return-to-nature philosophy. Based on the reporting in this book, do you think that the technology camp has won?

A.I think the book is an attempt to take on both of those schools of thought. On some level, technology has won even people who would say dont do geoengineering still want to put up solar panels and build huge arrays of batteries, and those are technologies! But where does that leave us? It goes back to Ruth Gates and the super coral project. There was a big fight among coral biologists about whether a project like that should even be pursued. The Great Barrier Reef is the size of Italy even if you have some replacement coral, how are you going to get them out on the reef? But Gatess point was, were not returning. Even if we stopped emitting CO2 tomorrow, youre not getting the Great Barrier Reef back as it was in a foreseeable timeframe.

My impulse as an old-school environmentalist is to say Well, lets just leave things alone. But the sad fact is that weve intervened so much at this point that even not intervening is itself an intervention.

Q.Now that we have a U.S. president who takes climate change seriously, do you think we could actually start cutting carbon emissions quickly?

A.I really do want to applaud the first steps that the Biden administration has taken. I think they show a pretty profound understanding of the problem. But the question, and its a big one, is What are the limits? Will Congress do anything? What will happen in the Supreme Court? The U.S. is no longer the biggest emitter on an annual basis, but on a cumulative basis were still the biggest. And we still dont have resolution on how much CO2 we can put up there to avoid 1.5 or 2 degrees Celsius of warming. Those are questions with big error bars. If were lucky, I think we can avoid disastrous climate change. But if were not lucky, were already in deep trouble.

Q.Is there anything else you want to say about the book?

A.It sounds kind of weird after our conversation, but the book was actually a lot of fun to write. It sounds odd when youre talking about a book where the subject is so immensely serious.

Q.You mean like when the undergraduates in Australia are tossing each other buckets of coral sperm?

A.Yes! There is always humor in all these situations. I hope that sense of fun comes through.

Will Biden deliver on climate? Get weekly analysis of federal climate policy action in the first months of Bidens term.

Here is the original post:
Interview: Elizabeth Kolbert on why well never stop messing with nature - Grist

Posted in Human Genetic Engineering | Comments Off on Interview: Elizabeth Kolbert on why well never stop messing with nature – Grist

Genomics and genre – Science

If the double helix is an icon of the modern age, then the genome is one of the last grand narratives of modernity, writes Lara Choksey in her new book, Narrative in the Age of the Genome. Hybridizing literary criticism with a genre-spanning consideration of a dozen distinct literary works, and imbued throughout with deep concern for the peripheral, the possible, and the political, the book seeks to challenge the whole imaginative apparatus for constructing the self into a coherent narrative, via the lexicon and syntax of the molecular.

To a reading of Richard Dawkins's The Selfish Gene (1976) as a repudiation of class struggle and E. O. Wilson's Sociobiology (1975) as a defense of warfare, Choksey juxtaposes another kind of ambiguous heterotopia in which genetic engineering is a tool of neoliberal self-fashioning. In Samuel R. Delany's Trouble on Triton (1976), Bron, a transgender ex-gigolo turned informatics expert, is caught between sociobiology and the selfish gene, between the liberal developmentalism of progressive evolution, and the neoliberal extraction and rearrangement of biological information. Even the undulating interruptions and parentheticals of Bron's thoughts [mimic] the description of the activation and silencing of genes, she suggests, tying together gene and genre in a way that encapsulates neoliberal alienation.

Choksey next explores the ways in which collectivist fantasies of biological reinvention under Soviet Lysenkoism fused code and cultivation through a close reading of Arkady and Boris Strugatsky's Roadside Picnic (1972) in which cultivated utopian dreamworlds become contaminated by alien forces, resulting in fundamental ecological transformations beyond the promised reach of human control. The novel brings to light not forgotten Soviet utopias but literal zombies and mutations. In a world where planned cultivation fails entirely in the face of the unfamiliar, even as new biological weapons are being developed, Earth itself viscerally reflects a fractured reality of lost promisesa world in crisis with all meaning gone, and survival itself a chancy proposition.

Framed as a family history, The Immortal Life of Henrietta Lacks is actually a horror story, argues Choksey.

As the promise of precision medicine emerged, so too did new forms of memoir. In Kazuo Ishiguro's Never Let Me Go (2005) and the film Gattaca (1997), for example, the traditional aspirational narrative of a pilgrim's progress is subverted: As the unitary subject disappears into data, algorithms, and commodities, a new grammar of existence emerges, albeit one in which the inherited problems of the pastracism, ableism, and the fiction of heteronormativityremain ever-present.

In Saidiya Hartman's Lose Your Mother (2006) and Yaa Gyasi's Homegoing (2016), Choksey sees a reorientation of genomics away from the reduction of self to code and toward new forms of kinship and belonging that offer a reckoning with the histories of brutalization and displacement upon which liberal humanism is founded. Even as genomics seeks to locate the trauma of enslavement at the level of the molecular, communities seeking reunion and reparation know that technology alone cannot do the cultural work of caring for history that narrative can offer.

Reading Rebecca Skloot's The Immortal Life of Henrietta Lacks (2010) as a biography of Black horror which tries, time and again, to resolve itself as family romance, Choksey identifies the perils of narratives unable to recognize their own genre. She argues that by blurring the lines not between fact and fiction but between horror and family history, the dehumanization of Black lives as experimental biomatter echoes inescapably with larger histories of the extraction of Black flesh for the expansion of colonial-capitalist production.

What emerges as most compelling out of this entire tapestry of readings is the author's interpretation of the limits and failures of the extraordinary cultural power of the genome. Concluding that genomics has privileged a particular conception of the human that is in the process of being reconfigured, Choksey ventures that the uncomplicated subject, the Vitruvian Man of the Human Genome Project, has reached its end. What is left is neither dust, stardust, nor a face erased in the sand (as Foucault would have it) but rather whatever might emerge next from the unwieldy kaleidoscope of possible meanings.

Originally posted here:
Genomics and genre - Science

Posted in Human Genetic Engineering | Comments Off on Genomics and genre – Science

Berkeley Lab Celebrates 90th Anniversary, Imagines the Next 90 Years | Berkeley Lab – Lawrence Berkeley National Laboratory

Ninety years ago, in August of 1931, physics professor Ernest Lawrence created the Radiation Laboratory in a modest building on the UC Berkeley campus to house his cyclotron, a particle accelerator that ushered in a new era in the study of subatomic particles. The invention of the cyclotron would go on to win Lawrence the 1939 Nobel Prize in physics.

From this start, Lawrences unique approach of bringing together multidisciplinary teams, world-class research facilities, and bold discovery science has fueled nine decades of pioneering research at the Department of Energys Lawrence Berkeley National Laboratory (Berkeley Lab). His team science approach also grew into todays national laboratory system.

Over the years, as Berkeley Labs mission expanded to cover a remarkable range of science, this approach has delivered countless solutions to challenges in energy, environment, materials, biology, computing, and physics.

And this same approach will continue to deliver breakthroughs for decades to come.

In 2021, Berkeley Labs 90th year, we invite you to join our anniversary celebration, Berkeley Lab: The Next 90, as we celebrate our past and imagine our future.

The pursuit of discovery science by multidisciplinary teams has brought, and will continue to bring, tremendous benefits to the nation and world, said Berkeley Lab Director Mike Witherell. Our celebration is a chance to honor everyone who has contributed to solving human problems through science, and to imagine what we can accomplish together in the next 90 years.

Berkeley Labs 90th anniversary celebration honors the diverse efforts of the Lab community: from scientists and engineers to administrative and operations staff.

It also celebrates our commitment to discovery science, which explores the fundamental underpinnings of the universe, materials, biology, and more. This research requires patience the dividends can be decades in the future but the results are often surprising and profound, from the cyclotron of yesteryear to todays CRISPR-Cas9 genetic engineering technology.

Its an incredible story were proud to share, and inspired to continue with your support. Over the next several months, well offer many ways to join our celebration. Visit Berkeley Lab: The Next 90 to learn more, and engage with us on Twitter at #BerkeleyLab90.

Here are several ways to join our celebration, all highlighted on the website:

Celebrate the past

90 Breakthroughs: To celebrate Berkeley Labs nine decades of transforming discovery science into solutions that benefit the world, well roll out 90 Berkeley Lab breakthroughs over the next several months.

Interactive Timeline: Explore the Labs many remarkable achievements and events through the decades.

History and photos: Check out our decade-by-decade photo album and historical material.

Imagine the Future

Charitable giving: In 2021, Berkeley Lab will support five non-profit organizations that help prepare young scholars to become leaders and problem solvers.

Basics 2 Breakthroughs: Research at Berkeley Lab often starts with basic science, which leads to breakthroughs that help the world. In this video series, early career scientists discuss their game-changing research and what inspires them.

A Day in the Half Life: This podcast series chronicles the incredible and often unexpected ways that science evolves over time, as told by scientists who helped shape a research field, and those who will bring it into the future.

Speaker series: These monthly lectures offer a look at game-changing scientific breakthroughs of the last 90 years, highlight current research aimed at tackling the nations most pressing challenges, and offer a glimpse into future research that will spur discoveries yet to be made.

Connect

Virtual tours: These live, interactive tours will enable you to learn more about Berkeley Labs research efforts, hear from the scientists who conduct this important work, and peek inside our amazing facilities.

Social media: Join us on social media for fun and engaging content that will help you discover the Labs incredible history, and learn what were imagining for the future. BerkeleyLab#90

# # #

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams,Lawrence Berkeley National Laboratoryand its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visitenergy.gov/science.

Read the original:
Berkeley Lab Celebrates 90th Anniversary, Imagines the Next 90 Years | Berkeley Lab - Lawrence Berkeley National Laboratory

Posted in Human Genetic Engineering | Comments Off on Berkeley Lab Celebrates 90th Anniversary, Imagines the Next 90 Years | Berkeley Lab – Lawrence Berkeley National Laboratory

Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes – SciTechDaily

Researchers have reconstructed the ancestral sequence of the great ape Y chromosome by comparing three existing (gorilla, human, and chimpanzee) and two newly generated (orangutan and bonobo) Y chromosome assemblies. The new research shows that many gene families and multi-copy sequences were already present in the great ape Y common ancestor and that the chimpanzee and bonobo lineages experienced accelerated gene death and nucleotide substitution rates after their divergence from the human lineage. Credit: Dani Zemba and Monika Cechova, Penn State

Researchers reconstruct the ancestral great ape Y and show its rapid evolution in bonobo and chimpanzee.

New analysis of the DNA sequence of the male-specific Y chromosomes from all living species of the great ape family helps to clarify our understanding of how this enigmatic chromosome evolved. A clearer picture of the evolution of the Y chromosome is important for studying male fertility in humans as well as our understanding of reproduction patterns and the ability to track male lineages in the great apes, which can help with conservation efforts for these endangered species.

A team of biologists and computer scientists at Penn State sequenced and assembled the Y chromosome from orangutan and bonobo and compared those sequences to the existing human, chimpanzee, and gorilla Y sequences. From the comparison, the team was able to clarify patterns of evolution that seem to fit with behavioral differences between the species and reconstruct a model of what the Y chromosome might have looked like in the ancestor of all great apes.

A paper describing the research was published in the journal Proceedings of the National Academy of Sciences.

The Y chromosome is important for male fertility and contains the genes critical for sperm production, but it is often neglected in genomic studies because it is so difficult to sequence and assemble, said Monika Cechova, a graduate student at Penn State at the time of the research and co-first author of the paper. The Y chromosome contains a lot of repetitive sequences, which are challenging for DNA sequencing, assembling sequences, and aligning sequences for comparison. There arent out-of-the-box software packages to deal with the Y chromosome, so we had to overcome these hurdles and optimize our experimental and computational protocols, which allowed us to address interesting biological questions.

The Y chromosome is unusual. It contains relatively few genes, many of which are involved in male sex determination and sperm production; large sections of repetitive DNA, short sequences repeated over and over again; and large DNA palindromes, inverted repeats that can be many thousands of letters long and read the same forwards and backwards.

Previous work by the team comparing human, chimpanzee, and gorilla sequences had revealed some unexpected patterns. Humans are more closely related to chimpanzees, but for some characteristics, the human Y was more similar to the gorilla Y.

If you just compare the sequence identitycomparing the As, Ts, Cs, and Gs of the chromosomeshumans are more similar to chimpanzees, as you would expect, said Kateryna Makova, Pentz Professor of Biology at Penn State and one of the leaders of the research team. But if you look at which genes are present, the types of repetitive sequences, and the shared palindromes, humans look more similar to gorillas. We needed the Y chromosome of more great ape species to tease out the details of what was going on.

The team, therefore, sequenced the Y chromosome of a bonobo, a close relative of the chimpanzee, and an orangutan, a more distantly related great ape. With these new sequences, the researchers could see that the bonobo and chimpanzee shared the unusual pattern of accelerated rates of DNA sequence change and gene loss, suggesting that this pattern emerged prior to the evolutionary split between the two species. The orangutan Y chromosome, on the other hand, which serves as an outgroup to ground the comparisons, looked about like what you expect based on its known relationship to the other great apes.

Our hypothesis is that the accelerated change that we see in chimpanzees and bonobos could be related to their mating habits, said Rahulsimham Vegesna, a graduate student at Penn State and co-first author of the paper. In chimpanzees and bonobos, one female mates with multiple males during a single cycle. This leads to what we call sperm competition, the sperm from several males trying to fertilize a single egg. We think that this situation could provide the evolutionary pressure to accelerate change on the chimpanzee and bonobo Y chromosome, compared to other apes with different mating patterns, but this hypothesis, while consistent with our findings, needs to be evaluated in subsequent studies.

In addition to teasing out some of the details of how the Y chromosome evolved in individual species, the team used the set of great ape sequences to reconstruct what the Y chromosome might have looked like in the ancestor of modern great apes.

Having the ancestral great ape Y chromosome helps us to understand how the chromosome evolved, said Vegesna. For example, we can see that many of the repetitive regions and palindromes on the Y were already present on the ancestral chromosome. This, in turn, argues for the importance of these features for the Y chromosome in all great apes and allows us to explore how they evolved in each of the separate species.

The Y chromosome is also unusual because, unlike most chromosomes it doesnt have a matching partner. We each get two copies of chromosomes 1 through 22, and then some of us (females) get two X chromosomes and some of us (males) get one X and one Y. Partner chromosomes can exchange sections in a process called recombination, which is important to preserve the chromosomes evolutionarily. Because the Y doesnt have a partner, it had been hypothesized that the long palindromic sequences on the Y might be able to recombine with themselves and thus still be able to preserve their genes, but the mechanism was not known.

We used the data from a technique called Hi-C, which captures the three-dimensional organization of the chromosome, to try to see how this self-recombination is facilitated, said Cechova. What we found was that regions of the chromosome that recombine with each other are kept in close proximity to one another spatially by the structure of the chromosome.

Working on the Y chromosome presents a lot of challenges, said Paul Medvedev, associate professor of computer science and engineering and of biochemistry and molecular biology at Penn State and the other leader of the research team. We had to develop specialized methods and computational analyses to account for the highly repetitive nature of the sequence of the Y. This project is truly cross-disciplinary and could not have happened without the combination of computational and biological scientists that we have on our team.

Reference: Dynamic evolution of great ape Y chromosomes by Monika Cechova, Rahulsimham Vegesna, Marta Tomaszkiewicz, Robert S. Harris, Di Chen, Samarth Rangavittal, Paul Medvedev and Kateryna D. Makova, 5 October 2020, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.2001749117

In addition to Cechova, Makova, Vegesna, and Medvedev, the research team at Penn State included Marta Tomaszkiewicz, Robert S. Harris, Di Chen, and Samarth Rangavittal. The research was supported by the U.S. National Institutes of Health, the U.S. National Science Foundation, the Clinical and Translational Sciences Institute, the Institute of Computational and Data Sciences, the Huck Institutes of the Life Sciences, and the Eberly College of Science of the Pennsylvania State University, and by the CBIOS Predoctoral Training Program awarded to Penn State by the National Institutes of Health.

See more here:
Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes - SciTechDaily

Posted in Human Genetic Engineering | Comments Off on Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes – SciTechDaily

Will We Ever Fully Understand Humans Impact on Nature? – The Nation

Elizabeth Kolbert. (Photo by John Kleiner)

To say that Earth is in crisis is an understatement. Atmospheric warming, ocean warming, ocean acidification, sea-level rise, deglaciation, desertification, eutrophicationthese are just some of the by-products of our speciess success, journalist Elizabeth Kolbert warns us about in her new book, Under a White Sky: The Nature of the Future. Kolbert has been studying the consequences of humanitys impact on Earth for decades as a contributor to The New Yorker and as the author of such books as the 2015 Pulitzer Prizewinning The Sixth Extinction, an exploration of the concept of extinction that posits mankind as a cataclysm as great as the asteroid that annihilated the dinosaurs.

In Under a White Sky, Kolbert ponders the nature of the future by examining a new pattern she attributes to the recursive logic of the Anthropocene: human interventions attempting to answer for past human interventions in the environment. The book chronicles the casualties of short-sighted human meddling with the planet and its resources and the present-day efforts being made to address that meddlingor, as Kolbert puts it, efforts to control the control of nature. Interviews with scientists in a wide array of disciplinesclimate scientists, climate entrepreneurs, biologists, glaciologists, and geneticistsreveal a trend of projects aiming to transform nature in order to save it. From the Mojave to lava fields in Iceland, Kolbert takes readers on a globe-spanning journey to explore these projects while weighing their pros, cons, and ethical implications (the books title refers to the way the sky could be bleached of color as a potential side effect of solar geoengineering, one of the proposed interventions to combat global warming). The issue, at this point, Kolbert writes, is not whether were going to alter nature, but to what end?

I spoke to Kolbert over the phone the day after President Joe Bidens inauguration. We talked about what its like to write a book about a big question you dont yet have the answer to, and what it will take to undo the environmental damage incurred during the Trump years.

Naomi Elias

Naomi Elias: You describe Under A White Sky as a book about people trying to solve problems created by people trying to solve problems. Can you explain that a little?

Elizabeth Kolbert: The pattern that Im looking at in the book is ways in which humans have intervenedor, if you prefer, mucked around withthe natural world and then have decided that the consequences are bad and are now looking for new forms of intervention to try to solve those problems. I start with the example of the Chicago River, which was reversed in an extraordinary engineering project at the beginning of the 20th century. The Chicago River used to flow east into Lake Michigan, which also happened to be Chicagos only source of drinking water. All of Chicagos human and animal waste flowed into Lake Michigan and there were constant outbreaks of typhoid and cholera. So Chicago decided, Well, we really have to do something about it, and what they did was this incredible engineering project, and now it flows basically to the southwest and eventually into the Mississippi, and all of Chicagos waste flows in the same direction. When the canal that reversed the river was put into place, there was a headline in The New York Times that ran something like, Water Flows in the Chicago River Again. It was so thick with muck that people joked a chicken could walk across it without getting its feet wet. That created a big problem that connected two huge drainage systems, the Great Lakes drainage system and the Mississippi drainage system, that has now led to all these species, including many invasive species, crossing from one basin into the other. It was having bad effects on the ecology of both systems, so to try to prevent these species from crossing from one basin to the other, theyve now electrified a significant chunk of this canal. So thats an intervention, as it were, on top of an intervention, and that is really the pattern that the book explores.

NE: The book visits project sites in Iceland, Australia, New Orleans, and the California desert. What drew you to the projects you write about?

EK: The first project that got me started down this whole path was the super coral project, which is currently in Hawaii and partly in Australia. As the oceans warm, corals are having a lot of trouble surviving. We get these coral-bleaching events that Im sure people have heard about. Some scientists were looking at how we can save coral reefs and the idea they came up with was that we need to intervene and try to coax along evolution so that these creatures can survive climate change. That struck me as a really interesting project, and got me thinking about this question of, Can we intervene to redress our own interventions? Once I started seeing that pattern, I started to see it everywhere. I could have gone to many different parts of the world and written stories that made the same point, but the projects that I went to were emblematic in some way. They were taking on different issues like climate change or invasive species, the loss of wetlandsthe list goes on.

NE: Did any of these effortsbe it the Harvard team trying to combat global warming by firing diamonds into the stratosphere or the group looking to reduce rodent populations with genetic manipulationconvince you that our best chance of averting climate apocalypse really is to control the control of nature? Are we digging ourselves out of a hole or just digging a deeper one?

EK: You know, you have identified the question at the center of the book. That is a question that I dont claim to answer. Im not a prophet. Im really trying to tease out that question in the book. Look at it, and have some fun with it, to be honest, and get people to think about the pattern. In many cases, these solutions are working to a certain extent. New Orleans would not exist without massive human intervention to solve the problems of water. In New Orleansa city thats essentially significantly below sea levelit turns out you need flooding to keep the land from subsiding even further because thats actually what built the land, the flooding that dropped a lot of sediment across the Mississippi Delta over many millennia. Are you getting into a trap when you pile these interventions on top of each other? Do you have alternatives? These are the big questions of our time.

If you like this article, please give today to help fund The Nations work.

NE: Id like to talk about your feelings about the popular phrase for the geological epoch were living in, the Anthropocene. In 2017 you gave a lecture at Manhattans New School, in which you said, Thinking scientifically about mans place in the world used to mean acknowledging our insignificance. This new human-centered term, the age of man, completely upends that. Can you talk about your feelings about the term and what it means for how we think about our relationship to the Earth?

EK: I think we are at this interesting turning point thats on some level been the subject of all the books Ive written, and a lot of the articles as well. We first decentered humans, right? It wasnt that the sun revolved around the Earth, it was that the Earth revolved around the sun. Theres a lot of these discoveries that have proved people are not the center of the universe, but then we get to the present moment, where we do have to acknowledge that we are becoming the dominant force in many very essential ways. We have to acknowledge that and, on some level, take responsibility for that. This term, the Anthropocene, is kind of a shorthand for all the ways that humans are affecting the Earth on what is sometimes called a geological scale. We are changing the carbon cycle very dramatically, were changing the nitrogen cycle, were acidifying the ocean. Weve even got to the point where we regularly cause earthquakes. We are definitely driving evolution; we are probably driving speciation. We are at this moment of tremendous human impact and we need to rise to that challenge of thinking about what we want the world to look like now that we are such a dominant force.

NE: In the book, you take note of the way the scientists you speak to encode a sense of moral urgency into their analysis of the climate crisis, which is something I feel present in contemporary climate reporting too. Youve been on the climate beat for decades. Have you felt a shift in the work? Do you feel like you now have an agenda when you write?

EK: Theres definitely been a shift in the sense that, when I started out almost 20 years ago, there was still, among a lot of pretty knowledgeable people, a lot of confusion. What is climate change? Is it real? Do I have to worry about it? The conversation has moved dramatically, at least in a big chunk of the US and a big chunk of the world. But I do not consider myself an advocate. Im a journalist and I try to report stories that I think illuminate the situation that were in. Ive thought about, you know, Should I be writing some sort of prescriptive journalism? But thats not really me.

NE: At the end of that same 2017 lecture you conclude, We are the fate of Earth. You call humans ethical agents and say that were failing as ethical agents if we dont acknowledge our impact.

EK: Yes, I certainly stand by those words. I mean, this book is on the one hand grappling with, on the other hand sort of playing around with, those questions. Our impact on the planet and the untold number of other species with whom we share the planet and whom we frankly dont spend a lot of time thinking aboutand dont even understand to a great extentI think will come to be seen as one of the great tragedies and the great ethical failings of humanity.

NE: So many of the things you discuss in the book were set in motion long before the 2016 election, but its hard to overstate what a setback the last four years of the Trump administration have been for the climate. An analysis from The New York Times cites over 100 environmental protections Trump reversed concerning areas like wetland and wildlife protection, and air and water pollution. In his inauguration speech yesterday, President Biden talked about answering the cry for survival Earth was letting out, and he immediately signed an order to rejoin the Paris climate accord. Im wondering what your thoughts are on what your job is going to look like under the Biden administration, and if you think we dodged some kind of metaphorical asteroid?

EK: I think what Trump did was egregious. It was an attempt to set us off completely on the wrong trajectory. Its a very complicated situation legally because now a lot of regulations will have to be rewritten. Its going to occupy the EPA for years, unfortunately. Thats very sad and just a waste of time and of human effort, when we should be doing a lot of other things. But, you know, there are great forces at work here and fortunately some of those continue to go in the right direction, like the tremendous decrease in prices of wind power and solar power that continued despite Donald Trumps best efforts to try to undermine renewable power. One could spend the next four years doing nothing but looking at the legal ins and outs of trying to undo that, and I think that that would be a noble thing to do. What Im thinking about areI dont want to call them bigger questions, but theyre the questions of our human impact on the planet, which are not going to change because Joe Biden suddenly rejoined the Paris Agreement, unfortunately.

View original post here:
Will We Ever Fully Understand Humans Impact on Nature? - The Nation

Posted in Human Genetic Engineering | Comments Off on Will We Ever Fully Understand Humans Impact on Nature? – The Nation