Search Immortality Topics:

Page 3«..2345..1020..»


Category Archives: Human Genetic Engineering

Global CRISPR Gene Editing Market (2020 to 2030) – Focus on Products, Applications, End-users, Country Data and Competitive Landscape -…

DUBLIN--(BUSINESS WIRE)--The "Global CRISPR Gene Editing Market: Focus on Products, Applications, End Users, Country Data (16 Countries), and Competitive Landscape - Analysis and Forecast, 2020-2030" report has been added to ResearchAndMarkets.com's offering.

The global CRISPR gene editing market was valued at $846.2 million in 2019 and is expected to reach $10,825.1 million by 2030, registering a CAGR of 26.86% during the forecast.

The development of genome engineering with potential applications proved to reflect a remarkable impact on the future of the healthcare and life science industry. The high efficiency of the CRISPR-Cas9 system has been demonstrated in various studies for genome editing, which resulted in significant investments within the field of genome engineering. However, there are several limitations, which need consideration before clinical applications. Further, many researchers are working on the limitations of CRISPR gene editing technology for better results. The potential of CRISPR gene editing to alter the human genome and modify the disease conditions is incredible but exists with ethical and social concerns.

The growth is attributed to the increasing demand in the food industry for better products with improved quality and nutrient enrichment and the pharmaceutical industry for targeted treatment for various diseases. Further, the continued significant investments by healthcare companies to meet the industry demand and growing prominence for the gene therapy procedures with less turnaround time are the prominent factors propelling the growth of the global CRISPR gene editing market.

Research organizations, pharmaceutical and biotechnology industries, and institutes are looking for more efficient genome editing technologies to increase the specificity and cost-effectiveness, also to reduce turnaround time and human errors. Further, the evolution of genome editing technologies has enabled wide range of applications in various fields, such as industrial biotech and agricultural research. These advanced methods are simple, super-efficient, cost-effective, provide multiplexing, and high throughput capabilities. The increase in the geriatric population and increasing number of cancer cases, and genetic disorders across the globe are expected to translate into significantly higher demand for CRISPR gene editing market.

Furthermore, the companies are investing huge amounts in the research and development of CRISPR gene editing products, and gene therapies. The clinical trial landscape of various genetic and chronic diseases has been on the rise in recent years, and this will fuel the CRISPR gene editing market in the future.

Within the research report, the market is segmented based on product type, application, end-user, and region. Each of these segments covers the snapshot of the market over the projected years, the inclination of the market revenue, underlying patterns, and trends by using analytics on the primary and secondary data obtained.

Key Questions Answered in this Report:

Market Dynamics

Growth Drivers

Restraints

Opportunities

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/rky1va

See the original post here:
Global CRISPR Gene Editing Market (2020 to 2030) - Focus on Products, Applications, End-users, Country Data and Competitive Landscape -...

Posted in Human Genetic Engineering | Comments Off on Global CRISPR Gene Editing Market (2020 to 2030) – Focus on Products, Applications, End-users, Country Data and Competitive Landscape -…

Geoengineering: What could possibly go wrong? Elizabeth Kolbert’s take, in her new book – Bulletin of the Atomic Scientists

Editors note: This story was originally published by Grist. It appears here as part of theClimate Deskcollaboration. Elizabeth Kolbert is a former member of the Science and Security Board of the Bulletin of the Atomic Scientists.

In Australia, scientists collect buckets of coral sperm, mixing one species with another in an attempt to create a new super coral that can withstand rising temperatures and acidifying seas. In Nevada, scientists nurse a tiny colony of one-inch long Devils Hole pupfish in an uncomfortably hot, Styrofoam-molded pool. And in Massachusetts, Harvard University scientists research injecting chemicals into the atmosphere to dim the suns lightand slow down the runaway pace of global warming.

These are some of the scenes from Elizabeth Kolberts new book,Under a White Sky, a global exploration of the ways that humanity is attempting to engineer, fix, or reroute the course of nature in a climate-changed world. (The title refers to one of the consequences of engineering the Earth to better reflect sunlight: Our usual blue sky could turn apale white.)

Kolbert, a New Yorker staff writer, has been covering the environment for decades: Her first book,Field Notes from a Catastrophe, traced the scientific evidence for global warming from Greenland to Alaska; her second,The Sixth Extinction, followed the growing pace of animal extinctions.

Under a White Skycovers slightly different ground. Humanity is now, Kolbert explains, in the midst of the Anthropocenea geologic era in whichweare the dominant force shaping earth, sea, and sky. Faced with that reality, humans have gotten more creative at using technology to fix the problems that we unwittingly spawned: Stamping out Australias cane toad invasion with genetic engineering, for example, or using giant air conditioners to suck carbon dioxide out of air and turn it into rock. As Kolbert notes, tongue-in-cheek: What could possibly go wrong?

This interview has been condensed and lightly edited for clarity.

Osaka:Under a White Skyis about a lot of things rivers, solar geoengineering, coral reefs but its also about what nature means in our current world. What got you interested in that topic?

Kolbert: All books have complicated births, as it were. But about four years ago, I went to Hawaii to report on a project that had been nicknamed the super coral project. And it was run by a very charismatic scientist namedRuth Gates, who very sadly passed away about two years ago. We have very radically altered the oceans by pouring hundreds of billions of tons of carbon dioxide into the airand we cant get that heat out of the oceans in any foreseeable timescale. We cant change the chemistry back. And if we want coral reefs in the future, were going to have to counter what weve done to the oceans by remaking reefs so they can withstand warmer temperatures. The aim of the project was to see if you could hybridize or crossbreed corals to get more vigorous varieties.

This ideathat we have to counteract one form of intervention in the natural world (climate change) with another form of intervention (trying to recreate reefs)just struck me as a very interesting new chapter in our long and very complicated relationship with nature. And once I started to think about it that way, I started to see that as a pretty widespread pattern. Thats really what prompted the book.

Osaka: Some of these human interventions to save nature seem hopeful and positiveand others go wrong in pretty epic ways. How do you balance those two types of stories?

Kolbert: The book starts with examples that probably will strike many readers as Okay, that makes sense. That makes sense. But it goes from regional engineering solutions through biotechnology, through gene editing, and all the way up to solar geoengineering. So it kind of leads you down what we might call a slippery slope. And one of the interesting things about these cases is that they will divide up people differently. Even people who consider themselves environmentalists will come down on different sides of some of these technologies. The bind were in is so profound that theres no right answer.

Osaka: So someone who accepts what were doing to save the Devils Hole pupfish might not necessarily accept gene-editing mosquitos or dimming the sun through solar geoengineering.

Kolbert: Exactly. And I think sometimes those linesseemclearer than they are once you start to think about it.

Osaka: At one point in the book, theres a quote that is (apocryphally) attributed to Einstein: We cannot solve our problems with the same thinking we used when we created them. But you dont say whether you agree with that sentiment or not. Is that on purpose?

Kolbert: Yeah, you can read the book and say, Im really glad people are doing these things, and I feel better. Or you can read the book and say, as one scientific quote does, This is a broad highway to hell. And both of those are very valid reactions.

Osaka: When you write about geoengineering, you point out that many scientists conclude that its necessary to avoid catastrophic levels of warming, but that it could also be a really bad ideKolbert Do you think that in 15 or 20 years youll be writing about a geoengineering experiment gone wrong, much as youre writing now about failed attempts to protect Louisiana from flooding?

Kolbert: I might argue about the timescales. Im not sure Ill be reporting on it in 15 years, but I thinkyoumight be reporting on it in 30 years.

At the moment, its still the realm of sci-fi, and Im not claiming to have any particular insight into how people are going to respond in the future. But the case thats made in the book by some very smart scientists is that we dont have very many tools in our toolbox for dealing with climate change quickly, because the system has so much inertia. Its like turning around a supertanker: It takes literally decades, even if we do everything absolutely right.

Osaka: Youve reported on climate change for a long time. How does it feel to see geoengineering being explored as a more valuableand potentially necessaryoption?

Kolbert: Well, one thing I learned in the course of reporting the book was that what we now refer to as geoengineering was actually the very first thing that people started to think about when they first realized we were warming the climate. The very first report about climate change that was handed to Lyndon Johnson in 1965 wasnt about how we should stop emittingit was: Maybe we should find some reflective stuff to throw into the ocean to bounce more sunlight back into space!

Its odd, its kind of almost freakish, and I cant explain it, except to say that it sort of fits the pattern of the book.

Osaka: Theres been a longstanding fight in environmentalism between a technology-will-save-us philosophy and a return-to-nature philosophy. Based on the reporting in this book, do you think that the technology camp has won?

Kolbert: I think the book is an attempt to take on both of those schools of thought. On some level, technologyhaswoneven people who would say dont do geoengineering still want to put up solar panels and build huge arrays of batteries, and those are technologies! But where does that leave us? It goes back to Ruth Gates and the super coral project. There was a big fight among coral biologists about whether a project like that should even be pursued. The Great Barrier Reef is the size of Italyeven if you have some replacement coral, how are you going to get them out on the reef? But Gatess point was, were not returning. Even if we stopped emitting carbon dioxide tomorrow, youre not getting the Great Barrier Reef back as it was in a foreseeable timeframe.

My impulse as an old-school environmentalist is to say Well, lets just leave things alone. But the sad fact is that weve intervened so much at this point that evennot intervening is itself an intervention.

Osaka: Now that we have a US president who takes climate change seriously, do you think we could actually start cutting carbon emissions quickly

Kolbert: I really do want to applaud the first steps that theBiden administration has taken. I think they show a pretty profound understanding of the problem. But the question, and its a big one, is What are the limits? Will Congress do anything? What will happen in theSupreme Court? The United States is no longer the biggest emitter on an annual basis, but on a cumulative basis were still the biggest. And we still dont have resolution on how much carbon dioxdie we can put up there to avoid 1.5 or 2 degrees Celsius (3.6 degrees Fahrenheit) of warming. Those are questions with big error bars. If were lucky, I think we can avoid disastrous climate change. But if were not lucky, were already in deep trouble.

Osaka: Is there anything else you want to say about the book?

Kolbert: It sounds kind of weird after our conversation, but the book was actually a lot of fun to write. It sounds odd when youre talking about a book where the subject is so immensely serious.

Osaka: You mean like when the undergraduates in Australia are tossing each other buckets of coral sperm?

Kolbert: Yes! There is always humor in all these situations. I hope that sense of fun comes through.

See more here:
Geoengineering: What could possibly go wrong? Elizabeth Kolbert's take, in her new book - Bulletin of the Atomic Scientists

Posted in Human Genetic Engineering | Comments Off on Geoengineering: What could possibly go wrong? Elizabeth Kolbert’s take, in her new book – Bulletin of the Atomic Scientists

Interview: Elizabeth Kolbert on why well never stop messing with nature – Grist

In Australia, scientists collect buckets of coral sperm, mixing one species with another in an attempt to create a new super coral that can withstand rising temperatures and acidifying seas. In Nevada, scientists nurse a tiny colony of one-inch long Devils Hole pupfish in an uncomfortably hot, Styrofoam-molded pool. And in Massachusetts, Harvard University scientists research injecting chemicals into the atmosphere to dim the suns light and slow down the runaway pace of global warming.

These are some of the scenes from Elizabeth Kolberts new book, Under a White Sky, a global exploration of the ways that humanity is attempting to engineer, fix, or reroute the course of nature in a climate-changed world. (The title refers to one of the consequences of engineering the Earth to better reflect sunlight: Our usual blue sky could turn a pale white.)

Kolbert, a New Yorker staff writer, has been covering the environment for decades: Her first book, Field Notes from a Catastrophe, traced the scientific evidence for global warming from Greenland to Alaska; her second, The Sixth Extinction, followed the growing pace of animal extinctions.

Under a White Sky covers slightly different ground. Humanity is now, Kolbert explains, in the midst of the Anthropocene a geologic era in which we are the dominant force shaping earth, sea, and sky. Faced with that reality, humans have gotten more creative at using technology to fix the problems that we unwittingly spawned: Stamping out Australias cane toad invasion with genetic engineering, for example, or using giant air conditioners to suck carbon dioxide out of air and turn it into rock. As Kolbert notes, tongue-in-cheek: What could possibly go wrong?

This interview has been condensed and lightly edited for clarity.

Q.Under a White Sky is about a lot of things rivers, solar geoengineering, coral reefs but its also about what nature means in our current world. What got you interested in that topic?

A.All books have complicated births, as it were. But about four years ago, I went to Hawaii to report on a project that had been nicknamed the super coral project. And it was run by a very charismatic scientist named Ruth Gates, who very sadly passed away about two years ago. We have very radically altered the oceans by pouring hundreds of billions of tons of CO2 into the air and we cant get that heat out of the oceans in any foreseeable timescale. We cant change the chemistry back. And if we want coral reefs in the future, were going to have to counter what weve done to the oceans by remaking reefs so they can withstand warmer temperatures. The aim of the project was to see if you could hybridize or crossbreed corals to get more vigorous varieties.

This idea that we have to counteract one form of intervention in the natural world (climate change) with another form of intervention (trying to recreate reefs) just struck me as a very interesting new chapter in our long and very complicated relationship with nature. And once I started to think about it that way, I started to see that as a pretty widespread pattern. Thats really what prompted the book.

Q.Some of these human interventions to save nature seem hopeful and positive and others go wrong in pretty epic ways. How do you balance those two types of stories?

A.The book starts with examples that probably will strike many Grist readers as OK, that makes sense. That makes sense. But it goes from regional engineering solutions through biotechnology, through gene editing, and all the way up to solar geoengineering. So it kind of leads you down what we might call a slippery slope. And one of the interesting things about these cases is that they will divide up people differently. Even people who consider themselves environmentalists will come down on different sides of some of these technologies. The bind were in is so profound that theres no right answer.

Q.So someone who accepts what were doing to save the Devils Hole pupfish might not necessarily accept gene-editing mosquitos or dimming the sun through solar geoengineering.

A.Exactly. And I think sometimes those lines seem clearer than they are once you start to think about it.

Q.At one point in the book, theres a quote that is (apocryphally) attributed to Einstein: We cannot solve our problems with the same thinking we used when we created them. But you dont say whether you agree with that sentiment or not. Is that on purpose?

A.Yeah, you can read the book and say, Im really glad people are doing these things, and I feel better. Or you can read the book and say, as one scientific quote does, This is a broad highway to hell. And both of those are very valid reactions.

Q.When you write about geoengineering, you point out that many scientists conclude that its necessary to avoid catastrophic levels of warming, but that it could also be a really bad idea. Do you think that in 15 or 20 years youll be writing about a geoengineering experiment gone wrong, much as youre writing now about failed attempts to protect Louisiana from flooding?

A.I might argue about the timescales. Im not sure Ill be reporting on it in 15 years, but I think you might be reporting on it in 30 years.

At the moment, its still the realm of sci-fi, and Im not claiming to have any particular insight into how people are going to respond in the future. But the case thats made in the book by some very smart scientists is that we dont have very many tools in our toolbox for dealing with climate change quickly, because the system has so much inertia. Its like turning around a supertanker: It takes literally decades, even if we do everything absolutely right.

Q.Youve reported on climate change for a long time. How does it feel to see geoengineering being explored as a more valuable and potentially necessary option?

A.Well, one thing I learned in the course of reporting the book was that what we now refer to as geoengineering was actually the very first thing that people started to think about when they realized we were warming the climate. The very first report about climate change that was handed to Lyndon Johnson in 1965 wasnt about how we should stop emitting it was: Maybe we should find some reflective stuff to throw into the ocean to bounce more sunlight back into space!

Its odd, its kind of almost freakish, and I cant explain it, except to say that it sort of fits the pattern of the book.

Q.Theres been a longstanding fight in environmentalism between a technology-will-save-us philosophy and a return-to-nature philosophy. Based on the reporting in this book, do you think that the technology camp has won?

A.I think the book is an attempt to take on both of those schools of thought. On some level, technology has won even people who would say dont do geoengineering still want to put up solar panels and build huge arrays of batteries, and those are technologies! But where does that leave us? It goes back to Ruth Gates and the super coral project. There was a big fight among coral biologists about whether a project like that should even be pursued. The Great Barrier Reef is the size of Italy even if you have some replacement coral, how are you going to get them out on the reef? But Gatess point was, were not returning. Even if we stopped emitting CO2 tomorrow, youre not getting the Great Barrier Reef back as it was in a foreseeable timeframe.

My impulse as an old-school environmentalist is to say Well, lets just leave things alone. But the sad fact is that weve intervened so much at this point that even not intervening is itself an intervention.

Q.Now that we have a U.S. president who takes climate change seriously, do you think we could actually start cutting carbon emissions quickly?

A.I really do want to applaud the first steps that the Biden administration has taken. I think they show a pretty profound understanding of the problem. But the question, and its a big one, is What are the limits? Will Congress do anything? What will happen in the Supreme Court? The U.S. is no longer the biggest emitter on an annual basis, but on a cumulative basis were still the biggest. And we still dont have resolution on how much CO2 we can put up there to avoid 1.5 or 2 degrees Celsius of warming. Those are questions with big error bars. If were lucky, I think we can avoid disastrous climate change. But if were not lucky, were already in deep trouble.

Q.Is there anything else you want to say about the book?

A.It sounds kind of weird after our conversation, but the book was actually a lot of fun to write. It sounds odd when youre talking about a book where the subject is so immensely serious.

Q.You mean like when the undergraduates in Australia are tossing each other buckets of coral sperm?

A.Yes! There is always humor in all these situations. I hope that sense of fun comes through.

Will Biden deliver on climate? Get weekly analysis of federal climate policy action in the first months of Bidens term.

Here is the original post:
Interview: Elizabeth Kolbert on why well never stop messing with nature - Grist

Posted in Human Genetic Engineering | Comments Off on Interview: Elizabeth Kolbert on why well never stop messing with nature – Grist

An Introduction to PCR – Technology Networks

Polymerase chain reaction (PCR) is a technique that has revolutionized the world of molecular biology and beyond. In this article, we will discuss a brief history of PCR and its principles, highlighting the different types of PCR and the specific purposes to which they are being applied.

In 1983, American biochemist Kary Mullis was driving home late at night when a flash of inspiration struck him. He wrote on the back of a receipt the idea that would eventually grant him the Nobel Prize for Chemistry in 1993. The concept was straightforward: reproducing in a laboratory tube the DNA replication process that takes place in cells. The outcome is the same: the generation of new complementary DNA (cDNA) strands based upon the existing ones.

Mullis used the basis of Sanger's DNA sequencing as a starting point for his new technique. He realized that the repeated use of DNA polymerase triggered a chain reaction resulting in a specific DNA segment's amplification.

The foundations for his idea were laid by a discovery in 1976 of a thermostable DNA polymerase, Taq, isolated from the bacterium Thermus aquaticus found in hot springs.1 Taq DNA polymerase has a temperature optimum of 72 C and survives prolonged exposure to temperatures as high as 96 C, meaning that it can tolerate several denaturation cycles.

Before the discovery of Taq polymerase, molecular biologists were already trying to optimize cyclic DNA amplification protocols, but they needed to add fresh polymerase at each cycle because the enzyme could not withstand the high temperatures needed for DNA denaturation. Having a thermostable enzyme meant that they could repeat the amplification process many times over without the need for fresh polymerase at every cycle, making the whole process scalable, more efficient and less time-consuming.

The first description of this polymerase chain reaction (PCR) using Taq polymerase was published in Science in 1985.2

In 1993, the first FDA-approved PCR kit came to market. Since then, PCR has been steadily and systematically improved. It has become a game-changer in everything from forensic evidence analysis and diagnostics, to disease monitoring and genetic engineering. It is undoubtedly considered one of the most important scientific advances of the 20th century.

The PCR is used to amplify a specific DNA fragment from a complex mixture of starting material called template DNA. The sample preparation and purification protocols depend on the starting material, including the sample matrix and accessibility of target DNA. Often, minimal DNA purification is needed. However, PCR does require knowledge of the DNA sequence information that flanks the DNA fragment to be amplified (called target DNA).

From a practical point of view, a PCR experiment is relatively straightforward and can be completed in a few hours. In general, a PCR reaction needs five key reagents:

DNA to be amplified: also called PCR template or template DNA. This DNA can be of any source, such as genomic DNA (gDNA), cDNA, and plasmid DNA.DNA polymerase: all PCR reactions require a DNA polymerase that can work at high temperatures. Taq polymerase is a commonly used one, which can incorporate nucleotides at a rate of 60 bases/second at 70 C and can amplify templates of up to 5 kb, making it suitable for standard PCR without special requirements. New generations of polymerases are being engineered to improve reaction performance. For example, some are engineered to be only activated at high temperatures to reduce non-specific amplification at the beginning of the reaction. Others incorporate a proofreading function, important, for example, when it is critical that the amplified sequence matches the template sequence exactly, such as during cloning.Primers: DNA polymerases require a short sequence of nucleotides to indicate where they need to initiate amplification. In a PCR, these sequences are called primers and are short pieces of single-stranded DNA (approximately 15-30 bases). When designing a PCR experiment, the researcher determines the region of DNA to be amplified and designs a pair of primers, one on the forward strand and one on the reverse, that specifically flanks the target region. Primer design is a key component of a PCR experiment and should be done carefully. Primer sequences must be chosen to target the unique DNA of interest, avoiding the possibility of binding to a similar sequence. They should have similar melting temperatures because the annealing step occurs simultaneously for both strands. The melting temperature of a primer can be impacted by the percentage of bases that are guanine (G) or cytosine (C) compared to adenine (A) or thymine (T), with higher GC contents increasing melting temperatures. Adjusting primer lengths can help to compensate for this in matching a primer pair. It is also important to avoid sequences that will tend to form secondary structures or primer dimers, as this will reduce PCR efficiency. Many free online tools are available to aid in primer design.Deoxynucleotide triphosphates (dNTPs): these serve as the building blocks to synthesize the new strands of DNA and include the four basic DNA nucleotides (dATP, dCTP, dGTP, and dTTP). dNTPs are usually added to the PCR reaction in equimolar amounts for optimal base incorporation.PCR buffer: the PCR buffer ensures that optimal conditions are maintained throughout the PCR reaction. The major components of PCR buffers include magnesium chloride (MgCl2), tris-HCl and potassium chloride (KCl). MgCl2 serves as a cofactor for the DNA polymerase, while tris-HCl and KCl maintain a stable pH during the reaction.The PCR reaction is carried out in a single tube by mixing the reagents mentioned above and placing the tube in a thermal cycler.The PCR amplification consists of three defined sets of times and temperatures termed steps: denaturation, annealing, and extension (Figure 1).

Figure 1: Steps of a single PCR cycle.

Each of these steps, termed cycles, is repeated 30-40 times, doubling the amount of DNA at each cycle and obtaining amplification (Figure 2).

Figure 2: The different stages and cycles of DNA molecule amplification by PCR.

Let's take a closer look at each step.

The first step of PCR, called denaturation, heats the template DNA up to 95 C for a few seconds, separating the two DNA strands as the hydrogen bonds between them are rapidly broken.

The reaction mixture is then cooled for 30 seconds to 1 minute. Annealing temperatures are usually 50 - 65 C however, the exact optimal temperature depends on the primers' length and sequence. It must be carefully optimized with every new set of primers.

The two DNA strands could rejoin at this temperature, but most do not because the mixture contains a large excess of primers that bind, or anneal, to the template DNA at specific, complementary positions. Once the annealing step is completed, hydrogen bonds will form between the template DNA and the primers. At this point, the polymerase is ready to extend the DNA sequence.

The temperature is then raised to the ideal working temperature for the DNA polymerase present in the mixture, typically around 72 C, 74 C in the case of Taq.

The DNA polymerase attaches to one end of each primer and synthesizes new strands of DNA, complementary to the template DNA. Now we have four strands of DNA instead of the two that were present to start with.

The temperature is raised back to 94 C and the double-stranded DNA molecules both the "original" molecules and the newly synthesized ones denature again into single strands. This begins the second cycle of denaturation-annealing-extension. At the end of this second cycle, there are eight molecules of single-stranded DNA. By repeating the cycle 30 times, the double-stranded DNA molecules present at the beginning are converted into over 130 million new double-stranded molecules, each one a copy of the region of the starting molecule delineated by the annealing sites of the two primers.

To determine if amplification has been successful, PCR products may be visualized using gel electrophoresis, indicating amplicon presence/absence, size and approximate abundance. Depending on the application and the research question, this may be the endpoint of an experiment, for example, if determining whether a gene is present or not. Otherwise, the PCR product may just be the starting point for more complex downstream investigations such as sequencing and cloning.

Thanks to their versatility, PCR techniques have evolved over recent years leading to the development or several different types of PCR technology.

Some of the most widely used ones are:

One of the most useful developments has been quantitative real-time PCR or qPCR. As the name suggests, qPCR is a quantitative technique that allows real-time monitoring of the amplification process and detection of PCR products as they are made.2 It can be used to determine the starting concentration of the target DNA, negating the need for gel electrophoresis in many cases. This is achieved thanks to the inclusion of non-specific fluorescent intercalating dyes, such as SYBR Green, that fluoresce when bound to double-stranded DNA, or DNA oligonucleotide sequence-specific fluorescent probes, such as hydrolysis (TaqMan) probes and molecular beacons. Probes bind specifically to DNA target sequences within the amplicon and use the principle of Frster Resonance Energy Transfer (FRET) to generate fluorescence via the coupling of a fluorescent molecule on one end and a quencher at the other end. For both fluorescent dyes and probes, as the number of copies of the target DNA increases, the fluorescence level increases proportionally, allowing real-time quantification of the amplification with reference to standards containing known copy numbers (Figure 3).

qPCR uses specialized thermal cyclers equipped with fluorescent detection systems that monitor the fluorescent signal as the amplification occurs.

Figure 3: Example qPCR amplification plot and standard curve used to enable quantification of copy number in unknown samples.

Reverse transcription (RT) -PCR and RT-qPCR are two commonly used PCR variants enabling gene transcription analysis and quantification of viral RNA, both in clinical and research settings.

RT is the process of making cDNA from single-stranded template RNA3 and is consequently also called first-strand cDNA synthesis. The first step of RT-PCR is to synthesize a DNA/RNA hybrid between the RNA template and a DNA oligonucleotide primer. The reverse transcriptase enzyme that catalyzes this reaction has RNase activity that then degrades the RNA portion of the hybrid. Subsequently, a single-stranded DNA molecule is synthesized by the DNA polymerase activity of the reverse transcriptase. High purity and quality starting RNA are essential for a successful RT-PCR.

RT-PCR can be performed following two approaches: one-step RT-PCR and two-step RT-PCR. In the first case, the RT reaction and the PCR reaction occur in the same tube, while in the two-step RT-PCR, the two reactions are separate and performed sequentially.

The reverse transcription described above often serves as the first step in qPCR too, quantifying RNA in biological samples (either RNA transcripts or derived from viral RNA genomes).

As with RT-PCR, there are two approaches for quantifying RNA by RT-qPCR: one-step RT-qPCR and two-step RT-qPCR. In both cases, RNA is first reverse-transcribed into cDNA, which is used as the template for qPCR amplification. In the two-step method, the reverse transcription and the qPCR amplification occur sequentially as two separate experiments. In the one-step method, RT and qPCR are performed in the same tube.

Digital PCR (dPCR) is another adaptation of the original PCR protocol.4 Like qPCR, dPCR technology uses DNA polymerase to amplify target DNA from a complex sample using a primer set and probes. The main difference, though, lies in the partitioning of the PCR reactions and data acquisition at the end.

dPCR and ddPCR are based on the concept of limiting dilutions. The PCR reaction is split into large numbers of nanoliter-sized sub-reactions (partitions). The PCR amplification is carried out within each droplet. Following PCR, each droplet is analyzed with Poisson statistics to determine the percentage of PCR-positive droplets in the original sample. Some partitions may contain one or more copies of the target, while others may contain no target sequences. Therefore, partitions classify either as positive (target detected) or negative (target not detected), providing the basis for a digital output format.

ddPCR is a recent technology that became available in 2011.5 ddPCR utilizes a water-oil emulsion to form the partitions that separate the template DNA molecules. The droplets essentially serve as individual test tubes in which the PCR reaction takes place.

The recent development of microfluidic handling systems with microchannels and microchambers has paved the way for a range of practical applications, including the amplification of DNA via PCR on microfluidic chips.

PCR performed on a chip benefits from microfluidics advantages in speed, sensitivity and low consumption of reagents. These features make microfluidic PCR particularly appealing for point-of-care testing, for example, for diagnostics applications. From a practical point of view, the sample flows through a microfluidic channel, repeatedly passing the three temperature zones reflecting the different steps of PCR. It takes just 90 seconds for a 10 L sample to perform 20 PCR cycles.6 The subsequent analysis can then be easily carried out off-chip.

The different PCR approaches all have advantages and disadvantages that impact the applications to which they are suited 7. These are summarized in Table 1.

Approach

Advantages

Limitations

PCR

Easiest PCR to perform

Low cost of equipment and reagents

Several downstream applications (e.g., cloning)

Results are only qualitative

Requires post-amplification analyses that increase time and risk of error

Products may need to be confirmed by sequencing

qPCR

Produces quantitative results

Probe use can ensure high specificity

High analytical sensitivity

Low turnaround time

Eliminates requirements for post-amplification analysis

Requires more expensive reagents and equipment

Less flexibility in primer and probe selection

Less amenable to other downstream product confirmation analyses (such as sequencing) due to the small length of the amplicon

Not suitable for some downstream applications such as cloning

RT-PCR and RT-qPCR

Can be used with all RNA types

RNA is prone to degradation

The RT step may increase the time and potential for contamination

dPCR and ddPCR

Fast

No DNA purification step

Provides absolute quantification

Increased sensitivity for detecting the target in limited clinical samples

Highly scalable

Costly

Based on several statistical assumptions

Microfluidic PCR

Accelerated PCR process

Reduced reagent consumption

Can be adapted for high throughput

Portable device for point-of-care applications

Allows single-cell analysis

Still very new technology

Requires extensive sample preparation to remove debris and unwanted compounds

Restricted choice of materials for the microfluidic device due to high temperatures

Table 1: Key advantages and disadvantages of different PCR approaches.

PCR has become an indispensable tool in modern molecular biology and has completely transformed scientific research. The technique has also opened up the investigation of cellular and molecular processes to those outside the field of molecular biology and consequently also finds utility by scientists in many disciplines.

Whilst PCR is itself a powerful standalone technique, it has also been incorporated into wider techniques, such as cloning and sequencing, as one small but important part of these workflows.

Research applications of PCR include:

Gene transcription -PCR can examine variations in gene transcription among cell types, tissues and organisms at a specific time point. In this process, RNA is isolated from samples of interest, and reverse-transcribed into cDNA. The original levels of RNA for a specific gene can then be quantified from the amount of cDNA amplified in PCR.Genotyping -PCR can detect sequence variations in alleles of specific cells or organisms. A common example is the genotyping of transgenic organisms, such as knock-out and knock-in mice. In this application, primers are designed to amplify either a transgene portion (in a transgenic animal) or the mutation (in a mutant animal).Cloning and mutagenesis- PCR cloning is a widely used technique where double-stranded DNA fragments amplified by PCR are inserted into vectors (e.g., gDNA, cDNA, plasmid DNA). This for example, enables the creation of bacterial strains from which genetic material has been deleted or inserted. Site-directed mutagenesis can also be used to introduce point mutations via cloning. This often employs a technique known as recombinant PCR, in which overlapping primers are specifically designed to incorporate base substitutions (Figure 4). This technique can also be used to create novel gene fusions.

Figure 4: Diagram depicting an example of recombinant PCR.Sequencing- PCR can be used to enrich template DNA for sequencing. The type of PCR recommended for the preparation of sequencing templates is called high-fidelity PCR and is able to maintain DNA sequence accuracy. In Sanger sequencing, PCR-amplified fragments are then purified and run in a sequencing reaction. In next-generation sequencing (NGS), PCR is used at the library preparation stage, where DNA samples are enriched by PCR to increase the starting quantity and tagged with sequencing adaptors to allow multiplexing. Bridge PCR is also an important part of the second-generation NGS sequencing process.Both as an independent technique and as a workhorse within other methods, PCR has transformed a range of disciplines. These include:

Genetic research- PCR is used in most laboratories worldwide. One of the most common applications is gene transcription analysis9, aimed at evaluating the presence or abundance of particular gene transcripts. It is a powerful technique in manipulating the genetic sequence of organisms animal, plant and microbe - through cloning. This enables genes or sections of genes to be inserted, deleted or mutated to engineer in genetic markers alter phenotypes, elucidate gene functions and develop vaccines to name but a few. In genotyping, PCR can be used to detect sequence variations in alleles in specific cells or organisms. Its use isnt restricted to humans either. Genotyping plants in agriculture assists plant breeders in selecting, refining, and improving their breeding stock. PCR is also the first step to enrich sequencing samples, as discussed above. For example, most mapping techniques in the Human Genome Project (HGP) relied on PCR.Medicine and biomedical research- PCR is used in a host of medical applications, from diagnostic testing for disease-associated genetic mutations, to the identification of infectious agents. Another great example of PCR use in the medical realm is prenatal genetic testing. Prenatal genetic testing through PCR can identify chromosome abnormalities and genetic mutations in the fetus, giving parents-to-be important information about whether their baby has certain genetic disorders. PCR can also be used as a preimplantation genetic diagnosis tool to screen embryos for in vitro fertilization (IVF) procedures.Forensic science- Our unique genetic fingerprints mean that PCR can be instrumental in both paternity testing and forensic investigations to pinpoint samples' sources. Small DNA samples isolated from a crime scene can be compared with a DNA database or with suspects' DNA, for example. These procedures have really changed the way police investigations are carried out. Authenticity testing also makes use of PCR genetic markers, for example, to determine the species from which meat is derived. Molecular archaeology too utilizes PCR to amplify DNA from archaeological remains.Environmental microbiology and food safety- Detection of pathogens by PCR, not only in patients' samples but also in matrices like food or water, can be vital in diagnosing and preventing infectious disease.PCR is the benchmark technology for detecting nucleic acids in every area, from biomedical research to forensic applications. Kary Mullis's idea, written on the back of a receipt on the side of the road, turned out to be a revolutionary one.

References1. Chien A, Edgar DB, Trela JM. Deoxyribonucleic acid polymerase from the extreme thermophile Thermus aquaticus. J Bacteriol 1976;127(3):1550-57 doi: 10.1128/JB.127.3.1550-1557.1976

2. Saiki RK, Scharf S, Faloona F, et al. Enzymatic amplification of beta-globin genomic sequences and restriction site analysis for diagnosis of sickle cell anemia. Science 1985;230(4732):1350 doi: 10.1126/science.2999980

3. Arya M, Shergill IS, Williamson M, Gommersall L, Arya N, Patel HRH. Basic principles of real-time quantitative PCR. Expert Review of Molecular Diagnostics 2005;5(2):209-19 doi: 10.1586/14737159.5.2.209

4. Bachman J. Chapter Two - Reverse-Transcription PCR (RT-PCR). In: Lorsch J, ed. Methods in Enzymology: Academic Press, 2013:67-74. doi : 10.1016/B978-0-12-420037-1.00002-6

5. Morley AA. Digital PCR: A brief history. Biomol Detect Quantif 2014;1(1):1-2 doi: 10.1016/j.bdq.2014.06.001

6. Taylor SC, Laperriere G, Germain H. Droplet Digital PCR versus qPCR for gene expression analysis with low abundant targets: from variable nonsense to publication quality data. Scientific Reports 2017;7(1):2409 doi: 10.1038/s41598-017-02217-x

7. Ahrberg CD, Manz A, Chung BG. Polymerase chain reaction in microfluidic devices. Lab on a Chip 2016;16(20):3866-84 doi: 10.1039/C6LC00984K

8. Garibyan L, Avashia N. Polymerase chain reaction. J Invest Dermatol 2013;133(3):1-4 doi: 10.1038/jid.2013.1

9. VanGuilder HD, Vrana KE, Freeman WM. Twenty-five years of quantitative PCR for gene expression analysis. BioTechniques 2008;44(5):619-26 doi: 10.2144/000112776

Read the original post:
An Introduction to PCR - Technology Networks

Posted in Human Genetic Engineering | Comments Off on An Introduction to PCR – Technology Networks

Genomics and genre – Science

If the double helix is an icon of the modern age, then the genome is one of the last grand narratives of modernity, writes Lara Choksey in her new book, Narrative in the Age of the Genome. Hybridizing literary criticism with a genre-spanning consideration of a dozen distinct literary works, and imbued throughout with deep concern for the peripheral, the possible, and the political, the book seeks to challenge the whole imaginative apparatus for constructing the self into a coherent narrative, via the lexicon and syntax of the molecular.

To a reading of Richard Dawkins's The Selfish Gene (1976) as a repudiation of class struggle and E. O. Wilson's Sociobiology (1975) as a defense of warfare, Choksey juxtaposes another kind of ambiguous heterotopia in which genetic engineering is a tool of neoliberal self-fashioning. In Samuel R. Delany's Trouble on Triton (1976), Bron, a transgender ex-gigolo turned informatics expert, is caught between sociobiology and the selfish gene, between the liberal developmentalism of progressive evolution, and the neoliberal extraction and rearrangement of biological information. Even the undulating interruptions and parentheticals of Bron's thoughts [mimic] the description of the activation and silencing of genes, she suggests, tying together gene and genre in a way that encapsulates neoliberal alienation.

Choksey next explores the ways in which collectivist fantasies of biological reinvention under Soviet Lysenkoism fused code and cultivation through a close reading of Arkady and Boris Strugatsky's Roadside Picnic (1972) in which cultivated utopian dreamworlds become contaminated by alien forces, resulting in fundamental ecological transformations beyond the promised reach of human control. The novel brings to light not forgotten Soviet utopias but literal zombies and mutations. In a world where planned cultivation fails entirely in the face of the unfamiliar, even as new biological weapons are being developed, Earth itself viscerally reflects a fractured reality of lost promisesa world in crisis with all meaning gone, and survival itself a chancy proposition.

Framed as a family history, The Immortal Life of Henrietta Lacks is actually a horror story, argues Choksey.

As the promise of precision medicine emerged, so too did new forms of memoir. In Kazuo Ishiguro's Never Let Me Go (2005) and the film Gattaca (1997), for example, the traditional aspirational narrative of a pilgrim's progress is subverted: As the unitary subject disappears into data, algorithms, and commodities, a new grammar of existence emerges, albeit one in which the inherited problems of the pastracism, ableism, and the fiction of heteronormativityremain ever-present.

In Saidiya Hartman's Lose Your Mother (2006) and Yaa Gyasi's Homegoing (2016), Choksey sees a reorientation of genomics away from the reduction of self to code and toward new forms of kinship and belonging that offer a reckoning with the histories of brutalization and displacement upon which liberal humanism is founded. Even as genomics seeks to locate the trauma of enslavement at the level of the molecular, communities seeking reunion and reparation know that technology alone cannot do the cultural work of caring for history that narrative can offer.

Reading Rebecca Skloot's The Immortal Life of Henrietta Lacks (2010) as a biography of Black horror which tries, time and again, to resolve itself as family romance, Choksey identifies the perils of narratives unable to recognize their own genre. She argues that by blurring the lines not between fact and fiction but between horror and family history, the dehumanization of Black lives as experimental biomatter echoes inescapably with larger histories of the extraction of Black flesh for the expansion of colonial-capitalist production.

What emerges as most compelling out of this entire tapestry of readings is the author's interpretation of the limits and failures of the extraordinary cultural power of the genome. Concluding that genomics has privileged a particular conception of the human that is in the process of being reconfigured, Choksey ventures that the uncomplicated subject, the Vitruvian Man of the Human Genome Project, has reached its end. What is left is neither dust, stardust, nor a face erased in the sand (as Foucault would have it) but rather whatever might emerge next from the unwieldy kaleidoscope of possible meanings.

Originally posted here:
Genomics and genre - Science

Posted in Human Genetic Engineering | Comments Off on Genomics and genre – Science

Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes – SciTechDaily

Researchers have reconstructed the ancestral sequence of the great ape Y chromosome by comparing three existing (gorilla, human, and chimpanzee) and two newly generated (orangutan and bonobo) Y chromosome assemblies. The new research shows that many gene families and multi-copy sequences were already present in the great ape Y common ancestor and that the chimpanzee and bonobo lineages experienced accelerated gene death and nucleotide substitution rates after their divergence from the human lineage. Credit: Dani Zemba and Monika Cechova, Penn State

Researchers reconstruct the ancestral great ape Y and show its rapid evolution in bonobo and chimpanzee.

New analysis of the DNA sequence of the male-specific Y chromosomes from all living species of the great ape family helps to clarify our understanding of how this enigmatic chromosome evolved. A clearer picture of the evolution of the Y chromosome is important for studying male fertility in humans as well as our understanding of reproduction patterns and the ability to track male lineages in the great apes, which can help with conservation efforts for these endangered species.

A team of biologists and computer scientists at Penn State sequenced and assembled the Y chromosome from orangutan and bonobo and compared those sequences to the existing human, chimpanzee, and gorilla Y sequences. From the comparison, the team was able to clarify patterns of evolution that seem to fit with behavioral differences between the species and reconstruct a model of what the Y chromosome might have looked like in the ancestor of all great apes.

A paper describing the research was published in the journal Proceedings of the National Academy of Sciences.

The Y chromosome is important for male fertility and contains the genes critical for sperm production, but it is often neglected in genomic studies because it is so difficult to sequence and assemble, said Monika Cechova, a graduate student at Penn State at the time of the research and co-first author of the paper. The Y chromosome contains a lot of repetitive sequences, which are challenging for DNA sequencing, assembling sequences, and aligning sequences for comparison. There arent out-of-the-box software packages to deal with the Y chromosome, so we had to overcome these hurdles and optimize our experimental and computational protocols, which allowed us to address interesting biological questions.

The Y chromosome is unusual. It contains relatively few genes, many of which are involved in male sex determination and sperm production; large sections of repetitive DNA, short sequences repeated over and over again; and large DNA palindromes, inverted repeats that can be many thousands of letters long and read the same forwards and backwards.

Previous work by the team comparing human, chimpanzee, and gorilla sequences had revealed some unexpected patterns. Humans are more closely related to chimpanzees, but for some characteristics, the human Y was more similar to the gorilla Y.

If you just compare the sequence identitycomparing the As, Ts, Cs, and Gs of the chromosomeshumans are more similar to chimpanzees, as you would expect, said Kateryna Makova, Pentz Professor of Biology at Penn State and one of the leaders of the research team. But if you look at which genes are present, the types of repetitive sequences, and the shared palindromes, humans look more similar to gorillas. We needed the Y chromosome of more great ape species to tease out the details of what was going on.

The team, therefore, sequenced the Y chromosome of a bonobo, a close relative of the chimpanzee, and an orangutan, a more distantly related great ape. With these new sequences, the researchers could see that the bonobo and chimpanzee shared the unusual pattern of accelerated rates of DNA sequence change and gene loss, suggesting that this pattern emerged prior to the evolutionary split between the two species. The orangutan Y chromosome, on the other hand, which serves as an outgroup to ground the comparisons, looked about like what you expect based on its known relationship to the other great apes.

Our hypothesis is that the accelerated change that we see in chimpanzees and bonobos could be related to their mating habits, said Rahulsimham Vegesna, a graduate student at Penn State and co-first author of the paper. In chimpanzees and bonobos, one female mates with multiple males during a single cycle. This leads to what we call sperm competition, the sperm from several males trying to fertilize a single egg. We think that this situation could provide the evolutionary pressure to accelerate change on the chimpanzee and bonobo Y chromosome, compared to other apes with different mating patterns, but this hypothesis, while consistent with our findings, needs to be evaluated in subsequent studies.

In addition to teasing out some of the details of how the Y chromosome evolved in individual species, the team used the set of great ape sequences to reconstruct what the Y chromosome might have looked like in the ancestor of modern great apes.

Having the ancestral great ape Y chromosome helps us to understand how the chromosome evolved, said Vegesna. For example, we can see that many of the repetitive regions and palindromes on the Y were already present on the ancestral chromosome. This, in turn, argues for the importance of these features for the Y chromosome in all great apes and allows us to explore how they evolved in each of the separate species.

The Y chromosome is also unusual because, unlike most chromosomes it doesnt have a matching partner. We each get two copies of chromosomes 1 through 22, and then some of us (females) get two X chromosomes and some of us (males) get one X and one Y. Partner chromosomes can exchange sections in a process called recombination, which is important to preserve the chromosomes evolutionarily. Because the Y doesnt have a partner, it had been hypothesized that the long palindromic sequences on the Y might be able to recombine with themselves and thus still be able to preserve their genes, but the mechanism was not known.

We used the data from a technique called Hi-C, which captures the three-dimensional organization of the chromosome, to try to see how this self-recombination is facilitated, said Cechova. What we found was that regions of the chromosome that recombine with each other are kept in close proximity to one another spatially by the structure of the chromosome.

Working on the Y chromosome presents a lot of challenges, said Paul Medvedev, associate professor of computer science and engineering and of biochemistry and molecular biology at Penn State and the other leader of the research team. We had to develop specialized methods and computational analyses to account for the highly repetitive nature of the sequence of the Y. This project is truly cross-disciplinary and could not have happened without the combination of computational and biological scientists that we have on our team.

Reference: Dynamic evolution of great ape Y chromosomes by Monika Cechova, Rahulsimham Vegesna, Marta Tomaszkiewicz, Robert S. Harris, Di Chen, Samarth Rangavittal, Paul Medvedev and Kateryna D. Makova, 5 October 2020, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.2001749117

In addition to Cechova, Makova, Vegesna, and Medvedev, the research team at Penn State included Marta Tomaszkiewicz, Robert S. Harris, Di Chen, and Samarth Rangavittal. The research was supported by the U.S. National Institutes of Health, the U.S. National Science Foundation, the Clinical and Translational Sciences Institute, the Institute of Computational and Data Sciences, the Huck Institutes of the Life Sciences, and the Eberly College of Science of the Pennsylvania State University, and by the CBIOS Predoctoral Training Program awarded to Penn State by the National Institutes of Health.

See more here:
Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes - SciTechDaily

Posted in Human Genetic Engineering | Comments Off on Genetic Analysis Reveals Evolution of the Enigmatic Y Chromosome in Great Apes – SciTechDaily