Search Immortality Topics:

Page 101«..1020..100101102103..110120..»


Category Archives: Human Genetic Engineering

Evolution and Bible cannot mix

By Babu G. Ranganathan

Christian leaders across denominations have compromised with Darwinian evolutionary theory, which has caused havoc of faith for millions. Many Christian leaders argue that God used evolution to create all life. This position is neither biblical nor scientific and opens the door for utter ridicule and disrespect. If Darwinian macro-evolutionary theory is true then any belief in God is nothing more than blind faith because God is not necessary for the process.

The Bible teaches that God began with a perfect creation where there was no suffering and death. There was death of plant life but not life possessing "soul" (i.e. animals and humans). There was even perfect peace and harmony between animals. There were no meat-eating animals in the beginning (Genesis 1:30). Animal and human death did not occur until after man sinned. There was no struggle and survival-of-the-fittest among animals or man in the beginning. Man and all creation, which was placed under man, fell to imperfection, struggle, suffering, and death because of the sin of Adam and Eve, mankind's first parents. All this is opposite to what Darwinian macro-evolutionary theory teaches. Unlike the Bible, the theory of evolution teaches the world was never perfect and the animal kingdom always existed with struggle and suffering and death (extinction). Scripture teaches these conditions came into being after the fall of man - not before!

You cannot mix Darwinian macro-evolution and the Bible. To say God used evolution to create man is in direct contradiction to the doctrine of the Fall of Man because the process of evolution involves struggle, pain, suffering, and death. God did not begin with these or with half-evolved fish, amphibians, reptiles, birds, mammals, etc. God created a perfect world with complete, fully-functioning, and fully-formed species from the very beginning (i.e. complete fish, complete amphibians, complete reptiles, complete birds, complete mammals, etc.). How could a partially-evolved species survive anyway? It would be unfit for survival. Survival-of-the-fittest wouldn't allow for partially-evolved species with partially-evolved tissues, organs, and biological functions and structures to survive!

More and more evolutionary scientists are abandoning the theory of gradual macro-evolution and are adopting a new theory, "Punctuated Equilibrium" which teaches that life forms changed suddenly, not gradually, by chance from one kind to another as a result of massive random genetic mutations caused by massive random radiation from the environment.

The reason for this big change among evolutionists is because they realize that species cannot survive in a partially evolved state with incomplete traits, tissues, organs, and body functions. A reptile with scales in the process of turning to feathers would not have the function of either trait.

The problem with punctuated equilibria, however, is that it is contrary to what we know about the nature of mutations and radiation. Punctuated equilibrium is nothing more than blind faith.

In Genesis 1, God says 10 different times that all living things must reproduce after their own "kind," not into other kinds! A dog must reproduce a dog. Different varieties of dogs are possible genetically, but they will all still be dogs and not something else.

God placed within the "kinds" the genetic ability for variation and change to adapt to changing environments, but this is not the same as evolution from one kind into another kind as Darwinian macro-evolutionary theory teaches.

All the biological similarities between species are because of a common Designer (God) Who designed similar functions for similar purposes in all the various forms of life, not because of a common ancestry as evolutionists teach.

Here is the original post:
Evolution and Bible cannot mix

Posted in Human Genetic Engineering | Comments Off on Evolution and Bible cannot mix

Reification is alright by me! | Gene Expression

Long time readers know that Im generally OK with reificationas long as we dont take it too seriously. And we do that all the time. An object is really only an object in a human-sense. Reduced down to particle physics it is an altogether different entity. But on the human-scale asserting that a chair is indeed a chair, rather than cellulose, etc., (or now, polymers), and further down basic macromolecules, is a useful fudge. Similarly, Im generally skeptical of the idea that we have a clear & distinct model for what a species is. The framework is very different when youre talking about prokaryotes, as opposed to plants, as opposed to mammals. The question is not species, but what utility or instrumental value does the category or class species have?

For most of the stuff Im concerned with, the messy shapes of reality which are the purview of biological science, we are all fundamentally nominalists in our metaphysic. We may accept that were idealists in the sense of cognitive or evolutionary psychology, but human intuition does not make it so. The categories and classes we construct are simply the semantic sugar which makes the reality go down easier. They should never get confused for the reality that is, the reality which we perceive but darkly and with biased lenses. The hyper-relativists and subjectivists who are moderately fashionable in some humane studies today are correct to point out that science is a human construction and endeavor. Where they go wrong is that they are often ignorant of the fact that the orderliness of many facets of nature is such that even human ignorance and stupidity can be overcome with adherence to particular methods and institutional checks and balances. The predictive power of modern science, giving rise to modern engineering, is the proof of its validity. No talk or argumentation is needed. Boot up your computer. Drive your car.

All this is to preface my explanation of my post below, Finding Fake Roots. Some readers & commenters were a bit confused or unclear at what I was getting at. Here in a nutshell is the problem as I see it: in Finding Your Roots Henry Louis Gates Jr. shifts back and forth between uniparental phylogenies, more contemporary model clustering assessments of genetic relatedness and relationships, and natural history. The map above illustrates human migrations as they were commonly depicted in the mid-2000s during the peak of uniparental lineages. By uniparental, I mean the direct maternal line, as defined by mtDNA, and the direct paternal line, as defined by Y chromosomes. These two methods were easy to conceive as a phylogentic tree, because thats what these two genomic regions are. Theres no recombination; mixing & matching between parents. You get your Y from your paternal lineage (if you are male), and your mtDNA from your mother. These neat phylogenetic trees of mtDNA and Y chromosomal lineages were then easily transposed to a spatial and temporal scale. Ergo, nice cartographic infographics like the one above.

But all good things come to an end. With SNP chips, which allow researchers to type individuals across hundreds of thousands of markers, we now are at the stage where we can move beyond the direct maternal and paternal lines. Rather than just one line of descent, we get a full picture of ones ancestors. To the left are my daughters result from 23andMe. As I have noted before, Im 99% sure theres a problem with their results for people with particular mixed ancestries (my half East Asian/European friends get good results, so I assume its a reference population problem). She is about 40 percent South Asian on ADMIXTURE, and that makes sense since Im 80-90 percent South Asian (the balance is East Asian, which is not surprising due to my familys origin). But lets put that to the side. What are these results telling us?These algorithms are powerful. But we always need to be very careful about imposing our own frame upon them. They are excellent at gauging genetic relatedness, but they are not so excellent at telling us our history.

To the left is my best guess at my history. Im 80-90% South Asian. Thats pretty clear, and will show up on any clustering algorithm with explicit models. It will also be clear on a PCA, where South Asians can shake out as their own independent dimension. But 10,000 years ago South Asians as we understand them geneticallyprobably did not exist. By probably, Im about 95% sure. The genetic analyses which support this proposition are abstruse, but they make sense of a great deal of other data. The South Asian genetic cluster is a real one, but it is compound formed within the last 10,000 years of two very distinct populations. Just because it is a hybrid does not mean that one should automatically reduce it to its antecedents. Many real populations originate as hybrids. Similarly, I think that there is a high chance that Europeans, as we understand them today, did not exist 10,000 years ago. Rather, modern Europeans as well may be a compound of various populations which expanded demographically, and synthesized with each other, between the end of the Ice Age and history.

In my post below some commenters argued that obviously implausible inferences from a thin set of reference populations are acceptable considering Henry Louis Gates Jrs target audience. But that really wasnt my main point. Rather, it was that he was eliding the distinction between uniparental markers, and the clusters generated by modeled based ancestry assignment algorithms, and ascribing the phylogenies of the former to the latter. It is important to note that categories like Europeans are only approximations. But theyre damn good approximations today! Nevertheless, note the qualification of time: they may have basically no meaning at some point in the recent past. Theyre powerful when it comes to precisely partitioning modern variation, but they dont tell us the history of that variation.

When constructs lead us to a false perception of reality, were using them incorrectly. We shouldnt blame the abstractions. Rather, we should blame humans. Ourselves.

Read the original here:
Reification is alright by me! | Gene Expression

Posted in Human Genetic Engineering | Comments Off on Reification is alright by me! | Gene Expression

Robot reveals the inner workings of brain cells: Automated way to record electrical activity inside neurons in the …

ScienceDaily (May 6, 2012) Gaining access to the inner workings of a neuron in the living brain offers a wealth of useful information: its patterns of electrical activity, its shape, even a profile of which genes are turned on at a given moment. However, achieving this entry is such a painstaking task that it is considered an art form; it is so difficult to learn that only a small number of labs in the world practice it.

But that could soon change: Researchers at MIT and the Georgia Institute of Technology have developed a way to automate the process of finding and recording information from neurons in the living brain. The researchers have shown that a robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.

The new automated process eliminates the need for months of training and provides long-sought information about living cells' activities. Using this technique, scientists could classify the thousands of different types of cells in the brain, map how they connect to each other, and figure out how diseased cells differ from normal cells.

The project is a collaboration between the labs of Ed Boyden, associate professor of biological engineering and brain and cognitive sciences at MIT, and Craig Forest, an assistant professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech.

"Our team has been interdisciplinary from the beginning, and this has enabled us to bring the principles of precision machine design to bear upon the study of the living brain," Forest says. His graduate student, Suhasa Kodandaramaiah, spent the past two years as a visiting student at MIT, and is the lead author of the study, which appears in the May 6 issue of Nature Methods.

The method could be particularly useful in studying brain disorders such as schizophrenia, Parkinson's disease, autism and epilepsy, Boyden says. "In all these cases, a molecular description of a cell that is integrated with [its] electrical and circuit properties has remained elusive," says Boyden, who is a member of MIT's Media Lab and McGovern Institute for Brain Research. "If we could really describe how diseases change molecules in specific cells within the living brain, it might enable better drug targets to be found."

Kodandaramaiah, Boyden and Forest set out to automate a 30-year-old technique known as whole-cell patch clamping, which involves bringing a tiny hollow glass pipette in contact with the cell membrane of a neuron, then opening up a small pore in the membrane to record the electrical activity within the cell. This skill usually takes a graduate student or postdoc several months to learn.

Kodandaramaiah spent about four months learning the manual patch-clamp technique, giving him an appreciation for its difficulty. "When I got reasonably good at it, I could sense that even though it is an art form, it can be reduced to a set of stereotyped tasks and decisions that could be executed by a robot," he says.

To that end, Kodandaramaiah and his colleagues built a robotic arm that lowers a glass pipette into the brain of an anesthetized mouse with micrometer accuracy. As it moves, the pipette monitors a property called electrical impedance -- a measure of how difficult it is for electricity to flow out of the pipette. If there are no cells around, electricity flows and impedance is low. When the tip hits a cell, electricity can't flow as well and impedance goes up.

The pipette takes two-micrometer steps, measuring impedance 10 times per second. Once it detects a cell, it can stop instantly, preventing it from poking through the membrane. "This is something a robot can do that a human can't," Boyden says.

The rest is here:
Robot reveals the inner workings of brain cells: Automated way to record electrical activity inside neurons in the ...

Posted in Human Genetic Engineering | Comments Off on Robot reveals the inner workings of brain cells: Automated way to record electrical activity inside neurons in the …

Robot Reveals the Inner Workings of Brain Cells

Newswise Gaining access to the inner workings of a neuron in the living brain offers a wealth of useful information: its patterns of electrical activity, its shape, even a profile of which genes are turned on at a given moment. However, achieving this entry is such a painstaking task that it is considered an art form; it is so difficult to learn that only a small number of labs in the world practice it.

But that could soon change: Researchers at MIT and the Georgia Institute of Technology have developed a way to automate the process of finding and recording information from neurons in the living brain. The researchers have shown that a robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.

The new automated process eliminates the need for months of training and provides long-sought information about living cells activities. Using this technique, scientists could classify the thousands of different types of cells in the brain, map how they connect to each other, and figure out how diseased cells differ from normal cells.

The project is a collaboration between the labs of Ed Boyden, associate professor of biological engineering and brain and cognitive sciences at MIT, and Craig Forest, an assistant professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech.

Our team has been interdisciplinary from the beginning, and this has enabled us to bring the principles of precision machine design to bear upon the study of the living brain, Forest says. His graduate student, Suhasa Kodandaramaiah, spent the past two years as a visiting student at MIT, and is the lead author of the study, which appears in the May 6 issue of Nature Methods.

The method could be particularly useful in studying brain disorders such as schizophrenia, Parkinsons disease, autism and epilepsy, Boyden says. In all these cases, a molecular description of a cell that is integrated with [its] electrical and circuit properties has remained elusive, says Boyden, who is a member of MITs Media Lab and McGovern Institute for Brain Research. If we could really describe how diseases change molecules in specific cells within the living brain, it might enable better drug targets to be found.

Automation

Kodandaramaiah, Boyden and Forest set out to automate a 30-year-old technique known as whole-cell patch clamping, which involves bringing a tiny hollow glass pipette in contact with the cell membrane of a neuron, then opening up a small pore in the membrane to record the electrical activity within the cell. This skill usually takes a graduate student or postdoc several months to learn.

Kodandaramaiah spent about four months learning the manual patch-clamp technique, giving him an appreciation for its difficulty. When I got reasonably good at it, I could sense that even though it is an art form, it can be reduced to a set of stereotyped tasks and decisions that could be executed by a robot, he says.

To that end, Kodandaramaiah and his colleagues built a robotic arm that lowers a glass pipette into the brain of an anesthetized mouse with micrometer accuracy. As it moves, the pipette monitors a property called electrical impedance a measure of how difficult it is for electricity to flow out of the pipette. If there are no cells around, electricity flows and impedance is low. When the tip hits a cell, electricity cant flow as well and impedance goes up.

Read more:
Robot Reveals the Inner Workings of Brain Cells

Posted in Human Genetic Engineering | Comments Off on Robot Reveals the Inner Workings of Brain Cells

H5N1 Paper Published: Deadly, Transmissible Bird Flu Closer than Thought?

After an epic debate over whether to release research detailing how scientists created H5N1 in the lab, Nature finally published one of the two controversial papers on Wednesday.

You might not have noticed, but the influenza world has been in a bit of an uproar since late last year, when news leaked out that two teams of researchers had purposefully tweaked H5N1 bird flu in the lab to potentially make it more transmissible among human beings. (H5N1 spreads like wildfire among birds and usually kills them but the virus only rarely seems to jump to human beings, though when it does the infections are often fatal.)

The two scientists Yoshihiro Kawaoka of the University of WisconsinMadison and TIME 100 honoree Ron Fouchier of Erasmus University in the Netherlands had submitted their research toNature andScience, respectively, with the expectation of swift publication. In December, the National Science Advisory Board for Biosecurity (NSABB) did something unprecedented: they ruled that the two papers should be censored if published, that they should be scrubbed of the complete methods and viral mutations that the researchers studied, in order to head off the risk that terror groups could use the information to craft a deadly bioweapon.

(PHOTOS: The Bird Flu Outbreaks in 2008)

That led to intense fighting within the scientific community. Some researchers wanted the papers published in full, both because they believed the work could help arm us against a future flu pandemic, and because they worried about the chill of government censorship on science. Other scientists were against publication and even the experiments themselves, believing that nothing gleaned from the work could be important enough to offset the risk of creating a potentially deadly flu virus.

In the end, Fouchier explained that his man-made flu virus wasnt the merciless killer that early media reports had made it out to be Kawaokas man-made virus was always believed to be less dangerous and in March the NSABB took a look at revised papers submitted by the two research teams and voted to recommend that they be published.

On Wednesday, Nature finally published Kawaokas research. (Were still waiting for the Fouchier paper, though the Dutch scientist was recently granted an export license for his work, so it should appear soon.) The sobering takeaway: avian H5N1 flu viruses in nature may be only one mutation away from spreading effectively between mammals, likely including human beings. If that happens and if H5N1 retains its apparently sky-high mortality rate we could be in for serious trouble.

For all the controversy, the research itself is actually quite fascinating. Kawaoka and his team mutated H5N1s hemagglutinin (HA) gene the H in H5N1 which produces the protein the virus needs to attach itself to host cells. They produced millions of genes, mimicking the effect of random mutation in nature, and found one version of H5N1 hemagglutinin that seems particularly effective at invading human cells.

(MORE: Dangers of Man-Made Bird Flu Are Exaggerated, Its Creators Say)

The genes for that protein contained four new mutations, three of which altered the shape of the gene, while the fourth one changed the pH level at which the protein attaches to the cell and injects the viruss genetic material inside. (Its a bit reminiscent of Alien, if the virus is the face-hugger and this poor guys face is the cell.)The team combined the mutated HA gene with seven other genes flu viruses have eight genes in all from the highly transmissible if not highly deadly H1N1 strain, which caused the 2009 flu pandemic. The result was an H1N1 virus with mutant H5N1 hemagglutinin proteins on the outside.

Go here to read the rest:
H5N1 Paper Published: Deadly, Transmissible Bird Flu Closer than Thought?

Posted in Human Genetic Engineering | Comments Off on H5N1 Paper Published: Deadly, Transmissible Bird Flu Closer than Thought?

Penn scientists develop large-scale simulation of human blood

Public release date: 1-May-2012 [ | E-mail | Share ]

Contact: Evan Lerner elerner@upenn.edu 215-573-6604 University of Pennsylvania

PHILADELPHIA Having a virtual copy of a patient's blood in a computer would be a boon to researchers and doctors. They could examine a simulated heart attack caused by blood clotting in a diseased coronary artery and see if a drug like aspirin would be effective in reducing the size of such a clot.

Now, a team of biomedical engineers and hematologists at the University of Pennsylvania has made large-scale, patient-specific simulations of blood function under the flow conditions found in blood vessels, using robots to run hundreds of tests on human platelets responding to combinations of activating agents that cause clotting.

Their work was published in the journal Blood.

Patient-specific information on how platelets form blood clots can be a vital part of care. Normally, clots prevent bleeding, but they can also cause heart attacks when they form in plaque-laden coronary arteries. Several drugs, including aspirin, are used to reduce the size of such clots and prevent heart attacks, but, as platelets differ from person to person, the efficacy of such drugs differs as well.

"Blood platelets are like computers in that they integrate many signals and make a complex decision of what to do," said senior author Scott Diamond, professor of chemical and biomolecular engineering in the School of Engineering and Applied Science. "We were interested to learn if we could make enough measurements in the lab to detect the small differences that make each of us unique. It would be impossible to do this with the cells of the liver, heart or brain. But we can easily obtain a tube of blood from each donor and run tests of platelet calcium release."

When blood platelets are exposed to the conditions of a cut or, in a more dangerous situation, a ruptured atherosclerotic plaque, they respond by elevating their internal calcium, which causes release of two chemicals, thromboxane and ADP. These two activating agents further enhance calcium levels and are the targets of common anti-platelet drugs such as aspirin or clopidogrel, also known as Plavix. By preventing platelets from increasing their calcium levels, these drugs make them less able to stick together and block blood vessels, decreasing the likelihood of a heart attack.

Since blood is a liquid, the liquid-handling robots originally developed for drug screening tests were ideal to test platelet function.

"We used a technique developed in our lab called 'pairwise agonist scanning' on platelets from three different donors to generate a massive data set of how their cells responded to all different pairs of these activating agents," Diamond said. "Then we trained neural network models for each donor based on this data to simulate how each and every cell in a blood clot is responding."

Read more:
Penn scientists develop large-scale simulation of human blood

Posted in Human Genetic Engineering | Comments Off on Penn scientists develop large-scale simulation of human blood