Tuesday, January 22, 2019



What you are born with matters a lot more than your education

It's true and has long been known but is a bit surprising coming from UCSD.  They avoided mentioning that what they were studying was IQ, however

Summary:

Youthful cognitive ability strongly predicts mental capacity later in life.

Early adult general cognitive ability [IQ] is a stronger predictor of cognitive function and reserve later in life than other factors, such as higher education, occupational complexity or engaging in late-life intellectual activities.
    
FULL STORY

Early adult general cognitive ability (GCA) -- the diverse set of skills involved in thinking, such as reasoning, memory and perception -- is a stronger predictor of cognitive function and reserve later in life than other factors, such as higher education, occupational complexity or engaging in late-life intellectual activities, report researchers in a new study publishing January 21 in PNAS.

Higher education and late-life intellectual activities, such as doing puzzles, reading or socializing, have all been associated with reduced risk of dementia and sustained or improved cognitive reserve. Cognitive reserve is the brain's ability to improvise and find alternate ways of getting a job done and may help people compensate for other changes associated with aging.

An international team of scientists, led by scientists at University of California San Diego School of Medicine, sought to address a "chicken or egg" conundrum posed by these associations. Does being in a more complex job help maintain cognitive abilities, for example, or do people with greater cognitive abilities tend to be in more complex occupations?

The researchers evaluated more than 1,000 men participating in the Vietnam Era Twin Study of Aging. Although all were veterans, nearly 80 percent of the participants reported no combat experience. All of the men, now in their mid-50s to mid-60s, took the Armed Forces Qualification Test at an average age of 20. The test is a measure GCA. As part of the study, researchers assessed participants' performance in late midlife, using the same GCA measure, plus assessments in seven cognitive domains, such as memory, abstract reasoning and verbal fluency.

They found that GCA at age 20 accounted for 40 percent of the variance in the same measure at age 62, and approximately 10 percent of the variance in each of the seven cognitive domains. After accounting for GCA at age 20, the authors concluded, other factors had little effect. For example, lifetime education, complexity of job and engagement in intellectual activities each accounted for less than 1 percent of variance at average age 62.

"The findings suggest that the impact of education, occupational complexity and engagement in cognitive activities on later life cognitive function likely reflects reverse causation," said first author William S. Kremen, PhD, professor in the Department of Psychiatry at UC San Diego School of Medicine. "In other words, they are largely downstream effects of young adult intellectual capacity."

In support of that idea, researchers found that age 20 GCA, but not education, correlated with the surface area of the cerebral cortex at age 62. The cerebral cortex is the thin, outer region of the brain (gray matter) responsible for thinking, perceiving, producing and understanding language.

The authors emphasized that education is clearly of great value and can enhance a person's overall cognitive ability and life outcomes. Comparing their findings with other research, they speculated that the role of education in increasing GCA takes place primarily during childhood and adolescence when there is still substantial brain development.

However, they said that by early adulthood, education's effect on GCA appears to level off, though it continues to produce other beneficial effects, such as broadening knowledge and expertise.

Kremen said remaining cognitively active in later life is beneficial, but "our findings suggest we should look at this from a lifespan perspective. Enhancing cognitive reserve and reducing later life cognitive decline may really need to begin with more access to quality childhood and adolescent education."

The researchers said additional investigations would be needed to fully confirm their inferences, such as a single study with cognitive testing at different times throughout childhood and adolescence.

SOURCE 


Thursday, January 17, 2019



Are our life chances determined by our DNA?

In less than two decades, the bid to read the human genome has shrunk from billion-dollar space-race project to cheap parlour game. In 2000, President Bill Clinton and Tony Blair, then UK prime minister, jointly announced that scientists had elucidated the three billion letters of the human genome — or discovered “the language in which God created life”, as the US president solemnly phrased it.

In 2018, prompted by opposition goading, the Democrat US senator Elizabeth Warren took a consumer DNA test to prove a strain of Cherokee ancestry. Flashing a sliver of exotic bloodline for political advantage turned out to be a calamitous misjudgment: her actions upset Native Americans, who regard identity and culture as more than a matter of DNA.

Science too is engaged in the same enterprise: to reduce the complexity of human identity to genetics. While we have long known that genes build our bodies — determining eye and hair colour, influencing height and body shape — there is a growing conviction that genes also sculpt the mind. As the cost of gene-sequencing technology has plunged to a few hundred dollars, millions of people have had their DNA sliced and diced by scientists seeking to quantify the genetic contribution to personality, intelligence, behaviour and mental illness.

This is the dark and difficult territory explored by three important books that embody a new zeitgeist of genetic determinism. If DNA builds the brain and mind — the puppetmasters pulling our behavioural strings — then selfhood becomes circumscribed largely by our genes. The idea that we are little more than machines driven by our biology raises a profound conundrum: if the genes we inherit at conception shape personality, behaviour, mental health and intellectual achievement, where is the space for society and social policy — even parents — to make a difference? What of free will?

As might be guessed from its klaxon of a title, Blueprint is unequivocal in stating the supremacy of the genome. “Genetics is the most important factor shaping who we are,” opens Robert Plomin, a behavioural geneticist at King’s College London recognised globally (and reviled by some) for his research into the genetics of intelligence. “It explains more of the psychological differences between us than everything else put together,” he writes, adding that “the most important environmental factors, such as our families and schools, account for less than 5 per cent of the differences between us in our mental health or how well we did at school.”

For decades, Professor Plomin has been using twin and adoption studies to tease out the relative effects of genes and environment. Identical twins share 100 per cent of their DNA; in non-identical twins this drops to 50 per cent (the same genetic overlap as regular siblings). Adopted children share a home environment, but no DNA, with their adoptive parents; and 50 per cent of their DNA, but no home environment, with each of their biological parents.

A careful study of these permutations can point to the “heritability” of various characteristics and psychological traits. Body weight, for example, shows a heritability of about 70 per cent: thus 70 per cent of the differences in weight between people can be attributed to differences in their DNA. Identical twins tend to be more similar than non-identical, fraternal twins; adopted children are more like their biological parents than their adoptive parents.

Breast cancer, widely thought of as a genetic disease, shows a heritability of only 10 per cent. In contrast, it is 50 per cent for schizophrenia; 50 per cent for general intelligence (reasoning); and 60 per cent for school achievement. Last year Plomin claimed that children with high “polygenic scores” for educational achievement — showing a constellation of genetic variants known to be associated with academic success — gained good GCSE grades regardless of whether they went to non-selective or selective schools. His conclusion was that genes matter pretty much above all else when it comes to exam grades.

Even the home, the very definition of “environment”, is subject to genetic influence, he says. If kids in book-filled homes exhibit high IQs, it is because high-IQ parents tend to create book-filled homes. The parents are passing on their intelligence to their children via their genes, not their libraries: “The shocking and profound revelation . . . is that parents have little systematic effect on their children’s outcomes, beyond the blueprint that their genes provide.” His conclusion is that “parents matter, but they don’t make a difference”.

That is not the only seemingly contradictory message. Plomin describes DNA as a “fortune-teller” while simultaneously emphasising that “genetics describes what is — it does not predict what could be”. This caveat is odd, given his later enthusiasm for using genetic testing predictively in almost every aspect of life: in health, education, choosing a job and even attracting a spouse. He suggests, for example, that we could use polygenic scores for schizophrenia “to identify problems on the basis of causes rather than symptoms”.

This vision sounds worryingly like pre-medicalisation. Plomin proclaims himself a cheerleader for such implications but is disappointingly light on the ethical issues. A predisposition might never manifest as a symptom — and besides, “possible schizophrenic” is not the kind of descriptor I would want to carry around from birth.

Plomin admits that cowardice stopped him writing such a book before now; it probably also stopped him from addressing alleged racial differences in intelligence. This is a grave omission, as he is one of the few academics capable of authoritatively quashing the notion. James Watson, the 90-year-old DNA pioneer, recently restated his belief that blacks are cognitively inferior to whites. Those, like Plomin, responsible for fuelling the resurgence in genetic determinism have a responsibility to speak out — and early — against those who misuse science to sow division. (Plomin is writing an afterword for future editions.)

Neuroscientist Kevin Mitchell believes that genes conspire with a hidden factor — brain development — to shape psychology and behaviour. Neural development, he contends persuasively in his book Innate, adds random variation to the unfurling of the genetic blueprint, ensuring individuality, even among identical twins. These special siblings, though clones, rarely score identically for psychological traits. Genes are the ingredients but a lot depends on the oven: “You can’t bake the same cake twice.”

Mitchell, associate professor of neuroscience at Trinity College Dublin, explains: “It is mainly genetic variation affecting brain development that underlies innate differences in psychological traits. We are different from each other in large part because of the way our brains get wired before we are born.” Genetic relatives have brains that are wired alike. Thus, we should look to the cranium, not only to chromosomes, to learn how minds are shaped.

Indeed, each of us is a miniature study in how a genetic blueprint can quiver under the influence of random variation, like a pencil tracing that does not conform exactly to the original outline. The genes directing the development of each side of your body are identical — but you are still slightly asymmetrical (put a mirror down the middle of a mugshot and see how weird you look with perfect symmetry). Fascinatingly, identical twins do not always show the same handedness, despite shared DNA and upbringing.

What goes on in that oven, or the brain, cannot be described as environmental — the catch-all term for non-genetic factors — because it is intrinsic to the individual rather than shared. Mitchell labels it the “non-shared environment”, a crucial but overlooked component of innate traits. Once this factor is folded in, “many traits are even more innate than heritability estimates alone would suggest”.

This, he insists, does not close the door to free will and autonomy. Genes plus neural development pre-programme a path of possible action, not the action itself: “We still have free will, just not in the sense that we can choose to do any old random thing at any moment . . . when we do make deliberative decisions, it is between a limited set of options that our brain suggests.” Having free will, he adds, does not mean doing things for no reason, but “doing them for your reasons.” Those include wanting to conform to social and familial norms; unlike Plomin, Mitchell recognises the reality that societies and families can and do make a difference.

While both discuss heritable conditions such as autism and schizophrenia in terms of defective genes, Randolph Nesse turns this thinking on its head. In Good Reasons for Bad Feelings, he asks: why do such disorders persist in the human population, given that natural selection tends to weed out “bad” genes?

Mental illness and psychological ill-health, he theorises, could be the collateral damage caused by the selection, over evolutionary time, of thousands of genes for survival and fitness. Autism, for example, has a well-documented genetic overlap with higher cognitive ability: some biologists now regard autism as a disorder of high intelligence. Once, only the clever survived.

Nesse, who runs the Centre for Evolution and Medicine at Arizona State University, can also explain why life offers mental torment in abundance: “Natural selection does not give a fig about our happiness. In the calculus of evolution, only reproductive success matters.”

Charles Darwin was one of the first to see the similarity in facial expressions between humans and other animals: these hint at a shared evolutionary heritage when it comes to emotions. Jealousy and fear, for example, are thought to promote genetic survival: a jealous man who controls his partner is more likely to end up raising his own genetic offspring, according to the evolutionary scientist David Buss; fear makes us cautious and keeps us alive.

These are indeed good reasons for bad feelings. But extreme jealousy can lead to murder; extreme fear can become debilitating phobia. Panic attacks — an exceedingly common experience — mirror the fight-or-flight response. Anxiety, meanwhile, works on the smoke detector principle: “a useful response that often goes overboard”.

Nesse’s book offers fresh thinking in a field that has come to feel stagnant, even if new therapeutic avenues are not immediately obvious. The prevailing orthodoxy that each mental disorder must have its own distinct cause, possibly correctable through chemicals, has not been wholly successful over the decades. Biologists have also failed to uncover tidy genetic origins for heritable conditions such as schizophrenia and autism, instead finding the risk sprinkled across thousands of genes. Recasting our psychiatric and psychological shortcomings as the unintended sprawling by-products of evolution seems a useful way of understanding why our minds malfunction in the multiple, messy ways that they do. The UK’s Royal College of Psychiatry thinks so: it recently set up a special interest group on evolutionary psychiatry.

Given that natural selection is blind to organisms being happy, sad, manic or depressed, Nesse notes that things could have turned out worse: “Instead of being appalled at life’s suffering, we should be astounded and awed by the miracle of mental health for so many.”

SOURCE  

Wednesday, January 16, 2019


'Father of DNA' James Watson Stripped of Honors Over More IQ  Comments

The story below shows the incredible power of America's racism hysteria. Its counter-factual beliefs must not be disputed.  Black IQ really is the third rail of political commentary in America. The reality is just too disturbing to face.

Note that NO evidence is mentioned to dispute Watson's claims -- for the excellent reason that Watson's comments are a good summary of the available evidence on the question.  Even the APA has acknowledged a large and persistent gap (one SD) between average black and white IQ and it would itself be floridly racist to say that what is genetic in whites is not genetic in blacks



The acclaimed Nobel Prize-winning scientist James Watson will be forever remembered as one of the 'fathers of DNA'. But also as something much worse.

In a resurfaced controversy that further dims the shine of one of the 20th century's most esteemed scientists, Watson – awarded the Nobel in 1962 for his role in the discovery of DNA's 'double helix' molecular structure – has been stripped of academic titles after repeating offensive racist views that began to shred his reputation over a decade ago.

After new racist comments by Watson surfaced in the recent PBS documentary American Masters: Decoding Watson, Cold Spring Harbor Laboratory (CSHL) – the pioneering research lab Watson led for decades – had finally had enough.

"Cold Spring Harbor Laboratory unequivocally rejects the unsubstantiated and reckless personal opinions," CSHL said in statement.

"Dr. Watson's statements are reprehensible, unsupported by science, and in no way represent the views of CSHL… The Laboratory condemns the misuse of science to justify prejudice."

In the new documentary, Watson states: "There's a difference on the average between blacks and whites on IQ tests. I would say the difference is, it's genetic."

It's not the first time Watson has come under fire for stating these kinds of beliefs.

In 2007, Watson created a furore after he was quoted as saying he was "inherently gloomy about the prospect of Africa" because "all our social policies are based on the fact that their intelligence is the same as ours – whereas all the testing says not really".

In the same article by The Times, Watson acknowledged such views were a "hot potato", but said that while he hoped that everyone was equal, "people who have to deal with black employees find this not true".

Watson later apologised for the comments, but the damage was done.

CSHL relieved him of all remaining administrative duties at the lab, leaving him only as an honorary figurehead in respect of his previous contributions to science. Now, those last accolades are also gone.

"In response to his most recent statements, which effectively reverse the written apology and retraction Dr. Watson made in 2007, the Laboratory has taken additional steps, including revoking his honorary titles of Chancellor Emeritus, Oliver R. Grace Professor Emeritus, and Honorary Trustee," the CSHL statement reads.

It's an indisputably inglorious end for one of the most glorious career arcs in 20th century science.

While the lesser-known story of Rosalind Franklin's unrecognised contributions to Watson and Francis Crick's famous DNA research are a telling reminder of the struggles women still face to be recognised in science, nobody denies the landmark contributions Watson himself made.

But, sadly, these famous accomplishments – which helped usher in a whole new era of knowledge in molecular biology and genetics – will now forever be linked with the offensive opinions of an old man in decline.

And an old man who, some say, should not be asked such questions any more.

"It is not news when a ninety-year-old man who has lost cognitive inhibition, and has drifted that way for decades as he aged, speaks from his present mind," CSHL Michael Wigler told The New York Times.

"It is not a moment for reflection. It is merely a peek into a corner of this nation's subconscious, and a strong whiff of its not-well-shrouded past secrets."

The last time Watson's racism created such controversy, the scientist ended up selling his Nobel Prize – citing financial issues from the resulting fallout that had rendered him an "unperson".

The buyer actually returned the Prize to Watson as a gesture of respect – but as time and the world moves on, the ageing scientist may find himself running out of such good will.

As for what we can ultimately make of the scientist's legacy, given the ugly shadow that now hangs over his earlier wins, helpful advice may come from a 2014 op-ed in The Guardian written about Watson.

"Celebrate science when it is great, and scientists when they deserve it," geneticist Adam Rutherford wrote.

"And when they turn out to be awful bigots, let's be honest about that too. It turns out that just like DNA, people are messy, complex and sometimes full of hideous errors."

SOURCE  


Friday, January 4, 2019




For James Watson, the Price Was Exile. At 90, the Nobel winner still thinks that black people are born intellectually inferior to whites

The NYT article below shows how powerful political correctness can be. James Watson has been severely sanctioned for saying in public little more than what most psychometricians are agreed on -- that the average black IQ is much lower than white IQ and that the difference is persistent -- nothing seems able to change it. The American Psychological Association is generally Left-leaning but it is the world's most prestigious body of academic psychologists. And even they (under the chairmanship of Ulric Neisser) have had to concede a large and persistent gap in black vs. white average IQ.

It is true that very few psychometricians will attribute the persistence of the black/white gap to genetics.  It would be career death for them if they did, as it was for Watson.  Yet they cheerfully attribute differences between white individuals to genetics.  There is powerful evidence of that. So why is a particular group difference not also genetic?  Groups are made up of individuals and group scores are the sum of individual scores. 

The only way out of that inference would be to say that blacks are a different species, or at least fundamentally different genetically -- that something produced by genes in whites is not produced by genes in blacks. Yet that denies the humanity of blacks.  It is saying that their brains are different in how they function.  That, it seems to me, is REALLY racist.  It is an attempt to deny racial differences that ends up proclaiming racial differences.  If we respect the humanity of blacks we have to say that the causes of IQ variation are the same in blacks and whites.  You have to say that the black/white gap is persistent because it is genetic.

But we can go beyond that.  The question is really the validity of IQ scores among blacks.  Do they measure what we think they measure?  Do they measure the same things that they measure among whites?  And the answer is very clear.  From their average IQ score we would expect blacks to be at the bottom of every heap where anything intellectual is remotely involved.  We would expect them to be economically unsuccessful (poor), mired in crime and barely educable.  And they are.  The tests are valid among blacks.

The education situation is particularly clear.  The large gap between black and white educational attainment has been loudly bewailed by all concerned for many years.  Leftist educators have turned themselves inside out trying to change it.  But nothing does.  It persists virtually unchanged year after year. It alone is graphic testimony to inborn lesser black intellectual competence.  No talk of IQ is really needed.

But it is exactly what we would predict from black IQ scores.  It is a large gap that mirrors a large IQ gap. It is exactly what we would expect from the black difference being a genetic given.  IQ in blacks works the same way as it does in whites.  So if it is genetically determined in whites it must be genetically determined among blacks.  Some whites are born dumb.  Many blacks are born dumb



It has been more than a decade since James D. Watson, a founder of modern genetics, landed in a kind of professional exile by suggesting that black people are intrinsically less intelligent than whites.

In 2007, Dr. Watson, who shared a 1962 Nobel Prize for describing the double-helix structure of DNA, told a British journalist that he was “inherently gloomy about the prospect of Africa” because “all our social policies are based on the fact that their intelligence is the same as ours, whereas all the testing says, not really.”

Moreover, he added, although he wished everyone were equal, “people who have to deal with black employees find this not true.”

Dr. Watson’s comments reverberated around the world, and he was forced to retire from his job as chancellor of the Cold Spring Harbor Laboratory on Long Island, although he retains an office there.

He apologized publicly and “unreservedly” and in later interviews he sometimes suggested that he had been playing the provocateur — his trademark role — or had not understood that his comments would be made public.

Ever since, Dr. Watson, 90, has been largely absent from the public eye. His speaking invitations evaporated. In 2014, he became the first living Nobelist to sell his medal, citing a depleted income from having been designated a “nonperson.”

But his remarks have lingered. They have been invoked to support white supremacist views, and scientists routinely excoriate Dr. Watson when his name surfaces on social media.

Eric Lander, the director of the Broad Institute of M.I.T. and Harvard, elicited an outcry last spring with a toast he made to Dr. Watson’s involvement in the early days of the Human Genome Project. Dr. Lander quickly apologized.

“I reject his views as despicable” Dr. Lander wrote to Broad scientists. “They have no place in science, which must welcome everyone. I was wrong to toast, and I’m sorry.”

And yet, offered the chance recently to recast a tarnished legacy, Dr. Watson has chosen to reaffirm it, this time on camera. In a new documentary, “American Masters: Decoding Watson” to be broadcast on PBS on Wednesday night, he is asked whether his views about the relationship between race and intelligence have changed.

“No” Dr. Watson said. “Not at all. I would like for them to have changed, that there be new knowledge that says that your nurture is much more important than nature. But I haven’t seen any knowledge. And there’s a difference on the average between blacks and whites on I.Q. tests. I would say the difference is, it’s genetic.”

Dr. Watson adds that he takes no pleasure in “the difference between blacks and whites” and wishes it didn’t exist. “It’s awful, just like it’s awful for schizophrenics” he says. (Doctors diagnosed schizophrenia in his son Rufus when he was in his teens.) Dr. Watson continues, “If the difference exists, we have to ask ourselves, how can we try and make it better?”

Dr. Watson’s remarks may well ignite another firestorm of criticism. At the very least, they will pose a challenge for historians when they take the measure of the man: How should such fundamentally unsound views be weighed against his extraordinary scientific contributions?

In response to questions from The Times, Dr. Francis Collins, the director of the National Institutes of Health, said that most experts on intelligence “consider any blackwhite differences in I.Q. testing to arise primarily from environmental, not genetic, differences.” Dr. Collins said he was unaware of any credible research on which Dr. Watson’s “profoundly unfortunate” statement would be based.

“It is disappointing that someone who made such groundbreaking contributions to science” Dr. Collins added, “is perpetuating such scientifically unsupported and hurtful beliefs.”

Dr. Watson is unable to respond, according to family members. He made his latest remarks last June, during the last of six interviews with Mark Mannucci, the film’s producer and director.

But in October Dr. Watson was hospitalized after a car accident, and he has not been able to leave medical care. Some scientists said that Dr. Watson’s recent remarks are noteworthy less because they are his than because they signify misconceptions that may be on the rise, even among scientists, as ingrained racial biases collide with powerful advances in genetics that are enabling researchers to better explore the genetic underpinnings of behavior and cognition.

“It’s not an old story of an old guy with old views” said Andrea Morris, the director of career development at Rockefeller University, who served as a scientific consultant for the film. Dr. Morris said that, as an African- American scientist, “I would like to think that he has the minority view on who can do science and what a scientist should look like. But to me, it feels very current.”

David Reich, a geneticist at Harvard, has argued that new techniques for studying DNA show that some human populations were geographically separated for long enough that they could plausibly have evolved average genetic differences in cognition and behavior.

But in his recent book, “Who We Are and How We Got Here” he explicitly repudiates Dr. Watson’s presumption that such differences would “correspond to longstanding popular stereotypes” as “essentially guaranteed to be wrong.”

Even Robert Plomin, a prominent behavioral geneticist who argues that nature decisively trumps nurture when it comes to individuals, rejects speculation about average racial differences.

“There are powerful methods for studying the genetic and environmental origins of individual differences, but not for studying the causes of average differences between groups” Dr. Plomin he writes in an afterword to be published this spring in the paperback edition of his book “Blueprint: How DNA Makes Us Who We Are.”

SOURCE


Friday, December 21, 2018


A South African who styles himself as "Rational Thinker" is critical of many claims about IQ

He is not a scientific thinker however. He starts out by describing something as a lie without any prior presention of evidence to support that characterization.

He then describes a claim as "refuted" without giving any information about how, when, where and why this refutation took place.

He goes on to say that the clain of lower average IQ among blacks whern compared with whites is "discredited".  Again: When and by whom?

The American Psychological Association is generally Left-leaning but it is the world's most prestigious body of academic psychologists. And even they have had to concede a big inborn gap in black vs. white average IQ.  See here

By this stage I think it is obvious that the screed below is simply an abusive ramble by an egotist who "just knows" what the truth is without recourse to a proper consideration of the evidence.  So much further consideration of his assertions is unlikely to shed any light on anything.

The thing that seems to be burning him up most is the claim that atheists are smarter than religious people, so I will close with a comment on that.  I have looked at the survey evidence on that on a number of occasions so I will summarize by saying that atheists DO test out as slightly more intelligent on average but that difference is fairly clearly due to sociological factors rather than psychological ones. A university environment in particular tends to undermine religion.  See here and  here and here and here and here.

So it would really help our knowall writer if he were to acquire some Christian humility about the veracity of his beliefs.  His existing beliefs certainly should go back into the hole where they belong



One of the major lies preached by atheists is that "atheists are more smarter" than theists. In support of this nonsense, they cite research from Richard Lynn, John Harvey and Helmuth Nyborg which compares belief in God and average IQs in 137 countries.

The research has been heavily criticized as well as refuted since its publication by most scientists but this unfortunately doesn't stop atheists from still using it.   

Firstly Helmuth Nyborg is not a good researcher. This is a researcher who attempted to say that women were less intelligent than men and that black people were less intelligent than white people. These sexist and racist "theories" having long been discredited in modern science.

His research on atheists being more intelligent was highly flawed in several areas in that it did not take into account the social, economical or financial factors with most of the countries with lower intelligence being less developed African countries.

  Artificial Intelligence researcher Randy Olson concludes in his criticism of the research that both the religious and atheist in the well developed countries were all within the bounds of average intelligence (90-109) and from a practical point, none were well distinguishable from another.

When we examine the research we find that this is the case so all in all the paper failed to prove that atheists were more intelligent and only showed what was a well known fact for ages: that poorer countries have higher numbers of religious as religion serves as a source for comfort for those struggling.

The poor intelligence meanwhile in these countries is down to these countries not having education. So to rephrase this entire paragraph: when the religious in the same countries as the "smarter atheists" were compared, they were of the same intelligence as the atheists.  

When examining the statistics, we find that atheists aren't more intelligent, in fact, according to The Psychology of Atheism, many atheists became atheist for motivational reasons rather than rational reasons during their adolescence suggesting a link between emotional thinking and atheism. (Source The Psychology of Atheism [2013]. Oxford Handbook of Atheism Page 470) This seems to be reinforced by the fact that many atheists use emotional arguments to support atheism (i.e "bad things happen in the world ergo god doesn't exist").

Finally, studies have shown that Christians in East Asia are smarter than non-Christians there. East Asian countries also have an higher IQ on average to western countries (where the majority of atheists in the first study came from). So if we go by the atheist reasoning we can now say that Christians are smarter. Either way the myth of the smarter atheist has been put back into the hole it belongs.

SOURCE  

Wednesday, November 21, 2018


Pregnant women who take paracetamol could lower their child's IQ and raise their risk of autism, research finds

No drug is free of side-effects but I have long noted that paracetamol (APAP) is much more dangerous than aspirin, principally because of its well-established liver toxicity.  The findings below would seem to add to that message but maybe not.  People who take a lot of painkillers are probably of worse health overall.  So maybe all we are seeing is that the children of unhealthy mother are unhealthy too. 

The journal article is "Prenatal Exposure to Acetaminophen and Risk for Attention Deficit Hyperactivity Disorder and Autistic Spectrum Disorder: A Systematic Review, Meta-Analysis, and Meta-Regression Analysis of Cohort Studies" and the authors  there are also cautious about the meaning of the findings.  They say: "These findings are concerning; however, results should be interpreted with caution given that the available evidence consists of observational studies and is susceptible to several potential sources of bias.



Women who take paracetamol during pregnancy risk lowering their child's IQ, a study has revealed. Taking the drug is also associated with a higher risk of ADHD (attention deficit hyperactivity disorder) and autism.  

Researchers from US universities, including Harvard, reviewed nine studies that looked at 150,000 mothers and babies in total.  Their findings suggest that the balance of hormones in the uterus are altered by taking paracetamol, also known as acetaminophen (APAP).

One study analysed found a three-point drop in IQ for five-year-old children whose mothers had taken paracetamol for pain relief without fever. Other research shows youngsters exposed to the drug in the womb struggled with speech.

It's not the first time scientists have found a link between paracetamol use and delayed speech.

In January, research from New York found that taking the go-to-pain relieving drug during pregnancy delays babies' speech by up to six times.  

Expectant mothers who take acetaminophen more than six times during their early pregnancies are significantly more likely to have daughters with limited vocabularies, the study found.   

Paracetamol is generally available without prescription and is the most commonly used medication in pregnancy.   

Research this year has shown the common painkiller can raise a child's risk of ADHD by up to 30 per cent, and up to a 20 per cent for autism, when taken by their mothers.

The study, led by Dr Ilan Matok, from the Hebrew University of Jerusalem, analysed 132,738 mother-child pairs over three-to-11 years.

Dr Matok said: 'Our findings suggest an association between prolonged acetaminophen use and an increase in the risk of autism and ADHD.'

SOURCE 

Monday, November 12, 2018


Transgenerational advantage

Summary below of a particularly dumb TED talk from a New School professor. The New School is far Left from wayback so the idea presented is as dumb and impractical as you would expect of that. It's true that economic advantage tends to be passed on from father to son but why and how?  The Newschooler doesn't know.  He just knows that it is.  So he resorts to vague generalities -- which apparently sounded clever to his audience. 

That wealth is transmitted in some automatic way once you have it is absolute bunkum.  How many times have we read of people winning big in a lottery and blowing the lot in short order?  Having money does not even encourage you to keep it, let alone pass it on.

But there is no need for "cleverness" in order to explain the phenomenon that our Newschooler has noticed.  It's perfectly plain why rich men tend to have economically successful children.  It's because you have to be pretty smart to get rich (As Charles Murray showed decades ago) and IQ is highly hereditary.  Both father and son get rich because they are both  smarter than the average. 

Giving a son money will do nothing to alter the main operative factor in wealth acquisition: IQ.  If he is smart he doesn't need it and if he is dumb he will simply blow it.



Economists often point out the simple truth that having wealth makes it easier to get more wealth, which means those who have a lot of money pass on an advantage from one generation to the next.

To adjust for that, economist Darrick Hamilton, a professor at The New School in New York, recently proposed a kind of baby trust fund system. His idea is to give all kids in the US a chunk of cash when they’re born, ranging between $US500 and $US60,000 based on their family’s wealth. That would help give all of thems a fair shot at a prosperous future, he said.

“Wealth is the paramount indicator of economic security and well-being,” Hamilton told a crowd at the TED Conferences headquarters in New York in September. “It is time to get beyond the false narrative that attributes inequalities to individual personal deficits while largely ignoring the advantages of wealth.”

SOURCE