“Is We Getting Dummer?” Oh,Yes. Does We Care?

Why yes, it DOES remind me of “Idiocracy,” which is only funny if it isn’t true.

Today, just prior to convicting Drew Peterson of killing his wife, his jury sent a message to the judge asking what the word “unanimous” meant.

Think about the implications of this. First of all, it means that one man’s life and the U.S. justice system’s integrity is resting on the judgment of twelve people, not one of whom possesses a fifth grade vocabulary, or, if one of them does, he or she did not possess the skills of persuasion or credibility to convince a majority of his colleagues that yes, “unanimous” means that everybody is in agreement. It means that the voir dire system managed to carefully select the most ignorant and inarticulate jury of adults imaginable for a first degree murder trial.

That’s not all. It means that in Joliet, Illinois, a select group of twelve adults, in addition to possessing only a rudimentary English vocabulary, were completely uninformed about the jury system. To reach adulthood this stunningly ignorant about one of the basic features of our justice system and  democracy, these individuals could not have regularly read newspapers or watched the news, and if they did, could not possibly have understood what they were reading or seeing. They ignored the O.J. Simpson case and the Casey Anthony trial, but also failed to comprehend “Law and Order,” and any number of other examples of popular entertainment that regularly involve the jury system. Is there any chance that individuals who have to ask what “unanimous” means (what did they think it might mean, when the judge told them that “your decision must be unanimous”?  Enthusiastic?) could comprehend far more difficult concepts like reasonable doubt? Could they have a working knowledge of the Constitution or the Bill of Rights? What are the chances that such a jury understood, with its 4th grade vocabulary, the questions asked by the lawyers during the trial? The evidence? Expert testimony?

It does not take a lot of school, great teachers or intelligence to learn what “unanimous” means. It takes the intellectual curiosity necessary to pay minimally adequate attention to the world around you. This isn’t a problem that can be solved by paying teachers more. This is a problem that can be solved by creating a culture in which everyone is encouraged to value knowledge and to accept the responsibilities of living in a community and take it seriously. For more than a year, I have been listening to heated rhetoric based on the false premise that effort, education, knowledge and character have no relationship to one’s success, that it is all connections and luck. I will say right now that I resent having to contribute one penny to the living expenses of anyone who passes through puberty without a serious closed head injury and who has not interacted with the rest of the world sufficiently to encounter and learn the word “unanimous.” I resent even more the fact that such people not only determine the direction of our political process, but also are the ones candidates pitch their appeals to. I resent that such people respond by electing Todd Akin, Michele Bachmann and Maxine Waters to Congress. I resent that our justice system, criminal or civil, is warped by their intellectual apathy and laziness.

I have mentioned “I.Q. 83” here previously; it was a thin paperback techno-thriller by Arthur Herzog that I read during a lunch hour at a book shop in the 80’s. Yes, I should have bought it. I wish I had, because not to was unethical and also because I have wanted to re-read it many times since. The book was a cross between “The Andromeda Strain” and “Flowers For Algernon,” with a genius scientist hero who discovers a virus responsible for mental retardation. The virus gets out into the population, of course, and spreads quickly, gradually lowering the IQ of the United States to a mean of 83 and falling. He is infected too, and the suspense arises from his race to find a cure before he is too stupid to care. The book is pretty funny as it describes life in an increasingly stupid country, with newspapers full of blank pages and typos, and the New York Times sporting the headline, as I recall it, “Is We Geting Dummer? Sientsist Consernd.”

The story of the Peterson jury is one of many I have encountered in the intervening years that makes me feel that the nightmare of the novel is coming to pass. There is no virus to blame in the real United States, however, which makes the evidence of creeping stupidity and ignorance less excusable and more frightening. The causes are apathy, lack of standards, lack of pride, and the failure to build a culture where civic involvement is regarded as an honor, and being prepared to fully and competently engage in it is recognized as every citizen’s obligation. Unanimously.

Instead, we have whole juries filled with people who don’t know what the word means.

_________________________________________________

Pointer: Fark

Facts: WLSAM

Graphics: TV Tropes

Ethics Alarms attempts to give proper attribution and credit to all sources of facts, analysis and other assistance that go into its blog posts. If you are aware of one I missed, or believe your own work was used in any way without proper attribution, please contact me, Jack Marshall, at  jamproethics@verizon.net.

68 thoughts on ““Is We Getting Dummer?” Oh,Yes. Does We Care?

  1. Maybe the jury was made up of really bad spellers. Maybe when the word was explained, several of them slapped their palms to their foreheads, “Kah-yeah! Of course! It LOOKS like un-anim-us-ly!”

    I disagree with the notion that knowledge can be gained by watching the news, reading the newspaper or tuning in to dramatizations on television. Maybe the jurors in Joliet do nothing BUT watch television and read the Joliet News & World Report (assuming there is such a publication).

    Side note: I’m always debating with myself whether I should break down and get myself a television. You’re always talking me out of it. Thank you.

    Christine (Media Hater & Social Cynic)

  2. Yet another reason why the privilege of voting should require passing the same test of knowledge of United States government that immigrants must pass before becoming naturalized citizens. But some idiots still don’t think that voter ID is a good idea, so fat chance of this happening.

  3. Peter Schiff, posing as an “anti-business crusader,” found a slew of people at the DNC in favor of banning or placing a cap on corporate profits. http://www.examiner.com/article/peter-schiff-to-democrats-let-s-ban-limit-corporate-profits?cid=rss

    Having served on a jury last year, I found the “voir dire” process on a routine trial somewhat different from the movies and the recent play of the same name. You may be excused for any of a number of reasons, but you are merely sent back to the daily jury pool to await another case. You must serve a full day as long as there are active trials, in our county; there is no early departure now.

    The mood on the jury is for conviction, quickly. Don’t know if this is caused by the various CSIs or the time constraints everybody lives under or both. Maybe it’s just human nature. I was somewhat surprised to find that jury members line up less by gender and race than by age. The younger folks see the scales of justice tilted towards “guilty as charged.”

    Presumption of innocence or “reasonable doubt,” though repeatedly mentioned, do not register on a conceptual level with most jurors. As you know, the legal process, while supposedly dealing with facts, tends to emphasize the emotional. Muddled arguments and logic carry over to the jury deliberations. Classical group dynamics are operative. In the face of a clear majority, pressure is exerted on jurors who favor acquittal. Probably there was a holdout or two on the Peterson case.

  4. I wouldn’t say that we’re getting dumber, per se, just that our knowledge has been redistributed. There is considerably more possible information now, and, as a result, things that were common to know previously are left out. The rise in complexity of our modern world has also created a situation where it is much harder to be a renaissance man. To be good at one topic, likely other knowledge has to be essentially ignored.

    For instance, look at cars. I understand how older cars worked. An 80s toyota or Ford? I’m not a handy person, but I can understand the engines and see what needs to be done to fix something. For my 2004 Corolla or my current Mini? I’m just lost.

    Look at Math. I had a deeper understanding of Mathematics by the time I left high school than even most Mathmaticians of the 19th century. I knew more by the time I hit 8th grade than most of the regulared learned men.

    The same goes for Biology and Physics. Chemistry didn’t even exist 100 years ago, and now Chemistry is an old science.

    When the depth of information required in any specific field grows, it’s impossible to keep pace with it while also keeping up the samebreadth of knowledge that the previous generation could.

    • I remember a freshman class seminar given by one of the professors in a liberal arts field. He was speaking to STEM students and his basic message was the importance of a classical liberal arts education for exactly the reason you point out. I don’t know how much of an impact it had on other students, but it resonated with me.

      As you point out, our world is very complex and even an undergraduate degree in a science field requires a significant amount of specialization. But if all I got out of college was a specialized education in mathematics I would not be a well rounded, educated person. It’s one of my issues with the focus and push to get more STEM graduates, while the skills/knowledge you learn in these fields are useful and can lead to high pay, it does not necessary lead to well educated people. Significant effort needs to be made in these degree programs to make sure students also learn general knowledge.

      • I 100% agree, but the issue isn’t just STEM. It’s pretty much everything except literature and history. You want to become a manager? Well, now we know all this information about sociology that we didn’t before. You want to go into marketting? Before it was just coming up with ideas, now a few classes in statistics and psychology are important.

        Also, if we force everyone to have the full wide range of knowledge that was previously common, we’re essentially neutering the ability for them to go as deep as they need to in any given subject.

        It’s also not just with book learning, it’s with everything. There are so many more topics of information available all the time. The basics that were learned in the past are just some of the many possibilities now. I don’t think it’s possible for most people to cover all the old information while also being minimally competent in the new information.

        Some things just have to fall out.

  5. “IQ 83” sci-fi and true crime writer, American Arthur Herzog III is huge. I turned up all kind of hits on my library search, so I looked further. Wiki notes that “Orca” and “The Swarm” [whatever their artistic merits] were made into movies and “IQ 83” is in the works – Dreamworks, that is. His father was a big jazz songwriter, collaborating with Billie Holiday. Maybe there is a stage adaptation just waiting to happen Jack!

  6. I graduated from a liberal arts college with a BA in biology and 24 credit hours in chemistry. And 8 credits of physics. I was required to take a year each of religion, philosophy, literature, and electives – all of which I enjoyed BTW, this was the fun stuff! The in depth study for your chosen field you get in graduate school. Or if you’re talented you skip (skip college too) and obtain on your own.

    STEM students are getting short-changed, in my view. And today’s undergrads are not getting an education in or exposure to critical thinking. To achieve this you have analyze all sides of a question, including the un-PC sides, and no one wants to go there.- teachers or students. Also, no one really reads anymore, or if they do reads with comprehension.. They lack basic reading and writing skills, not to mention spelling to make informed decisions, in jury rooms and voting booths.

    But, yes, the advance of information (including literature and history) and technology is certainly staggering, particularly noticeable as you age. Would we still be admitted to the same schools, I wonder?! I feel like I’m slipping further behind with every trip to Best Buy!

    • The in depth study for your chosen field you get in graduate school. Or if you’re talented you skip (skip college too) and obtain on your own.

      That worked…when only a few fields needed in depth study. Now, pretty much everything does.

      You want to be a software engineer? Well, you have to do 7 years to get there. Will that occur? No. We’ll just have more untutored programmers. Can they program? Yes. Do they understand requirements? Security holes? CM? Whether the kids are talented or not, without the tutoring, they’re going to repeat the errors that earlier generations made. Quality goes down, and the things we rely on to protect us (like every website) go even further down in quality.

      Throw in the increase in educational costs and the loss of productive working years, and I don’t see this as an easy or obvious solution.

  7. I guess it would depend on whether you want to be a specialist or a generalist, an expert or an innovator. If the former, yes, seven years or more may not do it. The latter would look outside his or her field to get what is needed. Or bring those people in. Gates and Jobs did so, as did many creative artists.

    The opportunity costs you mention, in fact, would argue against the traditional educational route. I feel that the technology and information age makes access to many disciplines easier than ever. Just last Sunday I watched an interesting segment on “60 Minutes” about a guy that developed a YouTube academic program to acquire knowledge at you own pace (www.khanacademy.org). Beyond that it’s all OJT.

    I bet you could have made many of the same points 100 years ago. The change in the concept of time, leap in technology, and move from an agrarian to an urban society must have been a cultural shock. Yet we somehow got through it, with far fewer resources.

    • I guess it would depend on whether you want to be a specialist or a generalist, an expert or an innovator.

      The point is that a generalist can’t do anything anymore.

      The latter would look outside his or her field to get what is needed. Or bring those people in. Gates and Jobs did so, as did many creative artists.

      Gates and Jobs were both experts in their field.

      The opportunity costs you mention, in fact, would argue against the traditional educational route. I feel that the technology and information age makes access to many disciplines easier than ever. Just last Sunday I watched an interesting segment on “60 Minutes” about a guy that developed a YouTube academic program to acquire knowledge at you own pace (www.khanacademy.org). Beyond that it’s all OJT.

      Arguing against universities is alot different from arguing that universities should be more general knowledge. I’d say it’s the exact opposite.

      I bet you could have made many of the same points 100 years ago. The change in the concept of time, leap in technology, and move from an agrarian to an urban society must have been a cultural shock. Yet we somehow got through it, with far fewer resources.

      There’s a change in acceleration in knowledge and complexity. Also, noone has claimed we can’t get through it, just that not all the same knowledge can be learned…which was true then as well. There was a time when everyone knew how to garden. That was no longer true in the 20th century.

  8. Actually, I believe the generalists have the advantage and are getting it done. They acquire knowledge and skills outside of their area of expertise or fields. Gates and Jobs are two prime examples. If you are saying that universities are training specialists, I agree with you. We need more of a true interdisciplinary (and intradisciplinary ) approach to education. Though, it may not seem obvious, our brains now are wired differently now than they were then. The pace and complexity have increased, but so has our ability to process this data, structurally. And it will be easier for those younger than us to further adapt. My concern is that we are educating them as specialists. We have the tools, but we are not implementing them wisely. Is it harder to be a Renaissance man (or woman) today? In most ways yes, perhaps; in some ways no.

    • The pace and complexity have increased, but so has our ability to process this data, structurally.

      Um…what? What in our brain has changed that allows us to process more data?

      —-

      You’re mostly speaking buzz words and repeating what you would like to happen… despite the problems with it. If you can come up with a way for people to get the depth of knowledge they need while also keeping the same breadth of knowledge as yesteryear, please tell.

      • Nope , no buzz words, unless you don’t know the meaning. Neuroplasticity is a fact – stroke and amputee studies have confirmed on PET and fMRI. Synapses reorganize based on input. (Neural growth is greatest in childhood, which is why the kids learn new routines so quickly. Also the reason why we need to vary the stimuli.) Additionally, to keep up, people make use of technology or people. There’s no question that we know more and have access to more information than any idealized group (Renaissance men, included) in history.

        • Nope , no buzz words, unless you don’t know the meaning

          This is all buzz and no substance: “We need more of a true interdisciplinary (and intradisciplinary ) approach to education.”

          Neuroplasticity is a fact – stroke and amputee studies have confirmed on PET and fMRI. Synapses reorganize based on input. (Neural growth is greatest in childhood, which is why the kids learn new routines so quickly. Also the reason why we need to vary the stimuli.)

          All true. None of it though shows a difference in our brains now as opposed to before. Fail.

          Additionally, to keep up, people make use of technology or people.

          That there is easier access to information does not greatly increase the ability to learn information that we have access to.

          There’s no question that we know more and have access to more information than any idealized group (Renaissance men, included) in history.

          And there’s no argument about this.

          • By “all buzz and no of substance,” you mean you either don’t understand or don’t want to understand these concepts. So I’ll make it simple, if you want a deeper and broader view of a discipline, you need to go outside it. People give a lot of lip service to those approaches, but don’t actually educate that way.

            The data are all there pre and post injury, so I don’t know what you’re getting at, but it’s not logic. Skip the ad hominem arguments which don’t convince anyone.

            I’m glad we agree on something.

            • By “all buzz and no of substance,” you mean you either don’t understand or don’t want to understand these concepts. So I’ll make it simple, if you want a deeper and broader view of a discipline, you need to go outside it. People give a lot of lip service to those approaches, but don’t actually educate that way.

              I understand completely. What I’m saying is that interdisciliplinary learning is trendy, and your claim that we need to do it is like claiming that we need better quality work done in less time using less resources. It’s empty.

              The data are all there pre and post injury, so I don’t know what you’re getting at, but it’s not logic. Skip the ad hominem arguments which don’t convince anyone.

              * You said that the brain is differently than it was previously.
              * I pointed out there was no evidence for that.
              * You went into how brain’s are awesome…but never contrasted that to the past.
              * I called out that your comments didn’t support your position.
              * You write the above.

              There was nothing close to ad hominem there (Do you know what that term even means?), and the awesomeness of the post-injury brain still does not show that are brains are wired differently now than they were before. Again. Fail.

              I’m glad we agree on something.

              Except you somehow see the fact that individuals are exposed to deeper information than we did previously as evidence that we can handle more information that we previously could. Unless you have evidence that we were at our limit before, that doesn’t work.

              • For you interdisciplinary sounds trendy. I’m not writing a paper here, but if, say, a specialtist (or subspecialtist) in the humanties were to teach a student not only in the humanities, but in the sciences (also down to the subspeciality level), you would deepen the level of education. This is rarely, if ever, done, and if so, it’s sporadic.

                I supplied facts about the way the brain changes its structure. I did not use or imply that it was “awesome” or “possess awesomeness,” which come to think of it sounds trendy. Ditto, “calling me out.”

                Most neuroscients would agree we are no where near our limit in our ability to retain and process information.

                An ad hominem argument (expansive usage) attacks the person and not the ideas.

                While you’ve spent much time trying to rebut me, I guess this is your style, I haven’t seen much in the way of evidence.

                • For you interdisciplinary sounds trendy. I’m not writing a paper here, but if, say, a specialist (or subspecialist) in the humanities were to teach a student not only in the humanities, but in the sciences (also down to the subspecialty level), you would deepen the level of education. This is rarely, if ever, done, and if so, it’s sporadic.

                  I supplied facts about the way the brain changes its structure. I did not use or imply that it was “awesome” or “possess awesomeness,” which come to think of it sounds trendy. Ditto, “calling me out.”

                  Most neuroscientists would agree we are no where near our limit in our ability to retain and process information.

                  An ad hominem argument (expansive usage) attacks the person and not the ideas.

                  While you’ve spent much time trying to attack me, I guess this is your style, I haven’t seen much in the way of evidence.

                • For you interdisciplinary sounds trendy. I’m not writing a paper here, but if, say, a specialtist (or subspecialtist) in the humanties were to teach a student not only in the humanities, but in the sciences (also down to the subspeciality level), you would deepen the level of education. This is rarely, if ever, done, and if so, it’s sporadic.

                  You know what else is rarely done? Doing more work with better quality quicker. Maybe I should have said buzzideas instead of buzzwords. There is no there there.

                  I supplied facts about the way the brain changes its structure. I did not use or imply that it was “awesome” or “possess awesomeness,” which come to think of it sounds trendy. Ditto, “calling me out.”

                  *Sigh*. Yes, you supplied facts about how the brain works… but no evidence that it didn’t work that way before. You still haven’t, so you still just have an unbacked assertion.

                  The language was me characterizing your argument. I didn’t put the words in your mouth. I summed up what you said. You described something awesome about brains… it just didn’t back your point.

                  You just equivocated on trendy to impute that I am using buzzwords. It’s completely different kinds of trendy.

                  Most neuroscients would agree we are no where near our limit in our ability to retain and process information.

                  Depending on how those words are all used, this statement can be everything from true but irrelevant to flat out false. Would you care to clarify?

                  An ad hominem argument (expansive usage) attacks the person and not the ideas.

                  Uh huh. Where did I attack you to claim your argument is wrong? A couple times I have said things about you that are based on what you have said, but the connection has always gone only the one way.

                  While you’ve spent much time trying to rebut me, I guess this is your style, I haven’t seen much in the way of evidence.

                  What statement from me do you feel is not sufficiently supported? Different types of statements require different information. Your claim that the brain works differently now than it did before is something that needs to be supported. My claim that there’s more depth of information in fields now than before is something else that needs to be supported… but I believe it’s “common knowledge” (in the game theory sense), so I didn’t bother. If It’s questioned, I’d support it.

                  On the other hand, my claim that the brain is essentially the same as before needs no evidence. It’s the default position.

                  • Please reread what I wrote about neuroplasticty and my earlier posts. Still waiting for evidence from you and not more questions about my position which you do not seem to understand. Better still, tell me (us) what it is you believe and support this if you can so we all can have a discussion.

  9. Interestingly enough, LL Cool J did a segment on Sesame Street where he talks about the meaning of the word unanimous with Elmo, Abbey and Oscar the Grouch.

  10. Re: tgt. You’ve heard of the evolution of the brain, at least for some? How do think that higher cortical function came into being? Still haven’t seen any evidence from you that we have stayed the same – because there isn’t any.

    • Have you heard about the need for selection pressures before evolution occurs? How about that it occurs over time. I’m pretty sure our brains haven’t changed that dramattically in 1-2 generations.

      I don’t need evidence that our brain has stayed the same. That’s the default position. It’s you that has to show the change. This is biology 101 and basic logic.

  11. Your default position is not mine. If you want to read about neuroplasticity and synaptic rewiring, check the literature. Structural change of the brain is a given, even at the Bio 101 level. You’re “pretty sure that our brains haven’t changed dramatically” (which I haven’t said) “in 1-2 generations.” You “don’t need evidence that the brain has stayed the same,” apparently why you won’t accept mine or that of anyone else. OK, I understand where you’re coming from.

    • *sigh*. Structural change of the brain that is different now than it was a 100+ years ago is not bio 101. It is still impossible that citing how the brain reacts pre and post injury says anything about how our brain was different 100 years ago

      I’d accept your evidence if you SHOWED ANY EVIDENCE.

        • Have you seen any of my threads? I respect the evidence, whatever it may be. If you don’t have evidence, your position doesn’t get respect. Even if you just claim to have evidence, I’ll normally google for it to save any back and forth. My googling didn’t find any support for your contention that the flexibility of the mind is something new. My novice understanding of evolution and the mind (mostly from Jerry Coyne and both agreed and disagreed links from there), I have not seen evidence of a rapid change in brain function in the last couple hundred years.

          It seems counter to what I know, I can’t find anything to support it, and you refuse to provide evidence. That’s 3 strikes and your out.

          Also, what in my logic is not logical? I’d like to know so I can correct it.

          • It seems you like to argue and have the last word. Also your posts tend to confuse or misrepresent arguments you have with discussants. If you have a novice understanding of evolution and the mind, perhaps googling won’t help you.

              • I’d say that the very little science in that article has to be taken with the largest grain of salt possible, as the writer seems to be making the facts fit his conclusions instead of the other way around. For instance: “[The concept of “individuality”] only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories – ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self.”

                So people weren’t autonomous and self-aware until the 1800s? Sure.

                The article claims that our brain changes with all input. That would support part of Glass’s point: that are brains are wired differently now then before based on the difference in input. His claim that our ability to process data has ramped up is not supported there, and the evolution of the brain comment is still ridiculously stupid.

                • Yeah, as someone with an abiding interest in ancient and medieval history, that whole line about ambition only arriving with the Industrial Revolution made me blow a gasket; plenty of old-school nobility came from middling (and even occasionally peasant) stock.

                  At this point, I wouldn’t rule out the idea that our brains might have changed in their ability to process data due to increased Internet usage or what-not, but we’re certainly not at the stage where we can say it’s a sure thing. I certainly don’t doubt there are “tricks” we can use to make our brains work “better”, but that’s a different point all together.

                  P.S. On a slightly related note, there’s also been a debate on whether relatively recent “Malthusian”-type periods like the Middle Ages also had any major influence on modern genotypes. Just in case anyone’s interested in jumping further down the rabbit hole.

                  • That’s a step to far down the hole for me. While I can be accurate in saying that a generation or 2 isn’t enough time for natural selection to work on our brains, I don’t have the depth of knowledge to work with what could occur in 50 generations, or enough information about society over that time span to determine what was and was not a real selection pressure.

                  • Three hundred years ago, our notions of human identity were vastly simpler: we were defined by the family we were born into and our position within that family. Social advancement was nigh on impossible and the concept of “individuality” took a back seat.

                    That only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories – ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self.
                    *******************************************************************************************
                    I don’t have a problem with the first paragraph. What she’s spouting in the second are the theory-driven notions advanced in most English Literature and Cultural Studies graduate departments since the late 60s.

                  • For formal schooling, I have a B.S. in mathematics with minors in computer science and philosophy. I am a software engineer by trade who follows biology, evolution, and theories of the mind as (1) they are important in dealing with the updates in attacks from liars for Jesus, and (2) a few of the bloggers I follow on atheism and skepticism are biologists by trade (like PZ Myers and the aforementioned Jerry Coyne) who often post about the topics.

                    I have never claimed credentials in biology or neuroscience. My knowledge is limited to what I have read for pleasure… like I had previously noted.

                    Throw in the previous unfounded, rhetorical accusations against me, and this really feels like a setup for the courtier’s reply.

                    • No set-up for a courtier’s reply coming. Let’s assume instead of a brain, we have programmed a neural network. Would you agree that we could set one up, train it based on input-output criteria, and end up with something with higher functionality, based on improved inner-connectivity?

                    • Of course.

                      I’m interested to see where you go with this considering that artificial neural networks are idealized objects whose behaviors are analogous to, but not representative of, the human brain.

                    • The general analogy is fine, but it just isn’t representative. I’ll give you an example. An artificial neural network can easily calculate square roots and find prime numbers. Humans can do the same things, but multiple orders of magnitude slower. How long would it take you to, by hand, come up with all prime numbers between 10,000 and 100,000? Hours? Days? Would you be able to concentrate on that task? The artificial neural net could do it in minutes if not seconds, and wouldn’t ever get distracted or tired.

                      Artificial neural nets are an amazing tool for us to try to learn about how our brains, but the information we glean from them is limited and subject to the differences in the building blocks. As I said, they’re idealized.

                    • Fair enough. An artificial neural network can do the brute force calculations quicker and more accurately than the brain. On the other hand the brain or its neuronal components have access to abilities through parallel distributed processing that an artificial neural net does not. At least not yet!

                      Staying at the lower level, though, a trained neural net functions a lot like a biological neural circuit. Pathways are strengthened for both by repetition; the likelihood for an output or outcome is increased with every iteration of the input-output or stimulus-response cycle.

                      When a computer or electrical circuit has its output modified, it changes the functionality. When a neural process is changed at the cellular or subcellular level, biologists refer to this as a structural change. Neurochemicals are released at synapses, and an action potential sends a signal down a pathway – it fires – which becomes the preferred or more likely route. I’m glossing over a lot details and simplifying, but these changes can be measured and visualized, some of it in real-time. You can’t attend a neuroscience or related medical meeting without someone presenting a paper dealing with the structural changes to the brain – negative and positive – caused by a legion of agents and behaviors. Is there anything here that you take issue with?

                    • The changing of the artificial pathways does not actually model how biological pathways are changed, or how they are maintained.

                      I agree that input causes structural changes, but I still deny that there’s evidence that those structural changes allow for learning the increased information we have now.

                    • Okay, so we both agree that inputs can cause structural changes to take place on both systems. And you have some problems with drawing inferences about biological systems as do I. Let’s put the artificial model aside for now.

                      There is disagreement on how much you can extrapolate (infer) from the experimental data to the organ system level. For instance, I’ve seen claims for training the brain using software programs that seem pretty far fetched. And who knows where you draw the line on qualitative versus quantitative data and where you go with it. But confining the discussion to biological systems, do you believe that there is a limitation on amount or rate of information or knowledge we can store in our brain? Structural changes occur, based on learning: surely these improved pathways don’t lead to a dead end.

                    • Well, we were always taught that there are billion neurons and we make use of only a small percentage of them. Also, the exquisite interconnectivity of those neurons – dynamically strengthening and weakening – increases the capacity of brain function. But let’s accept that the limit is finite and we don’t know what it is. Would you agree that we are running at no where near capacity?

                    • Well, we were always taught that there are billion neurons and we make use of only a small percentage of them.

                      This is just another false “we only uxe x% of our brains” statement.

                      Would you agree that we are running at no where near capacity?

                      Not without evidence.

                    • General statements like “check out the literature” are suspicious in general and ususally the refuge of people who can’t actually cite valid evidence. Throw in the fact that you thought evolution of the brain would suggest differences over 100 years, and your general scientific statements cannot be trusted. Can you point to any actual evidence?

                    • Since your first few blogs, which are generalizations based on your experience, you have sat back and repeatedly asked for evidence, while providing none yourself. This seems to be your pattern in other blogs. In order for us to have a dialogue, you have to look and be willing to read other points of view; just reading Coyne and Myers to confirm your opinions won’t do it. BTW, I have read Coyne and I think he’s pretty funny – also prolific, with more blogs per day than Jack! Is he right? Well, he has a strong POV and it seems he’s carrying baggage. Generally, I am suspicious of any folks advancing strong views who are not willing to admit they may be wrong. I might be mistaken. How about you?
                      You are misreading me on evolution.

                    • Again, a complaint that I don’t provide evidence, but no mention of what I need to support more.

                      A claim that I misinterpreted you, but no explanation of what you actually meant.

                      This is a smear campaign.

                    • I am perfectly willing to read multiple points of view and have previously indicatored such. Your refusal to provide evidence for your counterpoints is the problem. False accusations just dig your hole deeper.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.