The Good Luck Jet Engine Sabotage

China Southern Airlines Flight 380 from Shanghai to Guangzhou was held up at the Shanghai Pudong International Airport after an elderly female passenger threw coins into the plane’s engine to ensure “good luck.” An investigation into the incident is under way, the airlines says.

This bizarre story raises a serious and difficult ethics question. At what point should there be severe societal penalties for egregious life incompetence?

The elderly are obviously the most prone to this sort of thing. At some point many of them just stop paying attention, or lose the ability to keep up. In criminal law, we do not typically punish people for harm they do as a result of ignorance, but there are limits. There have to be.

I have a long delayed post on life competencies on the Ethics Alarms drawing board; it will eventually be a permanent free-standing page, like the Rationalizations List. The topic is difficult. What skills and knowledge are all of us obligated to have, if not master? If our inattention to Continuing Life Education makes us menaces to society, how should society respond? With pity? Sympathy? Compassion? Pat the fools on the head, and give them a stipend?

Being an ethical member of society mandates being able to participate in society’s activities without constantly screwing up. That, in turn, requires a level of personal responsibility. Society needs reasonable, fair, not overly harsh or intrusive ways of persuading everyone to meet this minimum requirement of citizenship. What are they?

It doesn’t have to be as ludicrous as an old lady nearly crashing a jet by throwing good luck coins into its engine, either. As we are increasingly dependent on technology, and as technology moves up a J curve, the damage that can be done by, just to take a wild example that could never happen, someone in a sensitive position using “password” as their computer password, thus enabling a foreign governments to steal confidential data and use it to set off an Ethics Trainwreck, is terrifying. How does a responsible society send a message that is sufficiently persuasive to people before they blunder into chaos ?

I don’t know the answer yet.

I’m just asking.

And now, a song!

Three coins in the engine
Each one risking air distress
Thrown by one stupid granny
How should she pay for the mess?

Three coins in the engine
Each as deadly as the first
There they lie in the engine
See the flames there as they burst!

Which will make the airplane crash?
Which will make the airplane crash?

Three coins in the fountain
Through the turbines how they shine!
Just one wish will be granted
Hope the charred corpse isn’t mine…



Pointer: Fred

Source: Boing Boing


Filed under Around the World, Law & Law Enforcement, Science & Technology, U.S. Society

48 responses to “The Good Luck Jet Engine Sabotage

  1. wyogranny

    Now this is an ethics question worthy of the name.
    What skills and knowledge are all of us obligated to have, if not master?
    I’d say not bringing superstition into public action is an obligation. Anyway, I’m glad someone caught it. I hope it was caught by good security and not moral luck.

    • But really, barring granny rolling down her plane window and tossing in the coins, perhaps at a minimum a protocol that funnels passengers maybe just a scootch further away from the massive jet engines might be in order.

  2. Let’s hope the now sabotage-repaired plane doesn’t crash for literally any other reason than not having the good luck coins throw into the engine..

  3. I don’t know about this specific lady, but despite the economic growth there China has probably probably 200 million people who live in conditions we’d consider primitive. I don’t think that you can impute competencies when the economic ladder extends that far down.

  4. dragin_dragon

    As an existing “senior citizen” (I just turned 72), I have noted both a loss of physical ability and for right now, a minor loss of memory ability. I suspect as I continue to age, I will continue to see a lessening of both mental and physical abilities. I am, frankly, concerned that I will not be able to identify at what point I should discontinue participation in certain activities. And, by the same token, I am concerned that there are people who are so far removed from the mainstream of life that they believe throwing coins into a relatively fragile piece of technology like a jet engine is a good idea, or who are so ill-informed about computer security that they 1) use “password” as a password, then 2) compound the error by putting sensitive information behind such a useless password. Unfortunately, “diminished capacity” is, perhaps, a good criminal defense but it sucks as a reason for being incompetent in important leadership roles. It also sucks as a description of a voter. Most of the solutions, I think, are going to require some sort of proof of competency for doing many things. Texas, for instance, already has an age-related requirement for a driving test for those seeking drivers license renewal. I suspect that some form of either proof of competency or just a requirement for service (not necessarily military) may, in the future, be required for voter registration. As our electronic information flow increases, the ability to use the devices will become a job requirement. We will no longer be able to refer our ignorances to an IT department or ignore them altogether. Whether this will be regulated by law or job description I don’t know…possibly both, but our present arrogance is unsustainable, much as I hate that word. Yes, this will mean limiting the rights of certain people, but “rights” has a companion word…”responsibility”. Those who cannot demonstrate the responsibility may, of necessity, find themselves unable to exercise the right.

  5. Jack, you didn’t want to allude to Bill Engvall??

    We had a safety training for all employees at the refinery two years ago. I forget who presented, but his assertion was that safety measures always follow catastrophic incidents. It is rare that anyone foresees the possibility of a major incident and attempts to devise measures against it. Part of it is, we don’t even recognize that a major incident is possible until it happens. Only in retrospect do we understand how all the risk factors aligned to produce the event. Then we can devise all the safety measures we need to prevent that incident from reoccurring, and we continue blithely on our way with the next incident lurking in the shadows, waiting for the next unanticipated alignment of risk factors to strike.

    Robert Nelms, of the Latent Cause Experience that I’ve referenced before, also spoke about how culture change follows incidents. In his experience, culture change never anticipates problems, but reacts when problems reach a critical mass. The problem, and the reason his incident investigative technique is called Latent Cause Analysis, is that the causes behind incidents are hidden in that gap between expected behavior and observed behavior. Until we unearth the reasons why that gap exists, we can’t hope to make sure the problem is adequately addressed. At the same time, we have to realize that, as long as everything is going well, there will be no motivation to change, so it actually requires an incident to occur to shake us out of our complacency so that we can track down the true latent causes present in the system.

    I’m not sure that anyone prior to this incident would have every conceived of protecting a jet engine from good-luck pennies. Similarly, even though my wife and I strive valiantly to toddler-proof our house, our daughters keep finding new and inventive ways to hurt themselves. We take care of the hazards we can predict, either because we’ve experience them or have had other parents warn us about particular hazards. But our daughters don’t look at the world the same way we do. They’re still trying to grasp how things work, and the things they have pieced together in their minds may not accurately reflect reality. Because they are still so young and can’t express themselves very well, my wife and I have no idea how well they grasp a concept until they actually demonstrate that the understanding isn’t complete.

    Returning to the Latent Cause Experience, one of the key factors in dealing with incidents is the refusal to assign blame. Some people call it an amnesty, though it isn’t exactly that. For the incident itself, no punishment is meted out. The investigation searches for the latent causes, and in the end, all the stakeholders involved make statements of “I know I do this or am like this, and this is a problem.” In other words, everyone involved asks of himself, “what is it about me that made this incident happen?” From answering that question, a person takes on himself action items to address his particular role. Afterwards, if that person fails to complete those action items, he is then held accountable.

    In the case of this elderly coin-tosser, I would expect that she be given a pass for this first time incident. She would then be expected to answer the “what is it about me” question and take on herself a corrective action to ensure that “something about her” is properly addressed. Only if she fails to follow through should she bear any further punishment.

    But we would also be expected to ask the same question of ourselves. What is it about me, Ryan Harkins, that allowed this incident to happen? Is it because I simply expect people to keep abreast of what is in the world and know how to handle it? Is it that I’m too impatient or self-absorbed to make sure that someone entering a situation they’ve never experienced before understand the dynamics? Is it that I’m dismissive of local or archaic customs? Whatever that answer is, I had better take it upon myself to address that problem. The airline company would likewise need to ask that question about itself, and find answers. It might be that there’s nothing really about them that’s wrong, other than failing to anticipate that someone would toss coins into the jet engine, which is that pervasive failure we all have, since we cannot possibly conceive all the different ways people will break things. Maybe what comes out of this incident is that now all airports will be required to post signs, in 13 languages, that say “Do not toss any objects into jet engines.” Sort of like the Preparation H warning, “Do not take orally.”

    What should we do about minimum expectation of competency? The last thing we should do is simply assume that people will absorb what they need to know through osmosis. If we expect that people should behave a particular way, we should take steps to explicitly educate them as such. I don’t have any answer of how to do this practically, though. But I can conceive that in that line of people waiting to board, more than a few were nose-down in their phones, paying no attention to the little old lady who wandered off to toss her coins. Perhaps it starts with people trying to be better brother’s keepers, and perhaps that means being a little bit more aware when someone around us starts behaving in a way we never would have expected.

    • I nominate this for a COTD

    • Spoken like a true engineer.

      Personally, I am slightly surprised that the coins damaged the engine instead of just passing through it, just because they are wide open for all airborne objects to get sucked in. Granted, that’s probably why planes try to repel birds and avoid flying under hail clouds.

      That said, using security mindset to fortify the engines doesn’t require us to anticipate people’s behavior. It only requires us to pay attention to the input and output footprint of the engine (which we should also do for the rest of the plane). The input footprint (what it assumes/requires) of the engine is clear air going into it. The output footprint (what it causes or changes) is that it produces thrust, makes a lot of noise, anything in front of it could get sucked in, and anything behind it could be blown away. Now all we need to do is design protocols to make sure the input footprint remains true and keep things safe from the output footprint. To maintain clear air in front of the jet engine, we can keep everything far away from it, put some sort of conical filter on it (probably not a good idea), and/or simply put up a sign that tells people to keep the air in front of it clear.

      I agree that it shouldn’t always be necessary to put up instructions so people don’t break equipment they don’t understand. All they need to do is not mess with it, and they should already know not to do that, especially without permission. I’m going to chalk this debacle up to hubris on the part of the old lady (“as if the giant plane company needs your help to do their job! Their checklists are better than your coins!”) Just don’t let her anywhere near a helicopter. That’s a good way to get someone killed.

      Sometimes people do respond to crises that haven’t actually happened yet, and usually we can credit fiction writers for bringing it to people’s attention.
      Fiction allows us to learn from mistakes we haven’t made yet. If we want people to actually use foresight when designing systems, though, we should teach them to use strategy mindset and related mindsets (such as security, standardization, institution, preparation, clarification, reputation, salvage, overhaul, et cetera). I am perpetually irritated at how many computer programmers don’t use clarification when designing their code, to make it more robust or to at least alert the user as to its limitations. There’s no excuse for writing code and not documenting its input and output footprints.

      • I am slightly surprised that the coins damaged the engine instead of just passing through it


        Such engines work by compressing air streams into a flame chamber, which burns hotter, which produced enhanced thrust. This compression is accomplished by spinning blades in the front of the engine, usually in excess of 2000 rpm. Impacts on these blades can cause undesirable consequences (read “fireballs of death”) as the rotors shed high velocity pieces (read “shrapnel”) for many yards around, usually impacting things that object to the experience (fuel tanks, airport walls, people.)

        Needless to say, airports work very hard (some even say they are anal) to keep the runways clear of objects that can be sucked into a jet engine.

        Coins in a running jet engine are fortunate to not have caused the ‘fireball of death’ and I suspect the engine was idling just because of that fact.

        • But I like the input and output footprint concept. Engineers call this state diagramming, or requirements and deliverable (depends on your branch of reality)

        • John Billingsley

          Fortunately the FAA requires that to be certified for flight on a commercial aircraft a jet engine must contain any thrown turbine blades within the body of the engine. Don’t know what the requirements are for Chinese engines.
          turbine blade failure

          • You didn’t just confuse design spec with real world conditions? Such a spec, while great and worthy to aspire to, cannot stop the multiple events shedding a blade in flight creates. One blade has to be stopped by the cowling… in a lab. In real life, that blade makes a hash of other components which are not likely to be contained… like other blades.

            Catastrophic testing is great fun for an engineer, and the goal is to make things as safe as economically possible, not ‘as possible.’

            I have taken equipment through simulated earthquakes, burned it to measure how much toxic gas is emitted, left it in a caustic environment for a month, and so on. The spec in some of those test specified that the equipment work after the test. It did not say how well, how long, or how I rigged it to find a working circuit path.

            Good times…

            • John Billingsley

              Certainly true that no test can cover everything that might happen in the real operational world. I looked at lots of clips of engine tests but didn’t see a single one with an elderly Chinese lady tossing coins into the intake. Still, it is fun to watch incredibly expensive engines tested to destruction. I initially trained in chemical engineering and for my masters worked on a system using high pressure and temperature to mix coal and hydrogen to make oil. I thought a lot about how things might go wrong with what was essentially an explosion waiting to happen.

              • I once set a supposedly certified bay of very expensive electronics in fire, just to see what would happen. Took a bit to catch (we simulated an overheating transformer in a Power Supply Unit using a gas torch) but once critical temperature was reached, the results were… disturbing. Nero burning the Christians comes to mind, with lots of toxic releases and flares. Nearby inflammables (like people) would likely have caught. Gone in 20 minutes.

                Side note: the company got irritated when we allowed the fire to scrap a $2000 Bellcore grade equipment rack (big steel frame) because we did not put the fire out quickly. They knew we would kill the half million dollar equipment, but thought the rack would come back more or less intact. We shut that down by explaining that they invented the blowtorch that caused that damage (their fan design sensed the temperature upswing and kicked into overdrive, working like a jet engine,) and it was what they could expect any time a fire erupted in a Bell Central Switching Office (and no one in such an office would have the oxygen supply and personal protective equipment needed to get into the toxic room to extinguish, short of the fire department.)

                Fun times…

    • In planning and preparing our systems and processes, we would prefer the ability to anticipate realistic emergencies/contingencies and prepare buffers to avoid them as well as mitigation should they occur. Ideally a society would have three roads: 1) under-preparing for catastrophic accidents, 2) preparing for reasonably expected catastrophic accidents, 3) over-preparing for catastrophic accidents…that is seeing disaster at every turn and buffering out avenues that lead even generally in that direction, seeing eventualities that even remotely appear could have danger and shutting them down.

      Let’s make it a false dichotomy: Which would you prefer a society that under-prepares and on reacts when an accident occurs, or a society that over-prepares and shuts down anything that seems potentially risky?

      • Which would you prefer a society that under-prepares and on reacts when an accident occurs, or a society that over-prepares and shuts down anything that seems potentially risky?

        Ugh, the choices! Without further clarification, I would probably have to lean toward the former, because the second sounds like paralysis of analysis. Either nothing would happen, or we’d realize there is risk even the attempts to avoid risk, and go batty trying to sort through the contingency-laden minefield.

        The problem with anticipating realistic incidents is that sometimes we don’t even know what is realistic. When we do a PHA (process hazard analysis) on our units, either new or old, we try to chase down all possible failures, all possible causes of those failures, and all possible consequences of having a failure. We have people from multiple different disciplines so that as many different eyes are on the issue as possible. The results are then sorted through a risk matrix, which factors together likelihood of failure with cost of failure. It may be that there are consequences that are very remote in likelihood, but their cost is so high that they should receive due consideration ahead of lower-cost, but likelier events.

        But all this evaluation depends on the experiences of people, plus whatever foresight, or the synthesis EC mentioned below, they possess. We would like to think that anything that would likely happen would be apparent enough to a team of experts, but sometimes an issue still eludes them. (I think that’s why PHA’s have to be held every 5 years.) But, assuming they identify potential problems, the solutions themselves may lack in creativity. Or they may incidentally create more problems than they resolve, and no one at the time foresaw the difficulties their solution would raise. Or the solutions might be cost-prohibitive, or on the lines of “solve world hunger”.

        The reason I don’t like the second item on your false dichotomy, Tex, is what I’ve seen happen with new equipment in the refinery. A compressor, for an example, might come with a shutdown PLC. The whole purpose of this PLC is to protect the compressor, so it will be programmed to read temperature, pressure, throughput, time running, vibration, amperage, and dozen other parameters. All of these will have alarm levels which, if crossed, will trip the compressor off. Usually what seems to happen is that we install the new compressor and can’t keep it running for more than 5 minutes because something trips it off. So we end up adding delays, or disabling certain trips just so we can actually use the thing. To make matters worse, sometimes tripping off that compressor can have bad consequences elsewhere in the refinery, so while we’re trying to solve a problem in one local are, we’re actually causing problems in other areas.

        Ugh. I also don’t like the second option because that seems to be the direction we’re headed with car seats and booster seats for kids. At rate we’re going, my daughters won’t be out of booster seats until they’re legal to drive.

        At age 32.

        • That second option is an overdose on the paradigm of security mindset (strategy with extra analysis). For a high resource cost (money, time, effort, et cetera), security mindset can all but guarantee that something will or won’t happen. It usually reduces capacity or mobility (or both) in order to do so.

          However, because both order and chaos are inherent in life, there are always unknown unknowns–black swan events, outside context problems, x-factors–which make it impossible to 100% guarantee anything. Security can’t change that.

          (For contrast, standardization mindset (strategy with extra organization) creates a cheaper plan with a higher expected value, at the cost of occasional failures. Emergency mindset (tactics with extra synthesis) is more or less the polar opposite of security, as it makes the impossible possible for a steep cost (e.g. overclocking something).)

        • John Billingsley

          There is a problem in medicine similar to the compressor example. There is an immense amount of medical equipment used in patient care and pretty much all of it has audible alarms designed to be very annoying and attention getting. Various sources estimate that between 72% and 99% of the alarms that occur in practice signal a condition that requires no clinical intervention such as a lead being accidentally pulled off. As a result, staff develop “alarm fatigue” and become desensitized to the alarms and tune them out or in some cases have actually defeated the alarm. As might be expected, this has led to patient deaths and injuries. The problem is that if you try treat everything as an emergency then nothing is an emergency.

          I learned during my experience as a flight surgeon that there had been similar problems with pilots ignoring warnings because there were just too many of them that didn’t signify the need for any immediate action. Or they became overly focused on a noncritical alarm and failed to fly the plane resulting in a crash. There were also confusing alarms. The horn in the 737 that signaled loss of cabin pressure at altitude, a real biggy, was the same horn used before takeoff to indicate the aircraft was not configured for flight. Pilots associated the horn with a pre-flight danger and sometimes ignored it in flight which resulted in at least one fatal crash.

          Cockpit warnings are now designed with more human factors in mind. Something critical like an impeding stall gives red lights, written message, voice warning and stick shaker. Less critical things the pilot needs to know about give increasingly less intrusive warnings such as just an amber light or voice message.

          Attempts are being made to improve the situation with medical equipment. This would involve categorizing events by system and how critical they are, having different types of alarms for different priorities of needed intervention, and having the alarm actually convey needed information such as what the equipment is actually indicating, patient medications and patient condition to the medical personnel by text or other method.

  6. You can start your list with this, Jack.

    Core competencies all mature adult individuals should have:

    Analysis: assessing concepts to see what fits and infer causality and other types of order.
    Synthesis: generating concepts to imagine possibilities not yet realized.
    Organization: optimizing navigation to make the best use of the resources at hand based on needs and priorities.
    Operation: internalizing navigation for familiar situations to make decisions intuitive and actions graceful.
    Strategy: fortifying paths to make robust plans and close unwanted possibilities.
    Tactics: repurposing paths to find innovative ways to use resources and open up possibilities.
    Semantics: simplifying interactions to process and transmit information rapidly, and to find quick answers within established paradigms using labels and algorithms.
    Empathy: individualizing interactions to better understand others’ paradigms to better show them respect, help them feel at ease, leave positive impressions, and find solutions that work for them.

    What I’ve been working on for the past several years is characterizing all of the mindsets (these are just the basic ones) and figuring out how to nurture them in people.

    These are meta-skills: all skills are based on one or more of these mindsets. Computer savviness? Semantics. Secure passwords? Semantics and strategy (clarification). Not being superstitious? Analysis. Being respectful to people? Empathy. Being respectful to people you know very little about? Empathy boosted with semantics (background). Understanding bizarre points of view? Analysis and empathy (deconstruction).

    Rant follows: It may be trite to say so, but I wonder if the old lady ever considered that if the coin luck thing worked, not only would slot machines be obsolete, but all airlines would have a technician to throw coins at every plane. Also, why the engine? Why not just throw them at the plane? Did she think she’d have to do the other engine as well? *Sigh* By its very definition, luck comprises all unknown factors. It is chaotic; it cannot be predicted or controlled. If it could, it wouldn’t be luck. I find it amazing how many people indulge in magical thinking to convince themselves that “good or bad luck” can be predicted beforehand without knowing what’s actually going to happen.

  7. Other Bill

    This shows us why God, in his infinite wisdom, invented jet ways.

  8. This post is timely, as Texas is dealing with a competency issue regarding what mental illness is, and how (if) such should be executed.

    This fellow is either Joker quality batty (sorry for the reference, but it was fun to make), or is playing the system for all he is worth. He has avoided two execution dates already, and may now never face that again after the court ruling yesterday. And he might really be crazy, and should not be executed.

    The only sure thing is that he will never see the light a freedom again, and should not.

    • John Billingsley

      This has been a recurrent theme in Texas and elsewhere. In Texas, there was Adam Ward who argued that he was mentally ill but was executed in 2016. Then there is Steven Staley who got a court order that prevented him being forcibly medicated so that he would become competent to be executed. Staley Given the outcome with Staley, I expect Panetti will end up with life imprisonment.

      In Arkansas, the stated ordered an inmate on death row, Charles Singleton, to be forcibly medicated so that he would become competent to be executed and then executed him in 2004. The courts reasoning in that case was interesting. “The court found that since the medication controlled Singleton’s mental illness, and since Singleton himself admitted to feeling better while on it, the only unwanted side effect of forcing him to take antipsychotic medication was the fact that it would make him eligible for execution, a sentence that was lawfully imposed upon him for a crime he was found to have committed. The state always has an essential interest in making sure that lawfully imposed sentences are carried out. Therefore, the 8th Circuit Court decided that Singleton was to continue to receive the medication regardless of the fact that an execution date had been set, because it was still in his best medical interest.”

      The American Medical Association and the American Psychiatric Association both hold that it is ethically unacceptable for a physician to prescribe drugs to restore competency for the purpose of execution.

      • be forcibly medicated so that he would become competent to be executed and then executed him


        That really makes me squirm. Emotionally, let him be detached from reality as he is executed. But rationally, you have to treat the illness when the criminal is incarcerated.

        Nasty situation, though.

        • I have the same problem with this idea. I’m not necessarily in favor of the death penalty, but doesn’t it only matter if someone was in their right mind when they actually committed the crime(s)? If they acted while “sane” and are sentenced to death, it doesn’t really matter whether they’re paying attention while they’re killed, does it?

          • If they acted while “sane” and are sentenced to death, it doesn’t really matter whether they’re paying attention while they’re killed, does it?

            I am of the mind set that putting someone to death is not about punishment, but about protecting society at large, and serving as a deterrent to others.

            That said, I don’t care if the guy is in a coma: carry out the sentence with the due process.

            • Yes, exactly. After all, a “punishment” usually carries the implication that a person will learn to behave better afterward, which can’t happen if they’re dead.

            • dragin_dragon

              With a nod to John Billingsley, my beliefs are fairly simple on this one. Execution, like EC said, is NOT a punishment, since revision of the behavior is not a ‘reasonable expectation’ from the act. It is, at it’s simplest, an eye for an eye, retribution…vengeance, if you will AND (and this is a very big ‘and’) removal of a threat to the society in which the perpetrator exists. Thus, there is no valid reason to make sure he/she knows why the execution takes place, or even THAT it takes place. There is certainly nothing ‘cruel or unusual’ about not being aware of the approach of death. In fact, I would think just the opposite…knowing it’s coming could be viewed as a form of mental torture.

      • dragin_dragon

        Boy, is that a Catch 22. It really doesn’t matter what you decide, you’re probably wrong.

        • dragin_dragon

          Just out of curiosity, whose idea was to say there was a competency test for dying?

          • John Billingsley

            Regarding the issue of competency to be executed, “In 1986, The United States Supreme Court decided Ford v. Wainwright, which declared that it was unconstitutional, a violation of the 8th Amendment to the Constitution prohibiting cruel and unusual punishment, to execute the “insane.” The court did not precisely define what “insanity” was, but in the opinion one of the justices suggested that it encompassed the idea that a person was too mentally ill to realize the reason that he or she was being put to death.” This quote is from an article by David Shapiro, Ph.D., J.D., in The National Psychologist, January 6, 2015.

            Since then the controversy seems to be centering around whether or not it is permissible to treat someone against their will to restore competency to be executed. I would think looking at the Staley case that courts now are going to act more often to prevent forcible treatment in that instance.

            Individuals who are found incompetent to stand trail due to mental illness are typically confined on a forensic unit in a state mental hospital. If they are returned to competence, they then stand trial. Providing voluntary or involuntary treatment in this case does not have the same issues associated with restoring competency to be executed. After all, a trial might result in acquittal or a minimal sentence while failure to recover from the mental illness could result in being a lifelong patient in a mental hospital.

            Competency to stand trial has nothing to do with whether or not the individual suffered insanity or diminished capacity at the time of the crime. That would be a defense that would have to be raised at a trial which can’t be held until the defendant is competent to participate in his or her defense. It is also important to keep in mind that an individual can suffer from significant mental illness and not be incompetent.

            I am not totally opposed the death penalty but I personally would not participate in treating a man or woman solely for the purpose of hastening their death.

            As a caveat, I am not a lawyer and have never even played one on TV. I am sure I have overly simplified some of the legal points but this is basically what I have been taught in terms of forensic psychiatry.

  9. Spartan

    Careful Jack, you might lead us down the primrose path of identifying the Useless Eaters.

    • Spartan

      Crickets. No one gets this? Slickwilly? Valkygrl? Awww.

      • Oh, I got it all right… just ran out of giveacraps for the day to reply to the dog whistle…

        And be careful, Sparty: Jack banned Chris today. I am minding may Ps and Qs while dad is on a tear. 😀😀😇

      • I got what you meant, and thought you were just joking. Deciding people are too stupid to live is not a slippery slope from deciding that they are too uneducated to be allowed to make decisions for the rest of society.

        The “only” danger in the latter is the incentives created when the people in power get to decide who they are allowed to completely ignore. Not that they don’t do that anyway, but this would be legally enforced.

        I don’t really see the need to restrict people from voting at this time, because it’s an insufficient solution. Votes by educated people do not translate to educated policy anyway.

  10. Wayne

    The song presents a good argument for the inevitability of rock and roll. As far as the old folks who have lost their marbles (or thrown their coins in the wrong place!) if their actions present a danger to others, they belong in a sheltered environment. I have a feeling that the Chinese granny is now sitting in a Chinese prison cell however.

  11. Pennagain

    I quit flying after they confiscated my horseshoe at the FSA counter.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.