Episode 5: Happy Feelins in the Synapses

youresuchatardis asked:  Why do humans feel emotions so strongly at times, and barely at others? Think about times of feeling very angry or very in love, ect and compare that to times of “meh”, as some of us would put it. I’m looking for the biological aspect, here.

Neurochemistry?  Come on, you could have at least asked a hard question. (Seriously, though, biology was my worst subject in the sciences.  I’m glad you asked this one — it’s something I’d like to learn more about, and I’d be happy to delve into it in further questions.)

Let’s start, as ever, by laying out some terms.  Your brain is composed of a very large number of neurons, in the neighborhood of eighty to one hundred billion — these are cells that transmit information in your brain via electrical and chemical signals.  This transmission occurs via synapses.  In a literal sense, this is a neural network — neurons connected to nearby neurons.  (I say that because the concept of a neural network also appears in artificial intelligence.  That’s a complicated topic — I’d be happy to touch on that more in a future post.)

Emotions, in general, are regulated by the transfer of neurotransmitters between neurons via synapses.  A neurotransmitter is a chemical signal, essentially.  Nearby neurons will release these chemicals, which induce the neuron to behave in a certain way.  Neurotransmitters come in two flavors:  excitatory, which induce the firing of an action potential, and inhibitory, which block this.

image
Above:  a neuron, center in the callout, connected by synapses, the thin tubes.  The red dots are neurotransmitters.

Often, both excitatory and inhibitory neurotransmitters are released at the same time, competing for the neuron’s attention.  Like bits in a computer, at any instant, a neuron either fires a message or doesn’t.  But since neurons have the opportunity to fire messages many times a second, you end up with a gradient — how many times, and how many neurons, fire that message?  The more times that happens to neurons that trigger an emotional response, the stronger the emotion.

Wouldn’t it be nice if that was the whole story?  But this is biology and chemistry we’re talking about.  Of course it’s not that simple.  These chemical neurotransmitters don’t just disappear.  If they’re still floating around in the brain, they’ll trigger the action potential of other neurons, again and again.  So the chemical needs to be removed from the environment — it can be destroyed by enzymatic action, or be reabsorbed into the neuron that sent it, a process called reuptake.

To illustrate how these effects work, let’s look at a common case — prescription of anti-depressant drugs in depressed patients.  The feeling of “happiness” is heavily influenced by a neurotransmitter called serotonin.  Interestingly, serotonin’s primary function is to mediate the action of the gastrointestinal tract, but also perception of resource availability.  In most cases this means food, but it also has a lot to do with perception of social standing; a socially superior animal generally has more access to resources.  (When they say the way to a man’s heart is through his stomach, serotonin may well have a lot to do with it!)  Intelligent animals, of course, have a lot more diverse resources to be concerned with.

image
Resources are delicious.

A commonly-prescribed medication for depression is a serotonin reuptake inhibitor, like fluoxetine (Prozac).  By preventing reuptake, the speed at which serotonin is destroyed in the brain is reduced; this increases the frequency at which “happy feelings” action potentials are fired.

Another chemical believed to be involved with “happy feelings” is dopamine.  Dopamine controls movement, emotional response, and the ability to feel pleasure and pain.  More dopamine means stronger feelings; less dopamine, weaker feelings.  L-DOPA, a precursor of dopamine, is converted by the brain into dopamine; this is frequently used in the treatment of Parkinson’s disease, which reduces dopamine available via the death of dopamine-generating cells in the brain.  A class of drugs called amphetamines (like Adderall, for example) also serves to increase the production of dopamine and inhibit reuptake — but with some nasty side effects.

image

The third one is a hormone called norepinephrine.  Norepinephrine is a neurotransmitter affecting the heart; increased norepinephrine triggers increased rate of contraction in the heart — a faster pulse.  It has direct interaction with the body’s fight-or-flight response, as well as our reward system.  Related:  feelings of love and lust increase norepinephrine.  The excitement you feel when interacting with the object of your affection is directly related; effects of norepinephrine include arousal (go ahead, get your yuks out now), as well as focused attention, increased energy, mania, and elation.  

We’re still learning about neurochemistry — our ability to understand the brain is a very, very new thing, and there are surely plenty of triggers I’ve glossed over or that science hasn’t discovered — but as a vast oversimplification, you can think of the magnitude of happiness, therefore, as the number of action potentials triggered by serotonin, times the number of action potentials triggered by dopamine, times the number by norepinephrine.

And who said math made people sad?

Episode 4: Come On Party People, Wave Your Waves In The Air

Anonymous asks:  What is happening when you can visibly see heat during summertime? For example, on top of cars.

image

On a hot summer day, you’ve almost certainly seen these reflective “wet spots” on the roadway as you drive.  As any experienced driver knows, these aren’t actually (necessarily!) water, but in fact heat shimmer, a kind of inferior mirage.  A mirage is an optical phenomenon in which light waves are bent to produce a displaced image.  An “inferior mirage” just means the mirage is reflecting something above it — in this case, the sky!

Hot surfaces heat the air around them through a process called convection, more appropriately convective heat transfer.  Natural buoyancy of the air causes it to rise, and cold air to fall.  It’s this convection that causes your hand to get hot and burn if you hold it over a pan on the stove for too long, even if you don’t touch it.  And also, it’s the reason that Mario in the image below is freaking dead.  Ignoring for a moment the toxic fumes the lava puts out, the air around that pool (and on top of those inexplicably floating tortoise shells) is about 1000 °F.

image

So the light waves are bent.  Turns out that the reason behind this is that temperature affects the refractive index of air.  Refraction is the bending of a wave — light, sound, whatever — due to a change in medium.  You’ve almost certainly seen refraction in action if you put a pencil or a wooden spoon in a cup of water, and noticed how it appears bent.

image

In the image above, the light waves from the pencil’s true end point, X, are bent as they touch the surface of the water, and so appear to originate at point Y.  The refractive index of water is about 1.33, and air about 1.0003.  These indices factor into a theorem called Snell’s Law, which states that the ratio of the sines of the angles of incidence and refraction is equal to the reciprocal of the ratio of indices of refraction.  That is to say, the bigger the difference between the two indices, the sharper the light rays are bent.  The refractive index of air as a function of temperature and pressure is given by this equation:

image

where P0 is standard pressure of 1atm, and T0 is standard temperature of 300K (about 27 °C, or about 80 °F).  So, as the temperature gets hotter, the refractive index of air decreases.

So that’s great, you’re telling me, but why are the refracted images so foggy?

image

Remember I mentioned convection earlier?  The air doesn’t heat evenly.  If you had a plane of air at, let’s say, 80 °F, and a plane of air at 50 °F, you’d see a clear, consistent bend.  (In this case, the ratio of the sines of the angles would be about 1.003.)  But the thermal energy instead propagates through both the normal buoyant effects of warm air and any forced convection, like the jet engine in the image above, so the air’s index of refraction isn’t consistent.  Hence, the foggy appearance you get from heat shimmer, since you’re being bombarded by light rays at all manner of angles.

Incidentally, the reason that heat mirages on the surface of a road look to the human eye like pools of water or oil is that water does the same thing.  Not surprisingly, water has a different refractive index from that of air, so it reflects an image of the sky, the same way as heat.  This can be dangerous — it may be hard to tell what’s a heat mirage and what’s actually a pool of water until you get too close.  Be careful when driving on a hot summer day!

Episode 3: Isn’t It Ironic?

lynneskysong asks:  Why do you think the word “ironic” is misused the majority of the time by people? Also, since language is also changing, do you think there will come a time when the new “official” definition of the word will change?

This is why.

The prosecution rests.





You’re still here?  Okay, fine.

Via Wikipedia:  ”Irony is rhetorical device, literary technique, or situation in which there is an incongruity between the literal and the implied meaning.”  So, for example, if a child at a school playground were wearing garish orange tennis shoes, and another walked up to him and said, “Wow, those shoes are amazing!”, the latter child’s statement would be ironic.  (This presumes that the child really thinks the contrary — that the shoes are ugly and terrible.  If he actually likes the orange shoes, it wouldn’t be.)

A brief aside:  the previous example would apply just as well to “sarcasm”, which in most forms is a special case of irony, “mocking, contemptuous, or ironic language intended to convey scorn or insult”.  This is a pretty important point:  sarcasm, by definition, is insulting to the subject.  I’ve seen a lot of people use the term “sarcasm” when they mean “irony”.

Not surprisingly, nothing in the lyrics to the song “Ironic” has any sort of ironic bent.  (Though I confess that calling the song “Unfortunate Happenstance” probably would’ve reduced its cultural impact somewhat.)  Ms. Morrissette did have something clever to say on that topic, though:

"For me the great debate on whether what I was saying in ‘Ironic’ was ironic wasn’t a traumatic debate. I’d always embraced the fact that every once in a while I’d be the malapropism queen. And when Glen and I were writing it, we definitely were not doggedly making sure that everything was technically ironic."

The circumstances in the song, while not ironic in the traditional sense, do drift into the concept of situational irony, specifically “cosmic irony”, or “irony of fate”.  Situational irony is a more modern use of the term “irony”, and does in fact relate to the song in a more direct way:  ”an outcome that turns out to be very different from what was expected, [or] the difference between what is expected to happen and what actually does”.  The citation for this definition comes from “Dictionary.com’s 21st Century Lexicon”, suggesting this interpretation of “irony” as a new interpretation of the term.

image

Above:  A classic example of “irony of fate”:  Oedpius Rex, or Oedipus the King.  Oedipus’s father, King Laius, is foretold of a prophecy that Oedipus would kill him and sleep with his wife.  Through repeated happenstance and both Laius and Oedipus’s attempts to avoid the prophecy, they instead make it come true.

And now we reach the rarefied heights of linguistics.  Linguistic prescription is the practice of championing one way of speaking a language over others, potentially demeaning the remaining ones as incorrect, improper, or ugly.  There’s a lot of debate over its validity compared to linguistic description, which involves the study of the patterns of speech of a language.  All language research is (or at least begins as) descriptive — we analyze the way a language is spoken in the present, or in the past.  Prescription then solidifies the fruits of descriptive research to define what is “correct”.

These approaches subsequently give way to a philosophical divide:  ”prescriptivism” seeks to codify language, while “descriptivism” holds that the language, as spoken, is what it is.  More or less, do we codify the language, or do we allow it to evolve?

Obviously, this is a false dichotomy.  Letting a language go wild and never centralizing any aspects of it would, over the course of years, create a lot of localization; we wouldn’t be able to understand each other, and for that matter, even understanding our own writing after a few hundred or a thousand years may well be difficult.  What most people call Old English, for example, is actually Middle English.  Old English, from the neighborhood of 700-1000 CE, looks like this:

image

On the other hand, language has such a central prominence in cultural identity and norms, that it’s not a big stretch to suggest that freezing a language in place and refusing to let it change has a chilling effect on the evolution of culture.  To a certain degree, the fact that we write at all slows the evolution of English; phrases would pass in and out of favor in a few years even as recently as the 17th century, while today cliches hang around for decades.  Though in its most straightforward application, George Orwell explored 100% pure language prescriptivism in Nineteen Eighty-Four with Newspeak, the notional idea being that if we fix language so that there is no word for an idea or expression, that idea will cease to exist.  (And that, of course, is doubleplusungood.)

Big brother is watching you

A discussion in one of my high school English classes touched on this question.  The teacher asked if removing the word for “freedom” would remove the concept of freedom from cultural understanding.  One of the students replied (paraphrased):  ”I don’t think so — I suspect they’d just invent a new word.  ’You can take me land, but you’ll never take me mitzelpix!”  Humor aside, the weakness in the student’s argument is that such an oppressive culture as Eurasia would stamp out (on a person’s face, forever) the kind of linguistic description such a change would represent, unless it was done in secret.  In which case, culture marches on.

Even today, there are official bodies whose goal is the preservation of a “pure” form of language.  In particular, French.  The Toubon Law, proposed by the then French Minister of Culture Jacques Toubon, which requires that in France:

  • "Any document that contains obligations for the employee or provisions whose knowledge is necessary for the performance of one’s work must be written in French."  Therefore, manuals and software must be translated in French to be used by French companies, for example.
  • Commercial advertisements and public announcements must be in French.
  • Product labels must be intelligible in French.
  • Public legal persons must produce publications in French.
  • Schools that do not use French as the primary language of instruction are ineligible for government funding.

It gets better.  The Commission Générale de Terminologie et de Néologie (General Commission for Terminology and Neology) in France made headlines recently for rejecting the use of the word “hashtag” in the French language, prescribing the term mot-dièse (sharp/hash word) in its place.  At a meeting about the interaction of technology and the increasing prevalence of English loanwords in the French language, a spokesperson for the Office Québécois de la Langue Française (Quebec Office for the French Language) said:  ”Borrowing too many words from English opens the door to a mishmash of French and English[. …]  This can have an impact on French word formation, phonetics and grammar, not just terminology.”

The French language (or, to hear some purists defend, Québécois) has a certain “high status” reputation, something that it seems the French and Quebec governments have no intention of relinquishing by allowing the language to evolve and intermix with “lower class” languages such as English, despite the fact that English has become the lingua franca of international commerce.  (Lingua franca, by the way, translates to “French language”, amusingly enough, and it means a language systematically used to enable interaction between people who don’t share a mother tongue.)  In blocking the use of terms that are technically accurate, the French language risks preventing its technical culture from evolving.

Getting back to the initial question on the word “ironic”, there really are two schools of thought:  the first, the prescriptivist answer, is that most people do indeed misuse the word, and the reason is this invasion of malapropism that threatens to undermine the word’s “true” meaning.  The second, the descriptivist answer, is that the word “ironic“‘s meaning has already changed, and we’re at this point powerless to revert it to its previous meaning, unless we take active steps to drift the word’s meaning again.  But then, in trying to shift the meaning again, would we take it away from what is now its “true” meaning?

Episode 2: The Bloat of Computers

"Gus’s male" asks on Facebook:  What makes Java such a hideously bloated resource hound? Seriously—Minecraft has all the graphical fidelity of an upscaled NES game—so apart from the number of objects it’s trying to load, what the hell causes it to want so many system resources just for a glorified game of Legos?

Not surprisingly, this is a complicated question. Yes, Minecraft does do a very good job of displaying a full 3D world in a manner reminiscent of NES-level graphics. But doing that, well, is harder than you think. And the level of system resources demanded by Java, Minecraft, and, truth be told, just about every computer program out there today, is a question of tradeoffs.

image

Computer programming has changed a lot since its infancy in the 1950s, and even since the 1990s.  Moore’s Law is an observation, named after Gordon Moore, founder of Intel, which states that roughly every eighteen months to two years, the number of transistors on integrated circuits doubles.  This more or less directly implies a proportional increase in computing power.  (Incidentally, this observation is slowing down; since we’re hitting practical physical limits, here, we’re reaching a level where computing power only doubles once every three years.)

The space-time tradeoff goes back to Martin Hellman in 1980, who proposed it in relation to cryptographic analysis.  In most applications and algorithms in computer science, you can greatly speed up computation if you’re willing to have the data you’re working with take up more space (memory, disk space, or whatever store you’re using).  A good example is opening a file from a compressed archive, like a ZIP — it can be done, but first the data must be decompressed, which is slow.  You end up losing time in the translation, but the space the data takes up is much smaller.  But we’re talking games here, right?

image

.kkrieger is a simple first-person shooter game, with full graphics and sound, that is stored in a 95 kilobyte executable.  Yes, kilobytes.  How does it pull this off?  The images, sounds, and just about everything else is generated via an algorithm.  The code has a provision for how to generate the graphics, on the fly, from simple rules.  So you can get marvelous images in very little space.  I have played this game, and it’s not much of a game, frankly.  What are you going to store in 95 KB?  It’s also slow, because all of the assets have to be generated just in time to display them, the game locks each door as you walk through it, and sometimes it forgets which enemies are dead and which items you’ve picked up.  But the concept is there; you can do amazing things with little space.

Having examined the kind of code that this and other games that generate levels programmatically require, I’ll tell you that it is an enormous pain to work like this. Humans generally don’t work well with the kind of bit twiddling required to develop a game in these constraints.  If you’re wondering how an NES, which has two kilobytes of onboard RAM, can handle running games, and for that matter, why it took months and years to develop games that an amateur can build in his spare time in days now, it’s specifically that; to get a functional game to work in such a constrained environment, you need to be a freaking super-genius.

Mega Man: F***ing Genius!!!

Not knowing much about the language in which NES games are programmed, I’ll delve in a direction I’m slightly more familiar:  PC gaming.  Most PC games — and in fact most software in general — have been written in C++, for decades at this point.  C++, originally “C With Classes”, is an evolution on the old C programming language.  You end up needing to interface with low-level libraries of code that directly manipulate the memory and features of your graphics card, system RAM, and so on.  In fact, this wasn’t even enough early on; some code would need to be hand optimized in assembly language, to get the program to run even faster than the compiler could make it.  (And compilers generally don’t mess around.)

image

Assembly language is one step removed from the literal ones-and-zeroes machine language that your processor uses to operate.  Oh, and by the way, the assembly language of each type of processor architecture is different.  

Punchline:  This sucks.  No normal person could program a game like Minecraft, solo, in any reasonable length of time, if they had to work in low-level languages.  Which is why they don’t.  Remember I mentioned “objects” and “game logic” here?  Abstraction is the bread and butter of the modern software engineer.

Abstraction is a concept in computer science that means two different things:  first, the abstraction of concepts into a data model, as we discussed earlier.  Second, the abstraction of methods and functionality in a computer program.  These two “kinds” of abstraction work in tandem  to make a computer program, such as a game, much easier to keep in memory.

A “class”, in computer parlance, is a package of code, variables, and methods that represents some notional “object”, ideally a literal component of a data model.  This takes advantage of one of the two types of abstraction in computer science:  the abstraction of a data model.  

In Goedel, Escher, Bach, Douglas Hofstadter illustrated six levels of abstraction for, say, a newspaper:

(1) a publication

(2) a newspaper

(3) The San Francisco Chronicle

(4) the May 18 edition of the The San Francisco Chronicle

(5) my copy of the May 18 edition of the The San Francisco Chronicle

(6) my copy of the May 18 edition of the The San Francisco Chronicle as it was when I first picked it up (as contrasted with my copy as it was a few days later: in my fireplace, burning)

In a similar vein, you might have a class called “Enemy”, which encapsulates code that enemy creatures in your game might need; “inheriting” from that, you might have the subtype “Walker”, and then individual enemies, like “Goomba”, or “Koopa Troopa”, and then descending still from that, individual instances of those enemies, with their positioning, health, and so forth.  

Abstracting the functioning of the game code is a separate, but related matter.  The “game loop”, as it’s called, creates instances of data model objects, as well as calling methods, packages of code which do further work.  A real world example of this would be driving; “accelerate” turns into putting your foot on the gas, which turns into going through a series of linkages which make the fuel-to-air mixture going into the engine more rich, which turns the motor faster, which turns the wheels faster, which (usually) makes the car accelerate.  Each of these levels is an abstraction, and the driver needs to worry about none of them, in the usual case, in order to drive successfully.

This abstraction, which is provided not just by the code we write, but also by the programming language and environment, makes programming easier, but also incurs a cost in terms of space and time.  The good news?  Moore’s law came into play, and we have loads of space and power to spare.  And this also is why programs tend to be a lot more bloated these days.  (For example, Word for Windows 2.0 required 4 MB of RAM and 8 MB of hard disk space.  Word 2013, aka Word 15, requires 1 or 2 GB of RAM, depending on your system architecture, as well as 3 GB of hard disk space.)

Java, however, complicates the question further.  Remember I mentioned compilers earlier?  Java isn’t a compiled language.  It’s run on a virtual machine — a virtual machine is a sort of intermediary between the Java code and the architecture of a machine.  Theoretically, you have one Java virtual machine for each system, and then code written in Java will theoretically run on any architecture that supports a Java VM.  The way this happens is that Java is a semi-interpreted language; it’s compiled down into “bytecode”, which is what you feed into the Java VM, which finishes the compilation process into the architecture you’re dealing with (Windows 32-bit, 64-bit, Unix, Mac, Android, whatever), and then you have a working program.  

The good news is that, since the details of the architecture you’re working with are (mostly) abstracted away, writing programs is easier.  (In fact, this leads into the Rule of Least Power:  Given a program and performance requirements, you should use the least powerful language you need to get the job done.  By “power”, here, we mean how low-level a language is, and therefore the faster it is, but by necessity, how hard it is to use.  As an aside, this also means that it’s easier to make mistakes, because you’re not as far removed from the system.)

The bad news is that you pay a penalty in speed.

But wait, I'm not finished yet!

Minecraft just looks like an 8-bit game.  Notch and the Mojang team have done a lot of work to get the game looking like a pop-up book rendition of the NES.  It’s “programmer art” — trying to stay on the abstract [as opposed to photorealistic] side of the uncanny valley.  The uncanny valley is that space where things look just real enough to be wrong somehow:

image

But the world of Minecraft is a lot of things that most NES worlds aren’t:

1) Fully 3D.  Obvious.

2) Actually very-high resolution, if you examine the borders of objects.  The blocky visuals are a stylistic choice.

3) Nearly 100% persistent.  The state of blocks is preserved across the entire world, and a surprisingly high amount of the world is persisted in memory.  (“Chunks” of the world that are sufficiently far away from your current position are unloaded and saved to disk.)  But if you mine, say, a bunny out of the wall, walk sixteen miles away, and come back, the bunny will still be there.  (Dropped items only persist for five minutes.) If you remember how, when you walked into a new area on an NES game and came back, only a handful of items were persisted?  (And that’s if you were lucky — sometimes enemies would respawn if you walked a screen width away and came back, even in the same area!)  Those were being explicitly saved in flags.  Not so in Minecraft.  (The original Deus Ex persisted the state of the area you were in, as well — that’s why its saves, at 10-20 MB, were unusually large for its era.)

There’s a lot going on.  The choices that Notch, and subsequently his team, made, were designed to build a game that delivered on its promises, used the technical capability available, but still maintained a level of abstraction sufficient to make the game practical as a solo project.  He might have made different choices if he knew the way the game was going to explode in popularity — but that’s another story.

Follow-up: The 1000-mph Man

I realize that my explanation for the effects of air drag on a man traveling at 1000 mph was kind of a cop-out.  There’s a reason for this:

image

Variables in there include the density of the fluid, the square of the object’s velocity, the reference area, and the drag coefficient of the object.  The complicated part is that drag coefficient (Cd), which depends entirely on the object’s shape.  

The drag coefficient of an upright man is in the neighborhood of 1 to 1.3.  Air at sea level has a density of 1.225 kg/m^3.  An adult man’s body surface area is about 1.9 m^2; given half of the area is exposed to the air travelling forward, we can take the reference area of the man to be about 0.95 m^2.

That gives us an applied drag force, at 1000 mph, of 133,727 Newtons (N).  The force that gravity applies depends on the mass of the object; remember the acceleration due to gravity of 9.81 m/s^2?  We have to multiply that by the mass of the man.  Let’s say our man is 180 lb, or 82 kg.  One “g” of force, therefore, would be right around 800 N.  Our 133,727 N / 800 N = about 167 g.  Therefore, our guy is totally wrecked, right?

Not exactly.  Don’t forget that drag isn’t the only force operating on the man!  The acceleration the man experiences is the sum of all the forces acting on him.  The man’s got to be experiencing some measure of thrust behind him to be travelling at 1000 mph; presuming the man’s velocity holds steady, there’s going to be 133,727 N of thrust behind him as well, for a net force of zero.  (Of course, the man is standing on the ground, and he’s experiencing gravity, but because the force he experiences from gravity, and the “normal force” — normal in the sense of a vector tangent to a surface — cancel out as well, we’re still looking at zero net force.)

But why are we sitting around, crunching numbers, anyway, when we have a real life example to look at?

image

Felix Baumgartner, the man who, on October 24 of last year, jumped from a height of 39 kilometers (24 miles, or about 130,000 feet), experienced a maximum velocity of an estimated 843.6 mph, or about Mach 1.25!  He walked away without injury, due to the extremely controlled nature of his fall; if he started into what’s called a "flat spin", he could’ve been badly hurt or killed from injuries to his brain.  (We saw that with John Stapp last post.)

So, in sum:  A human can most certainly handle travelling at speeds approaching 1000 mph — he just better be really, really careful about it!

(Hat tip to my coworker Mike, for reminding me about Felix Baumgartner!)

Asks or full posts?

I realized after I posted my first response (thanks, ecshadow) that asks aren’t rebloggable.  Apparently they’re also a little touchy in general.  To those of you wiser in tumblr-fu than me, would it make more sense to just copy the text of the questions and do a full post to reply, or stick with the ask/answer system?

ecshadow Asked:
I'm going to try and be as sensible as I can about this, and I just sort of had it pop into my head. Say we were able to break the land speed record again with current technology. I would think it's a stretch to say that the body would be able to withstand a specific amount of force. If we were able to hit up 1000mph or more, do you think the body could withstand it?

All right, let’s get this started.  So there are a few assumptions we have to make here:

Are we in a land vehicle, or are we just flying through the air?  That is, how exposed is the body to the force of air drag/friction?

How fast are we accelerating?  This turns out to be the biggest question.  We’ll start by ignoring the effects of drag, just to simplify the math a bit.  (Air will come into play in a moment.)  Turns out that, barring any external factors, a body (and I use the term here in the physics sense of “a physical thing”) only experiences force when it accelerates.  Acceleration is a change in velocity; remember velocity is a vector quantity that has both a magnitude and a direction, so acceleration can take the form of a change in either magnitude, direction, or both.  (If you’ve seen the parlor trick where a guy puts food and plates on a dinner tray suspended by a string or rope, and then whirls it around in a circle, and nothing falls off, you’re seeing the effects of acceleration which changes the direction, but not magnitude, of a body’s velocity.  That’s commonly called a centripetal force.  (Not to be confused with centrifugal force, which depending on your reference frame is either a fictitious force or an application of centripetal force.)

Newton’s second law of motion states that the acceleration of a body is proportional to the force applied to it, and inversely proportional to the object’s mass.  That is, the faster we accelerate the body, the more force it will experience; if you took a gradual jaunt from 0 to 1000 mph over, say, an hour, you wouldn’t feel much.  But if you slammed on the accelerator and did the same over ten seconds, you’d probably need to visit a chiropractor.

Acceleration is typically measured in “G’s” — technically that should be a small g, which is the acceleration due to gravity of a point mass at Earth sea level, approximately 9.81 m/s^2.  (“Big G”, the universal gravitational constant, is another matter entirely.)  

You remember the old vomit comet?

Human decelerator

This is the human decelerator at Edwards Air Force Base in California. This device was used in a test to determine how quick of a rate of deceleration (that is, acceleration that reduces the magnitude of velocity) a human could withstand.  The rider, John Stapp, suffered accelerations as high as 46.2 g, sustaining injuries that left him with broken limbs, a detached retina, and intermittent eyesight problems that would plague him forever, due to permanently burst blood vessels in his eyes.  Turns out they were pushing the limit — the “50th percentile” fatality level appears to be somewhere between 50 and 75 g’s, depending on the individual.  This would require you to accelerate from zero to 1000 mph (or 1000 mph to zero) in 0.6 seconds.

Meanwhile, the maximum acceleration a person can suffer laterally (that is, side to side) without suffering organ damage is in the neighborhood of 14 g’s.

Remember we mentioned we were neglecting air a moment ago?  Let’s bring it back in.  The reason the land speed record is so hard to break is that the air pushes back on the vehicle as it travels; the faster you go, the more drag you experience.  When drag and thrust equal out, you’ve hit your maximum speed.  

Put generally — how aerodynamic is your vehicle, and how much thrust can you bring to the equation?  (And for that matter, can you achieve that thrust without gaining lift, flying off the ground and crashing into the earth at ludicrous speeds?

She's gone to plaid!

This is kind of a trick question, because we’d have to presume that if researchers built a vehicle that could achieve 1000 mph on land, it would be built with an insanely resilient safety system that would keep the driver safe.  Speaking of which:ThrustSSC

This is the Thrust SSC, the current land speed record holder, clocking in at 763 mph (Mach 1.02).

This is the cockpit.

This is the cockpit.

And when it stops, it deploys parachutes to decelerate.  (Though in fairness, so do drag racers.)  It’s a real piece of work.

chute

Let’s get started!

Your mind is like your body — if you don’t feed it, it atrophies.  I’m a software engineer by trade, and a fair-weather game designer by hobby, but I want to make sure I keep my mind open to other knowledge.  I don’t want to get stuck in a rut.

Here are the ground rules:

  1. Ask me a question.  It’s an educational thing; questions like, “What’s the difference between Bayesian and frequentist statistics?” or “What’s the story behind Chicken Marango?”
  2. I’m looking for a moderate scope of time and effort:  something that I can’t trivially answer in a few seconds (make me do some research!), but not something that will take me dozens of hours to answer.  (“Learn Python, and tell us about your experience.”)
  3. Subjective questions are fine — it’d allow me to compare and contrast different opinions.  Even controversial questions are okay, but don’t get too cross if I don’t take sides.
  4. While I’ll try to answer all questions asked of me, I reserve the right to decline a question, because it doesn’t follow the ground rules, or for any other reason I choose.

So let’s get started — what do you want to know?  Click the “Ask a question” link up top.