Engineering, Management, Technology Consulting

  • Find the full article at uh! 11 Obvious Science Findings of 2011, By Stephanie Pappas and LiveScience  | December 31, 2011 |

    In science, it’s not enough to think something is so. Researchers must show that what  we believe to be true is in fact true, proven through statistically significant and reproducible results. Questioning assumptions is, after all, what science is about.

    I know I am glad we have data now on these:

    1. Unsafe sex is more likely after drinking

    Drinking too much alcohol can impair decision-making.

    2. Men appear confident by suppressing fear, pain and empathy

    3. Smoking pot and driving isn’t safe4. Pigs love mud

    5. Fashion magazines glorify youth

    6. People with generous partners have happy marriages

    7. Parents don’t think their kids are doing drugsParents are in denial about their own children’s bad habits.

    8. People aren’t doing anything in particular on the Internet

    9. Restricting driver’s licenses decreases teen fatalities

    10. Most shoppers ignore nutrition labels

    11. Presidents outlive their contemporaries

    Money and knowledge tend to buy health and longevity.

  • By REBECCA R. RUIZ

    Students who major in engineering and the physical sciences can expect to spend more hours in the library than those who take a concentration of courses in business and social sciences, according to a national survey of more than 400,000 undergraduates at nearly 700 colleges and universities.

    The annual survey, known as the National Survey of Student Engagement, is being released Thursday by the Indiana University Center for Postsecondary Research and offers particular insight this year into university students’ study habits. (And, for our purposes here on The Choice, it also provides something of a reality check for high school seniors who will soon be bound for college.)

    The average college student, the researchers reported, studied for 15 hours each week. Engineering students fell on the higher end of the spectrum — devoting roughly 19 hours to class preparation — while students with majors in business and social sciences were on the lower end, spending about 14 hours preparing for class. (Meanwhile, the survey found, students of business were more likely to hold a job during the school year.)

    Majors that were moderately demanding were biological sciences, arts and humanities (both requiring about 17 hours of study) and education (requiring 15 hours of study).

    Although they tended to study longer, on average, than their peers in other disciplines, engineers were the most likely to head to class without having completed all of their assignments.

    Even as seniors, the workload of engineers did not seem to abate, at least for those in the N.S.S.E. sample; they were twice as likely as business majors to spend over 20 hours a week on coursework.

    Eighty-five percent of all students surveyed — engineers and business students alike — reported that they took careful notes during class, but only 65 percent said they wound up reviewing those notes afterward.

    Students who studied education reported the highest rates of learning (86 percent reported significant gains), while engineering and business students also rated their academic growth highly (80 percent reported what they considered to be great gains). In comparison, about 65 percent of those who studied biological, physical and social sciences or arts and humanities described their learning as significant.

    Separately, the survey found that students whose parents had attended college spent more time preparing for class than those who were the first in their families to attend school. At the same time, those first-generation students were more likely to employ “effective learning strategies,” including studying in groups and meeting with professors.

    The survey also asked after involvement in Greek life, concluding that while membership in a fraternity or sorority was good for personal development and community engagement, its benefits might have been overshadowed by “increased risky behaviors and smaller cognitive gains.”

    Notably, students involved in Greek life reported studying, socializing and working at rates comparable to non-Greek students, suggesting that their extracurricular commitments did not displace any other activities.

    Meanwhile, transfer students, the survey found, were more likely to work off campus and care for dependents, decreasing their sense of connection to the college community. They were, on average, older and more racially diverse than broader undergraduate populations.

  • Every new invention changes the world — in ways both intentional and unexpected. Historian Edward Tenner tells stories that illustrate the under-appreciated gap between our ability to innovate and our ability to foresee the consequences.

  • MoooJvM’s Channel – YouTube

    Sadly, as our daily lives become more and more digital some things fall by the way side as they are replaced by newer, «better» devices.
    Let us not forget those fallen appliances, tools and gadgets and relive those bygone times by taking a visit to The Museum of Obsolete Objects. Step inside to step back in time!

  • From the NYTime, By , Published: December 5, 2011

    Neal Stephenson doesn’t like talking about how he predicted the future. Before the invention of the Web, Neal Stephenson saw the Metaverse, a virtual place for people to meet.

    Mr. Stephenson is the author of a baker’s dozen novels whose dazzling range includes rocket-propelled thrillers and a vast historical epic known as the Baroque Cycle, which takes place at the dawn of modern science in the time of Newton and Leibniz.

    His reputation for prescience is well earned, even if he regards it lightly. His third novel, “Snow Crash,” was published way back in ancient 1992 and laid out many of the attributes of today’s online life, including the Metaverse, a virtual place where people meet, do business and play, presenting themselves as avatars. If you’ve ever played wildly popular multiplayer online games like World of Warcraft, or visited the virtual communities of Second Life, you can get a chill thinking about what he saw back before the popularization of the World Wide Web.

    The title of seer falls to novelists now and then: William Gibson got it in the early 1980s for works envisioning cyberspace; more recently, Gary Shteyngart picked it up for the eerie elements of a future foretold, including people’s total reliance on their “apparat” smartphones in his novel “Super Sad True Love Story.”

    Asked to join in a conversation about the future of computing grounded, in part, in that sense that many of us are living in a world he envisioned, Mr. Stephenson, 52, responded with the e-mail equivalent of a sigh.

    Politely, but with a suggestion of weariness, he wrote: “Thanks for thinking of me, but this is actually just about my least favorite interview topic of all time. I’m not sure why.”

    He added, “It seems reasonable, but I just can’t find the joy in talking about how I allegedly ‘foretold’ stuff.” He then agreed to the interview anyway, to coincide with a trip to New York.

    Over lunch at the Omni Berkshire Place hotel in Manhattan, he explained his reticence about taking credit for clairvoyance. For one thing, he said, it isn’t really true. “The way the Internet developed, in my mind, is completely different from the Metaverse in ‘Snow Crash,’ ” he said. “I can talk all day long about how wrong I got it. But there are a lot of people who feel as though that was an accurate prediction.”

    He described his reputation as a seer in terms resembling what psychologists call the focusing illusion — too much attention to some aspects of a thing and too little to others. Science fiction writers are “shotgunning ideas” through their works, he said, and people tend to recall the pellets that hit the target. “The perception of goodness is just selective,” he said.

    He suggested that the near-reverence for writerly efforts at prognostication — and Mr. Stephenson’s many fans can be disconcertingly passionate — are downright goofy in light of the fact that “there must be countless billions of dollars that go every year into really systematic and motivated future-predicting activities.”

    Of course, those systematic and well-financed efforts can fail spectacularly, as when the intelligence community failed to anticipate India’s test of nuclear bombs in 1974. Remember those Iraqi weapons of mass destruction?

    The thought receives a sardonic grin in reply.

    Regardless of whether he foresaw the future, he did foresee the world he would create in his most recent novel, “Reamde,” a 1,056-page return to the thriller genre after years of weightier fare like the Baroque Cycle.

    Its protagonist, Richard Forthrast, is a onetime marijuana smuggler who becomes the creator of a successful multiplayer online game, T’Rain. There are hackers who unleash a malicious computer program (the titular Reamde) and set in motion a swirling plot that involves Russian mobsters, British and American agents, Chinese hackers and Forthrast’s niece Zula and her hacker boyfriend. A tag line for the novel on Mr. Stephenson’s Web site reads, “When the virtual game becomes real, you win or you die.”

    Which is also a great line from George R. R. Martin’s “Game of Thrones,” but let’s let the folks in marketing thrash that one out.

    This use of the vast resources of a networked world to play games with mortal stakes is a central element of the world that Mr. Stephenson laid out in books like “Snow Crash,” and in another of his novels, “The Diamond Age: Or, a Young Lady’s Illustrated Primer.” That book, published in 1995, described in detail an interactive book that can teach lessons on many topics, entertain with games and provide virtual training for living in the real world.

    And the notion of learning real-world skills from online gaming is also a big theme of “Reamde,” in which young Chinese hackers hone their skills in English, learn effective problem-solving skills, even jury-rig a sail for a motorboat that has run out of fuel.

    When he was asked, toward the end of lunch, where he thought computing might be headed, he paused to rephrase the question. “I’ll tell you what I’d like to see happen,” he said, and began discussing what the future was supposed to have looked like, back in his 1960s childhood. He ticked off the tropes of what he called “techno-optimistic science fiction,” including flying cars and jetpacks. And then computers went from being things that filled a room to things that could fit on a desk, and the economy and industries changed. “The kinds of super-bright, hardworking geeky people who, 50 years ago, would have been building moon rockets or hydrogen bombs or what have you have ended up working in the computer industry, doing jobs that in many cases seem kind of ignominious by comparison.”

    Again, a beat. A consideration, perhaps, that he is talking about the core readership for his best sellers. No matter. He’s rolling. He presses on.

    “What I’m kind of hoping is that this is just kind of a pause, while we assimilate this gigantic new thing, ubiquitous computing and the Internet. And that at some point we’ll turn around and say, ‘Well, that was interesting — we have a whole set of new tools and capabilities that we didn’t have before the whole computer/Internet thing came along.’ ”

    He said people should say, “Now let’s get back to work doing interesting and useful things.”

    The needs of the world are great: New forms of energy, space transportation and infrastructure all need to be tackled with imagination and innovation, he said. He grew animated as he discussed his latest initiative: He is now pushing for a return to a can-do American culture that can “get big stuff done.”

    He is trying to carve out a place for science fiction to help by, well, predicting the future. Wait, that’s not quite it: He wants science fiction to help by creating the future — supplying the imagination and inspiration to the next generation of engineers and scientists, just as writers like Stanislaw Lem helped inform and usher in the space age. And as he might have done by giving engineers a vision of the Metaverse, though he doesn’t say that part out loud.

    He is doing it through Hieroglyph, a project of writers who hope to reignite the popular imagination to “develop new technologies and implement them on a heroic scale,” as he put it in a recent essay in World Policy Journal, with fiction that returns to its techno-optimistic roots.

    The tools of computing and a networked world can help to get us there, he said, but only if we use our newfound power as a means to those ends, instead of as ends in themselves.

    “We can’t Facebook our way out of the current economic status quo.”