The Unz Review - Mobile
A Collection of Interesting, Important, and Controversial Perspectives Largely Excluded from the American Mainstream Media
 iSteve BlogTeasers
NPR: "When Women Stopped Coding"

IBM System/360 ad, 1964

From NPR:

When Women Stopped Coding
by STEVE HENN
October 21, 2014 8:54 AM ET

Modern computer science is dominated by men. But it hasn’t always been this way.

IBM ad

A lot of computing pioneers — the people who programmed the first digital computers — were women. And for decades, the number of women studying computer science was growing faster than the number of men. But in 1984, something changed. The percentage of women in computer science flattened, and then plunged, even as the share of women in other technical and professional fields kept rising.

What happened?

We spent the last few weeks trying to answer this question, and there’s no clear, single answer.

But here’s a good starting place: The share of women in computer science started falling at roughly the same moment when personal computers started showing up in U.S. homes in significant numbers.

I.e., about the time when computing stopped being a career, it started being an adventure. Before the personal computer came along, computers were most famously associated with IBM. IBM was the most valuable company on the New York Stock Exchange for much of the 1960s and represented extreme respectability (with a certain muted sexy Mad Men glamor):

IBM System/360 ad

Part of IBM’s shtick had been that it shied away from the kind of Disruption Hype we’re used to hearing from the computer industry today. Instead, IBM presented its computers as a reassuring part of the evolution of office machines, such as its old keypunch machines and its superb electric typewriter beloved by secretaries everywhere. (Possession of an IBM Selectric was a status symbol among secretaries when I started working in offices in the 1970s.)

IBM emphasized how anti-Disruptive its computers were: it put tremendous efforts into making business computers as painless to adopt as possible for large corporations. They were immensely expensive for what they did, but IBM tried very hard to make them not scary. Not surprisingly, women had a not insignificant role in this latest version of Office Work.

Tom Watson Sr., the famous CEO of IBM, recognized that women made up a huge fraction of office workers. From an IBM promotional document:

By 1953, IBM had enacted an unequalled string of progressive workplace programs and policies, from hiring the disabled in 1914, to the arrival of professional women and equal pay for equal work in 1935, to appointing the company’s first female vice president, Ruth Leach Amonette, in 1943. Amonette was one of the first executives, male or female, to publicly state the business case for diversity. Upon her appointment she asked, rhetorically, “Doesn’t it make sense to employ people who are similar to your customers?”

A case study: In the fall of 1984, the late Dr. Gerry Eskin, the vice-chairman of the market research company where I worked, gave me his PC XT and I immediately went nuts over the potential of the PC. I worked full time on introducing PCs to the company from 1986 to mid-1988. My nemesis during this era was D., the woman in charge of the huge staff that ran the mainframe, who hated microcomputers.

Back to NPR:

These early personal computers weren’t much more than toys. You could play pong or simple shooting games, maybe do some word processing. And these toys were marketed almost entirely to men and boys.

Wozniak and Jobs, 1975

This idea that computers are for boys became a narrative. It became the story we told ourselves about the computing revolution. It helped define who geeks were and it created techie culture.

Movies like Weird Science, Revenge of the Nerds, and War Games all came out in the ’80s. And the plot summaries are almost interchangeable: awkward geek boy genius uses tech savvy to triumph over adversity and win the girl.

So, it’s like Society then engaged in a Giant Conspiracy to undermine the Rousseauan paradise of the gender equal computing industry before The Evil Woz came along and ruined everything by inventing the personal computer.

The Woz, 2012

In reality, however, the IBM Era had been a giant conspiracy by IBM to make computers as non-disruptive as possible. Before the PC, computing was the most famously well-organized and decorous career-path in America. The PC liberated the male sex to finally do what a lot of guys had been itching to do for hundreds of thousands of years: not shower, stay up all night, and obsess over something in which human emotions and codes of polite manners played no role.

 
Email This Page to Someone

 Remember My Information



=>
Commenters to Ignore...to FollowEndorsed Only
[Filtered by Reply Thread]
  1. This and this is what actually happened.

    When PCs first appeared in biz offices, the opportunity to hack on them and make big money was wide open to anybody who wanted to learn the systems and do the work.

    Employers pleaded with people to learn.

    Read More
    ReplyAgree/Disagree/Etc.
    AgreeDisagreeLOLTroll
    These buttons register your public Agreement, Disagreement, Troll, or LOL with the selected comment. They are only available to recent, frequent commenters who have saved their Name+Email using the 'Remember My Information' checkbox, and may also only be used once per hour.
    Sharing Comment via Twitter
    http://www.unz.com/isteve/npr-when-women-stopped-coding/#comment-744270
    More... This Commenter Display All Comments
  2. It’s funny how they have to appeal to that evil, racist, and sexist decade of the 1950s as an example of how things “should” be.

    Read More
  3. Anonymous says:     Show CommentNext New Comment

    It strikes me that this article would make a great ‘interesting history of how the image of computer scientists flipped over the second half of the 20th century’ article in a ‘respectable’ magazine if the gender elements were expunged.

    Read More
  4. If you’ve read my links, I’ll continue to explain what, it seems to me, actually happened.

    I moved on to become a sophisticated coder and multimedia artist in the late 80s.

    So, what actually happened with the introduction of PCs was… the same old story. The Wild Wild West opened up when PCs entered the market in a big way. Anti-trust legislation broke up IBM. Systems were very different in every office. A coder wasn’t working, as he was during the IBM era, with a stagnant and unchanging system that could be taught in a classroom. He had to produce custom results on demand for a wide variety of businesses.

    When a stagnant system characterized by monopoly breaks up, the Wild Wild West becomes the reality. Only men and whores want to be in the Wild Wild West.

    Read More
    • Replies: @Peter Akuleyev
    As far as I can tell, you are telling the same story Steve is, just from a different angle.
    , @BurplesonAFB

    He had to produce custom results on demand for a wide variety of businesses.
     
    Yeah that's my perception of the mainframe->PC shift as well. It went from being primarily about perfect accuracy doing a small number of things to being about holding a giant library of functions in your head and using them to do a unique task you'd never seen before. Add to that the ability of someone programming on PC for PC to iterate code very quickly (do something cool and run it, if it doesn't work find out why it breaks) as compared to the old mainframe era where (especially with punch cards) iteration took so long that the importance was on minimizing the number of mistakes.

    When someone understands how computing has changed, a general theory of the sexes would tell them that women would do better in the old mainframe era. Yes, even with condescending, alcohol drinking, ass pinching, non equality believing male bosses.
    , @Lagertha
    Sooo...funny! Finally, a great example that makes sense. As a mother of 3 gifted STEM boys, coders, and general renegades, who are also "wordly," social and athletic, I can relate to this idea. I have always been a renegade myself, but have only found a very small group of women the last 35 years to share my struggles/triumphs with. Bold and driven women whether in STEM or not ( art world for me) are rare, and, people should just accept that, and get over themselves (it seems that people who aren't good at math/coding or have no creative talent, are always seething about illusory "fairness" in the work place). No one has dared to ever hold me back, or stifle my opinions, but I have actually had more negative comments from women who were/are intimidated by my intelligence, tenacity, artistic/creative ability. I have been called a "ball buster" by men (not my husband!) and "intense/aloof/dismissive" by women.

    And, weirdly, I am relieved that my sons can just stay on their "trailblazing" (good, idea, your "Wild West" analogy) paths without being hindered by petty envy from anyone because they are just "faster" and ahead of the curve all the time. Had I had a daughter, she would be under the scrutiny of both men and women like I was - so I would have had to teach her the skills to ignore most people who are not as smart/driven as she is. So, yeah, I get why there are more boys in comp sci or most STEM fields, and, it is not due to institutional sexism. My MIT faculty father raised me like a boy, or really, in a non-gender way because we are Scandinavian, and, the Nordic culture expects every child to have the same survival skills. I have told my sons to marry one day , someone who is smarter, has more degrees, and possibly, out-earns you...can carry you up a mountain, pay the mortgage if you get laid-off
    ! This is the new "trophy" wife in northern Europe.

  5. My mother was a computer (which explains why I am so logical!). She got a degree in mathematics and was hired as a computer straight out of university. Being a computer was considered an ideal profession for young ladies—like being a secretary but for women who were good with numbers. When the place she worked for bought an electronic computer she was promoted to programmer. After quitting to become a stay-at-home mother, during which time she got a computer science degree by correspondence, she then went back into computing when all her children were in their teens. She was offered a management position at her current job but she turned it down because she prefers doing the technical stuff. During all her years in computing she has never encountered any sexism.

    Read More
  6. @Shouting Thomas
    If you've read my links, I'll continue to explain what, it seems to me, actually happened.

    I moved on to become a sophisticated coder and multimedia artist in the late 80s.

    So, what actually happened with the introduction of PCs was... the same old story. The Wild Wild West opened up when PCs entered the market in a big way. Anti-trust legislation broke up IBM. Systems were very different in every office. A coder wasn't working, as he was during the IBM era, with a stagnant and unchanging system that could be taught in a classroom. He had to produce custom results on demand for a wide variety of businesses.

    When a stagnant system characterized by monopoly breaks up, the Wild Wild West becomes the reality. Only men and whores want to be in the Wild Wild West.

    As far as I can tell, you are telling the same story Steve is, just from a different angle.

    Read More
  7. Anonymous says:     Show CommentNext New Comment

    It was all about nerd-dom and boys’ toys right from the beginning.
    It was the military who were behind the computing revolution. What can be more nerdy than calculating shell trajectories or code breaking?

    Read More
  8. (Possession of an IBM Selectric was a status symbol among secretaries when I started working in offices in the 1970s.)

    The man who designed the IBM Selectric:

    Eliot Fette Noyes (August 12, 1910 – July 18, 1977) was a Harvard-trained American architect and industrial designer, who worked on projects for IBM, most notably the IBM Selectric typewriter and the IBM Aerospace Research Center in Los Angeles, California. Noyes was also a pioneer in development of comprehensive corporate-wide design programs that integrated design strategy and business strategy. Noyes worked on corporate imagery for IBM, Mobil Oil, Cummins Engine and Westinghouse.
    Eliot Noyes was born in Boston, Massachusetts. Shortly after his birth, Noyes moved to Colorado where he resided until age seven. At this point, Noyes and his family moved to Cambridge, Massachusetts. Noyes’ father taught English at Harvard and his mother was an accomplished pianist. He was not always set on architecture. As a teen, he seriously contemplated becoming a painter; however by age 19 he had his mind set on architecture. He first enrolled at Harvard University in 1932 to obtain a bachelor’s degree in the Classics. Noyes’ experience at Harvard was unlike the other four members of Harvard Five. When he arrived at Harvard, the school was still under the influence of the Beaux-Arts architecture movement – hardly the modernist influence that the other four received. However, after meeting guest lecturer Le Corbusier in the school library, his architectural outlook changed entirely. He was inspired by Le Corbusier’s work and researched the Bauhaus. In his junior year at Harvard, he traveled to Iran for an archaeological expedition. Upon returning to the school, Noyes found that Harvard had undergone a complete revolution. Gropius and Breuer had already arrived there, and with them came a new modernist spirit at the school.[2] In 1938 he received his architecture degree from Harvard Graduate School of Design.

    While at Harvard, Noyes was also a member of the Harvard soaring club and flew the club’s new Schweizer Aircraft-built SGU1-7 glider.[3]

    Noyes spent twenty-one years working as consultant design director for IBM, designing the IBM Selectric typewriter in 1961 and numerous other products, while also advising the IBM internal design staff.[1] Prior to his work on the Selectric, Noyes was commissioned in 1956 by Thomas J. Watson, Jr to create IBM’s first corporate-wide design program — indeed, these influential efforts, in which Noyes collaborated with Paul Rand and Charles Eames, have been referred to as the first comprehensive design program in American business. Noyes was commissioned regularly by IBM to design various products as well as buildings for the corporation. His most famous and well known of these buildings are the IBM building in Garden City, NY (1966), the IBM Aerospace Building in Los Angeles, CA (1964), The IBM Pavilion Hemisfair in San Antonio, TX (1968) and the IBM Management Development Center in Armonk, NY (1980). Noyes also selected other notable architects such as Mies van der Rohe, Eero Saarinen, Marco Zanuso and Marcel Breuer to design IBM buildings around the world.[2]

    Noyes also redesigned the standard look for all Mobil gasoline stations during the 1960s (and hired the graphic design firm Chermayeff & Geismar to redesign the Mobil logo). His New Canaan, Connecticut residence is regarded as an important piece of Modernist architecture.[2]

    Read More
    • Replies: @Jus' Sayin'...
    The man who designed the IBM Selectric:

    You mean " who designed the CASE of the IBM Selectric." The working innards of the Selectric - what made it a revolutionary step in technology and every typist's dream machine - were designed by a team of engineers, not architects. The foundational engineering of the Selectric - the rotating ball typing head - never changed over the life of the machine although major incremental changes were made to other operational aspect. It's my recollection that the external design, the styling, changed radically over time from the rounded, kitchen appliance look of the earliest models to the squared off much more linear casing of the later versions.
    , @Another Canadian
    I call b.s. on Eliot Noyes as "designing" the Selectric. Just look at who had the patents for the key electromechanical elements of the Selectric, mostly engineers at the Lexington, Kentucky Office Products Division. The Japanese never could reproduce the electromechanical reliability of the Selectric, that's why there were millions of Selectrics in Japan. That golf ball moving precisely and reliably in three dimensions was one of the few products in history that could not be duplicated once the patents expired.
    , @Jack D
    Noyes "designed" the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism - the spinning and tilting golf ball which was driven by an ingenious mechanical "computer" called a "whiffletree"- it was digital but NOT electronic. It had an electric motor to drive the mechanism but it would have worked just the same if it was driven by a foot treadle (like an old sewing machine). Every key press mechanically generated a unique two digit "code" - one of which was the number of steps to rotate the ball from its resting position and the other was the number of steps of tilt. This, and not Noyes's case, is what made the Selectric special.
  9. Here is an idea I’ve not seen explored elsewhere: at the very top of levels of the software industry, verbal fluency in engineers rises in importance in a steep curve until it’s almost as important as coding skill. I would have thought this would work to women’s advantage, but I don’t see much of that. Women in tech still gravitate to the soft side.

    Read More
  10. I have to admit those old photos from the IBM brochures look so appealing: no goofball coders with their little sci-fi affectations, ratty t-shirts or My Little Pony paraphernalia. One of the worst things to come out of the PC revolution was this idea that coders needed to be “characters”. I’ve come to hate “characters”. After seeing enough presentations by web developers who fill their slide presentations with “cute” photos, swear for no obvious reason, make arcane sci-fi references, one longs for that lost era of IBM guys in white shirts and ties. To be fair, there are some of the current batch who look like slobs but actually know their stuff and can convey information without all the geeky attempts at showmanship. I’m thinking of one coder who makes Steve Wosniak look like Alain Delon but is very gracious and business-like when actually making a presentation.

    Read More
  11. More than a kernel of truth here. Business machines from A.B. Dick mimeo copiers to NCR’s cash registers and adding machines were the natural province of women retail, office and school workers. Men had slide rules, women had those key punch cards.

    Read More
  12. @syonredux

    (Possession of an IBM Selectric was a status symbol among secretaries when I started working in offices in the 1970s.)
     
    The man who designed the IBM Selectric:

    Eliot Fette Noyes (August 12, 1910 – July 18, 1977) was a Harvard-trained American architect and industrial designer, who worked on projects for IBM, most notably the IBM Selectric typewriter and the IBM Aerospace Research Center in Los Angeles, California. Noyes was also a pioneer in development of comprehensive corporate-wide design programs that integrated design strategy and business strategy. Noyes worked on corporate imagery for IBM, Mobil Oil, Cummins Engine and Westinghouse.
    Eliot Noyes was born in Boston, Massachusetts. Shortly after his birth, Noyes moved to Colorado where he resided until age seven. At this point, Noyes and his family moved to Cambridge, Massachusetts. Noyes’ father taught English at Harvard and his mother was an accomplished pianist. He was not always set on architecture. As a teen, he seriously contemplated becoming a painter; however by age 19 he had his mind set on architecture. He first enrolled at Harvard University in 1932 to obtain a bachelor’s degree in the Classics. Noyes’ experience at Harvard was unlike the other four members of Harvard Five. When he arrived at Harvard, the school was still under the influence of the Beaux-Arts architecture movement – hardly the modernist influence that the other four received. However, after meeting guest lecturer Le Corbusier in the school library, his architectural outlook changed entirely. He was inspired by Le Corbusier’s work and researched the Bauhaus. In his junior year at Harvard, he traveled to Iran for an archaeological expedition. Upon returning to the school, Noyes found that Harvard had undergone a complete revolution. Gropius and Breuer had already arrived there, and with them came a new modernist spirit at the school.[2] In 1938 he received his architecture degree from Harvard Graduate School of Design.

    While at Harvard, Noyes was also a member of the Harvard soaring club and flew the club's new Schweizer Aircraft-built SGU1-7 glider.[3]

    Noyes spent twenty-one years working as consultant design director for IBM, designing the IBM Selectric typewriter in 1961 and numerous other products, while also advising the IBM internal design staff.[1] Prior to his work on the Selectric, Noyes was commissioned in 1956 by Thomas J. Watson, Jr to create IBM's first corporate-wide design program — indeed, these influential efforts, in which Noyes collaborated with Paul Rand and Charles Eames, have been referred to as the first comprehensive design program in American business. Noyes was commissioned regularly by IBM to design various products as well as buildings for the corporation. His most famous and well known of these buildings are the IBM building in Garden City, NY (1966), the IBM Aerospace Building in Los Angeles, CA (1964), The IBM Pavilion Hemisfair in San Antonio, TX (1968) and the IBM Management Development Center in Armonk, NY (1980). Noyes also selected other notable architects such as Mies van der Rohe, Eero Saarinen, Marco Zanuso and Marcel Breuer to design IBM buildings around the world.[2]

    Noyes also redesigned the standard look for all Mobil gasoline stations during the 1960s (and hired the graphic design firm Chermayeff & Geismar to redesign the Mobil logo). His New Canaan, Connecticut residence is regarded as an important piece of Modernist architecture.[2]

     

    The man who designed the IBM Selectric:

    You mean ” who designed the CASE of the IBM Selectric.” The working innards of the Selectric – what made it a revolutionary step in technology and every typist’s dream machine – were designed by a team of engineers, not architects. The foundational engineering of the Selectric – the rotating ball typing head – never changed over the life of the machine although major incremental changes were made to other operational aspect. It’s my recollection that the external design, the styling, changed radically over time from the rounded, kitchen appliance look of the earliest models to the squared off much more linear casing of the later versions.

    Read More
  13. Someone feel free to correct me here, but isn’t the “programming” of room-sized mainframe computers less complicated than learning the sorts of languages that are required to master programming on personal computers? It’s not like these women were dealing with the same kind of issues as contemporary code monkeys?

    Read More
  14. @syonredux

    (Possession of an IBM Selectric was a status symbol among secretaries when I started working in offices in the 1970s.)
     
    The man who designed the IBM Selectric:

    Eliot Fette Noyes (August 12, 1910 – July 18, 1977) was a Harvard-trained American architect and industrial designer, who worked on projects for IBM, most notably the IBM Selectric typewriter and the IBM Aerospace Research Center in Los Angeles, California. Noyes was also a pioneer in development of comprehensive corporate-wide design programs that integrated design strategy and business strategy. Noyes worked on corporate imagery for IBM, Mobil Oil, Cummins Engine and Westinghouse.
    Eliot Noyes was born in Boston, Massachusetts. Shortly after his birth, Noyes moved to Colorado where he resided until age seven. At this point, Noyes and his family moved to Cambridge, Massachusetts. Noyes’ father taught English at Harvard and his mother was an accomplished pianist. He was not always set on architecture. As a teen, he seriously contemplated becoming a painter; however by age 19 he had his mind set on architecture. He first enrolled at Harvard University in 1932 to obtain a bachelor’s degree in the Classics. Noyes’ experience at Harvard was unlike the other four members of Harvard Five. When he arrived at Harvard, the school was still under the influence of the Beaux-Arts architecture movement – hardly the modernist influence that the other four received. However, after meeting guest lecturer Le Corbusier in the school library, his architectural outlook changed entirely. He was inspired by Le Corbusier’s work and researched the Bauhaus. In his junior year at Harvard, he traveled to Iran for an archaeological expedition. Upon returning to the school, Noyes found that Harvard had undergone a complete revolution. Gropius and Breuer had already arrived there, and with them came a new modernist spirit at the school.[2] In 1938 he received his architecture degree from Harvard Graduate School of Design.

    While at Harvard, Noyes was also a member of the Harvard soaring club and flew the club's new Schweizer Aircraft-built SGU1-7 glider.[3]

    Noyes spent twenty-one years working as consultant design director for IBM, designing the IBM Selectric typewriter in 1961 and numerous other products, while also advising the IBM internal design staff.[1] Prior to his work on the Selectric, Noyes was commissioned in 1956 by Thomas J. Watson, Jr to create IBM's first corporate-wide design program — indeed, these influential efforts, in which Noyes collaborated with Paul Rand and Charles Eames, have been referred to as the first comprehensive design program in American business. Noyes was commissioned regularly by IBM to design various products as well as buildings for the corporation. His most famous and well known of these buildings are the IBM building in Garden City, NY (1966), the IBM Aerospace Building in Los Angeles, CA (1964), The IBM Pavilion Hemisfair in San Antonio, TX (1968) and the IBM Management Development Center in Armonk, NY (1980). Noyes also selected other notable architects such as Mies van der Rohe, Eero Saarinen, Marco Zanuso and Marcel Breuer to design IBM buildings around the world.[2]

    Noyes also redesigned the standard look for all Mobil gasoline stations during the 1960s (and hired the graphic design firm Chermayeff & Geismar to redesign the Mobil logo). His New Canaan, Connecticut residence is regarded as an important piece of Modernist architecture.[2]

     

    I call b.s. on Eliot Noyes as “designing” the Selectric. Just look at who had the patents for the key electromechanical elements of the Selectric, mostly engineers at the Lexington, Kentucky Office Products Division. The Japanese never could reproduce the electromechanical reliability of the Selectric, that’s why there were millions of Selectrics in Japan. That golf ball moving precisely and reliably in three dimensions was one of the few products in history that could not be duplicated once the patents expired.

    Read More
  15. I heard the earliest segment of this NPR piece. Some of it was laughably obtuse. The early women codeers they referenced, speaking reverently as if they were Knuth-equivalents, were actually operators of mechanical calculators on a computational assembly line doing piecemeal implementation implementation of algorithms that to the best of my knowledge were almost solely designed by men. Women have made major contributions to computer science – NPR mentioned two, Grace Hopper and Ada Lovelace — but as is usually the case in these pieces the overall role of women was grossly over-stated. When given freedom to do what they want very gifted men and women choose to do different things in a very predictable way. This drives feminists and other whack-job lefties out of their minds but it’s reality and we all had better learn tio live with it.

    Read More
  16. IBM was the most valuable company on the New York Stock Exchange for much of the 1960s and represented extreme respectability (with a certain muted sexy Mad Men glamor):

    Mad Men and IBM typewriters:

    “Now try not to be overwhelmed by all this technology. It looks complicated, but the men who design it made it simple enough for a woman to use.” — Joan reassuring Peggy on her first day

    Typewriters had been commercially marketed ever since the 1870s, with the Hansen Writing Ball. On Mad Men, most of the typewriters seem to resemble IBM’s Selectric model, which dates back to 1961. An electric machine, the Selectric used a swiveling ball that pivoted before striking the typebars onto the ribbon.

    Read More
    • Replies: @Jack D
    The Hansen writing ball never caught on, especially not in the US. The first really practical typewriter was the Remington, which introduced the the grid layout QWERTY "keyboard" which is what you used to type your message. Until the Selectric, almost all typewriters were modeled after the Remington - pressing a key caused a typebar to fly up and strike the paper - a different typebar for each key. The Selectric was a nice refinement (it produced beautiful looking documents) but offices had been using typewriters for over 80 years by the time it came on the market. From a secretary's point of view, the Selectric required almost no retraining from their familiar typewriters.

    Your quote is from an idiotic "Mashable" entitled "Mad Men Tech: 9 Devices That Changed the 1960s Office".

    http://mashable.com/2012/03/22/mad-men-tech/

    #1 is the Selectric, which was not revolutionary at all, as I explain above.

    #2 is the Xerox machine, which WAS truly revolutionary.

    After that, it's all a stretch - jukeboxes, riding mowers, etc. Huh? Idiocracy here we come!
  17. If you want to go back further in office technology and “coding,” my great-great-grandmother was one of many girl telegraph operators. Some snippets I’ve pulled from accounts of the time:

    “Aunt Lydia was one of the girl telegraph operators of pioneer days. The Deseret Telegraph line was extended from St. George to the mining camps of Pioche and Ely, Nevada in 1872, and it was found necessary to open an office in Hebron.
    “Daniel Tyler was called from Beaver to care for the Hebron office and to teach a school in telegraph for those who cared to learn. Mr. Tyler made wooden keys for the class of four which included Jeter Snow, Zera P. Terry, Sarah Crosby and Alydia Terry, the latter being sixteen years old. The instruction carried over a period of three months from January to March, 1872. She was then placed in charge of the Hebron office where she served for several months. She was then transferred to the office in Panaca, Nevada, and at the age of nineteen went to take care of the office at Pipe Springs, on the border of Utah and Arizona.”

    “It was while working at Pipe Springs Telegraph Office that Alydia met Anson Perry Winsor, Jr., a stalwart young cattleman of that place. Her fancy was taken first with the fine way he handled his horses and the way he carried himself when riding, also of the truthful way he told her of his faults.”

    “After their marriage the young Winsors moved to Hebron, 1878, and Lydia took over the office there. Finally in 1888 the office was moved into her home, and she was operator for four years until 1892 when they moved to St. George. She taught a class in telegraphy. Among those she trained, was Alma A. Nelson, who became a first-class operator of that day and was employed for years in the general office in Salt Lake City.”

    Read More
  18. Back before the PC, companies like IBM and Burroughs did not just sell a computer. They sold a person or team to setup and run your new computer. In larger organizations, they would even build out the computer room with raised floors and the required cabling. Everything about the pitch was to make it appear safe to the guy writing the check. It’s when the expression, “no one ever got fired for recommending IBM” got started.

    As others have noted, the PC let men have fun with computing. The Hayes modem probably should get the credit for the rise of the male coder. Sitting around by your lonesome writing code is much more fun when you can go on-line and compete against others. The Hayes modem let every secret pirate get out of port. That was the attraction for me, least ways.

    Read More
  19. When “coders were women”, coding meant you were given an exact specification of the algorithm and simply had to implement it in code, with no regard for what the algorithm was doing or why. This skill was taught in trade schools, like drafting or typing.

    You couldn’t get rich doing that then or now. The real money is in conceiving and developing algorithms. Coding is certainly harder than typing, but roughly speaking coding is to software as typing is to literature.

    Read More
  20. Er, okay. Or maybe it’s because the older programming languages were much more chatty than the current crop?

    Read More
  21. “scary”

    High variance would be more precise and less patronizing.

    However justified and necessary that patronization, it may not be a battle worth fighting at this point.

    Read More
  22. The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    Read More
    • Replies: @TWS
    The way I understand it the PC era was brought about by guys dumpster diving for IBM code used for building their PCs.
    , @Lurker
    But ironically the IBM PC clone (other than Apple Macs) became the de facto standard PC.
    , @Anonymous
    IBM owned computing, except for a minicomputer slice of the market that DEC was in. There virtually was no-one else to deal with for most of the 1960s and 70s. And remember, at the time, no-one ever got fired for bringing IBM in to provide a computing solution.

    I still believe the IBM Selectric was the second-greatest engineering achievement of the era, next to the Apollo program. The Selectric is one of those things I don't believe we could do anymore if we had to. The engineering creativity that Bud Beattie's team in Lexington put into the golf ball mechanism is still impressive over 50 years later. Something as simple as the mechanism to keep more than one key from being struck at the same time was brilliant. And you are right, Noyes was just the metal-bender, but the best one of his era. If GM had him at the time they never would have lost ground to the Japanese.

    , @Bill M
    It's not just that. IBM consciously saw itself as a sales and customer service based company, rather than a disruptive tech company based on research and developing new tech and products. The business model was pushing older, established tech with an army of salesmen and providing responsive customer service. The Watsons prioritized salesmen and salesmen dominated the company's management. They didn't think much of techies, at least at their company. Watson Sr. regarded the salesman as a kind of American hero.
    , @JImbo
    The first PCs (mostly running CP/M) were products of the West Coast. In order to come up with a competitive machine, IBM had to sequester a group of engineers far away from HQ in Boca Raton, Florida, for a year to cobble something together out of off-the-shelf parts.
    , @EriK
    Right, because Digital Equipment Corporation was the west coast of the north shore.
    , @Jack D
    This is a great "just so" story but the truth is that IBM was instrumental in the widespread adoption of the personal computer. Every Windows computer, all X gazillion of them, is a direct descendant of the IBM PC.
    , @Reg Cæsar

    The microcomputer (as it was once called) could have only been invented on the more individualist west coast.
     
    OK, but the biggest advances took place at the West Coast branch of an East Coast-- no, make that North Coast-- firm, Xerox.
    , @polistra
    I'm not convinced that the Altair 8080 micro was all that important. It was the playground and testbed for those Silicon Valley boys, but it didn't turn into anything remotely commercial.

    DEC already had desktop computers in the late 60s, which were glorified word processors. (ie computers used by women.) IBM answered DEC with its first PC in 1975, but didn't get behind it. After Apple had popularized its own answer to the DEC desktops, IBM finally saw a market and tried its second PC in 1981, which caught on.
  23. anonymous says:     Show CommentNext New Comment

    I sometimes tell people in silicon valley that there was a time when computers were widely used and no software came from silicon valley or the few other large “centers of the universe” today (because a commercial software industry didn’t exist). Instead, every little community college in every little city in America taught programmers, often women. These programmers worked right there, in the little flyover cities were they lived, and wrote programs for the local businesses that hired them. I can tell that many people don’t really believe me.

    I think it was more than the advent of PCs. The commercial software industry got started surprisingly late. Up until the mid-to-early 80s software was often bundled with computers. Computing was “vertically integrated”, in the sense that you bought a complete system from one vendor, which was likely largely incompatible with other vendor’s systems. Applications up to this point were often (usually) developed “in house” by programmers who were employees of the company that used the software.

    The commerical software market took off slightly before the PC. It exploded with the arrival of the PC, in particular the “open architecture” (but widespread standard) IBM PC.

    This still didn’t push women out of the computer industry. What pushed women out (there are still some women, often immigrants or older women) was the over-expansion of Computer Science departments in US colleges (perhaps resulting from starting from a base of zero with an attitude “we have to grow fast!”) coupled with the growth of commercial software companies and the expansion of large Wall Street banks overseas (they were among the first to offshore). This resulted in the need for an increasing number of CS students. This resulted in fishing for students overseas, a process at which the IEEE got particularly good at by the mid-80s. That resulted in an explosion of pipeline immigration, as just about every smart male in the world realized that with a little dedication they could get into the US programming market. As all these folks came in, the environment changed. It was no longer the women-friendly office culture it had been.

    Read More
  24. One thing that never gets mentioned about the reduction of women in programming is that programming itself is much harder conceptually than it used to be.

    Business programming as exemplified by the things that Cobol was designed to do was relatively easy: such things as generating reports, doing payroll, etc. Mostly it was simple input of a record, a little bit of obvious processing, and output of some line on a printed page or another record.

    But programming a PC for its uses was greatly more complicated. One had to understand how windows were managed, which, among other things, required some concept of asynchronous processes (or what at least seemed to be asynchronous processes). And the additional step of an object oriented language such as C++ quickly became standard — this too was no triviality to master.

    And generally the logic of a given operation became far more complex and unpredictable, unlike before.

    And of course the sort of programming that goes on in places like Silicon Valley, where typically the boundaries are being pushed with respect to what applications are designed to do, is the hardest sort of programming of all.

    Women who enter, or would enter, the profession are confronted with these difficulties at one stage of their education or career or another, typically hit the wall, and look for alternatives.

    Read More
  25. It boils down to TPS cover sheets versus Fuckin A.

    Corporate IT is trying to rein in the male/cowboy/get it fucking done types, who are being replaced with women, gays and offshores. Good in some cases, but a lot of people in the industry don’t have a real firm grasp of what they are doing.

    Read More
  26. It was the first time boys/men of that age were allowed to stay inside for ridiculous amounts of time at hobbies activities. Even the old Warner Brothers cartoons would lampoon the ‘egghead’ ‘sissies’ and fuzzy intellectual types.

    Then something happened mid seventies or so. Guys who would have been told to get the hell out of the house by mom were allowed to ferment in the basement playing D&D and monkeying around on computers. I remember wanting to sit quietly and finish a book and being told to ‘go outside and get the indoor stink off you’. Actually, I pointed out that didn’t make since because I always wound up sweaty after playing football or playing in the woods for hours but my grandmother explained that was the way boys were supposed to smell. Not like hermits.

    Then I moved in with my parents and my friends and I could sit and play D&D or monkey with computers (my girlfriend later wife made a very nice program with a guy tumbling across the screen then later formatted my characters for me for D&D). My mom and dad were early, ‘Boomers’ 45/46, and I believe that was the generation that lost the disdain for inside activities. Probably because they were raised to believe education made you smarter/better and TV (the most brain-mushing activity until social media) was part of their upbringing.

    Boys (and sometimes girls) need to get outside and just do things. Can you envision the ‘Greatest Generation’ marching into cubicles all day long? Sure there were office drones but those were steno-pool types not your average Joe.

    Read More
  27. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    The way I understand it the PC era was brought about by guys dumpster diving for IBM code used for building their PCs.

    Read More
  28. “These early personal computers weren’t much more than toys. You could play pong or simple shooting games, maybe do some word processing. And these toys were marketed almost entirely to men and boys.”

    And how many women who worked in the computer industry bought these early machines, just to fool around with? My guess is………..just about zero.

    In my experience, women are not that interested in their work as such – they turn it off when they leave work at five. There are now lots of women engineers, or at least women who are called engineers. How many of them have a technical hobby of any kind? How many of them do their own plumbing, or wiring, or car repair at home?

    Read More
    • Replies: @Anonymous
    The kind of "work" women really seem fascinated and captivated by seems to be house and home oriented and shopping. By house and home oriented, I mean that literally. Women are really into houses, from buying them, remodeling, interior decoration, homemaking, etc. Even the most professional women seem captivated by it. It's really the closest analogue to the male fascination with typically male physical objects like tools, weapons, cars, tech, etc. The activity of shopping, the relentless and interminable browsing, picking up, examining, etc. also seems to fascinate them. Whereas most men have a target based approach to shopping. They have a specific product in mind and target it directly at the store and try to leave as fast as possible.
    , @dcite
    I don't know about plumbing, but one of the most brilliant (in almost every way) people that I know of, is a woman physicist. She is adept at fixing/firing up/even making from scratch, computers of all types, and loves them like stuffed animals. In fact, almost anything mechanical and requiring focus and concentration, she can do. She repaired an upholstery rip by actually painstakingly re-weaving it. You could barely see the mend. She even bakes amazingly complicated cakes, and has raised 3 children very well. Several marriages though. Her engineering husband was said to have resented her. My experience is that men resent great skill in women if it is a field they themselves care about a lot and want to excel in.
    My sister is a programmer. Learned it on the job when that was still an option, and has done well. OTOH, I had to dictate her English compositions to her before school in the morning because she claimed she had no imagination.
    There are women who excel in these things, but I actually don't think men pay that much attention to what gals get up to if it doesn't personally concern them, Mr. Anon. Which is why you might have missed it.
    btw, an Indian gentleman I work with brushed aside his whole 40 yr programming career with "it's just being logical" when I expressed admiration for the facility.
    Me, I don't have the head for it, but when I was younger I was always meeting girls that did. But they tended to veer towards other fields for long term prospects.
  29. @Shouting Thomas
    If you've read my links, I'll continue to explain what, it seems to me, actually happened.

    I moved on to become a sophisticated coder and multimedia artist in the late 80s.

    So, what actually happened with the introduction of PCs was... the same old story. The Wild Wild West opened up when PCs entered the market in a big way. Anti-trust legislation broke up IBM. Systems were very different in every office. A coder wasn't working, as he was during the IBM era, with a stagnant and unchanging system that could be taught in a classroom. He had to produce custom results on demand for a wide variety of businesses.

    When a stagnant system characterized by monopoly breaks up, the Wild Wild West becomes the reality. Only men and whores want to be in the Wild Wild West.

    He had to produce custom results on demand for a wide variety of businesses.

    Yeah that’s my perception of the mainframe->PC shift as well. It went from being primarily about perfect accuracy doing a small number of things to being about holding a giant library of functions in your head and using them to do a unique task you’d never seen before. Add to that the ability of someone programming on PC for PC to iterate code very quickly (do something cool and run it, if it doesn’t work find out why it breaks) as compared to the old mainframe era where (especially with punch cards) iteration took so long that the importance was on minimizing the number of mistakes.

    When someone understands how computing has changed, a general theory of the sexes would tell them that women would do better in the old mainframe era. Yes, even with condescending, alcohol drinking, ass pinching, non equality believing male bosses.

    Read More
  30. NPR is full of it. Hardly a surprise.

    The H1-B visa and Gender Marxism drove women from the IT field.

    In the old mainframe days, the better IT departments would do annual programmer efficiency and productivity audits. The goal was to determine productivity per hour worked and compensated. Employers were actually concerned about employee turnover and knew there was a limit to the about of uncompensated overtime they could coerce from workers.

    With the onset of the H1-B and the unnatural surplus of workers. Employers were no longer concerned about burning out their programmers by demanding massive amounts of uncompensated overtime. The management goals went from trying to get 38 hours of productivity out of a 40 hour work week with occasional compensated overtime, to getting 45 hours of work from a 55-60 hour work week year round.

    See http://en.wikipedia.org/wiki/Peopleware:_Productive_Projects_and_Teams

    With the H1-B, management now had the whip hand. Incompetent, non-programmer women could be now promoted in mass into IT middle management and above, making the Marxists at the EEOC happy. With free rein, these so called women managers could demand overtime from male programmers who feared being fired and facing an increasingly slack labor market.

    Lots of female programmers left the IT field because they could no longer negotiate working a 30-40 work week with telecommuting so they could also raise a family. And yes many were competent women, and some worked their way right out of secretarial pools by taking computer programing courses at night. They may not have been the ultimate hardcore egomaniac programmer jocks, but they were of great help in auxiliary project tasks and management.

    In short the age of Carly Fiorina and Sheryl Kara Sandberg has not been kind to average cubical farm would be moms.

    Funny how the Neo-Marxists at NPR are not hip this.

    Oh, by the way, the Karl Rove and fat cat Republican brain trust have the perfect counter strategy to Hillary.

    http://www.usnews.com/news/blogs/run-2016/2014/07/28/carly-fiorina-taking-2016-temperature

    Read More
  31. I recall when growing up in Silicon Valley, there was a service known as Dial-a-Joke. I later found out it was run by Steve Wozniak before he was known. Many of the jokes were Polish jokes and were on the corny side. I recall one,

    “What does it say on the bottom of a Polish Coke bottle? Open other end. Ha Ha Ha”

    The “Ha Ha” was a characteristic raspy baritone bellowing laugh.

    http://en.wikipedia.org/wiki/Dial-A-Joke

    I think you’re on to something with the introduction of the PC relating to the decline of women in programming. Before the internet, there was something of a revolution in application software. Database applications were the most important for business and there was high demand for jobs for database programmers in the 70s and 80s. When spreadsheets and database programs such as Access and Excel became common, the need for database programmers declined since it was possible to download data into a spreadsheet and managers did their own analysis. I recall that most database programmers were women.

    I also recall that database programmers were often trained by their employer and many only had a high school diploma. Today, a career in computer science/programming demands more math.

    Read More
  32. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    But ironically the IBM PC clone (other than Apple Macs) became the de facto standard PC.

    Read More
  33. Anonymous says:     Show CommentNext New Comment
    @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    IBM owned computing, except for a minicomputer slice of the market that DEC was in. There virtually was no-one else to deal with for most of the 1960s and 70s. And remember, at the time, no-one ever got fired for bringing IBM in to provide a computing solution.

    I still believe the IBM Selectric was the second-greatest engineering achievement of the era, next to the Apollo program. The Selectric is one of those things I don’t believe we could do anymore if we had to. The engineering creativity that Bud Beattie’s team in Lexington put into the golf ball mechanism is still impressive over 50 years later. Something as simple as the mechanism to keep more than one key from being struck at the same time was brilliant. And you are right, Noyes was just the metal-bender, but the best one of his era. If GM had him at the time they never would have lost ground to the Japanese.

    Read More
  34. Anonymous says:     Show CommentNext New Comment

    As a manager from the late 70′s – 2000 I managed a handful of very talented women programmers but only a small # compared to men. Why? Perhaps it is as innocuous a reason as my wife’s experience. Started teaching math out of college, moved to programming for the money. Went to part time programmer after first child. Quit when pregnant with second child to become a stay at home mom. Went back to teaching years later so she would be home when the kids came home and off summers. Not an issue of talent, sexism, glass ceiling, or anything else of interest to NPR. It was simply quality of life.

    Another observation from someone who has managed hundreds of programmers and engineers. It is not exactly a social profession. At least not in the industries I worked in. More women are socially interactive than the average, hard core programmer or engineer. Call it sexism or whatever you want, at least in my experience the pool of men who are OK with working in a very much less than social environment is larger than the pool of women who are.

    This is not to say all men or women programmers are socially awkward, or that all women want to stay at home with the kids, or that there are or aren’t some cultural issues at work, or other reasons. Just that there is at least some rather natural filtering at work. To ignore such rather obvious factors is as biased as that which NPR implies.

    Read More
  35. What I find so pernicious about NPR is not just that it’s a gleeful channel of pure government propaganda but also that when the narrative collapses, they don’t even acknowledge that they had to reformulate the story. It really is Orwellian. Hundreds of “news professionals” aligned by a common faith turn all at once like a flock of birds, but pretend to have always been flying that way.

    To me, morality starts by trying to make one’s own beliefs consistent with one another. NPR makes no effort to do this.

    Read More
  36. There was a graph, I think on twitter, that showed women in computer science peaking in 1985. That was the year before the H-1B visa wasn’t it?

    Read More
    • Replies: @anonymous-antimarxist
    Actually I believe H1-B came in 1990. But there were a forerunner visa programs that served similar purposes.

    I noticed that corporate America began to see its programming work force as essentially disposable by the mid-1990s. By then the networks of Immigration Lawyers, HR hacks and lobbyists supporting the program were well established.

    The Dot-Com bubble temporary masked what was happening. By the 2001, the massive H1-B expansion pushed through by Gene Sperling and Elena Kagan under Clinton and then Dubya letting corporate America know that under no circumstances would H1-B visa overstays be deported, the full impact of H1-B on programmer employment began to be felt.

    Also do not forget the L1 visa as well.
  37. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    It’s not just that. IBM consciously saw itself as a sales and customer service based company, rather than a disruptive tech company based on research and developing new tech and products. The business model was pushing older, established tech with an army of salesmen and providing responsive customer service. The Watsons prioritized salesmen and salesmen dominated the company’s management. They didn’t think much of techies, at least at their company. Watson Sr. regarded the salesman as a kind of American hero.

    Read More
  38. Wasn’t FORTRAN the female programming language of choice? Then it went away — or at least seemed to.

    Read More
    • Replies: @Jack D
    The Hansen writing ball never caught on, especially not in the US. The first really practical typewriter was the Remington, which introduced the the grid layout QWERTY "keyboard" which is what you used to type your message. Until the Selectric, almost all typewriters were modeled after the Remington - pressing a key caused a typebar to fly up and strike the paper - a different typebar for each key. The Selectric was a nice refinement (it produced beautiful looking documents) but offices had been using typewriters for over 80 years by the time it came on the market. From a secretary's point of view, the Selectric required almost no retraining from their familiar typewriters.

    Your quote is from an idiotic "Mashable" entitled "Mad Men Tech: 9 Devices That Changed the 1960s Office".

    http://mashable.com/2012/03/22/mad-men-tech/

    #1 is the Selectric, which was not revolutionary at all, as I explain above.

    #2 is the Xerox machine, which WAS truly revolutionary.

    After that, it's all a stretch - jukeboxes, riding mowers, etc. Huh? Idiocracy here we come!
    , @Jack D
    No, COBOL was the verbose "woman's" language - this is the language that all those women programmers used to write programs for the Social Security Administration, big banks, etc. Fortran was used more for scientific work.
  39. I really need to do some research on this question, because I keep hearing this dubious story about the supposed golden age of female programmers and I want to know whether there’s any truth to it.

    Were these female “programmers” really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems work, etc.? Were they doing low level system programming and/or developing complex programs using higher level, abstract languages like Java?

    Or were they doing what would today be called “scripting” using languages like COBOL?

    There’s a world of difference between someone with a CS degree from Stanford who works on Google’s search engine, or develops embedded software for fighter jets, compared to someone who writes Excel macros or HTML and can’t explain the difference between an array and a linked list. They might both call themselves “programmers” however, and technically they’d both be correct.

    I suspect that these female programmers back in the day were far closer to the latter. If someone can shed some light on this I’d greatly appreciate it.

    Read More
    • Replies: @Anonymous
    You sort of answered your question already. Back then, there were no search engines or software in fighter jets. Most of the jobs back in the day on those mainframes was data processing and arithmetic.
    , @Hoosier girl
    In the late 1960(s) and early 1970(s), this female worked in a data processing center
    at a major bank. Basically, we used the IBM 402 keypunch card readers. At that time,
    credit cards and some corporate & government checks were keypunch cards. The IBM 402
    was a HARD wired device, which the operator programmed. The "mainframe" we used
    was an IBM 360 with an IBM 1419 MICR reader. Most was programmed with FORTRAN,
    which is (unlike most of these posts) was behind the "new" or existing systems at a
    root level. This was a three step process: First, the logic would be written by the programming
    group. Then, their hand written program was sent to keypunchers, who would keypunch
    the cards. Then, they would take the cards to the operators, who also had an IBM 402, which
    then transferred the program to the IBM 360 thru sub-floored wiring.

    Once your understand that a PC does not really stand alone, then you can understand
    that the roots of it are STILL in the original programming, which in its time was more
    (not less) complicated. I built my own first PC and I programmed it myself because I
    had a background in FORTRAN. I would eventually co-own a telecom company and
    helped to merge an token ring and an ethernet, which both IBM and Nortel said could
    not be done, but then the software people didn't ask the hardware people how
    telephones transmit data thru a T1 carrier.

    Considering that process, females dominated the computer field. But you have to take
    into account the big picture here. Or what happened with each revision of the
    technology itself and what part that revision had a role in the next revision.....not only
    in software, but also hardware. Inside, they only got smaller and faster. They still
    have the same basic skills for logic needed to do programming.

    For example, I just worked on a conversion for healthcare. The conversion works
    from FORTRAN to user friendly newer system with color and pictures (neither necessary
    for the job it is to accomplish). However, the conversion works through
    a separate translator program. The ROOT is still from the FORTRAN programmed database.
    So, basically, it is NOT more complicated or difficult. They are just making it difficult by
    adding unnecessary bells and whistles, instead of just doing another revision of
    the FORTRAN to increase allocated memory and add the connectivity.
    , @Hoosier girl
    PS: And to directly answer your question.....YES, females were programmers and often doing the most complex of the analytic thinking. And that was before Texas Instruments introduced the "financial calculator" that did NOT have the logic to query (or search) for accounts with overdrafts and/or negative balances to send out specific letters. An IBM 402 was a glorified auto-read calculator. The IBM 360 was a full blown and multi-tasked LOGIC system. It didn't "search" on the internet, but it did multi-task and search the data base to insert data into another program, which could be a range of dates, times, names, etc. Just because it now does a query online does not mean it did not exist in those early computer days that were not as user friendly.
  40. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    The first PCs (mostly running CP/M) were products of the West Coast. In order to come up with a competitive machine, IBM had to sequester a group of engineers far away from HQ in Boca Raton, Florida, for a year to cobble something together out of off-the-shelf parts.

    Read More
    • Replies: @Stan Adams
    Very true.

    This is a digression, but it might help clear up a few points:

    By the early '80s, IBM's bureaucracy had grown stagnant. The company's various divisions saw their main competitors not as other companies but as other IBM divisions whose new designs might cannibalize the sales of existing products. (They failed to appreciate that Tom Watson had cannibalized the entire company when IBM introduced the System/360.) New machines took years to wind their way through various committees to come to market, and were often woefully underpowered and ridiculously overpriced by the time they did so. The higher-ups realized that, if the PC team were hamstrung by the company's standard operating procedures - rules designed to *stifle* innovation, not promote it - IBM would never be able to compete in such a fast-moving marketplace.

    The greatest factor behind the success of the PC (besides the fact that it was a fairly well-designed machine that had the Big Blue imprimatur) and its successors (the XT and the AT) was that IBM published all of the technical specifications and encouraged third parties to develop add-on products that extended the machine's functionality. Within months of the PC's introduction (it was announced in August 1981 and shipped in October), companies such as Tecmar were offering peripherals such as hard drives and enhanced graphics cards. Competition kept prices down and spurred constant improvement.

    Consider that, in the late '70s and early '80s, IBM was embroiled in a massive antitrust lawsuit. The company took pains to demonstrate to the FTC that it was not attempting to steamroll its competitors by introducing a "closed" machine.

    Apple, on the other hand, went to extreme lengths to discourage anyone from making any add-ons to the Macintosh. Steve Jobs insisted that the first Mac be limited to 128K of RAM. (Even in 1984, 128K was considered a ludicrously small amount for a system of the Mac's complexity. The first Macs were so feeble that they could not be used for software development - all early Mac programs were written on Lisas.) He also insisted that it be a completely closed machine, with no expansion slots. The result was a slow, expensive clunker whose "killer apps" were a paint program suitable mainly for drawing smiley faces and a word processor that crapped out after 10 pages. The Mac initially sold very poorly and might have failed completely had Aldus PageMaker and the Apple LaserWriter not come along to invent the desktop-publishing market. The Mac was largely consigned to the graphics ghetto for years to come.

    IBM's lead in the PC arena slipped in the mid-'80s, as a slew of "clones" cropped up to satisfy the insatiable public demand for PC-compatible DOS machines. In September 1986, the best of the clone-makers, Compaq, beat IBM to market with a machine based on Intel's 386 processor. By this time, a number of IBM flops - including such prominent duds as the much-maligned PCjr home computer and the execrable PC Convertible laptop - had tarnished the company's once-sterling reputation.

    In April 1987, IBM introduced the PS/2 with the MCA bus - a proprietary architecture. Its antitrust issues long settled, IBM now insisted that any companies that wished to build PS/2 clones pay hefty licensing fees. Years earlier, this ploy might have worked, but users were no longer willing to play along with IBM's monopolistic schemes. The company's sales plummeted even as the overall PC market mushroomed, in large part because the clones were now more "IBM-compatible" than IBM itself.

    I'm leaving out *a lot*. I haven't even mentioned the role that Microsoft played in all of this. But that's enough for one post.
  41. One of the interesting facts about this period of computing was that the most common computer language was COBOL which was invented by one of the first naval female admirals Grace Harper http://en.wikipedia.org/wiki/Grace_Hopper

    The language had a reputation for being a bit chattier than Fortran which was used in science and dominated by men. In the late 1970′s I dated a girl from Simmons College and she and many of her female friends were COBOL programmers. Even now a 50ish femaile friend of mine is still running legacy COBOL code.

    Read More
  42. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    Right, because Digital Equipment Corporation was the west coast of the north shore.

    Read More
    • Replies: @Brutusale
    "There is no reason for any individual to have a computer in his home!"--Ken Olsen, DEC CEO, 1977

    Olsen's apologists say that quote was taken out of context, but if he didn't believe it, why was he so intransigent about the PC? From a company almost as large as, and more profitable than, IBM in the late 80s to a piece of freaking Compaq 5 years later.
  43. @syonredux

    (Possession of an IBM Selectric was a status symbol among secretaries when I started working in offices in the 1970s.)
     
    The man who designed the IBM Selectric:

    Eliot Fette Noyes (August 12, 1910 – July 18, 1977) was a Harvard-trained American architect and industrial designer, who worked on projects for IBM, most notably the IBM Selectric typewriter and the IBM Aerospace Research Center in Los Angeles, California. Noyes was also a pioneer in development of comprehensive corporate-wide design programs that integrated design strategy and business strategy. Noyes worked on corporate imagery for IBM, Mobil Oil, Cummins Engine and Westinghouse.
    Eliot Noyes was born in Boston, Massachusetts. Shortly after his birth, Noyes moved to Colorado where he resided until age seven. At this point, Noyes and his family moved to Cambridge, Massachusetts. Noyes’ father taught English at Harvard and his mother was an accomplished pianist. He was not always set on architecture. As a teen, he seriously contemplated becoming a painter; however by age 19 he had his mind set on architecture. He first enrolled at Harvard University in 1932 to obtain a bachelor’s degree in the Classics. Noyes’ experience at Harvard was unlike the other four members of Harvard Five. When he arrived at Harvard, the school was still under the influence of the Beaux-Arts architecture movement – hardly the modernist influence that the other four received. However, after meeting guest lecturer Le Corbusier in the school library, his architectural outlook changed entirely. He was inspired by Le Corbusier’s work and researched the Bauhaus. In his junior year at Harvard, he traveled to Iran for an archaeological expedition. Upon returning to the school, Noyes found that Harvard had undergone a complete revolution. Gropius and Breuer had already arrived there, and with them came a new modernist spirit at the school.[2] In 1938 he received his architecture degree from Harvard Graduate School of Design.

    While at Harvard, Noyes was also a member of the Harvard soaring club and flew the club's new Schweizer Aircraft-built SGU1-7 glider.[3]

    Noyes spent twenty-one years working as consultant design director for IBM, designing the IBM Selectric typewriter in 1961 and numerous other products, while also advising the IBM internal design staff.[1] Prior to his work on the Selectric, Noyes was commissioned in 1956 by Thomas J. Watson, Jr to create IBM's first corporate-wide design program — indeed, these influential efforts, in which Noyes collaborated with Paul Rand and Charles Eames, have been referred to as the first comprehensive design program in American business. Noyes was commissioned regularly by IBM to design various products as well as buildings for the corporation. His most famous and well known of these buildings are the IBM building in Garden City, NY (1966), the IBM Aerospace Building in Los Angeles, CA (1964), The IBM Pavilion Hemisfair in San Antonio, TX (1968) and the IBM Management Development Center in Armonk, NY (1980). Noyes also selected other notable architects such as Mies van der Rohe, Eero Saarinen, Marco Zanuso and Marcel Breuer to design IBM buildings around the world.[2]

    Noyes also redesigned the standard look for all Mobil gasoline stations during the 1960s (and hired the graphic design firm Chermayeff & Geismar to redesign the Mobil logo). His New Canaan, Connecticut residence is regarded as an important piece of Modernist architecture.[2]

     

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism – the spinning and tilting golf ball which was driven by an ingenious mechanical “computer” called a “whiffletree”- it was digital but NOT electronic. It had an electric motor to drive the mechanism but it would have worked just the same if it was driven by a foot treadle (like an old sewing machine). Every key press mechanically generated a unique two digit “code” – one of which was the number of steps to rotate the ball from its resting position and the other was the number of steps of tilt. This, and not Noyes’s case, is what made the Selectric special.

    Read More
    • Replies: @syonredux

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism
     
    Why the scare quotes around designed? Doesn't that word accurately describe what Noyes did?

    Further info on the IBM mid-century aesthetic:

    The Selectric typewriter was introduced on 23 July 1961. Its industrial design is credited to influential American designer Eliot Noyes. Noyes had worked on a number of design projects for IBM; prior to his work on the Selectric, he had been commissioned in 1956 by Thomas J. Watson, Jr. to create IBM's first house style: these influential efforts, in which Noyes collaborated with Paul Rand, Marcel Breuer, and Charles Eames, have been referred to as the first "house style" program in American business.
     
    Interestingly, Charles Eames had a creative partnership with his wife, Bernice Alexandra ("Ray") Kaiser Eames:

    Ray-Bernice Alexandra Kaiser Eames (December 15, 1912 – August 21, 1988) was an American artist, designer, and filmmaker who, together with her husband Charles, is responsible for many classic, iconic designs of the 20th century. She was born in Sacramento, California to Alexander and Edna Burr Kaiser, and had a brother named Maurice. After having lived in a number of cities during her youth, in 1933 she graduated from Bennett Women's College in Millbrook, New York, and moved to New York City, where she studied abstract expressionist painting with Hans Hofmann. She was a founder of the American Abstract Artists group in 1936 and displayed paintings in their first show a year later at Riverside Museum in Manhattan. One of her paintings is in the permanent collection of The Whitney Museum of American Art.

    In September 1940, she began studies at the Cranbrook Academy of Art in Bloomfield Hills, Michigan. She met Charles Eames while preparing drawings and models for the Organic Design in Home Furnishings competition and they were married the following year.[7] Settling in Los Angeles, California, Charles and Ray Eames would lead an outstanding career in design and architecture.

    In 1943, 1944, and 1947, Ray Eames designed several covers for the landmark magazine, Arts & Architecture.

    In the late 1940s, Ray Eames created several textile designs, two of which, "Crosspatch" and "Sea Things", were produced by Schiffer Prints, a company that also produced textiles by Salvador Dalí and Frank Lloyd Wright. Original examples of Ray Eames textiles can be found in many art museum collections. The Ray Eames textiles have been re-issued by Maharam as part of their Textiles of the Twentieth Century collection.

    Ray Eames died in Los Angeles in 1988, ten years to the day after Charles. They are buried next to each other in Calvary Cemetery in St. Louis.

     

    The Eames created the famous Mathematica: A World of Numbers...and Beyond interactive exhibition:

    Mathematica: A World of Numbers…and Beyond is an interactive exhibition originally shown at the California Museum of Science and Industry. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.

    In March, 1961 a new science wing at the California Museum of Science and Industry[1] in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond.[2] This was the first of many exhibitions designed by the Eames Office.

    The 3,000-square-foot (280 m2) exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition.[3] Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science.[4]


    Duplicates
    In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. In January 2014 the exhibit temporarily closed in order for it to undergo much needed refurbishment. [5]

    Another copy was made for the IBM Exhibit at the 1964/1965 New York World's Fair.[6] Subsequently it was briefly on display in New York City, and then installed in the Pacific Science Center in Seattle where it stayed until 1980. It was briefly re-installed in New York city at the 590 Madison Ave IBM Headquarters Building, before being moved to SciTrek in Atlanta, but that organization was shut down in 2004 due to funding cuts. The exhibit was shipped to Petaluma, CA to the daughter of Charles Eames, Lucia Eames. The exhibit is now in the hands of the Eames family, some elements have been on display at the Eames office.

    Men of Modern Mathematics poster
    In 1966, five years after the opening of the Mathematica Exhibit, IBM published a 2-by-12-foot (0.61 m × 3.66 m) timeline poster, titled Men of Modern Mathematics. It was based on the items displayed on the exhibit's History Wall, and free copies were distributed to schools. The timeline covered the period from 1000 AD to approximately 1950 AD, and the poster featured biographical and historical items, along with numerous pictures showing progress in various areas of science, including architecture. The mathematical items in this chart were prepared by Professor Raymond Redheffer[7] of UCLA. Long after the chart was distributed, mathematics departments around the world have proudly displayed this chart on their walls.[8]

    In 2012, IBM Corporation released a free iPad application, Minds of Modern Mathematics, based on the poster but updated to the present. The app was developed by IBM with the assistance of the Eames Office.[9][10]
     
    , @Dave Pinsen
    What was special about the IBM Selectric was how easy it was to correct mistakes. If you hit one key, it would go back and "erase" the last word you typed, by typing the exact same letters in the same order with white ink. With other type writers, corrections were a more tedious process: you had to put in a white ink cartridge, back space over your mistake, and then type over it. Or you could use whiteout and then wait for the paper to dry.
  44. A gaming blog I read had some intelligent remarks on this subject: http://blessingofkings.blogspot.com/2014/10/women-in-computer-science.html

    Basically, there were two booms in interest in programming and women moved into the field same as men but when each boom crashed, relatively more women flowed out than men. So it’s the old story of women preferring a more stable and predictable field to a boom and bust one. No sexism needed to explain it.

    Read More
    • Replies: @Silicon Valley Dinosaur
    Thanks for that link. It helps me understand NPR's "...for decades, the number of women studying computer science was growing faster than the number of men". Namely that "decades" means fifteen years; rising from essentially zero in 1970, female CS majors peaked in 1985 at a level never since attained.

    Maybe in 2030 NPR will observe that the army still doesn't have many female combat soldiers, despite the fact that some time in the early 21st Century, the number of female combat soldiers was "growing faster than the number of men" (a trend that ended, perhaps, when the first all-female combat battalion was captured in the ISIS war and distributed as war booty).

    But to return to computer programming, anyone with real, recent experience in Silicon Valley can tell you that the number of women, or the number of blacks, are now essentially nonissues. The issue should be that there are hardly any Americans anymore. In fact, if (American-born, slave-descended) blacks were to somehow become 13% of the technical workforce around here (i.e. the overall black share of the US population) they would outnumber all the *white* Americans!

  45. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    This is a great “just so” story but the truth is that IBM was instrumental in the widespread adoption of the personal computer. Every Windows computer, all X gazillion of them, is a direct descendant of the IBM PC.

    Read More
  46. Anonymous says:     Show CommentNext New Comment

    All the people in those old IBM adverts were models, chosen for their clean-cut, all-American good looks. White shirts and short hair on the man to indicate ‘efficiency’, sensible hairstyles and clothing on the women.
    That the leading company of that time would unashamedly only use good looking, young, well dressed, well groomed white persons to project their image, is a fact unthinkable in today’s climate.

    Read More
  47. But Steve, you seem to be engaged in the same individualistic exploration for truth and knowledge that is so abhorred by corporate and special interests that would rather you accept their revealed truth and knowledge. Oh yeah, and pay a tithe for it as well.

    Read More
  48. @syonredux

    IBM was the most valuable company on the New York Stock Exchange for much of the 1960s and represented extreme respectability (with a certain muted sexy Mad Men glamor):
     
    Mad Men and IBM typewriters:

    “Now try not to be overwhelmed by all this technology. It looks complicated, but the men who design it made it simple enough for a woman to use.” -- Joan reassuring Peggy on her first day

    Typewriters had been commercially marketed ever since the 1870s, with the Hansen Writing Ball. On Mad Men, most of the typewriters seem to resemble IBM's Selectric model, which dates back to 1961. An electric machine, the Selectric used a swiveling ball that pivoted before striking the typebars onto the ribbon.
     

    The Hansen writing ball never caught on, especially not in the US. The first really practical typewriter was the Remington, which introduced the the grid layout QWERTY “keyboard” which is what you used to type your message. Until the Selectric, almost all typewriters were modeled after the Remington – pressing a key caused a typebar to fly up and strike the paper – a different typebar for each key. The Selectric was a nice refinement (it produced beautiful looking documents) but offices had been using typewriters for over 80 years by the time it came on the market. From a secretary’s point of view, the Selectric required almost no retraining from their familiar typewriters.

    Your quote is from an idiotic “Mashable” entitled “Mad Men Tech: 9 Devices That Changed the 1960s Office”.

    http://mashable.com/2012/03/22/mad-men-tech/

    #1 is the Selectric, which was not revolutionary at all, as I explain above.

    #2 is the Xerox machine, which WAS truly revolutionary.

    After that, it’s all a stretch – jukeboxes, riding mowers, etc. Huh? Idiocracy here we come!

    Read More
  49. Marty [AKA "mel belli"] says:     Show CommentNext New Comment

    It would be an interesting study to find out whether the introduction of PC’s in offices in the 1980′s, especially law firms (see Shouting Thomas above), in preference to the Selectric III actually lowered productivity and cost the firms money. In the Selectric era, the way offices worked was that secretaries sat at their machines and basically didn’t move while successive typing tasks were handed to them. There was very little socializing going on. I was in law firms as late as the late ’90′s, and even then the promise held out by the PC was basically for two tasks: addressing bulk envelopes via mail merge, and automating a table of authorities in a legal brief. The first task, while accomplished effectively, was useless because you can’t drum up legal business just by sending a bunch of ‘we’re here!” letters over the transom. The second, the TOA, almost never worked, and would have been done in less time manually. The PC revolution did have one valuable use for plaintiff’s firms in the early years: when you forgot about a client’s file for a few years and faced a dismissal motion and thus a potential malpractice claim, you’d tell the judge that the years-long failure to prosecute was the result of a “computer glitch,” and often the court would buy it.

    Read More
    • Replies: @Steve Sailer
    And in the 1980s there was the diversion of smart young fellows from the main business of the firm into getting personal computers to work (cough, cough).
  50. Anonymous says:     Show CommentNext New Comment
    @AlphaMaleBrogrammer
    I really need to do some research on this question, because I keep hearing this dubious story about the supposed golden age of female programmers and I want to know whether there's any truth to it.

    Were these female "programmers" really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems work, etc.? Were they doing low level system programming and/or developing complex programs using higher level, abstract languages like Java?

    Or were they doing what would today be called "scripting" using languages like COBOL?

    There's a world of difference between someone with a CS degree from Stanford who works on Google's search engine, or develops embedded software for fighter jets, compared to someone who writes Excel macros or HTML and can't explain the difference between an array and a linked list. They might both call themselves "programmers" however, and technically they'd both be correct.

    I suspect that these female programmers back in the day were far closer to the latter. If someone can shed some light on this I'd greatly appreciate it.

    You sort of answered your question already. Back then, there were no search engines or software in fighter jets. Most of the jobs back in the day on those mainframes was data processing and arithmetic.

    Read More
  51. @Jack D
    Noyes "designed" the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism - the spinning and tilting golf ball which was driven by an ingenious mechanical "computer" called a "whiffletree"- it was digital but NOT electronic. It had an electric motor to drive the mechanism but it would have worked just the same if it was driven by a foot treadle (like an old sewing machine). Every key press mechanically generated a unique two digit "code" - one of which was the number of steps to rotate the ball from its resting position and the other was the number of steps of tilt. This, and not Noyes's case, is what made the Selectric special.

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism

    Why the scare quotes around designed? Doesn’t that word accurately describe what Noyes did?

    Further info on the IBM mid-century aesthetic:

    The Selectric typewriter was introduced on 23 July 1961. Its industrial design is credited to influential American designer Eliot Noyes. Noyes had worked on a number of design projects for IBM; prior to his work on the Selectric, he had been commissioned in 1956 by Thomas J. Watson, Jr. to create IBM’s first house style: these influential efforts, in which Noyes collaborated with Paul Rand, Marcel Breuer, and Charles Eames, have been referred to as the first “house style” program in American business.

    Interestingly, Charles Eames had a creative partnership with his wife, Bernice Alexandra (“Ray”) Kaiser Eames:

    Ray-Bernice Alexandra Kaiser Eames (December 15, 1912 – August 21, 1988) was an American artist, designer, and filmmaker who, together with her husband Charles, is responsible for many classic, iconic designs of the 20th century. She was born in Sacramento, California to Alexander and Edna Burr Kaiser, and had a brother named Maurice. After having lived in a number of cities during her youth, in 1933 she graduated from Bennett Women’s College in Millbrook, New York, and moved to New York City, where she studied abstract expressionist painting with Hans Hofmann. She was a founder of the American Abstract Artists group in 1936 and displayed paintings in their first show a year later at Riverside Museum in Manhattan. One of her paintings is in the permanent collection of The Whitney Museum of American Art.

    In September 1940, she began studies at the Cranbrook Academy of Art in Bloomfield Hills, Michigan. She met Charles Eames while preparing drawings and models for the Organic Design in Home Furnishings competition and they were married the following year.[7] Settling in Los Angeles, California, Charles and Ray Eames would lead an outstanding career in design and architecture.

    In 1943, 1944, and 1947, Ray Eames designed several covers for the landmark magazine, Arts & Architecture.

    In the late 1940s, Ray Eames created several textile designs, two of which, “Crosspatch” and “Sea Things”, were produced by Schiffer Prints, a company that also produced textiles by Salvador Dalí and Frank Lloyd Wright. Original examples of Ray Eames textiles can be found in many art museum collections. The Ray Eames textiles have been re-issued by Maharam as part of their Textiles of the Twentieth Century collection.

    Ray Eames died in Los Angeles in 1988, ten years to the day after Charles. They are buried next to each other in Calvary Cemetery in St. Louis.

    The Eames created the famous Mathematica: A World of Numbers…and Beyond interactive exhibition:

    Mathematica: A World of Numbers…and Beyond is an interactive exhibition originally shown at the California Museum of Science and Industry. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.

    In March, 1961 a new science wing at the California Museum of Science and Industry[1] in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers… and Beyond.[2] This was the first of many exhibitions designed by the Eames Office.

    The 3,000-square-foot (280 m2) exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition.[3] Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science.[4]

    Duplicates
    In November, 1961 an exact duplicate was made for Chicago’s Museum of Science and Industry, where it was shown until late 1980. From there it was relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. In January 2014 the exhibit temporarily closed in order for it to undergo much needed refurbishment. [5]

    Another copy was made for the IBM Exhibit at the 1964/1965 New York World’s Fair.[6] Subsequently it was briefly on display in New York City, and then installed in the Pacific Science Center in Seattle where it stayed until 1980. It was briefly re-installed in New York city at the 590 Madison Ave IBM Headquarters Building, before being moved to SciTrek in Atlanta, but that organization was shut down in 2004 due to funding cuts. The exhibit was shipped to Petaluma, CA to the daughter of Charles Eames, Lucia Eames. The exhibit is now in the hands of the Eames family, some elements have been on display at the Eames office.

    Men of Modern Mathematics poster
    In 1966, five years after the opening of the Mathematica Exhibit, IBM published a 2-by-12-foot (0.61 m × 3.66 m) timeline poster, titled Men of Modern Mathematics. It was based on the items displayed on the exhibit’s History Wall, and free copies were distributed to schools. The timeline covered the period from 1000 AD to approximately 1950 AD, and the poster featured biographical and historical items, along with numerous pictures showing progress in various areas of science, including architecture. The mathematical items in this chart were prepared by Professor Raymond Redheffer[7] of UCLA. Long after the chart was distributed, mathematics departments around the world have proudly displayed this chart on their walls.[8]

    In 2012, IBM Corporation released a free iPad application, Minds of Modern Mathematics, based on the poster but updated to the present. The app was developed by IBM with the assistance of the Eames Office.[9][10]

    Read More
    • Replies: @Jack D
    No, COBOL was the verbose "woman's" language - this is the language that all those women programmers used to write programs for the Social Security Administration, big banks, etc. Fortran was used more for scientific work.
    , @Jack D
    The scare quotes around "design" are there because the meaning of design is ambiguous. One of the other commenters thought you meant that Noyes "designed" (really "engineered") the innards of the Selectric, which he didn't. Maybe you understand clearly the difference between product design and product engineering but a lot of people conflate the two. Noyes's case design was clean and pleasant and probably contributed to the success of the product, but the mechanism was what made the Selectric special. There are some products where the design of the skin is what is special about the product and the innards not so much (the Studebaker Avanti), but the Selectric is not one of them.
    , @colm
    No mention of progeny in the Eames husband-and-wife-team. It is quite strange that most 'artistic' unions rarely produce any progeny.
  52. Anonymous says:     Show CommentNext New Comment
    @Mr. Anon
    "These early personal computers weren’t much more than toys. You could play pong or simple shooting games, maybe do some word processing. And these toys were marketed almost entirely to men and boys."

    And how many women who worked in the computer industry bought these early machines, just to fool around with? My guess is...........just about zero.

    In my experience, women are not that interested in their work as such - they turn it off when they leave work at five. There are now lots of women engineers, or at least women who are called engineers. How many of them have a technical hobby of any kind? How many of them do their own plumbing, or wiring, or car repair at home?

    The kind of “work” women really seem fascinated and captivated by seems to be house and home oriented and shopping. By house and home oriented, I mean that literally. Women are really into houses, from buying them, remodeling, interior decoration, homemaking, etc. Even the most professional women seem captivated by it. It’s really the closest analogue to the male fascination with typically male physical objects like tools, weapons, cars, tech, etc. The activity of shopping, the relentless and interminable browsing, picking up, examining, etc. also seems to fascinate them. Whereas most men have a target based approach to shopping. They have a specific product in mind and target it directly at the store and try to leave as fast as possible.

    Read More
  53. @Luke Lea
    Wasn't FORTRAN the female programming language of choice? Then it went away -- or at least seemed to.

    The Hansen writing ball never caught on, especially not in the US. The first really practical typewriter was the Remington, which introduced the the grid layout QWERTY “keyboard” which is what you used to type your message. Until the Selectric, almost all typewriters were modeled after the Remington – pressing a key caused a typebar to fly up and strike the paper – a different typebar for each key. The Selectric was a nice refinement (it produced beautiful looking documents) but offices had been using typewriters for over 80 years by the time it came on the market. From a secretary’s point of view, the Selectric required almost no retraining from their familiar typewriters.

    Your quote is from an idiotic “Mashable” entitled “Mad Men Tech: 9 Devices That Changed the 1960s Office”.

    http://mashable.com/2012/03/22/mad-men-tech/

    #1 is the Selectric, which was not revolutionary at all, as I explain above.

    #2 is the Xerox machine, which WAS truly revolutionary.

    After that, it’s all a stretch – jukeboxes, riding mowers, etc. Huh? Idiocracy here we come!

    Read More
    • Replies: @Steve Sailer
    "From a secretary’s point of view, the Selectric required almost no retraining from their familiar typewriters."

    Right, it was just better. The only things Disruptive about the Selectric were the arguments and tears among secretaries over who got a Selectric first.

  54. @Luke Lea
    Wasn't FORTRAN the female programming language of choice? Then it went away -- or at least seemed to.

    No, COBOL was the verbose “woman’s” language – this is the language that all those women programmers used to write programs for the Social Security Administration, big banks, etc. Fortran was used more for scientific work.

    Read More
  55. @syonredux

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism
     
    Why the scare quotes around designed? Doesn't that word accurately describe what Noyes did?

    Further info on the IBM mid-century aesthetic:

    The Selectric typewriter was introduced on 23 July 1961. Its industrial design is credited to influential American designer Eliot Noyes. Noyes had worked on a number of design projects for IBM; prior to his work on the Selectric, he had been commissioned in 1956 by Thomas J. Watson, Jr. to create IBM's first house style: these influential efforts, in which Noyes collaborated with Paul Rand, Marcel Breuer, and Charles Eames, have been referred to as the first "house style" program in American business.
     
    Interestingly, Charles Eames had a creative partnership with his wife, Bernice Alexandra ("Ray") Kaiser Eames:

    Ray-Bernice Alexandra Kaiser Eames (December 15, 1912 – August 21, 1988) was an American artist, designer, and filmmaker who, together with her husband Charles, is responsible for many classic, iconic designs of the 20th century. She was born in Sacramento, California to Alexander and Edna Burr Kaiser, and had a brother named Maurice. After having lived in a number of cities during her youth, in 1933 she graduated from Bennett Women's College in Millbrook, New York, and moved to New York City, where she studied abstract expressionist painting with Hans Hofmann. She was a founder of the American Abstract Artists group in 1936 and displayed paintings in their first show a year later at Riverside Museum in Manhattan. One of her paintings is in the permanent collection of The Whitney Museum of American Art.

    In September 1940, she began studies at the Cranbrook Academy of Art in Bloomfield Hills, Michigan. She met Charles Eames while preparing drawings and models for the Organic Design in Home Furnishings competition and they were married the following year.[7] Settling in Los Angeles, California, Charles and Ray Eames would lead an outstanding career in design and architecture.

    In 1943, 1944, and 1947, Ray Eames designed several covers for the landmark magazine, Arts & Architecture.

    In the late 1940s, Ray Eames created several textile designs, two of which, "Crosspatch" and "Sea Things", were produced by Schiffer Prints, a company that also produced textiles by Salvador Dalí and Frank Lloyd Wright. Original examples of Ray Eames textiles can be found in many art museum collections. The Ray Eames textiles have been re-issued by Maharam as part of their Textiles of the Twentieth Century collection.

    Ray Eames died in Los Angeles in 1988, ten years to the day after Charles. They are buried next to each other in Calvary Cemetery in St. Louis.

     

    The Eames created the famous Mathematica: A World of Numbers...and Beyond interactive exhibition:

    Mathematica: A World of Numbers…and Beyond is an interactive exhibition originally shown at the California Museum of Science and Industry. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.

    In March, 1961 a new science wing at the California Museum of Science and Industry[1] in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond.[2] This was the first of many exhibitions designed by the Eames Office.

    The 3,000-square-foot (280 m2) exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition.[3] Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science.[4]


    Duplicates
    In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. In January 2014 the exhibit temporarily closed in order for it to undergo much needed refurbishment. [5]

    Another copy was made for the IBM Exhibit at the 1964/1965 New York World's Fair.[6] Subsequently it was briefly on display in New York City, and then installed in the Pacific Science Center in Seattle where it stayed until 1980. It was briefly re-installed in New York city at the 590 Madison Ave IBM Headquarters Building, before being moved to SciTrek in Atlanta, but that organization was shut down in 2004 due to funding cuts. The exhibit was shipped to Petaluma, CA to the daughter of Charles Eames, Lucia Eames. The exhibit is now in the hands of the Eames family, some elements have been on display at the Eames office.

    Men of Modern Mathematics poster
    In 1966, five years after the opening of the Mathematica Exhibit, IBM published a 2-by-12-foot (0.61 m × 3.66 m) timeline poster, titled Men of Modern Mathematics. It was based on the items displayed on the exhibit's History Wall, and free copies were distributed to schools. The timeline covered the period from 1000 AD to approximately 1950 AD, and the poster featured biographical and historical items, along with numerous pictures showing progress in various areas of science, including architecture. The mathematical items in this chart were prepared by Professor Raymond Redheffer[7] of UCLA. Long after the chart was distributed, mathematics departments around the world have proudly displayed this chart on their walls.[8]

    In 2012, IBM Corporation released a free iPad application, Minds of Modern Mathematics, based on the poster but updated to the present. The app was developed by IBM with the assistance of the Eames Office.[9][10]
     

    No, COBOL was the verbose “woman’s” language – this is the language that all those women programmers used to write programs for the Social Security Administration, big banks, etc. Fortran was used more for scientific work.

    Read More
  56. @syonredux

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism
     
    Why the scare quotes around designed? Doesn't that word accurately describe what Noyes did?

    Further info on the IBM mid-century aesthetic:

    The Selectric typewriter was introduced on 23 July 1961. Its industrial design is credited to influential American designer Eliot Noyes. Noyes had worked on a number of design projects for IBM; prior to his work on the Selectric, he had been commissioned in 1956 by Thomas J. Watson, Jr. to create IBM's first house style: these influential efforts, in which Noyes collaborated with Paul Rand, Marcel Breuer, and Charles Eames, have been referred to as the first "house style" program in American business.
     
    Interestingly, Charles Eames had a creative partnership with his wife, Bernice Alexandra ("Ray") Kaiser Eames:

    Ray-Bernice Alexandra Kaiser Eames (December 15, 1912 – August 21, 1988) was an American artist, designer, and filmmaker who, together with her husband Charles, is responsible for many classic, iconic designs of the 20th century. She was born in Sacramento, California to Alexander and Edna Burr Kaiser, and had a brother named Maurice. After having lived in a number of cities during her youth, in 1933 she graduated from Bennett Women's College in Millbrook, New York, and moved to New York City, where she studied abstract expressionist painting with Hans Hofmann. She was a founder of the American Abstract Artists group in 1936 and displayed paintings in their first show a year later at Riverside Museum in Manhattan. One of her paintings is in the permanent collection of The Whitney Museum of American Art.

    In September 1940, she began studies at the Cranbrook Academy of Art in Bloomfield Hills, Michigan. She met Charles Eames while preparing drawings and models for the Organic Design in Home Furnishings competition and they were married the following year.[7] Settling in Los Angeles, California, Charles and Ray Eames would lead an outstanding career in design and architecture.

    In 1943, 1944, and 1947, Ray Eames designed several covers for the landmark magazine, Arts & Architecture.

    In the late 1940s, Ray Eames created several textile designs, two of which, "Crosspatch" and "Sea Things", were produced by Schiffer Prints, a company that also produced textiles by Salvador Dalí and Frank Lloyd Wright. Original examples of Ray Eames textiles can be found in many art museum collections. The Ray Eames textiles have been re-issued by Maharam as part of their Textiles of the Twentieth Century collection.

    Ray Eames died in Los Angeles in 1988, ten years to the day after Charles. They are buried next to each other in Calvary Cemetery in St. Louis.

     

    The Eames created the famous Mathematica: A World of Numbers...and Beyond interactive exhibition:

    Mathematica: A World of Numbers…and Beyond is an interactive exhibition originally shown at the California Museum of Science and Industry. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.

    In March, 1961 a new science wing at the California Museum of Science and Industry[1] in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond.[2] This was the first of many exhibitions designed by the Eames Office.

    The 3,000-square-foot (280 m2) exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition.[3] Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science.[4]


    Duplicates
    In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. In January 2014 the exhibit temporarily closed in order for it to undergo much needed refurbishment. [5]

    Another copy was made for the IBM Exhibit at the 1964/1965 New York World's Fair.[6] Subsequently it was briefly on display in New York City, and then installed in the Pacific Science Center in Seattle where it stayed until 1980. It was briefly re-installed in New York city at the 590 Madison Ave IBM Headquarters Building, before being moved to SciTrek in Atlanta, but that organization was shut down in 2004 due to funding cuts. The exhibit was shipped to Petaluma, CA to the daughter of Charles Eames, Lucia Eames. The exhibit is now in the hands of the Eames family, some elements have been on display at the Eames office.

    Men of Modern Mathematics poster
    In 1966, five years after the opening of the Mathematica Exhibit, IBM published a 2-by-12-foot (0.61 m × 3.66 m) timeline poster, titled Men of Modern Mathematics. It was based on the items displayed on the exhibit's History Wall, and free copies were distributed to schools. The timeline covered the period from 1000 AD to approximately 1950 AD, and the poster featured biographical and historical items, along with numerous pictures showing progress in various areas of science, including architecture. The mathematical items in this chart were prepared by Professor Raymond Redheffer[7] of UCLA. Long after the chart was distributed, mathematics departments around the world have proudly displayed this chart on their walls.[8]

    In 2012, IBM Corporation released a free iPad application, Minds of Modern Mathematics, based on the poster but updated to the present. The app was developed by IBM with the assistance of the Eames Office.[9][10]
     

    The scare quotes around “design” are there because the meaning of design is ambiguous. One of the other commenters thought you meant that Noyes “designed” (really “engineered”) the innards of the Selectric, which he didn’t. Maybe you understand clearly the difference between product design and product engineering but a lot of people conflate the two. Noyes’s case design was clean and pleasant and probably contributed to the success of the product, but the mechanism was what made the Selectric special. There are some products where the design of the skin is what is special about the product and the innards not so much (the Studebaker Avanti), but the Selectric is not one of them.

    Read More
  57. @AnAnon
    There was a graph, I think on twitter, that showed women in computer science peaking in 1985. That was the year before the H-1B visa wasn't it?

    Actually I believe H1-B came in 1990. But there were a forerunner visa programs that served similar purposes.

    I noticed that corporate America began to see its programming work force as essentially disposable by the mid-1990s. By then the networks of Immigration Lawyers, HR hacks and lobbyists supporting the program were well established.

    The Dot-Com bubble temporary masked what was happening. By the 2001, the massive H1-B expansion pushed through by Gene Sperling and Elena Kagan under Clinton and then Dubya letting corporate America know that under no circumstances would H1-B visa overstays be deported, the full impact of H1-B on programmer employment began to be felt.

    Also do not forget the L1 visa as well.

    Read More
    • Replies: @AnAnon
    1990 is certainly when the first statistics started getting collected, but I'm sure I've seen references to it before that.
  58. Burpleson,

    “When someone understands how computing has changed, a general theory of the sexes would tell them that women would do better in the old mainframe era. Yes, even with condescending, alcohol drinking, ass pinching, non equality believing male bosses.”

    That “even” is, alas, empirically unsupported.

    Read More
  59. @Polynices
    A gaming blog I read had some intelligent remarks on this subject: http://blessingofkings.blogspot.com/2014/10/women-in-computer-science.html

    Basically, there were two booms in interest in programming and women moved into the field same as men but when each boom crashed, relatively more women flowed out than men. So it's the old story of women preferring a more stable and predictable field to a boom and bust one. No sexism needed to explain it.

    Thanks for that link. It helps me understand NPR’s “…for decades, the number of women studying computer science was growing faster than the number of men”. Namely that “decades” means fifteen years; rising from essentially zero in 1970, female CS majors peaked in 1985 at a level never since attained.

    Maybe in 2030 NPR will observe that the army still doesn’t have many female combat soldiers, despite the fact that some time in the early 21st Century, the number of female combat soldiers was “growing faster than the number of men” (a trend that ended, perhaps, when the first all-female combat battalion was captured in the ISIS war and distributed as war booty).

    But to return to computer programming, anyone with real, recent experience in Silicon Valley can tell you that the number of women, or the number of blacks, are now essentially nonissues. The issue should be that there are hardly any Americans anymore. In fact, if (American-born, slave-descended) blacks were to somehow become 13% of the technical workforce around here (i.e. the overall black share of the US population) they would outnumber all the *white* Americans!

    Read More
  60. @syonredux

    Noyes “designed” the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism
     
    Why the scare quotes around designed? Doesn't that word accurately describe what Noyes did?

    Further info on the IBM mid-century aesthetic:

    The Selectric typewriter was introduced on 23 July 1961. Its industrial design is credited to influential American designer Eliot Noyes. Noyes had worked on a number of design projects for IBM; prior to his work on the Selectric, he had been commissioned in 1956 by Thomas J. Watson, Jr. to create IBM's first house style: these influential efforts, in which Noyes collaborated with Paul Rand, Marcel Breuer, and Charles Eames, have been referred to as the first "house style" program in American business.
     
    Interestingly, Charles Eames had a creative partnership with his wife, Bernice Alexandra ("Ray") Kaiser Eames:

    Ray-Bernice Alexandra Kaiser Eames (December 15, 1912 – August 21, 1988) was an American artist, designer, and filmmaker who, together with her husband Charles, is responsible for many classic, iconic designs of the 20th century. She was born in Sacramento, California to Alexander and Edna Burr Kaiser, and had a brother named Maurice. After having lived in a number of cities during her youth, in 1933 she graduated from Bennett Women's College in Millbrook, New York, and moved to New York City, where she studied abstract expressionist painting with Hans Hofmann. She was a founder of the American Abstract Artists group in 1936 and displayed paintings in their first show a year later at Riverside Museum in Manhattan. One of her paintings is in the permanent collection of The Whitney Museum of American Art.

    In September 1940, she began studies at the Cranbrook Academy of Art in Bloomfield Hills, Michigan. She met Charles Eames while preparing drawings and models for the Organic Design in Home Furnishings competition and they were married the following year.[7] Settling in Los Angeles, California, Charles and Ray Eames would lead an outstanding career in design and architecture.

    In 1943, 1944, and 1947, Ray Eames designed several covers for the landmark magazine, Arts & Architecture.

    In the late 1940s, Ray Eames created several textile designs, two of which, "Crosspatch" and "Sea Things", were produced by Schiffer Prints, a company that also produced textiles by Salvador Dalí and Frank Lloyd Wright. Original examples of Ray Eames textiles can be found in many art museum collections. The Ray Eames textiles have been re-issued by Maharam as part of their Textiles of the Twentieth Century collection.

    Ray Eames died in Los Angeles in 1988, ten years to the day after Charles. They are buried next to each other in Calvary Cemetery in St. Louis.

     

    The Eames created the famous Mathematica: A World of Numbers...and Beyond interactive exhibition:

    Mathematica: A World of Numbers…and Beyond is an interactive exhibition originally shown at the California Museum of Science and Industry. Duplicates have since been made, and they (as well as the original) have been moved to other institutions.

    In March, 1961 a new science wing at the California Museum of Science and Industry[1] in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond.[2] This was the first of many exhibitions designed by the Eames Office.

    The 3,000-square-foot (280 m2) exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition.[3] Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science.[4]


    Duplicates
    In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. In January 2014 the exhibit temporarily closed in order for it to undergo much needed refurbishment. [5]

    Another copy was made for the IBM Exhibit at the 1964/1965 New York World's Fair.[6] Subsequently it was briefly on display in New York City, and then installed in the Pacific Science Center in Seattle where it stayed until 1980. It was briefly re-installed in New York city at the 590 Madison Ave IBM Headquarters Building, before being moved to SciTrek in Atlanta, but that organization was shut down in 2004 due to funding cuts. The exhibit was shipped to Petaluma, CA to the daughter of Charles Eames, Lucia Eames. The exhibit is now in the hands of the Eames family, some elements have been on display at the Eames office.

    Men of Modern Mathematics poster
    In 1966, five years after the opening of the Mathematica Exhibit, IBM published a 2-by-12-foot (0.61 m × 3.66 m) timeline poster, titled Men of Modern Mathematics. It was based on the items displayed on the exhibit's History Wall, and free copies were distributed to schools. The timeline covered the period from 1000 AD to approximately 1950 AD, and the poster featured biographical and historical items, along with numerous pictures showing progress in various areas of science, including architecture. The mathematical items in this chart were prepared by Professor Raymond Redheffer[7] of UCLA. Long after the chart was distributed, mathematics departments around the world have proudly displayed this chart on their walls.[8]

    In 2012, IBM Corporation released a free iPad application, Minds of Modern Mathematics, based on the poster but updated to the present. The app was developed by IBM with the assistance of the Eames Office.[9][10]
     

    No mention of progeny in the Eames husband-and-wife-team. It is quite strange that most ‘artistic’ unions rarely produce any progeny.

    Read More
  61. @Jack D
    Noyes "designed" the outer skin of the typewriter, which was pleasant enough, but what really made the Selectric special was its mechanism - the spinning and tilting golf ball which was driven by an ingenious mechanical "computer" called a "whiffletree"- it was digital but NOT electronic. It had an electric motor to drive the mechanism but it would have worked just the same if it was driven by a foot treadle (like an old sewing machine). Every key press mechanically generated a unique two digit "code" - one of which was the number of steps to rotate the ball from its resting position and the other was the number of steps of tilt. This, and not Noyes's case, is what made the Selectric special.

    What was special about the IBM Selectric was how easy it was to correct mistakes. If you hit one key, it would go back and “erase” the last word you typed, by typing the exact same letters in the same order with white ink. With other type writers, corrections were a more tedious process: you had to put in a white ink cartridge, back space over your mistake, and then type over it. Or you could use whiteout and then wait for the paper to dry.

    Read More
    • Replies: @Art Deco
    IIRC, that feature appeared around about 1977 and they were chasing Smith Corona, who had it earlier (or were mass marketing earlier). The previous versions of the Selectric did not have that.
    , @Jack D
    The Correcting Selectric was not introduced until 1973, by which point the Selectric had been on the market for over a decade.
  62. I would definitly have loved to bang Ada Lovelace. I just get the feeling that she’d be mad, bad and delightful to know. Good pillow talk, too.

    Read More
    • Replies: @Steve Sailer
    On a slightly more refined level, that's what Tom Stoppard's "Arcadia" is about: being in love with Ada Lovelace.
  63. @Dave Pinsen
    What was special about the IBM Selectric was how easy it was to correct mistakes. If you hit one key, it would go back and "erase" the last word you typed, by typing the exact same letters in the same order with white ink. With other type writers, corrections were a more tedious process: you had to put in a white ink cartridge, back space over your mistake, and then type over it. Or you could use whiteout and then wait for the paper to dry.

    IIRC, that feature appeared around about 1977 and they were chasing Smith Corona, who had it earlier (or were mass marketing earlier). The previous versions of the Selectric did not have that.

    Read More
  64. People have been mentioning the IBM Selectric typewriter but mention should also be made of the work done by the Italians at Olivetti. They developed the first programmable desktop computer, the Programma 101, which was used by NASA for the Apollo 11 landing and was ripped-off by Hewlett-Packard to build their 9100A. I remember a mention of this HP desktop computer in the biography of Steve Jobs. Apparently he was very impressed by one during a tour of Hewlett-Packard he was given as part of his high school electronics club.

    Here’s a 3min clip on the Olivetti Programma 101. Notice how the design team didn’t have to look like Hobbits or ageing skateboarders to do groundbreaking work:
    Programma 101- Memory of Future: http://www.youtube.com/watch?v=lpkqdbz1R_s

    Read More
  65. @David
    I would definitly have loved to bang Ada Lovelace. I just get the feeling that she'd be mad, bad and delightful to know. Good pillow talk, too.

    On a slightly more refined level, that’s what Tom Stoppard’s “Arcadia” is about: being in love with Ada Lovelace.

    Read More
  66. @Jack D
    The Hansen writing ball never caught on, especially not in the US. The first really practical typewriter was the Remington, which introduced the the grid layout QWERTY "keyboard" which is what you used to type your message. Until the Selectric, almost all typewriters were modeled after the Remington - pressing a key caused a typebar to fly up and strike the paper - a different typebar for each key. The Selectric was a nice refinement (it produced beautiful looking documents) but offices had been using typewriters for over 80 years by the time it came on the market. From a secretary's point of view, the Selectric required almost no retraining from their familiar typewriters.

    Your quote is from an idiotic "Mashable" entitled "Mad Men Tech: 9 Devices That Changed the 1960s Office".

    http://mashable.com/2012/03/22/mad-men-tech/

    #1 is the Selectric, which was not revolutionary at all, as I explain above.

    #2 is the Xerox machine, which WAS truly revolutionary.

    After that, it's all a stretch - jukeboxes, riding mowers, etc. Huh? Idiocracy here we come!

    “From a secretary’s point of view, the Selectric required almost no retraining from their familiar typewriters.”

    Right, it was just better. The only things Disruptive about the Selectric were the arguments and tears among secretaries over who got a Selectric first.

    Read More
  67. @Marty
    It would be an interesting study to find out whether the introduction of PC's in offices in the 1980's, especially law firms (see Shouting Thomas above), in preference to the Selectric III actually lowered productivity and cost the firms money. In the Selectric era, the way offices worked was that secretaries sat at their machines and basically didn't move while successive typing tasks were handed to them. There was very little socializing going on. I was in law firms as late as the late '90's, and even then the promise held out by the PC was basically for two tasks: addressing bulk envelopes via mail merge, and automating a table of authorities in a legal brief. The first task, while accomplished effectively, was useless because you can't drum up legal business just by sending a bunch of 'we're here!" letters over the transom. The second, the TOA, almost never worked, and would have been done in less time manually. The PC revolution did have one valuable use for plaintiff's firms in the early years: when you forgot about a client's file for a few years and faced a dismissal motion and thus a potential malpractice claim, you'd tell the judge that the years-long failure to prosecute was the result of a "computer glitch," and often the court would buy it.

    And in the 1980s there was the diversion of smart young fellows from the main business of the firm into getting personal computers to work (cough, cough).

    Read More
  68. @Dave Pinsen
    What was special about the IBM Selectric was how easy it was to correct mistakes. If you hit one key, it would go back and "erase" the last word you typed, by typing the exact same letters in the same order with white ink. With other type writers, corrections were a more tedious process: you had to put in a white ink cartridge, back space over your mistake, and then type over it. Or you could use whiteout and then wait for the paper to dry.

    The Correcting Selectric was not introduced until 1973, by which point the Selectric had been on the market for over a decade.

    Read More
  69. You guys make me feel old.

    My first RAM cost me $500 for 16 KB.

    My current memory (512MB) would have cost me $16 billion in 1977.

    Read More
  70. I find it amazing that people seem to think that IBM Mainframes are not an advanced technology because of COBOL.

    In case anybody did not know this, COBOL is an application programming language. It is short for COmmon Business-Oriented Language. It was designed to deal with data that is usually discreet and, if not, does not extend beyond two significant digits. This is what most business data is. Accounting data does not go beyond the penny($00.01) in its notation and inventory data is usually specified in whole numbers. Developing an application program designed to deal with this data specifically plus being self-documenting in an English-like language meant anyone can read COBOL code and understood how their data is being manipulated. It’s an excellent computer language for what it is designed to do and far superior to the various junk software they use today.

    If you needed an application programming language to handle scientific and engineering applications then you used FORTRAN. Again, FORTRAN is an application programming language.

    Both COBOL and FORTRAN were built using ASSEMBLER. ASSEMBLER is also the language used to build OS/390, the operating system that runs IBM Mainframes.

    Hearing PC programmers brag about how good their programming is because of link-lists, arrays and tables is hilarious. They are essentially adapting a system programming language (C or C++) designed for building operating systems like UNIX to designing custom-made business applications from scratch. This is like using Assembler to build an accounting program from scratch. Who does that? That’s like designing your own compiler each and every time to build a program.

    Then there is the “verbose” charge. You do understand that anything above hexadecimal notation is “verbose” right? That it is entirely irrelevant how obscure or language-like a programming language is, right? You are not getting anymore speed or precisions using C++ over the English-like COBOL because COBOL compilers are just as fast as C compilers.

    In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user. This is why corporate America prefers to outsource programmers and treat them like assembly-line workers. That don;t want PC obscurantism ruining their businesses.

    Read More
    • Replies: @Steve Sailer
    "In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user."

    That's what the lady running the mainframes at the company I worked for in the 1980s constantly told me as she sabotaged my plans for PCs. She had a point, too.

    , @Ivy
    If their Assembler approach doesn't work, maybe you could put a Hex on the as they clearly need to Shift-Right-Logic :)

    Fortran stems from Formula Translation, indicative of its scientific application roots. A lot of work was done on that at Waterloo in Ontario, hence Watfor (Waterloo Fortran) and Watfiv (Waterloo Five).

    Please take a moment to send kind thoughts to the brave people in Ottawa.
  71. @map
    I find it amazing that people seem to think that IBM Mainframes are not an advanced technology because of COBOL.

    In case anybody did not know this, COBOL is an application programming language. It is short for COmmon Business-Oriented Language. It was designed to deal with data that is usually discreet and, if not, does not extend beyond two significant digits. This is what most business data is. Accounting data does not go beyond the penny($00.01) in its notation and inventory data is usually specified in whole numbers. Developing an application program designed to deal with this data specifically plus being self-documenting in an English-like language meant anyone can read COBOL code and understood how their data is being manipulated. It's an excellent computer language for what it is designed to do and far superior to the various junk software they use today.

    If you needed an application programming language to handle scientific and engineering applications then you used FORTRAN. Again, FORTRAN is an application programming language.

    Both COBOL and FORTRAN were built using ASSEMBLER. ASSEMBLER is also the language used to build OS/390, the operating system that runs IBM Mainframes.

    Hearing PC programmers brag about how good their programming is because of link-lists, arrays and tables is hilarious. They are essentially adapting a system programming language (C or C++) designed for building operating systems like UNIX to designing custom-made business applications from scratch. This is like using Assembler to build an accounting program from scratch. Who does that? That's like designing your own compiler each and every time to build a program.

    Then there is the "verbose" charge. You do understand that anything above hexadecimal notation is "verbose" right? That it is entirely irrelevant how obscure or language-like a programming language is, right? You are not getting anymore speed or precisions using C++ over the English-like COBOL because COBOL compilers are just as fast as C compilers.

    In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user. This is why corporate America prefers to outsource programmers and treat them like assembly-line workers. That don;t want PC obscurantism ruining their businesses.

    “In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user.”

    That’s what the lady running the mainframes at the company I worked for in the 1980s constantly told me as she sabotaged my plans for PCs. She had a point, too.

    Read More
    • Replies: @map
    I actually got into computer programming through mainframes in the late 90's. I trained to work on the Y2k stuff and later maintained programs for a large bank. The system was built on COBOL and DB2 with TSO tying everything together and JCL executing all of the jobs. I really learned to appreciate the power and simplicity of this technology that was refined over 40 years.

    One thing that I became acutely aware of is how the 4GL universe vilified the 3GL world. The OO guys would constantly vilify the old fogey COBOL people as obsolete who built spaghetti code that was impossible to maintain.

    In fact, the system I used was built by Andersen Consulting in the mid-90's and was used to manage SWIFT data. Far from spaghetti code, this COBOL implementation was using CALL...USING features that were very similar to object-oriented design.

    I learned that software operates on a "tear-down" mentality in order to sell product. It is a very dishonest and disingenuous business.
    , @Laban
    “In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user.”

    With virtualisation and thin clients we are indeed going full circle. For "data centre" read "server farm".
  72. @JImbo
    The first PCs (mostly running CP/M) were products of the West Coast. In order to come up with a competitive machine, IBM had to sequester a group of engineers far away from HQ in Boca Raton, Florida, for a year to cobble something together out of off-the-shelf parts.

    Very true.

    This is a digression, but it might help clear up a few points:

    By the early ’80s, IBM’s bureaucracy had grown stagnant. The company’s various divisions saw their main competitors not as other companies but as other IBM divisions whose new designs might cannibalize the sales of existing products. (They failed to appreciate that Tom Watson had cannibalized the entire company when IBM introduced the System/360.) New machines took years to wind their way through various committees to come to market, and were often woefully underpowered and ridiculously overpriced by the time they did so. The higher-ups realized that, if the PC team were hamstrung by the company’s standard operating procedures – rules designed to *stifle* innovation, not promote it – IBM would never be able to compete in such a fast-moving marketplace.

    The greatest factor behind the success of the PC (besides the fact that it was a fairly well-designed machine that had the Big Blue imprimatur) and its successors (the XT and the AT) was that IBM published all of the technical specifications and encouraged third parties to develop add-on products that extended the machine’s functionality. Within months of the PC’s introduction (it was announced in August 1981 and shipped in October), companies such as Tecmar were offering peripherals such as hard drives and enhanced graphics cards. Competition kept prices down and spurred constant improvement.

    Consider that, in the late ’70s and early ’80s, IBM was embroiled in a massive antitrust lawsuit. The company took pains to demonstrate to the FTC that it was not attempting to steamroll its competitors by introducing a “closed” machine.

    Apple, on the other hand, went to extreme lengths to discourage anyone from making any add-ons to the Macintosh. Steve Jobs insisted that the first Mac be limited to 128K of RAM. (Even in 1984, 128K was considered a ludicrously small amount for a system of the Mac’s complexity. The first Macs were so feeble that they could not be used for software development – all early Mac programs were written on Lisas.) He also insisted that it be a completely closed machine, with no expansion slots. The result was a slow, expensive clunker whose “killer apps” were a paint program suitable mainly for drawing smiley faces and a word processor that crapped out after 10 pages. The Mac initially sold very poorly and might have failed completely had Aldus PageMaker and the Apple LaserWriter not come along to invent the desktop-publishing market. The Mac was largely consigned to the graphics ghetto for years to come.

    IBM’s lead in the PC arena slipped in the mid-’80s, as a slew of “clones” cropped up to satisfy the insatiable public demand for PC-compatible DOS machines. In September 1986, the best of the clone-makers, Compaq, beat IBM to market with a machine based on Intel’s 386 processor. By this time, a number of IBM flops – including such prominent duds as the much-maligned PCjr home computer and the execrable PC Convertible laptop – had tarnished the company’s once-sterling reputation.

    In April 1987, IBM introduced the PS/2 with the MCA bus – a proprietary architecture. Its antitrust issues long settled, IBM now insisted that any companies that wished to build PS/2 clones pay hefty licensing fees. Years earlier, this ploy might have worked, but users were no longer willing to play along with IBM’s monopolistic schemes. The company’s sales plummeted even as the overall PC market mushroomed, in large part because the clones were now more “IBM-compatible” than IBM itself.

    I’m leaving out *a lot*. I haven’t even mentioned the role that Microsoft played in all of this. But that’s enough for one post.

    Read More
    • Replies: @Steve Sailer
    In Ridley Scott's famous "1984" Super Bowl commercial introducing the Mac, Big Brother looks increasingly like Steve Jobs.
  73. @Art Deco
    IIRC, that feature appeared around about 1977 and they were chasing Smith Corona, who had it earlier (or were mass marketing earlier). The previous versions of the Selectric did not have that.

    Interesting. TIL.

    Read More
  74. This rings familiar, yes?

    “So, it’s like Society then engaged in a Giant Conspiracy to undermine the Rousseauan paradise of the gender equal computing industry before The Evil Woz came along and ruined everything by inventing the personal computer.”

    http://mamarracho.tumblr.com/post/116024738/and-then-that-pussy-cobain-came-along-and-ruined

    I lift structure from pop culture all the time and I don’t review movies for a living. I wonder how often you slip these in?

    Read More
  75. Anonymous says:     Show CommentNext New Comment

    Why is this a mystery? A woman, or even a man, with the brains to program a computer could make way more money way easier as a manager. A better question is why don’t more men realize this?

    Read More
  76. @Stan Adams
    Very true.

    This is a digression, but it might help clear up a few points:

    By the early '80s, IBM's bureaucracy had grown stagnant. The company's various divisions saw their main competitors not as other companies but as other IBM divisions whose new designs might cannibalize the sales of existing products. (They failed to appreciate that Tom Watson had cannibalized the entire company when IBM introduced the System/360.) New machines took years to wind their way through various committees to come to market, and were often woefully underpowered and ridiculously overpriced by the time they did so. The higher-ups realized that, if the PC team were hamstrung by the company's standard operating procedures - rules designed to *stifle* innovation, not promote it - IBM would never be able to compete in such a fast-moving marketplace.

    The greatest factor behind the success of the PC (besides the fact that it was a fairly well-designed machine that had the Big Blue imprimatur) and its successors (the XT and the AT) was that IBM published all of the technical specifications and encouraged third parties to develop add-on products that extended the machine's functionality. Within months of the PC's introduction (it was announced in August 1981 and shipped in October), companies such as Tecmar were offering peripherals such as hard drives and enhanced graphics cards. Competition kept prices down and spurred constant improvement.

    Consider that, in the late '70s and early '80s, IBM was embroiled in a massive antitrust lawsuit. The company took pains to demonstrate to the FTC that it was not attempting to steamroll its competitors by introducing a "closed" machine.

    Apple, on the other hand, went to extreme lengths to discourage anyone from making any add-ons to the Macintosh. Steve Jobs insisted that the first Mac be limited to 128K of RAM. (Even in 1984, 128K was considered a ludicrously small amount for a system of the Mac's complexity. The first Macs were so feeble that they could not be used for software development - all early Mac programs were written on Lisas.) He also insisted that it be a completely closed machine, with no expansion slots. The result was a slow, expensive clunker whose "killer apps" were a paint program suitable mainly for drawing smiley faces and a word processor that crapped out after 10 pages. The Mac initially sold very poorly and might have failed completely had Aldus PageMaker and the Apple LaserWriter not come along to invent the desktop-publishing market. The Mac was largely consigned to the graphics ghetto for years to come.

    IBM's lead in the PC arena slipped in the mid-'80s, as a slew of "clones" cropped up to satisfy the insatiable public demand for PC-compatible DOS machines. In September 1986, the best of the clone-makers, Compaq, beat IBM to market with a machine based on Intel's 386 processor. By this time, a number of IBM flops - including such prominent duds as the much-maligned PCjr home computer and the execrable PC Convertible laptop - had tarnished the company's once-sterling reputation.

    In April 1987, IBM introduced the PS/2 with the MCA bus - a proprietary architecture. Its antitrust issues long settled, IBM now insisted that any companies that wished to build PS/2 clones pay hefty licensing fees. Years earlier, this ploy might have worked, but users were no longer willing to play along with IBM's monopolistic schemes. The company's sales plummeted even as the overall PC market mushroomed, in large part because the clones were now more "IBM-compatible" than IBM itself.

    I'm leaving out *a lot*. I haven't even mentioned the role that Microsoft played in all of this. But that's enough for one post.

    In Ridley Scott’s famous “1984″ Super Bowl commercial introducing the Mac, Big Brother looks increasingly like Steve Jobs.

    Read More
    • Replies: @Stan Adams
    Yes, indeed.

    I've always found it amusing that people portray Apple as a bulwark of iconoclastic individualism struggling mightily against the Borg-like Microsoft. At his best and at his worst, Steve Jobs was a consummate control freak who tolerated no dissension.

    The great irony of the Apple/Microsoft rivalry is that the latter company owes much of its success over the years to its early dominance of the Mac applications market. Not one of Microsoft's DOS applications ever came close to beating its leading competitors (Word lagged far behind WordPerfect, Multiplan behind 1-2-3, and so on), but the Mac versions of Word, Excel, and PowerPoint owned their markets almost literally from the days they were introduced. This success provided Microsoft not only with a steady stream of revenue, but with expertise in creating GUI software.

    In the early '90s, when the PC market shifted from DOS to Windows, Microsoft was able to introduce a suite of Windows applications that looked and worked much the same as their Mac counterparts*. Companies such as WordPerfect and Lotus stumbled badly as they rushed to produce graphical versions of their market-leading DOS programs. By the mid-'90s, once-hotly-contested program categories such as word processors and spreadsheets were totally controlled by Microsoft.

    *Many people recall Microsoft's disastrous attempt to port the Windows version of Word to the Mac, an effort that produced the hideously buggy and slow Word 6.0. But even before that colossal blunder, Microsoft's Windows products largely resembled their Mac counterparts - the early PC versions of Excel and PowerPoint, in particular, were almost totally identical to their corresponding Mac versions in terms of interfaces and feature sets.
  77. anonymous says:     Show CommentNext New Comment

    Old time computer guy here (learned to program, sort of, on an NCR-200 Century mainframe, first place I worked at still had a running 1958 Philco 2000 mainframe (one of the first transistorized machines, perhaps the first real commercial transistorized computers, BTW it was a 48-bit machine, all tapes, no disks or drums.) Been in the computer business ever since, all across the technology spectrum.

    “Computer science research has always been dominated by men.”

    Dominated numerically, but not exclusively. Someone pointed out Ada Lovelace invented the programming loop about 100 years ahead of her time, writing numerical analysis code without a real computer (and publishing a very influential paper (appendix) about computing); Grace Hooper invented the subroutine library and perhaps the compiler concept I think… I think it was the women computers who wired up the ENIAC who actually developed the register-based simplification of the async ENIAC that became the synchronous von Neumann architecture when he wrote it up… A lot of these women had advanced degrees in math.

    Also, “Computer Science” has today become a branch of math. (And this may have had a huge effect on the entry of women into the field.) It wasn’t clear this would happen in the beginning. Some departments evolved from EE departments, some from math departments, and an influential third from “library science” or “informatics” departments (big data, google-type text-analysis, etc.). There always seemed to be a fair number of women in library science. Pity that there never really were “Software Engineering” departments, although CMU, Wang, and a few other places later tried to address that, though mostly seem to have failed.

    “…from essentially zero in 1970…”

    For what it’s worth I just checked, at one of the first schools to offer a CS degree, in 1971 about half the CS graduates were women. And the claim is that all went on to successful careers in the field. The thing was that a lot of schools established their CS Departments in the 1970s. CS was a new degree in 1970, with not that many existing institutions that granted degrees.

    “Someone feel free to correct me here, but isn’t the “programming” of room-sized mainframe computers less complicated than learning the sorts of languages that are required to master programming on personal computers? It’s not like these women were dealing with the same kind of issues as contemporary code monkeys?”

    No. In any environment you can find a “code monkey” job, just designing form layouts or whatever. It is my belief from doing both that, surprisingly, there is almost no difference in the mental complexity of dealing with:

    (1) JCL; CICS; TSO; assembly; all this interfaced to Cobol, Fortran, and PL/1 code; dealing with mainframe I/O systems (ISAM, etc, sort of like SQLite today); mainframe forms management systems; dealing with lots of different kinds of early databases (and database wars back before the relation/SQL standard); dealing with networking using HASP RJE, and dealing with the lack of standard character and floating point standards, etc.. The “block-mode” mainframe terminals today are browsers, but still… (Actually, you can still find a lot of this stuff running in an IBM mainframe shop and there’s a chance that a lot of the backend data processing that effects your life is still done this way.)

    (2) All the large frameworks one works in today in almost any area (Linux system programming; Windows programming, large web frameworks with front-side languages and back-end languages, message buses and middleware, large scale-out web-backends, etc..) It’s actually amazing how little things have changed at a “how hard this is” level. Oh, the details are all different. But the relative complexity of the tools seems essentially unchanged.

    We should have made things a lot easier by now. A lot of those tools that are out there is an attempt to make things easier… some have helped, everybody these days uses pretty interactive debuggers that make things, in theory, easier than working with core-dumps, for example. (A lot of the H1-B types are convinced at first that they just need to be a great debugger jockey and they can figure it all out without needed a “bugger picture”…) But the underlying complexity due to asynchronous events and concurrency is pretty much the same.

    (3) Throw in the minicomputer eco-system, different from the other two and which had it’s own challenges…

    If anything older systems may have been a little harder. Less emphasis on higher-level languages, more need to know low-level things about the OS and filesystems (for instance, individual tracks on disk belonging to the app’s files were formatted for the file I/O workload of the application (as specified by the programmer). A large reason Unix became popular was it was seen as a much simpler and smaller environment.

    Indeed, we all expected things would have gotten a lot better and easier by 2014. Heck we thought we’d have AI by now and I’m still using editors I first used in the mid 70s! Nearly 50 years of tool development designed to make the job easier and we haven’t had that much unequivocal success. To some extent the “difficulties” of the current environment is an artificial manifestation of the immaturity of web technology as a programming platform. Each technology seems to have to go through the same cycle to end up at about the same point.

    The women programmers who I knew (some whom I trained and some who trained me) dealt with all of the mainframe issues I mentioned.

    It’s important to note that not all women were programmers. You had a large number of women doing keypunch, which needed no programming skill whatsoever. You had a large number of women computer operators. (These were like system admins, but they were much more in the real-time loop of running the machine.) Truth, many of these operators often worked for a (usually but not always) male boss who was the lead system programmer for the installation. (Mainframe OSes were not open source, but the were distributed in source code and patched, debugged, etc., in source code by a dedicated employee, usually). This “guy” usually spec’d and ordered all the new equipment and basically ran the “glass cathedral” and told users (and other management types) how things were going to be. This was usually seen as man’s work, negotiating in a largely man’s world. They were quite vulnerable to savy computer sales ladies.

    “Were these female “programmers” really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems worked, etc.?”

    The core programmers, yes, unequivocally. (As an example, I think it was a women system programmer who had a lot to do with getting the original Ethernet to work in the field (Radia Perlman).

    “Or were they doing what would today be called “scripting” using languages like COBOL?”

    Nobody would call Cobol a decent scripting language. If anything you’d considered it a database-oriented and record-oriented language (though it preceded databases). Although Cobol was invented by women, that wasn’t really widely known by the end of the 60s when women coders had become common. There may have been slightly more women Cobol coders, but there were also female Fortran programmers. (I never noticed a great deal of difference between male and female Fortran coders, except perhaps for the amount of math they knew if they actually had to do real math.) Pl/1 seemed sexually-neutral. (Pl/1 was surprisingly influential.)

    I’m not making a paragon out of women in computing. My point is they were just there, like everyone else, and it was a nice predictable white collar job for a married woman and, if one wanted to look at it that way, where a young women would meet a lot of promising young men. (It’s not so much this way any more either in school or in the bowels of silicon valley… well, that’s not really true for non-whites, there are a lot of arranged marriages in silicon valley between Indian men and women who are both programmers or engineers. H1-B status and occupation is very important in the arranged marriage market.)

    There was still the male/female difference. A very good woman programmer once asked me, “You and N- are the only two guys in the office who just love to program even if you don’t need to. Why is that?”

    “…what really made the Selectric special was its mechanism – the spinning and tilting golf ball ..”

    IBM computer consoles were often modified Selectrics. I’ve spent a huge amount of time using Selectrics as terminals. (And Selectrics with Mag Card readers!) The replaceable spinning ball was really useful on terminals. You could replace the type-ball in seconds and switch from an “english” typewriter/terminal to an APL terminal. (The APL language had its entire own character set that had a lot of greek and math symbols in it, on some mainframes it was heavily used. (These days you’d use R, which borrows a lot from APL.) I got to admit to loving APL on a Selectric.

    “…Back then, there were no search engines or software in fighter jets. Most of the jobs back in the day on those mainframes was data processing and arithmetic. …”

    I wouldn’t go this far. A lot of the early text processing research was done by Gerard Salton and his group in the early 60s, the folks that later did a lot of work on the “SMART information retrieval system” (mostly in the 60s and 70s). These folks were working in the library science thread of things (“Salton was perhaps most well known for developing the now widely used vector space model for Information Retrieval.”) He wrote the original book on “Automatic Text Processing: The Transformation Analysis and Retrieval of Information by Computer”.

    Two large systems where his ideas were used, and that were very much search engines (for what today we’d use Google), were Lexis (for lawyers to do legal search on all the laws and case history (“During the 1970s, LexisNexis pioneered the electronic accessibility of legal and journalistic documents. As of 2006, the company has the world’s largest electronic database for legal and public-records related information.”)) and Nexis (for newspapers to search news stories). They differed from today in that they were not free.

    About fighter jets, by the early 70s there was a lot of software in things like the F-111 and the A-5 Vigilante (probably introduced around 1960). It was its computer that allowed the F-111 to hug-the-ground at high-speed.

    In the 1950s there was a huge national program, SAGE, to control fighter jets by ground based computers such as the F-106 (the computers on the ground could fly and fight the planes, at least in theory, with the exception of landing and takeoff, I think; the planes could have nuclear missiles to shoot down Soviet bomber formations).

    The case can be made that SAGE was the single most important reason for the explosion of the computer industry in the 60s. Many basic things, like modems, were invented as part of the SAGE project. (There were other reasons, like IBMs increasing electronic implementation of its tab shop equipment, but SAGE really jump-started the economy of the industry.)

    Read More
    • Replies: @The most deplorable one

    Dominated numerically, but not exclusively. Someone pointed out Ada Lovelace invented the programming loop about 100 years ahead of her time, writing numerical analysis code without a real computer (and publishing a very influential paper (appendix) about computing)
     
    It is funny how every knows stuff that is not true.

    On the question of who the first computer programmer was, there is no confusion what so ever and it was not Ada Lovelace. The Menabrea Memoir that Ada had translated already contained examples of programmes for the Analytical Engine that Babbage had used to illustrate his Turin lectures and had actually developed several years before. The notes contain further examples from the same source that Babbage supplied to the authoress. The only new programme example developed for the notes was the one to determine the so-called Bernoulli numbers. Quite who contributed what to this programme is open to dispute. In his autobiography, written several years after Ada’s death, Babbage claims that Ada suggested the programme, which he then wrote, although noting that she had spotted a serious error in the original. The correspondence suggests that Ada was much more actively involved in the development of the programme and should perhaps be given more credit than Babbage allowed her. Whatever the truth of the matter Ada Lovelace was neither a mathematician nor the first computer programmer.
     
  78. IBM emphasized how anti-Disruptive its computers were

    Very true with rare exceptions (behind the scenes things were turbulent enough they had to play up the animal spirits, however obliquely, in certain marketing channels). More often their tack was similar to a life insurance company’s, with the Greek columns, cloudless sky, and pantsuits — or some kind of opposite-of-sexy cologne your business sprayed on to become 50% more successful. In fact I think they had those ridiculous “globalism” TV spots, typical one being this mustachioed pizza guy in Sicily or wherever, with Big Blue’s expertise imparting good fortune to his operations down in the village.

    Today it seems really quaint against the tenor of whatever Gavin Volure Sunstream nonsense is cooked up to get Sand Hill Road salivating. I don’t recall IBM ever trying the game-changing, map-shredding, wall-tumbling Dread Pirate Roberts shtick you can sample in vintage Enron commercials

    Read More
  79. @EriK
    Right, because Digital Equipment Corporation was the west coast of the north shore.

    “There is no reason for any individual to have a computer in his home!”–Ken Olsen, DEC CEO, 1977

    Olsen’s apologists say that quote was taken out of context, but if he didn’t believe it, why was he so intransigent about the PC? From a company almost as large as, and more profitable than, IBM in the late 80s to a piece of freaking Compaq 5 years later.

    Read More
  80. anonymous says:     Show CommentNext New Comment

    At least one thing I garbled in the mention of SAGE. SAGE was a national system of connected computers and radars, intended to protect the US from Soviet bombers. (It was inspired by the system the British used to win the Battle of Britain.) The ground computers controlled planes such as the F-106. The SAGE computers themselves were the physically largest computers ever built. Apparently 24 were deployed. They operated very reliably for a long time.

    ICBMs made SAGE instantly obsolete. (Maybe it was SAGE that had made it necessary for the Soviets to spend so much money on developing and building ICBMs.)

    The hulk of a SAGE radar station is still standing guard over silicon valley and the bay area. Here is a large pic (the Almaden “Monolith”).

    Read More
  81. @Steve Sailer
    In Ridley Scott's famous "1984" Super Bowl commercial introducing the Mac, Big Brother looks increasingly like Steve Jobs.

    Yes, indeed.

    I’ve always found it amusing that people portray Apple as a bulwark of iconoclastic individualism struggling mightily against the Borg-like Microsoft. At his best and at his worst, Steve Jobs was a consummate control freak who tolerated no dissension.

    The great irony of the Apple/Microsoft rivalry is that the latter company owes much of its success over the years to its early dominance of the Mac applications market. Not one of Microsoft’s DOS applications ever came close to beating its leading competitors (Word lagged far behind WordPerfect, Multiplan behind 1-2-3, and so on), but the Mac versions of Word, Excel, and PowerPoint owned their markets almost literally from the days they were introduced. This success provided Microsoft not only with a steady stream of revenue, but with expertise in creating GUI software.

    In the early ’90s, when the PC market shifted from DOS to Windows, Microsoft was able to introduce a suite of Windows applications that looked and worked much the same as their Mac counterparts*. Companies such as WordPerfect and Lotus stumbled badly as they rushed to produce graphical versions of their market-leading DOS programs. By the mid-’90s, once-hotly-contested program categories such as word processors and spreadsheets were totally controlled by Microsoft.

    *Many people recall Microsoft’s disastrous attempt to port the Windows version of Word to the Mac, an effort that produced the hideously buggy and slow Word 6.0. But even before that colossal blunder, Microsoft’s Windows products largely resembled their Mac counterparts – the early PC versions of Excel and PowerPoint, in particular, were almost totally identical to their corresponding Mac versions in terms of interfaces and feature sets.

    Read More
  82. –who, by the way, was a GENIUS and a billionaire; but then he has sex with *one* computer and that’s his legacy??

    Read More
  83. anonymous says:     Show CommentNext New Comment

    Sigh. Sadly I still got the link to the Alamaden Monolith wrong. It’s cursed, I tell ya!

    But this allows me to mention the biggest thing that made older software in some ways much harder to write–there is so much memory available today. Much, much more memory. And virtual memory. This usually makes things easier. (Heh, heh, no Overlay Description Languages for defining multi-rooted memory-mapped overlay trees.)

    I think the biggest mainframe I worked on might have had 374K or something like that. And the funny things was, you could support hundreds of terminals on a system of that size… (An important reason was by-and-large the terminals didn’t have byte-level interrupts. But a browser running Javascript today doesn’t create byte-level interrupts back on the server either…)

    Here are some interesting pictures of the monolith and the inside of an old Cold War SAGE radar station. (“Recent Photos of Almaden AFS, CA”)

    Read More
  84. The most deplorable one [AKA "Fourth doorman of the apocalypse"] says:     Show CommentNext New Comment
    @anonymous
    Old time computer guy here (learned to program, sort of, on an NCR-200 Century mainframe, first place I worked at still had a running 1958 Philco 2000 mainframe (one of the first transistorized machines, perhaps the first real commercial transistorized computers, BTW it was a 48-bit machine, all tapes, no disks or drums.) Been in the computer business ever since, all across the technology spectrum.

    "Computer science research has always been dominated by men."

    Dominated numerically, but not exclusively. Someone pointed out Ada Lovelace invented the programming loop about 100 years ahead of her time, writing numerical analysis code without a real computer (and publishing a very influential paper (appendix) about computing); Grace Hooper invented the subroutine library and perhaps the compiler concept I think... I think it was the women computers who wired up the ENIAC who actually developed the register-based simplification of the async ENIAC that became the synchronous von Neumann architecture when he wrote it up... A lot of these women had advanced degrees in math.

    Also, "Computer Science" has today become a branch of math. (And this may have had a huge effect on the entry of women into the field.) It wasn't clear this would happen in the beginning. Some departments evolved from EE departments, some from math departments, and an influential third from "library science" or "informatics" departments (big data, google-type text-analysis, etc.). There always seemed to be a fair number of women in library science. Pity that there never really were "Software Engineering" departments, although CMU, Wang, and a few other places later tried to address that, though mostly seem to have failed.

    "...from essentially zero in 1970..."

    For what it's worth I just checked, at one of the first schools to offer a CS degree, in 1971 about half the CS graduates were women. And the claim is that all went on to successful careers in the field. The thing was that a lot of schools established their CS Departments in the 1970s. CS was a new degree in 1970, with not that many existing institutions that granted degrees.

    "Someone feel free to correct me here, but isn’t the “programming” of room-sized mainframe computers less complicated than learning the sorts of languages that are required to master programming on personal computers? It’s not like these women were dealing with the same kind of issues as contemporary code monkeys?"

    No. In any environment you can find a "code monkey" job, just designing form layouts or whatever. It is my belief from doing both that, surprisingly, there is almost no difference in the mental complexity of dealing with:

    (1) JCL; CICS; TSO; assembly; all this interfaced to Cobol, Fortran, and PL/1 code; dealing with mainframe I/O systems (ISAM, etc, sort of like SQLite today); mainframe forms management systems; dealing with lots of different kinds of early databases (and database wars back before the relation/SQL standard); dealing with networking using HASP RJE, and dealing with the lack of standard character and floating point standards, etc.. The "block-mode" mainframe terminals today are browsers, but still... (Actually, you can still find a lot of this stuff running in an IBM mainframe shop and there's a chance that a lot of the backend data processing that effects your life is still done this way.)

    (2) All the large frameworks one works in today in almost any area (Linux system programming; Windows programming, large web frameworks with front-side languages and back-end languages, message buses and middleware, large scale-out web-backends, etc..) It's actually amazing how little things have changed at a "how hard this is" level. Oh, the details are all different. But the relative complexity of the tools seems essentially unchanged.

    We should have made things a lot easier by now. A lot of those tools that are out there is an attempt to make things easier... some have helped, everybody these days uses pretty interactive debuggers that make things, in theory, easier than working with core-dumps, for example. (A lot of the H1-B types are convinced at first that they just need to be a great debugger jockey and they can figure it all out without needed a "bugger picture"...) But the underlying complexity due to asynchronous events and concurrency is pretty much the same.

    (3) Throw in the minicomputer eco-system, different from the other two and which had it's own challenges...

    If anything older systems may have been a little harder. Less emphasis on higher-level languages, more need to know low-level things about the OS and filesystems (for instance, individual tracks on disk belonging to the app's files were formatted for the file I/O workload of the application (as specified by the programmer). A large reason Unix became popular was it was seen as a much simpler and smaller environment.

    Indeed, we all expected things would have gotten a lot better and easier by 2014. Heck we thought we'd have AI by now and I'm still using editors I first used in the mid 70s! Nearly 50 years of tool development designed to make the job easier and we haven't had that much unequivocal success. To some extent the "difficulties" of the current environment is an artificial manifestation of the immaturity of web technology as a programming platform. Each technology seems to have to go through the same cycle to end up at about the same point.


    The women programmers who I knew (some whom I trained and some who trained me) dealt with all of the mainframe issues I mentioned.

    It's important to note that not all women were programmers. You had a large number of women doing keypunch, which needed no programming skill whatsoever. You had a large number of women computer operators. (These were like system admins, but they were much more in the real-time loop of running the machine.) Truth, many of these operators often worked for a (usually but not always) male boss who was the lead system programmer for the installation. (Mainframe OSes were not open source, but the were distributed in source code and patched, debugged, etc., in source code by a dedicated employee, usually). This "guy" usually spec'd and ordered all the new equipment and basically ran the "glass cathedral" and told users (and other management types) how things were going to be. This was usually seen as man's work, negotiating in a largely man's world. They were quite vulnerable to savy computer sales ladies.

    "Were these female “programmers” really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems worked, etc.?"

    The core programmers, yes, unequivocally. (As an example, I think it was a women system programmer who had a lot to do with getting the original Ethernet to work in the field (Radia Perlman).


    "Or were they doing what would today be called “scripting” using languages like COBOL?"

    Nobody would call Cobol a decent scripting language. If anything you'd considered it a database-oriented and record-oriented language (though it preceded databases). Although Cobol was invented by women, that wasn't really widely known by the end of the 60s when women coders had become common. There may have been slightly more women Cobol coders, but there were also female Fortran programmers. (I never noticed a great deal of difference between male and female Fortran coders, except perhaps for the amount of math they knew if they actually had to do real math.) Pl/1 seemed sexually-neutral. (Pl/1 was surprisingly influential.)


    I'm not making a paragon out of women in computing. My point is they were just there, like everyone else, and it was a nice predictable white collar job for a married woman and, if one wanted to look at it that way, where a young women would meet a lot of promising young men. (It's not so much this way any more either in school or in the bowels of silicon valley... well, that's not really true for non-whites, there are a lot of arranged marriages in silicon valley between Indian men and women who are both programmers or engineers. H1-B status and occupation is very important in the arranged marriage market.)

    There was still the male/female difference. A very good woman programmer once asked me, "You and N- are the only two guys in the office who just love to program even if you don't need to. Why is that?"


    "...what really made the Selectric special was its mechanism – the spinning and tilting golf ball .."

    IBM computer consoles were often modified Selectrics. I've spent a huge amount of time using Selectrics as terminals. (And Selectrics with Mag Card readers!) The replaceable spinning ball was really useful on terminals. You could replace the type-ball in seconds and switch from an "english" typewriter/terminal to an APL terminal. (The APL language had its entire own character set that had a lot of greek and math symbols in it, on some mainframes it was heavily used. (These days you'd use R, which borrows a lot from APL.) I got to admit to loving APL on a Selectric.

    "...Back then, there were no search engines or software in fighter jets. Most of the jobs back in the day on those mainframes was data processing and arithmetic. ..."

    I wouldn't go this far. A lot of the early text processing research was done by Gerard Salton and his group in the early 60s, the folks that later did a lot of work on the "SMART information retrieval system" (mostly in the 60s and 70s). These folks were working in the library science thread of things ("Salton was perhaps most well known for developing the now widely used vector space model for Information Retrieval.") He wrote the original book on "Automatic Text Processing: The Transformation Analysis and Retrieval of Information by Computer".

    Two large systems where his ideas were used, and that were very much search engines (for what today we'd use Google), were Lexis (for lawyers to do legal search on all the laws and case history ("During the 1970s, LexisNexis pioneered the electronic accessibility of legal and journalistic documents. As of 2006, the company has the world's largest electronic database for legal and public-records related information.")) and Nexis (for newspapers to search news stories). They differed from today in that they were not free.

    About fighter jets, by the early 70s there was a lot of software in things like the F-111 and the A-5 Vigilante (probably introduced around 1960). It was its computer that allowed the F-111 to hug-the-ground at high-speed.

    In the 1950s there was a huge national program, SAGE, to control fighter jets by ground based computers such as the F-106 (the computers on the ground could fly and fight the planes, at least in theory, with the exception of landing and takeoff, I think; the planes could have nuclear missiles to shoot down Soviet bomber formations).

    The case can be made that SAGE was the single most important reason for the explosion of the computer industry in the 60s. Many basic things, like modems, were invented as part of the SAGE project. (There were other reasons, like IBMs increasing electronic implementation of its tab shop equipment, but SAGE really jump-started the economy of the industry.)

    Dominated numerically, but not exclusively. Someone pointed out Ada Lovelace invented the programming loop about 100 years ahead of her time, writing numerical analysis code without a real computer (and publishing a very influential paper (appendix) about computing)

    It is funny how every knows stuff that is not true.

    On the question of who the first computer programmer was, there is no confusion what so ever and it was not Ada Lovelace. The Menabrea Memoir that Ada had translated already contained examples of programmes for the Analytical Engine that Babbage had used to illustrate his Turin lectures and had actually developed several years before. The notes contain further examples from the same source that Babbage supplied to the authoress. The only new programme example developed for the notes was the one to determine the so-called Bernoulli numbers. Quite who contributed what to this programme is open to dispute. In his autobiography, written several years after Ada’s death, Babbage claims that Ada suggested the programme, which he then wrote, although noting that she had spotted a serious error in the original. The correspondence suggests that Ada was much more actively involved in the development of the programme and should perhaps be given more credit than Babbage allowed her. Whatever the truth of the matter Ada Lovelace was neither a mathematician nor the first computer programmer.

    Read More
    • Replies: @candid_observer
    Yeah, the whole Ada Lovelace thing is just one more convenient fiction with the Correct Moral.

    Just a little investigation shows just how grandiose and confused she was.

    Even after some considerable study under the tutelage of Augustus De Morgan, the woman couldn't grasp as simple a concept as the expression of the equation y = x^2 in analytic geometry via a curve:


    [De Morgan] then returned to [Ms. Lovelace] an exercise in which she had calculated some values of the variable x^2 ( 1 , 9/4, 4, 25/4, 9, and so forth), drawn vertical lines proportional to these values, and connected the upper ends of these lines with a curve. A note in her hand says that the vertical lines represent the function x^2, but she cannot see their relation to the curve, or why he says that the curve represents x^2. A note in his hand says that this curve "and no other belongs to y = x^2." Armed with her new insight into her own thought processes, she wrote back,

    I am afraid I do not understand what you were kind enough to write about the Curve; and I think. for this reason, - that I do not know what the tern equation to a curve means. -Probably with some study, I should deduce the meaning myself; but having plenty else to attend to of more immediate consequence, I do not like to give my time to a mere digression of this sort. -I should like much at some future period, (when I have got rid of the common algebra &; Trigonometry which at present detain me), to attend particularly to this subject. At present, you will observe, I have four distinct things to carry on at the same time: - the Algebra; -Trigonometry; -Chapter 2nd of the Differential Calculus; -Be the mere practice in Differentiation.
     

    From Ada A Live and a Legacy, Dorothy Stein, p. 74
    http://monoskop.org/images/e/e7/Stein_Dorothy_Ada_A_Life_and_a_Legacy.pdf

    In a way, what's remarkable is how even back then a number of distinguished men seemed hellbent to make her out to be some kind of genius when they would have had to be fools to believe it.

  85. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    OK, but the biggest advances took place at the West Coast branch of an East Coast– no, make that North Coast– firm, Xerox.

    Read More
  86. @The most deplorable one

    Dominated numerically, but not exclusively. Someone pointed out Ada Lovelace invented the programming loop about 100 years ahead of her time, writing numerical analysis code without a real computer (and publishing a very influential paper (appendix) about computing)
     
    It is funny how every knows stuff that is not true.

    On the question of who the first computer programmer was, there is no confusion what so ever and it was not Ada Lovelace. The Menabrea Memoir that Ada had translated already contained examples of programmes for the Analytical Engine that Babbage had used to illustrate his Turin lectures and had actually developed several years before. The notes contain further examples from the same source that Babbage supplied to the authoress. The only new programme example developed for the notes was the one to determine the so-called Bernoulli numbers. Quite who contributed what to this programme is open to dispute. In his autobiography, written several years after Ada’s death, Babbage claims that Ada suggested the programme, which he then wrote, although noting that she had spotted a serious error in the original. The correspondence suggests that Ada was much more actively involved in the development of the programme and should perhaps be given more credit than Babbage allowed her. Whatever the truth of the matter Ada Lovelace was neither a mathematician nor the first computer programmer.
     

    Yeah, the whole Ada Lovelace thing is just one more convenient fiction with the Correct Moral.

    Just a little investigation shows just how grandiose and confused she was.

    Even after some considerable study under the tutelage of Augustus De Morgan, the woman couldn’t grasp as simple a concept as the expression of the equation y = x^2 in analytic geometry via a curve:

    [De Morgan] then returned to [Ms. Lovelace] an exercise in which she had calculated some values of the variable x^2 ( 1 , 9/4, 4, 25/4, 9, and so forth), drawn vertical lines proportional to these values, and connected the upper ends of these lines with a curve. A note in her hand says that the vertical lines represent the function x^2, but she cannot see their relation to the curve, or why he says that the curve represents x^2. A note in his hand says that this curve “and no other belongs to y = x^2.” Armed with her new insight into her own thought processes, she wrote back,

    I am afraid I do not understand what you were kind enough to write about the Curve; and I think. for this reason, – that I do not know what the tern equation to a curve means. -Probably with some study, I should deduce the meaning myself; but having plenty else to attend to of more immediate consequence, I do not like to give my time to a mere digression of this sort. -I should like much at some future period, (when I have got rid of the common algebra &; Trigonometry which at present detain me), to attend particularly to this subject. At present, you will observe, I have four distinct things to carry on at the same time: – the Algebra; -Trigonometry; -Chapter 2nd of the Differential Calculus; -Be the mere practice in Differentiation.

    From Ada A Live and a Legacy, Dorothy Stein, p. 74

    http://monoskop.org/images/e/e7/Stein_Dorothy_Ada_A_Life_and_a_Legacy.pdf

    In a way, what’s remarkable is how even back then a number of distinguished men seemed hellbent to make her out to be some kind of genius when they would have had to be fools to believe it.

    Read More
  87. @map
    I find it amazing that people seem to think that IBM Mainframes are not an advanced technology because of COBOL.

    In case anybody did not know this, COBOL is an application programming language. It is short for COmmon Business-Oriented Language. It was designed to deal with data that is usually discreet and, if not, does not extend beyond two significant digits. This is what most business data is. Accounting data does not go beyond the penny($00.01) in its notation and inventory data is usually specified in whole numbers. Developing an application program designed to deal with this data specifically plus being self-documenting in an English-like language meant anyone can read COBOL code and understood how their data is being manipulated. It's an excellent computer language for what it is designed to do and far superior to the various junk software they use today.

    If you needed an application programming language to handle scientific and engineering applications then you used FORTRAN. Again, FORTRAN is an application programming language.

    Both COBOL and FORTRAN were built using ASSEMBLER. ASSEMBLER is also the language used to build OS/390, the operating system that runs IBM Mainframes.

    Hearing PC programmers brag about how good their programming is because of link-lists, arrays and tables is hilarious. They are essentially adapting a system programming language (C or C++) designed for building operating systems like UNIX to designing custom-made business applications from scratch. This is like using Assembler to build an accounting program from scratch. Who does that? That's like designing your own compiler each and every time to build a program.

    Then there is the "verbose" charge. You do understand that anything above hexadecimal notation is "verbose" right? That it is entirely irrelevant how obscure or language-like a programming language is, right? You are not getting anymore speed or precisions using C++ over the English-like COBOL because COBOL compilers are just as fast as C compilers.

    In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user. This is why corporate America prefers to outsource programmers and treat them like assembly-line workers. That don;t want PC obscurantism ruining their businesses.

    If their Assembler approach doesn’t work, maybe you could put a Hex on the as they clearly need to Shift-Right-Logic :)

    Fortran stems from Formula Translation, indicative of its scientific application roots. A lot of work was done on that at Waterloo in Ontario, hence Watfor (Waterloo Fortran) and Watfiv (Waterloo Five).

    Please take a moment to send kind thoughts to the brave people in Ottawa.

    Read More
  88. anonymous says:     Show CommentNext New Comment

    “It is funny how every knows stuff that is not true.”

    That might also be true of the link you quote. Trust but verify, someone once said.

    It doesn’t seem like it’s known to be not true, it’s more a conjecture by one group of historians. This is apparently a summary of the controversy, one side of which is represented by the link you quote.

    Ada not a mathematician? Well, maybe.

    She apparently had a real mathematical education for her time. Her mother was hoping an education in math would counter Lord Byron’s “irrationality”. (Sounds like the lot of them were worried about madness, probably all bipolar?)

    One of her teachers was Augustus De Morgan. Just what is the definition of mathematician that whoever wrote that link you cites is using?

    “Her mother’s obsession with rooting out any of the insanity of… Lord Byron was one of the reasons that Ada was taught mathematics from an early age. She was privately schooled in mathematics and science by William Frend, William King, and Mary Somerville, noted researcher and scientific author of the 19th century. One of her later tutors was mathematician and logician Augustus De Morgan. From 1832, when she was seventeen, her remarkable mathematical abilities began to emerge, and her interest in mathematics dominated the majority of her adult life. In a letter to Lady Byron, De Morgan suggested that her daughter’s skill in mathematics could lead her to become “an original mathematical investigator, perhaps of first-rate eminence.”

    If that de Morgan quote is true, I’ll go with that. Ada was at least a potential, if not practicing, mathematician. The fact that that quote says she “was neither a mathematician…” has me suspicious of the rest of it.

    Few women (none?) of her position were. It apparently never occurred to her that women could write papers on their own. (Actually, maybe all of the aristocracy of the time had some sort of taboo against publishing.)

    Here is the wikipedia’s take (not necessarily accurate, but as good a starting place as any):

    “Explaining the Analytical Engine’s function was a difficult task, as even other scientists did not really grasp the concept and the British establishment was uninterested in it. …

    The notes are longer than the memoir itself and include…, in complete detail, a method for calculating a sequence of Bernoulli numbers with the Engine, which would have run correctly had the Analytical Engine been built…

    …Lovelace is now widely credited with being the first computer programmer and her method is recognised as the world’s first computer program. Her work was well received at the time; Michael Faraday described himself as a fan of her writing.”

    No one seems to question that she understood the potential of programs and programming. She’s known for a number of quotes like this (there also are some more specific ones out there):

    “…[The Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations…

    …Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

    So she’s thinking like a programmer.

    At the end of the wikipedia section on the controversy, there’s an interesting quote by two pro-Ada historians:

    “[Lovelace] was certainly capable of writing the program herself given the proper formula; this is clear from her depth of understanding regarding the process of programming and from her improvements on Babbage’s programming notation….

    …From this letter, two things are clear. First, including a program that computed Bernoulli numbers was Ada’s idea. Second, Babbage at the very least provided the formulas for calculating Bernoulli numbers… Letters between Babbage and Ada at the time seem to indicate that Babbage’s contributions were limited to the mathematical formula and that Ada created the program herself.”

    I find this believable. One was the numerical analyst, one the programmer. That’s pretty much what you’d see today. Ada spent the better part of a year on it and apparently invented the programming notation, so yeah, you could say she was just the programming grunt… but that’s the part we’re interested in.

    One thing I’ll add. It probably pays to be very cautious taking as gospel what modern historians write about members of the historical British aristocracy. There’s a lot of neo-Marxist “and of course the aristocracy were nothing but a bunch of drunken and useless debauched idlers who never had an idea in their head that wasn’t silly or evil.” Ada is maybe a bit inconvenient for them. They’re often after the old take-down, anyway they can get it, kind of like the modern mainstream media when it deals with those it doesn’t like. Really, trust me, they were all just greedy pigs.

    So regards the link you quote, I’d go for a verdict of “not proven”.

    Read More
    • Replies: @candid_observer
    I suggest you read the quote I took above from the book by Stein, and pore over its significance.

    She couldn't grasp the meaning of the curve y=x^2 -- which any of us would take in a fellow student in analytic geometry as the sign of an incorrigible dunce. Doesn't that speak for itself as to her abilities as a mathematician? And she dismisses her incomprehension as of no significance, because she's so deeply immersed in such topics as differential calculus (as if she could understand that without getting the meaning of the y=x^2 curve).

    Yes, De Morgan and to an extent Babbage made Lovelace out to be something quite unusual, but one suspects they had reasons of their own -- let's say, reasons of the heart at least in De Morgan's case -- to do so.

  89. anonymous says:     Show CommentNext New Comment

    “….Just a little investigation shows just how grandiose and confused she was.

    …In a way, what’s remarkable is how even back then a number of distinguished men seemed hellbent to make her out to be some kind of genius when they would have had to be fools to believe it.”

    I don’t really see it. Everyone forgot about it for a hundred years until Howard Aiken was shown an old piece of the Analytic Engine in a Harvard attic. (I think they then rediscovered her notes.) They then basically built the machine, with WWII technology, and this was the electro-mechanical computer that actually worked as a computer during the war. (It is what Grace Hooper progammed.) Grace Hooper and Aiken knew of her notes and program. So they followed her. That was probably more what established Ada than anything she did at the time or the men of her age.

    She was almost certainly bipolar and grandiose, but what did she write about programming that was grandiose or that grandiosely ascribed accomplishments to herself? I didn’t think she herself make any claims as to being the first programmer or whatever. And her writing does not appear confused.

    I think it’s more likely that that’s a quote from one of those modern historians who have already decided the arrow of history, the good guys and the bad guys, and they are trying to push that arrow along.

    Read More
    • Replies: @Steve Sailer
    "Everyone forgot about it for a hundred years"

    That the work of Babbage and Lovelace in the first half of the 19th Century on computing was forgotten is particularly striking since they were giant celebrities in their own day. Lovelace was the only legitimate child of Lord Byron, who had been the most famous man in Europe in the decade after Napoleon. Babbage was a fixture at the best London parties. Dickens based a character on him.

    , @CJ
    Hopper, not Hooper.
  90. @anonymous
    "It is funny how every knows stuff that is not true."

    That might also be true of the link you quote. Trust but verify, someone once said.

    It doesn't seem like it's known to be not true, it's more a conjecture by one group of historians. This is apparently a summary of the controversy, one side of which is represented by the link you quote.


    Ada not a mathematician? Well, maybe.

    She apparently had a real mathematical education for her time. Her mother was hoping an education in math would counter Lord Byron's "irrationality". (Sounds like the lot of them were worried about madness, probably all bipolar?)

    One of her teachers was Augustus De Morgan. Just what is the definition of mathematician that whoever wrote that link you cites is using?

    "Her mother's obsession with rooting out any of the insanity of... Lord Byron was one of the reasons that Ada was taught mathematics from an early age. She was privately schooled in mathematics and science by William Frend, William King, and Mary Somerville, noted researcher and scientific author of the 19th century. One of her later tutors was mathematician and logician Augustus De Morgan. From 1832, when she was seventeen, her remarkable mathematical abilities began to emerge, and her interest in mathematics dominated the majority of her adult life. In a letter to Lady Byron, De Morgan suggested that her daughter's skill in mathematics could lead her to become "an original mathematical investigator, perhaps of first-rate eminence."

    If that de Morgan quote is true, I'll go with that. Ada was at least a potential, if not practicing, mathematician. The fact that that quote says she "was neither a mathematician..." has me suspicious of the rest of it.

    Few women (none?) of her position were. It apparently never occurred to her that women could write papers on their own. (Actually, maybe all of the aristocracy of the time had some sort of taboo against publishing.)

    Here is the wikipedia's take (not necessarily accurate, but as good a starting place as any):

    "Explaining the Analytical Engine's function was a difficult task, as even other scientists did not really grasp the concept and the British establishment was uninterested in it. ...

    The notes are longer than the memoir itself and include..., in complete detail, a method for calculating a sequence of Bernoulli numbers with the Engine, which would have run correctly had the Analytical Engine been built...

    ...Lovelace is now widely credited with being the first computer programmer and her method is recognised as the world's first computer program. Her work was well received at the time; Michael Faraday described himself as a fan of her writing."


    No one seems to question that she understood the potential of programs and programming. She's known for a number of quotes like this (there also are some more specific ones out there):

    "...[The Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations...

    ...Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."


    So she's thinking like a programmer.

    At the end of the wikipedia section on the controversy, there's an interesting quote by two pro-Ada historians:

    "[Lovelace] was certainly capable of writing the program herself given the proper formula; this is clear from her depth of understanding regarding the process of programming and from her improvements on Babbage's programming notation....

    ...From this letter, two things are clear. First, including a program that computed Bernoulli numbers was Ada's idea. Second, Babbage at the very least provided the formulas for calculating Bernoulli numbers... Letters between Babbage and Ada at the time seem to indicate that Babbage's contributions were limited to the mathematical formula and that Ada created the program herself."


    I find this believable. One was the numerical analyst, one the programmer. That's pretty much what you'd see today. Ada spent the better part of a year on it and apparently invented the programming notation, so yeah, you could say she was just the programming grunt... but that's the part we're interested in.



    One thing I'll add. It probably pays to be very cautious taking as gospel what modern historians write about members of the historical British aristocracy. There's a lot of neo-Marxist "and of course the aristocracy were nothing but a bunch of drunken and useless debauched idlers who never had an idea in their head that wasn't silly or evil." Ada is maybe a bit inconvenient for them. They're often after the old take-down, anyway they can get it, kind of like the modern mainstream media when it deals with those it doesn't like. Really, trust me, they were all just greedy pigs.

    So regards the link you quote, I'd go for a verdict of "not proven".

    I suggest you read the quote I took above from the book by Stein, and pore over its significance.

    She couldn’t grasp the meaning of the curve y=x^2 — which any of us would take in a fellow student in analytic geometry as the sign of an incorrigible dunce. Doesn’t that speak for itself as to her abilities as a mathematician? And she dismisses her incomprehension as of no significance, because she’s so deeply immersed in such topics as differential calculus (as if she could understand that without getting the meaning of the y=x^2 curve).

    Yes, De Morgan and to an extent Babbage made Lovelace out to be something quite unusual, but one suspects they had reasons of their own — let’s say, reasons of the heart at least in De Morgan’s case — to do so.

    Read More
    • Replies: @Steve Sailer
    Ada Lovelace, the Muse of Nerds ...

    There are much worse things to be.

  91. @anonymous
    "....Just a little investigation shows just how grandiose and confused she was.

    ...In a way, what’s remarkable is how even back then a number of distinguished men seemed hellbent to make her out to be some kind of genius when they would have had to be fools to believe it."


    I don't really see it. Everyone forgot about it for a hundred years until Howard Aiken was shown an old piece of the Analytic Engine in a Harvard attic. (I think they then rediscovered her notes.) They then basically built the machine, with WWII technology, and this was the electro-mechanical computer that actually worked as a computer during the war. (It is what Grace Hooper progammed.) Grace Hooper and Aiken knew of her notes and program. So they followed her. That was probably more what established Ada than anything she did at the time or the men of her age.

    She was almost certainly bipolar and grandiose, but what did she write about programming that was grandiose or that grandiosely ascribed accomplishments to herself? I didn't think she herself make any claims as to being the first programmer or whatever. And her writing does not appear confused.

    I think it's more likely that that's a quote from one of those modern historians who have already decided the arrow of history, the good guys and the bad guys, and they are trying to push that arrow along.

    “Everyone forgot about it for a hundred years”

    That the work of Babbage and Lovelace in the first half of the 19th Century on computing was forgotten is particularly striking since they were giant celebrities in their own day. Lovelace was the only legitimate child of Lord Byron, who had been the most famous man in Europe in the decade after Napoleon. Babbage was a fixture at the best London parties. Dickens based a character on him.

    Read More
    • Replies: @Steve Sailer
    In Stoppard's "Arcadia," the Babbage character consoles the Lovelace character:

    "We shed as we pick up, like travellers who must carry everything in their arms, and what we let fall will be picked up by those behind. The procession is very long and life is very short. We die on the march. But there is nothing outside the march so nothing can be lost to it. The missing plays of Sophocles will turn up piece by piece, or be written again in another language. Ancient cures for diseases will reveal themselves once more. Mathematical discoveries glimpsed and lost to view will have their time again. You do not suppose, my lady, that if all of Archimedes had been hiding in the great library of Alexandria, we would be at a loss for a corkscrew?”

  92. @Steve Sailer
    "Everyone forgot about it for a hundred years"

    That the work of Babbage and Lovelace in the first half of the 19th Century on computing was forgotten is particularly striking since they were giant celebrities in their own day. Lovelace was the only legitimate child of Lord Byron, who had been the most famous man in Europe in the decade after Napoleon. Babbage was a fixture at the best London parties. Dickens based a character on him.

    In Stoppard’s “Arcadia,” the Babbage character consoles the Lovelace character:

    “We shed as we pick up, like travellers who must carry everything in their arms, and what we let fall will be picked up by those behind. The procession is very long and life is very short. We die on the march. But there is nothing outside the march so nothing can be lost to it. The missing plays of Sophocles will turn up piece by piece, or be written again in another language. Ancient cures for diseases will reveal themselves once more. Mathematical discoveries glimpsed and lost to view will have their time again. You do not suppose, my lady, that if all of Archimedes had been hiding in the great library of Alexandria, we would be at a loss for a corkscrew?”

    Read More
  93. @candid_observer
    I suggest you read the quote I took above from the book by Stein, and pore over its significance.

    She couldn't grasp the meaning of the curve y=x^2 -- which any of us would take in a fellow student in analytic geometry as the sign of an incorrigible dunce. Doesn't that speak for itself as to her abilities as a mathematician? And she dismisses her incomprehension as of no significance, because she's so deeply immersed in such topics as differential calculus (as if she could understand that without getting the meaning of the y=x^2 curve).

    Yes, De Morgan and to an extent Babbage made Lovelace out to be something quite unusual, but one suspects they had reasons of their own -- let's say, reasons of the heart at least in De Morgan's case -- to do so.

    Ada Lovelace, the Muse of Nerds …

    There are much worse things to be.

    Read More
  94. Anonymous says:     Show CommentNext New Comment

    I suggest you read the quote I took above from the book by Stein, and pore over its significance.

    Okay, will do.

    What do you think of the following (from “Ada and the first computer”, Scientific American 280, Kim, Eugene Eric; Toole, Betty Alexandra (May 1999), by way of wikipedia’s references). Do you think they are right? How much math did she even have to know? Babbage had been working with Bernoulli numbers for years with the Difference Engine. So why wouldn’t he have just given her the formulas and why wouldn’t she have just accepted them and got on with the job of computing them?

    I’m cautious about reading too much into one letter from what sounds like a correspondence course taken from who knows what part of what you might call her no-doubt haphazard education by modern standards. Maybe she’s completely missing the notions of analytic geometry and approaching differential calculus from a completely symbolic manipulation point of view…

    “…her improvements on Babbage’s programming notation….

    …From this letter, two things are clear. First, including a program that computed Bernoulli numbers was Ada’s idea. Second, Babbage at the very least provided the formulas for calculating Bernoulli numbers… Letters between Babbage and Ada at the time seem to indicate that Babbage’s contributions were limited to the mathematical formula and that Ada created the program herself.”

    Read More
    • Replies: @Steve Sailer
    James Gleick's 2011 book "The Information" contains selections of the correspondence between Babbage and Lovelace. (Gleick's 1987 book on chaos theory helped inspire Stoppard's "Arcadia," which features characters vaguely inspired by Babbage and Lovelace.) My impression is that Lovelace was the first of the two to perceive the open-ended potential of computing.

    Of course, that might have been historically disastrous, since Babbage notoriously failed to get his Analytic Engine done. Maybe if he'd never met Lovelace, he would have eventually delivered a more limited machine that actually worked. And then somebody else would have delivered a better machine and so forth and so on over the next 100 years. Paul Johnson at some point speculates about how his beloved England could have launched the Information Age a half century early if only Babbage hadn't muffed his big chance (he got a lot of money from Parliament, but didn't deliver on it).

  95. A lot of stuff here doesn’t gibe with my reality–worked in corporate IT as an apps and systems programmer (mostly consulting, but also in-house) through the 80s, 90s, and early oughts. Anonymous 26′s post hits most of the actual reasons.

    The death of in-house applications as well as the death of the dream of an easy work-flow data model. More than anything, this had to do with the refusal of the tax code to treat inhouse development as equivalent to purchasing software–the latter was tax deductible, the former wasn’t. Over time, as the heyday of corporate America faded (probably the obsession with stock prices, but there may have been other reasons), so did the willingness to fund in-house. The Y2K was not a big glitch after all, but that was corporate IT’s last big hiring spree.

    In the 70s and 80s, corporate IT college degrees were all over the map. You had lots of history and English majors who were very good coders (and if you asked them why they hadn’t degreed in CS, they all said, regardless of gender, that they hated the math). Starting with the Internet era which coincided with the influx of H1B coders with PhDs willing to live 8 to a house, people with that background were simply considered beneath consideration. So at the same time corporate IT was dying out, the new field required largely useless degrees that many people (particularly women) weren’t interested in.

    While women were never overrepresented on the coding side, there were many more in corporate IT than there are now. And while some were super-techies, most of the best ones I knew were highly specialized application geeks who had to work late hours and on call hours and put in tons of time on business-critical applications that cost millions an hour if they went down. They weren’t 9-5 jobs; you worked hours on end as needed and then long lunches during slow times. But they were doing this work for large corporations. Women do tend to be risk averse. And yes, it’s true most of them don’t code for fun.

    It’s not like anything replaced corporate IT applications. They just made do with less. Because Steve’s nemesis was right.

    Anonymous 81′s post also corresponds with my memory.

    Most IT shops had four major divisions:

    Applications (Business Systems): applications that drove the business. Usually built-inhouse.

    Systems Programming: ran the operating systems, installed and integrated the purchased products , usually wrote code (yes, usually scripting applications) to pull everything together. Many people (raises hand) made a nice living designing the apps these folks needed to run things: problem and change management applications, source code management, etc.

    Database Administration: special case of Systems Programming, because managing business data required both knowledge of the in-house apps and the operating system as well as being an expert in the (always more than one) DBMS that ran the business.

    Operations: Sysadmin is a weak shadow of what ops used to be. They ran the batch cycle, the system maintenance jobs, were the first line of troubleshooting.

    I was talking to a friend of mine. Female, a decade younger, super techie who does indeed code for fun, but didn’t get into tech until about 15 years ago. After working in several startups, she told me that her company recently realized that they had a significant investment in in-house code and needed someone to manage the writing and support of this code. So now they were starting a new department of people responsible for the mission critical code, as well as the people who monitor it. I kept a straight face, to my great pride. Who knows, maybe IT is coming back under a different name.

    Read More
  96. Anonymous says:     Show CommentNext New Comment

    I suggest you read the quote I took above from the book by Stein, and pore over its significance.

    Okay, will do.

    What do you think of the following (from “Ada and the first computer”, Scientific American 280, Kim, Eugene Eric; Toole, Betty Alexandra (May 1999), by way of wikipedia’s references):

    “…her improvements on Babbage’s programming notation….

    …From this letter, two things are clear. First, including a program that computed Bernoulli numbers was Ada’s idea. Second, Babbage at the very least provided the formulas for calculating Bernoulli numbers… Letters between Babbage and Ada at the time seem to indicate that Babbage’s contributions were limited to the mathematical formula and that Ada created the program herself.”

    Do you think they are right? How much math did she even have to know? Babbage had been working with Bernoulli numbers for years with the Difference Engine. So why wouldn’t he have just given her the formulas and why wouldn’t she have just accepted them and got on with the job of computing them?

    I’m cautious about reading too much into one letter from what sounds like a correspondence course taken from who knows what part of what you might call her no-doubt haphazard education by modern standards. Maybe she’s completely missing the notions of analytic geometry and approaching differential calculus from a completely symbolic manipulation point of view…

    Read More
  97. @Anonymous
    I suggest you read the quote I took above from the book by Stein, and pore over its significance.


    Okay, will do.

    What do you think of the following (from "Ada and the first computer", Scientific American 280, Kim, Eugene Eric; Toole, Betty Alexandra (May 1999), by way of wikipedia's references). Do you think they are right? How much math did she even have to know? Babbage had been working with Bernoulli numbers for years with the Difference Engine. So why wouldn't he have just given her the formulas and why wouldn't she have just accepted them and got on with the job of computing them?

    I'm cautious about reading too much into one letter from what sounds like a correspondence course taken from who knows what part of what you might call her no-doubt haphazard education by modern standards. Maybe she's completely missing the notions of analytic geometry and approaching differential calculus from a completely symbolic manipulation point of view...


    "...her improvements on Babbage’s programming notation….

    ...From this letter, two things are clear. First, including a program that computed Bernoulli numbers was Ada’s idea. Second, Babbage at the very least provided the formulas for calculating Bernoulli numbers… Letters between Babbage and Ada at the time seem to indicate that Babbage’s contributions were limited to the mathematical formula and that Ada created the program herself.”

    James Gleick’s 2011 book “The Information” contains selections of the correspondence between Babbage and Lovelace. (Gleick’s 1987 book on chaos theory helped inspire Stoppard’s “Arcadia,” which features characters vaguely inspired by Babbage and Lovelace.) My impression is that Lovelace was the first of the two to perceive the open-ended potential of computing.

    Of course, that might have been historically disastrous, since Babbage notoriously failed to get his Analytic Engine done. Maybe if he’d never met Lovelace, he would have eventually delivered a more limited machine that actually worked. And then somebody else would have delivered a better machine and so forth and so on over the next 100 years. Paul Johnson at some point speculates about how his beloved England could have launched the Information Age a half century early if only Babbage hadn’t muffed his big chance (he got a lot of money from Parliament, but didn’t deliver on it).

    Read More
    • Replies: @candid_observer
    My impression is that Lovelace was the first of the two to perceive the open-ended potential of computing.

    It's important to bear in mind what the perception of "the open-ended potential of computing" really amounts to here.

    I remember reading about Lovelace's speculations -- she believed that the machine might be able to write music, or weave designs, or translate languages. Babbage apparently never brought up such possibilities. But anyone who has ever written a new application and explained it to someone naive on the outside knows how cheap and easy such speculation is. They might offer up such groundbreaking "innovations" as: Why can't you just make the application respond to voice? Why can't you just put electrodes on your head and have the application controlled by your brainwaves? Etc. The person writing the application knows how overwhelmingly difficult it would be to do such things, and dismisses the possibility as something that could only come about in some very distant future, if at all. He, as someone trying to sell the idea of the application to others, knows he can't get caught up in speculation on which he has no possibility of delivering.

    People like Lovelace are content simply to speculate; people like Babbage ask themselves instead: how could one make this thing actually happen? As any inventor knows, you don't really have an invention until you can see how you can make it work. It's the difference between science fiction and technology.

    And I really very much doubt that the Information Age might have been launched 50 years earlier if Babbage had been better understood. As a purely mechanical device, it was probably doomed to be a cumbersome toy. My guess is that any genuinely efficient and useful computer would have required enough electrical sophistication in its relays, etc., that it would simply have been impossible technically much earlier than its actual introduction.

  98. anonymous says:     Show CommentNext New Comment

    “I kept a straight face, to my great pride. Who knows, maybe IT is coming back under a different name.”

    I’ve had the same experience. These days I guess they call it “DevOps” and I hear it’s the cat’s meow.

    Read More
  99. @Steve Sailer
    "In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user."

    That's what the lady running the mainframes at the company I worked for in the 1980s constantly told me as she sabotaged my plans for PCs. She had a point, too.

    I actually got into computer programming through mainframes in the late 90′s. I trained to work on the Y2k stuff and later maintained programs for a large bank. The system was built on COBOL and DB2 with TSO tying everything together and JCL executing all of the jobs. I really learned to appreciate the power and simplicity of this technology that was refined over 40 years.

    One thing that I became acutely aware of is how the 4GL universe vilified the 3GL world. The OO guys would constantly vilify the old fogey COBOL people as obsolete who built spaghetti code that was impossible to maintain.

    In fact, the system I used was built by Andersen Consulting in the mid-90′s and was used to manage SWIFT data. Far from spaghetti code, this COBOL implementation was using CALL…USING features that were very similar to object-oriented design.

    I learned that software operates on a “tear-down” mentality in order to sell product. It is a very dishonest and disingenuous business.

    Read More
  100. anonymous says:     Show CommentNext New Comment

    Although the story of electronic computers is reasonably well known (the ENIAC and following machines) they didn’t really get operational in WWII.

    The computer that was operational, and was heavily used during WWII, was the electro-mechanical Harvard Mark I. It was basically a Babbage Analytic Engine made with electric motors, relays, clutches, and shafts that acted as registers containing lots of gears that acted as digits. (I think it’s “bus” was a quarter horsepower electric motor powering a driveshaft, kind of like what you find in an old mill.)

    This was the computer that Grace Hooper wrote software for, developed libraries on, and where she invented the term “debugging”. This was the computer on which the first sizeable body of software was developed, even before there were electronic digital computers. This computer actually did Manhattan Project work.

    The computer “…was the first operating machine that could execute long computations automatically.”

    Howard Aiken, a Harvard physicist, built it and here is the connection to Ada and (the first programmer):

    “…after two rejections, he was shown a demonstration set that Charles Babbage’s son had given to Harvard university 50 years earlier. This led him to study Babbage and to add references of the analytical engine to his proposal ; the resulting machine “brought Babbage’s principles of the analytical engine almost to full realization, while adding important new features.”

    At this point, Grace Hopper: Admiral of the Cyber Sea, Kathleen Broom Williams, picks up the story:

    “Aiken… told her (Grace Hooper) she was going to write a book… she went to work (against her wishes) writing a manual for the Mark I. …

    …At this point Hopper’s computing education under Aiken took an interesting twist. He insisted that she read Charles Babbage’s autobiography for most of the basic computing concepts. Aiken also wanted her to familiarize herself with the work of Lady Ada Lovelace, whom Hopper always remembered for having written the first loop.”

    Grace Hooper was tremendously influential. I mean, inventing the term “debugging” really puts you in on the ground floor. And after studying Babbage and Lovelace, while trying to build one of these machines under the pressures of WWII, for Grace Hooper, writing “the first loop” made Ada the first programmer, even if her program had never run.

    Read More
    • Replies: @Steve Sailer
    Hopper, not Hooper.
    , @Hard Line Realist

    Grace Hooper was tremendously influential. I mean, inventing the term “debugging” really puts you in on the ground floor.
     
    Are you telling me that you agree that coming up with a name for something that engineers and designers have done for a very long time makes you a giant in the field? Debugging is the name for the process of figuring out what is wrong with some process or product or design, regardless of whether it has software or not. (Note, that engineers have always had a term for that process. They didn't need Hopper to invent it for them.)

    If that is the case, I have to wonder about your intelligence.

    , @Hard Line Realist

    Grace Hooper was tremendously influential. I mean, inventing the term “debugging” really puts you in on the ground floor.
     
    Are you telling me that you agree that coming up with a name for something that engineers and designers have done for a very long time makes you a giant in the field? Debugging is the name for the process of figuring out what is wrong with some process or product or design, regardless of whether it has software or not. (Note, that engineers have always had a term for that process. They didn't need Hopper to invent it for them.)

    If that is the case, I have to wonder about your intelligence.
  101. @anonymous
    Although the story of electronic computers is reasonably well known (the ENIAC and following machines) they didn't really get operational in WWII.

    The computer that was operational, and was heavily used during WWII, was the electro-mechanical Harvard Mark I. It was basically a Babbage Analytic Engine made with electric motors, relays, clutches, and shafts that acted as registers containing lots of gears that acted as digits. (I think it's "bus" was a quarter horsepower electric motor powering a driveshaft, kind of like what you find in an old mill.)

    This was the computer that Grace Hooper wrote software for, developed libraries on, and where she invented the term "debugging". This was the computer on which the first sizeable body of software was developed, even before there were electronic digital computers. This computer actually did Manhattan Project work.

    The computer "...was the first operating machine that could execute long computations automatically."

    Howard Aiken, a Harvard physicist, built it and here is the connection to Ada and (the first programmer):

    "...after two rejections, he was shown a demonstration set that Charles Babbage’s son had given to Harvard university 50 years earlier. This led him to study Babbage and to add references of the analytical engine to his proposal ; the resulting machine “brought Babbage’s principles of the analytical engine almost to full realization, while adding important new features.”

    At this point, Grace Hopper: Admiral of the Cyber Sea, Kathleen Broom Williams, picks up the story:

    "Aiken... told her (Grace Hooper) she was going to write a book... she went to work (against her wishes) writing a manual for the Mark I. ...

    ...At this point Hopper's computing education under Aiken took an interesting twist. He insisted that she read Charles Babbage's autobiography for most of the basic computing concepts. Aiken also wanted her to familiarize herself with the work of Lady Ada Lovelace, whom Hopper always remembered for having written the first loop."


    Grace Hooper was tremendously influential. I mean, inventing the term "debugging" really puts you in on the ground floor. And after studying Babbage and Lovelace, while trying to build one of these machines under the pressures of WWII, for Grace Hooper, writing "the first loop" made Ada the first programmer, even if her program had never run.

    Hopper, not Hooper.

    Read More
    • Replies: @Steve Sailer
    Forgive me for sounding snarky. This is very interesting information you bring up. But, please,
    "Hopper."
  102. I was under the impression that computers were re-invented without awareness of Babbage and Lovelace’s work a century before. On the other hand, much of what I thought I knew from my youth about the history of computers in the first half of the 1940s was more or less disinformation put out to cover up the Enigma project during WWII. For example, I can recall reading in an early 1970s reference book written by the McWhirter twins of Guinness Book fame that computers were invented during WWII for calculations related to artillery trajectories. I now know that the Enigma decipherment project was officially secret until 1974.

    Read More
  103. @Steve Sailer
    Hopper, not Hooper.

    Forgive me for sounding snarky. This is very interesting information you bring up. But, please,
    “Hopper.”

    Read More
  104. @anonymous
    "....Just a little investigation shows just how grandiose and confused she was.

    ...In a way, what’s remarkable is how even back then a number of distinguished men seemed hellbent to make her out to be some kind of genius when they would have had to be fools to believe it."


    I don't really see it. Everyone forgot about it for a hundred years until Howard Aiken was shown an old piece of the Analytic Engine in a Harvard attic. (I think they then rediscovered her notes.) They then basically built the machine, with WWII technology, and this was the electro-mechanical computer that actually worked as a computer during the war. (It is what Grace Hooper progammed.) Grace Hooper and Aiken knew of her notes and program. So they followed her. That was probably more what established Ada than anything she did at the time or the men of her age.

    She was almost certainly bipolar and grandiose, but what did she write about programming that was grandiose or that grandiosely ascribed accomplishments to herself? I didn't think she herself make any claims as to being the first programmer or whatever. And her writing does not appear confused.

    I think it's more likely that that's a quote from one of those modern historians who have already decided the arrow of history, the good guys and the bad guys, and they are trying to push that arrow along.

    Hopper, not Hooper.

    Read More
  105. “These days I guess they call it “DevOps” and I hear it’s the cat’s meow.”

    Hahaha. At some point in our conversation, I said, “So it’s a bit like the old corporate IT sho—”

    “Don’t say that. Don’t even *think* it.” Yes. Must be invented here.

    Anyway, for anyone interested, the middle chunk of this long essay has a description of how I got into IT and what I did, which goes through a lot of the kind of system (not business) in house apps that had to get built.

    The mission critical apps folk had very high standing in the company, but couldn’t get much money at any other company. People like me, on the other hand, had relatively low status in any one company (although if you got good and trusted, that could change), but could charge a comfortable hourly consulting rate because so few wanted to do that kind of work. (I’m talking now of my work in process and work flow design and apps.

    https://educationrealist.wordpress.com/2012/08/11/learning-math/

    Read More
  106. “anonymous says:

    Sigh. Sadly I still got the link to the Alamaden Monolith wrong. It’s cursed, I tell ya!”

    Thanks for the links about SAGE and the Almaden AF Station. Growing up in the Bay Area, my eyes scanned over that mountain many times, without realizing what was up there. That was an interesting little bit of history of that old-timey technology. Again, thanks.

    Read More
  107. @Steve Sailer
    James Gleick's 2011 book "The Information" contains selections of the correspondence between Babbage and Lovelace. (Gleick's 1987 book on chaos theory helped inspire Stoppard's "Arcadia," which features characters vaguely inspired by Babbage and Lovelace.) My impression is that Lovelace was the first of the two to perceive the open-ended potential of computing.

    Of course, that might have been historically disastrous, since Babbage notoriously failed to get his Analytic Engine done. Maybe if he'd never met Lovelace, he would have eventually delivered a more limited machine that actually worked. And then somebody else would have delivered a better machine and so forth and so on over the next 100 years. Paul Johnson at some point speculates about how his beloved England could have launched the Information Age a half century early if only Babbage hadn't muffed his big chance (he got a lot of money from Parliament, but didn't deliver on it).

    My impression is that Lovelace was the first of the two to perceive the open-ended potential of computing.

    It’s important to bear in mind what the perception of “the open-ended potential of computing” really amounts to here.

    I remember reading about Lovelace’s speculations — she believed that the machine might be able to write music, or weave designs, or translate languages. Babbage apparently never brought up such possibilities. But anyone who has ever written a new application and explained it to someone naive on the outside knows how cheap and easy such speculation is. They might offer up such groundbreaking “innovations” as: Why can’t you just make the application respond to voice? Why can’t you just put electrodes on your head and have the application controlled by your brainwaves? Etc. The person writing the application knows how overwhelmingly difficult it would be to do such things, and dismisses the possibility as something that could only come about in some very distant future, if at all. He, as someone trying to sell the idea of the application to others, knows he can’t get caught up in speculation on which he has no possibility of delivering.

    People like Lovelace are content simply to speculate; people like Babbage ask themselves instead: how could one make this thing actually happen? As any inventor knows, you don’t really have an invention until you can see how you can make it work. It’s the difference between science fiction and technology.

    And I really very much doubt that the Information Age might have been launched 50 years earlier if Babbage had been better understood. As a purely mechanical device, it was probably doomed to be a cumbersome toy. My guess is that any genuinely efficient and useful computer would have required enough electrical sophistication in its relays, etc., that it would simply have been impossible technically much earlier than its actual introduction.

    Read More
  108. I just wanted to point out that the linked article is hilariously wrong in its most basic assumptions.

    Women coding in the early days = Data entry today.

    Secretaries “coding” back-in-the-day does not equal programming. They are two completely different jobs.

    Read More
  109. @Steve Sailer
    "In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user."

    That's what the lady running the mainframes at the company I worked for in the 1980s constantly told me as she sabotaged my plans for PCs. She had a point, too.

    “In a lot of ways, the PC revolution is merely re-inventing the mainframe, only with a lot less stability and confidence for the end-user.”

    With virtualisation and thin clients we are indeed going full circle. For “data centre” read “server farm”.

    Read More
  110. @anonymous-antimarxist
    Actually I believe H1-B came in 1990. But there were a forerunner visa programs that served similar purposes.

    I noticed that corporate America began to see its programming work force as essentially disposable by the mid-1990s. By then the networks of Immigration Lawyers, HR hacks and lobbyists supporting the program were well established.

    The Dot-Com bubble temporary masked what was happening. By the 2001, the massive H1-B expansion pushed through by Gene Sperling and Elena Kagan under Clinton and then Dubya letting corporate America know that under no circumstances would H1-B visa overstays be deported, the full impact of H1-B on programmer employment began to be felt.

    Also do not forget the L1 visa as well.

    1990 is certainly when the first statistics started getting collected, but I’m sure I’ve seen references to it before that.

    Read More
  111. Anonymous says:     Show CommentNext New Comment

    The comparison is inapposite. The “coding” done by women at terminals back then was largely preparing business reports. Women still do that type of “coding”. It is called “creating Excel spreadsheets” and “writing Excel macros”. “Coding” in the old sense has been democratized and is available on every secretary’s desktop for years.

    Read More
  112. @anonymous
    Although the story of electronic computers is reasonably well known (the ENIAC and following machines) they didn't really get operational in WWII.

    The computer that was operational, and was heavily used during WWII, was the electro-mechanical Harvard Mark I. It was basically a Babbage Analytic Engine made with electric motors, relays, clutches, and shafts that acted as registers containing lots of gears that acted as digits. (I think it's "bus" was a quarter horsepower electric motor powering a driveshaft, kind of like what you find in an old mill.)

    This was the computer that Grace Hooper wrote software for, developed libraries on, and where she invented the term "debugging". This was the computer on which the first sizeable body of software was developed, even before there were electronic digital computers. This computer actually did Manhattan Project work.

    The computer "...was the first operating machine that could execute long computations automatically."

    Howard Aiken, a Harvard physicist, built it and here is the connection to Ada and (the first programmer):

    "...after two rejections, he was shown a demonstration set that Charles Babbage’s son had given to Harvard university 50 years earlier. This led him to study Babbage and to add references of the analytical engine to his proposal ; the resulting machine “brought Babbage’s principles of the analytical engine almost to full realization, while adding important new features.”

    At this point, Grace Hopper: Admiral of the Cyber Sea, Kathleen Broom Williams, picks up the story:

    "Aiken... told her (Grace Hooper) she was going to write a book... she went to work (against her wishes) writing a manual for the Mark I. ...

    ...At this point Hopper's computing education under Aiken took an interesting twist. He insisted that she read Charles Babbage's autobiography for most of the basic computing concepts. Aiken also wanted her to familiarize herself with the work of Lady Ada Lovelace, whom Hopper always remembered for having written the first loop."


    Grace Hooper was tremendously influential. I mean, inventing the term "debugging" really puts you in on the ground floor. And after studying Babbage and Lovelace, while trying to build one of these machines under the pressures of WWII, for Grace Hooper, writing "the first loop" made Ada the first programmer, even if her program had never run.

    Grace Hooper was tremendously influential. I mean, inventing the term “debugging” really puts you in on the ground floor.

    Are you telling me that you agree that coming up with a name for something that engineers and designers have done for a very long time makes you a giant in the field? Debugging is the name for the process of figuring out what is wrong with some process or product or design, regardless of whether it has software or not. (Note, that engineers have always had a term for that process. They didn’t need Hopper to invent it for them.)

    If that is the case, I have to wonder about your intelligence.

    Read More
  113. @Shouting Thomas
    If you've read my links, I'll continue to explain what, it seems to me, actually happened.

    I moved on to become a sophisticated coder and multimedia artist in the late 80s.

    So, what actually happened with the introduction of PCs was... the same old story. The Wild Wild West opened up when PCs entered the market in a big way. Anti-trust legislation broke up IBM. Systems were very different in every office. A coder wasn't working, as he was during the IBM era, with a stagnant and unchanging system that could be taught in a classroom. He had to produce custom results on demand for a wide variety of businesses.

    When a stagnant system characterized by monopoly breaks up, the Wild Wild West becomes the reality. Only men and whores want to be in the Wild Wild West.

    Sooo…funny! Finally, a great example that makes sense. As a mother of 3 gifted STEM boys, coders, and general renegades, who are also “wordly,” social and athletic, I can relate to this idea. I have always been a renegade myself, but have only found a very small group of women the last 35 years to share my struggles/triumphs with. Bold and driven women whether in STEM or not ( art world for me) are rare, and, people should just accept that, and get over themselves (it seems that people who aren’t good at math/coding or have no creative talent, are always seething about illusory “fairness” in the work place). No one has dared to ever hold me back, or stifle my opinions, but I have actually had more negative comments from women who were/are intimidated by my intelligence, tenacity, artistic/creative ability. I have been called a “ball buster” by men (not my husband!) and “intense/aloof/dismissive” by women.

    And, weirdly, I am relieved that my sons can just stay on their “trailblazing” (good, idea, your “Wild West” analogy) paths without being hindered by petty envy from anyone because they are just “faster” and ahead of the curve all the time. Had I had a daughter, she would be under the scrutiny of both men and women like I was – so I would have had to teach her the skills to ignore most people who are not as smart/driven as she is. So, yeah, I get why there are more boys in comp sci or most STEM fields, and, it is not due to institutional sexism. My MIT faculty father raised me like a boy, or really, in a non-gender way because we are Scandinavian, and, the Nordic culture expects every child to have the same survival skills. I have told my sons to marry one day , someone who is smarter, has more degrees, and possibly, out-earns you…can carry you up a mountain, pay the mortgage if you get laid-off
    ! This is the new “trophy” wife in northern Europe.

    Read More
  114. @anonymous
    Although the story of electronic computers is reasonably well known (the ENIAC and following machines) they didn't really get operational in WWII.

    The computer that was operational, and was heavily used during WWII, was the electro-mechanical Harvard Mark I. It was basically a Babbage Analytic Engine made with electric motors, relays, clutches, and shafts that acted as registers containing lots of gears that acted as digits. (I think it's "bus" was a quarter horsepower electric motor powering a driveshaft, kind of like what you find in an old mill.)

    This was the computer that Grace Hooper wrote software for, developed libraries on, and where she invented the term "debugging". This was the computer on which the first sizeable body of software was developed, even before there were electronic digital computers. This computer actually did Manhattan Project work.

    The computer "...was the first operating machine that could execute long computations automatically."

    Howard Aiken, a Harvard physicist, built it and here is the connection to Ada and (the first programmer):

    "...after two rejections, he was shown a demonstration set that Charles Babbage’s son had given to Harvard university 50 years earlier. This led him to study Babbage and to add references of the analytical engine to his proposal ; the resulting machine “brought Babbage’s principles of the analytical engine almost to full realization, while adding important new features.”

    At this point, Grace Hopper: Admiral of the Cyber Sea, Kathleen Broom Williams, picks up the story:

    "Aiken... told her (Grace Hooper) she was going to write a book... she went to work (against her wishes) writing a manual for the Mark I. ...

    ...At this point Hopper's computing education under Aiken took an interesting twist. He insisted that she read Charles Babbage's autobiography for most of the basic computing concepts. Aiken also wanted her to familiarize herself with the work of Lady Ada Lovelace, whom Hopper always remembered for having written the first loop."


    Grace Hooper was tremendously influential. I mean, inventing the term "debugging" really puts you in on the ground floor. And after studying Babbage and Lovelace, while trying to build one of these machines under the pressures of WWII, for Grace Hooper, writing "the first loop" made Ada the first programmer, even if her program had never run.

    Grace Hooper was tremendously influential. I mean, inventing the term “debugging” really puts you in on the ground floor.

    Are you telling me that you agree that coming up with a name for something that engineers and designers have done for a very long time makes you a giant in the field? Debugging is the name for the process of figuring out what is wrong with some process or product or design, regardless of whether it has software or not. (Note, that engineers have always had a term for that process. They didn’t need Hopper to invent it for them.)

    If that is the case, I have to wonder about your intelligence.

    Read More
  115. My nemesis during this era was D., the woman in charge of the huge staff that ran the mainframe, who hated microcomputers.

    Ah, the good old days. The friction between stuffy mainframe types and wild and rowdy microcomputer enthusiasts was always a big deal among techies when I was growing up. It seems so quaint now that technology has essentially blurred away all distinction between the two worlds.

    Read More
  116. @Mr. Anon
    "These early personal computers weren’t much more than toys. You could play pong or simple shooting games, maybe do some word processing. And these toys were marketed almost entirely to men and boys."

    And how many women who worked in the computer industry bought these early machines, just to fool around with? My guess is...........just about zero.

    In my experience, women are not that interested in their work as such - they turn it off when they leave work at five. There are now lots of women engineers, or at least women who are called engineers. How many of them have a technical hobby of any kind? How many of them do their own plumbing, or wiring, or car repair at home?

    I don’t know about plumbing, but one of the most brilliant (in almost every way) people that I know of, is a woman physicist. She is adept at fixing/firing up/even making from scratch, computers of all types, and loves them like stuffed animals. In fact, almost anything mechanical and requiring focus and concentration, she can do. She repaired an upholstery rip by actually painstakingly re-weaving it. You could barely see the mend. She even bakes amazingly complicated cakes, and has raised 3 children very well. Several marriages though. Her engineering husband was said to have resented her. My experience is that men resent great skill in women if it is a field they themselves care about a lot and want to excel in.
    My sister is a programmer. Learned it on the job when that was still an option, and has done well. OTOH, I had to dictate her English compositions to her before school in the morning because she claimed she had no imagination.
    There are women who excel in these things, but I actually don’t think men pay that much attention to what gals get up to if it doesn’t personally concern them, Mr. Anon. Which is why you might have missed it.
    btw, an Indian gentleman I work with brushed aside his whole 40 yr programming career with “it’s just being logical” when I expressed admiration for the facility.
    Me, I don’t have the head for it, but when I was younger I was always meeting girls that did. But they tended to veer towards other fields for long term prospects.

    Read More
  117. anonymous says:     Show CommentNext New Comment

    “Are you telling me that you agree that coming up with a name for something that engineers and designers have done for a very long time makes you a giant in the field?”

    No, I’m saying she was in the first group of unequivocal real programmers who led directly to where software is today, as indicated by the fact that we still use some of the terminology she originated. The point has nothing to do with the particular term. (You are familiar with the story of the moth in the relay?)

    She is considered a giant in the field because she “…invented the first compiler for a computer programming language. …popularized the idea of machine-independent programming languages…” and “…served as the technical consultant to the committee” that defined Cobol, which historically has been the mostly widely used programming language. She also “…pioneered the implementation of standards for testing computer systems…”.

    My estimation of her is hardly alone. If you are unaware of her the wikipedia page says she was the first American to be elected a Distinguished Fellow of the British Computer Society and in 1971 the ACM (Association of Computing Machinery) created the Grace Murray Hopper Award. (Visit the ACM page here.) A similar award they have is the Turing award.

    The real point is that she, Howard Aiken, Richard Milton Bloch, and Robert Campbell have a strong claim to be the first group of practicing programmers on a reliably working general purpose computer. As such, who they say the first programmers were that they followed and learned from might be more important that what anybody said about themselves or what historians have written. The second group gets to tell us who was first. Who did they learn from? It seems Aiken made them read Charles Babbage and Ada Lovelace. Grace Hopper apparently considered Ada Lovelace the first programmer because of her program listing that contained a loop.

    Read More
  118. anonymous says:     Show CommentNext New Comment

    “Women coding in the early days = Data entry today.”

    No, Keypunch was literally data entry. Keypunch tended to be something like 90% female. Occasionally you ran into some guy working his way through school or temping, or whatever. For a few years it was a great place to put african-american males with not a lot of education. All male programmers could keypunch when they had to, but they rarely did except when debugging.

    I’m not sure where you get that you couldn’t find any women writing code at most any level of the industry, from lowest level system programming to the top of the stack. It wasn’t that big a deal at the time. As top system programmers they were always probably in the minority, often substantially, but they were there. There’s still a little of it that can be found around.

    Object-oriented programming is a big deal these days, for instance. Barbara Liskov’s work in the 70s and 80s resulted in: “Liskov received the 2008 Turing Award from the ACM, in March 2009, for her work in the design of programming languages and software methodology that led to the development of object-oriented programming.”

    Read More
  119. anonymous says:     Show CommentNext New Comment

    “People like Lovelace are content simply to speculate; people like Babbage ask themselves instead: how could one make this thing actually happen?”

    You can read her “translator’s notes” here “Sketch of The Analytical Engine Invented by Charles Babbage”,By L. F. MENABREA of Turin, Officer of the Military Engineers from the Bibliothèque Universelle de Genève, October, 1842, No. 82, With notes upon the Memoir by the Translator ADA AUGUSTA, COUNTESS OF LOVELACE.

    (Caps as in original title page.)

    Read More
  120. Everyone seems to be missing the truly revolutionary thing about the IBM Selectric: the interchangeable ball!! You could switch from pica to elite, type italics, math symbols, accented characters, etc., simply by popping in a new type ball. This was not eclipsed until the laser printer in the mid-80s.

    On the computing gender wars…I only have anecdata that a lot fewer women than men got interested in figuring out how to make their cursor blink, how to make the PDP-11 ask you for a cookie before you could enter a command, and just in general get fascinated with how to make the computer do what one wanted. FWIW…

    Read More
  121. The most deplorable one [AKA "Fourth doorman of the apocalypse"] says:     Show CommentNext New Comment

    She is considered a giant in the field because she “…invented the first compiler for a computer programming language. …popularized the idea of machine-independent programming languages…” and “…served as the technical consultant to the committee” that defined Cobol, which historically has been the mostly widely used programming language. She also “…pioneered the implementation of standards for testing computer systems…”.

    As always, people know things that are just not true.

    See, for example, the wiki page on compiler

    Towards the end of the 1950s, machine-independent programming languages were first proposed. Subsequently several experimental compilers were developed. The first compiler was written by Grace Hopper, in 1952, for the A-0 programming language. The A-0 functioned more as a loader or linker than the modern notion of a compiler. The first autocode and its compiler were developed by Alick Glennie in 1952 for the Mark 1 computer at the University of Manchester and is considered by some to be the first compiled programming language. The FORTRAN team led by John Backus at IBM is generally credited as having introduced the first complete compiler in 1957. COBOL was an early language to be compiled on multiple architectures, in 1960.[2]

    See that caveat on A-0. It is not even what I (or people who understand this stuff) would call a computer language. Probably just a language for an FSM.

    People have their sacred cows.

    Read More
  122. The most deplorable one [AKA "Fourth doorman of the apocalypse"] says:     Show CommentNext New Comment

    Indeed, by looking at the description of the A-0 System:

    The subroutines were identified by a numeric code and the arguments to the subroutines were written directly after each subroutine code. The A-0 system converted the specification into machine code that could be fed into the computer a second time to execute the said program.

    it can be seen that the A-0 compiler was little more than an assembler. There were plenty of those already.

    Read More
  123. anonymous says:     Show CommentNext New Comment

    “…it can be seen that the A-0 compiler was little more than an assembler. There were plenty of those already.”

    Take a look at the history of Flow-matic and how it led to Cobol (which for a very long time was the most widely used programming language):

    “FLOW-MATIC, originally known as B-0 (Business Language version 0), was the first English-like data processing language. It was developed for the UNIVAC I at Remington Rand under Grace Hopper during the period from 1955 until 1959. …

    …In late 1953 she proposed that data processing problems should be expressed using English keywords… In early 1955, she and her team… implemented a prototype.

    …FLOW-MATIC was the first programming language to express operations using English-like statements. …

    …FLOW-MATIC was the first system to distinctly separate the description of data from the operations on it. …

    …Flow-Matic was a major influence in the design of COBOL, since only it and its direct descendent AIMACO were in actual use at the time. Several elements of Flow-Matic were incorporated into COBOL.”

    Here’s the example chunk of code in the wikipedia article (not a complete program):

    (0) INPUT INVENTORY FILE-A PRICE FILE-B ; OUTPUT PRICED-INV FILE-C UNPRICED-INV
    FILE-D ; HSP D .

    (1) COMPARE PRODUCT-NO (A) WITH PRODUCT-NO (B) ; IF GREATER GO TO OPERATION 10 ;
    IF EQUAL GO TO OPERATION 5 ; OTHERWISE GO TO OPERATION 2 .

    (2) TRANSFER A TO D .

    (3) WRITE-ITEM D .

    (4) JUMP TO OPERATION 8 .

    (5) TRANSFER A TO C .

    (6) MOVE UNIT-PRICE (B) TO UNIT-PRICE (C) .

    (7) WRITE-ITEM C .

    (8) READ-ITEM A ; IF END OF DATA GO TO OPERATION 14 .

    (9) JUMP TO OPERATION 1 .

    (10) READ-ITEM B ; IF END OF DATA GO TO OPERATION 12 .

    (11) JUMP TO OPERATION 1 .

    (12) SET OPERATION 9 TO GO TO OPERATION 2 .

    (13) JUMP TO OPERATION 2 .

    (14) TEST PRODUCT-NO (B) AGAINST ZZZZZZZZZZZZ ; IF EQUAL GO TO OPERATION 16 ;
    OTHERWISE GO TO OPERATION 15 .

    (15) REWIND B .

    (16) CLOSE-OUT FILES C ; D .

    (17) STOP . (END)

    I think you might be missing how important languages oriented at what they used to call DP (data processing) were, as opposed to high-level assembly (system programming languages) or languages intended for scientific computing. Languages good for managing records and solving for what we today would call database problems became the core of the computing industry. I think for maybe the first 30 years something like 80% to 90% of programming was done in Cobol.

    If a prototype of this was running in 1955, that probably beats Fortran (1956 or 1957).

    If you remain unconvinced, write a detailed paper about why you’re right and submit it to the IEEE Annals of the History of Computing. I don’t think you’d find anyone really cares about the sex of who did what, but there’s plenty of interest in the stuff that was done. This is very well plowed ground, though.

    Read More
    • Replies: @The most deplorable one

    Take a look at the history of Flow-matic and how it led to Cobol (which for a very long time was the most widely used programming language):

    “FLOW-MATIC, originally known as B-0 (Business Language version 0), was the first English-like data processing language. It was developed for the UNIVAC I at Remington Rand under Grace Hopper during the period from 1955 until 1959. …
     
    Damn the torpedoes, full speed ahead!

    You showed another example of a high-level assembler. I used one of those back in the '80 when writing code on Telenet packet switches. We were honest back then and called them high level assemblers.

    Languages good for managing records and solving for what we today would call database problems became the core of the computing industry.

    Still trying to use changes in the meanings of words to assert that Hopper was teh super woman, I see.

    I think you have no understanding of the orders of magnitude difference between the sorts of reports they did then and what is done now, and indeed, you seem to have little understanding of what real system programming is.
  124. anonymous says:     Show CommentNext New Comment

    “”The subroutines were identified by a numeric code and the arguments to the subroutines were written directly after each subroutine code. The A-0 system converted the specification into machine code that could be fed into the computer a second time to execute the said program.

    …it can be seen that the A-0 compiler was little more than an assembler.”

    It’s not just generating binary from mnemonic codes, as with an assembler. It sounds more like a form of direct–threaded-code or subroutine-threaded code, the system providing both the code generator and the “interpreter” (run-time driver), complicated by the fact that there apparently was a large library (for its time) of routines on tape. Patching in the routines on tape, and generating fixups for them to presumably access their arguments, sounds like the “compilation” process. There have been plenty of compilers (in particular for small systems) that used similar run-time models, for instance, the old DEC FOR compiler on the PDP-11 was a popular and long-lasting example. Heck, Microsoft’s P-code that was used in many of their early products and Visual Basic is another example.

    See threaded code: “So-called “subroutine-threaded code” (also “call-threaded code”) consists of a series of machine-language “call” instructions (or addresses of functions to “call”, as opposed to direct threading’s use of “jump”). Early compilers for ALGOL, Fortran, Cobol and some Forth systems often produced subroutine-threaded code.”

    Whatever it was, the A-2 version seems to have been widely used by 1953: “The A-2 system was developed at the UNIVAC division of Remington Rand in 1953 and released to customers by the end of that year. Customers were provided the source code for A-2 and invited to send their improvements back to UNIVAC.”

    The Cambridge Centre for Computing History puts it like this:

    “…The A-2 compiler was the first compiler to be used extensively, paving the way to the development of programming languages. …

    …she was met with the common opinion that a computer couldn’t write its own programs. It took two years for the compiler to become accepted…

    …” …nobody would touch it because, they carefully told me, computers could only do arithmetic; they could not do programs. It was a selling job to get people to try it. I think with any new idea, because people are allergic to change, you have to get out and sell the idea” …”

    Read More
  125. The most deplorable one [AKA "Fourth doorman of the apocalypse"] says:     Show CommentNext New Comment
    @anonymous
    "...it can be seen that the A-0 compiler was little more than an assembler. There were plenty of those already."


    Take a look at the history of Flow-matic and how it led to Cobol (which for a very long time was the most widely used programming language):

    "FLOW-MATIC, originally known as B-0 (Business Language version 0), was the first English-like data processing language. It was developed for the UNIVAC I at Remington Rand under Grace Hopper during the period from 1955 until 1959. ...

    ...In late 1953 she proposed that data processing problems should be expressed using English keywords... In early 1955, she and her team... implemented a prototype.

    ...FLOW-MATIC was the first programming language to express operations using English-like statements. ...

    ...FLOW-MATIC was the first system to distinctly separate the description of data from the operations on it. ...

    ...Flow-Matic was a major influence in the design of COBOL, since only it and its direct descendent AIMACO were in actual use at the time. Several elements of Flow-Matic were incorporated into COBOL."

    Here's the example chunk of code in the wikipedia article (not a complete program):

    (0) INPUT INVENTORY FILE-A PRICE FILE-B ; OUTPUT PRICED-INV FILE-C UNPRICED-INV
    FILE-D ; HSP D .

    (1) COMPARE PRODUCT-NO (A) WITH PRODUCT-NO (B) ; IF GREATER GO TO OPERATION 10 ;
    IF EQUAL GO TO OPERATION 5 ; OTHERWISE GO TO OPERATION 2 .

    (2) TRANSFER A TO D .

    (3) WRITE-ITEM D .

    (4) JUMP TO OPERATION 8 .

    (5) TRANSFER A TO C .

    (6) MOVE UNIT-PRICE (B) TO UNIT-PRICE (C) .

    (7) WRITE-ITEM C .

    (8) READ-ITEM A ; IF END OF DATA GO TO OPERATION 14 .

    (9) JUMP TO OPERATION 1 .

    (10) READ-ITEM B ; IF END OF DATA GO TO OPERATION 12 .

    (11) JUMP TO OPERATION 1 .

    (12) SET OPERATION 9 TO GO TO OPERATION 2 .

    (13) JUMP TO OPERATION 2 .

    (14) TEST PRODUCT-NO (B) AGAINST ZZZZZZZZZZZZ ; IF EQUAL GO TO OPERATION 16 ;
    OTHERWISE GO TO OPERATION 15 .

    (15) REWIND B .

    (16) CLOSE-OUT FILES C ; D .

    (17) STOP . (END)

     

    I think you might be missing how important languages oriented at what they used to call DP (data processing) were, as opposed to high-level assembly (system programming languages) or languages intended for scientific computing. Languages good for managing records and solving for what we today would call database problems became the core of the computing industry. I think for maybe the first 30 years something like 80% to 90% of programming was done in Cobol.

    If a prototype of this was running in 1955, that probably beats Fortran (1956 or 1957).

    If you remain unconvinced, write a detailed paper about why you're right and submit it to the IEEE Annals of the History of Computing. I don't think you'd find anyone really cares about the sex of who did what, but there's plenty of interest in the stuff that was done. This is very well plowed ground, though.

    Take a look at the history of Flow-matic and how it led to Cobol (which for a very long time was the most widely used programming language):

    “FLOW-MATIC, originally known as B-0 (Business Language version 0), was the first English-like data processing language. It was developed for the UNIVAC I at Remington Rand under Grace Hopper during the period from 1955 until 1959. …

    Damn the torpedoes, full speed ahead!

    You showed another example of a high-level assembler. I used one of those back in the ’80 when writing code on Telenet packet switches. We were honest back then and called them high level assemblers.

    Languages good for managing records and solving for what we today would call database problems became the core of the computing industry.

    Still trying to use changes in the meanings of words to assert that Hopper was teh super woman, I see.

    I think you have no understanding of the orders of magnitude difference between the sorts of reports they did then and what is done now, and indeed, you seem to have little understanding of what real system programming is.

    Read More
  126. anonymous says:     Show CommentNext New Comment

    “I think you have no understanding of the orders of magnitude difference between the sorts of reports they did then and what is done now, and indeed, you seem to have little understanding of what real system programming is.”

    Sadly if you did your first system programming back in the 80′s I’ve probably got over a decade more system programming experience… a fair amount of DP experience on early databases as well… pre-relational codasyl databases, databases written in Fortran for portability… did you ever use Informatics Mark IV? I believe it was the first commercial software product period. It was basically a precursor to database systems (pre-codasyl) that worked on files. It was kind of cool because everything that was done on it was done with paper forms, like IRS tax forms… Anyhow I used it so I can say I used the first commercial software product and know a fair amount about the types of reports that were done then and now. (I suppose it’s also sad that I’m still dealing with modern database to browser lashups… are we sure anyone has made in real progress in this business?) And I’ve done a lot of very heavily used OS code and kernel code, across decades. I’ve written assemblers. I’ve programmed in real high-level assemblers (ever used DEC’s SUPMAC?). Are you familiar with the difference between high-level assemblers and compilers that produce threaded code?

    I’m trying to figure out why you are so hostile to simple statements of fact about Grace Hopper. She was one of the first real practicing programmers in the world, because she worked with Aiken on the Mark I. Indeed, she wrote the manual for the Mark I. Is any of this deniable? She studied Babbage and Lovelace and called Lovelace the first programmer because she had studied her code. That doesn’t seem deniable. She developed the family of compilers that intellectually led to Cobol; this started with A-0 in 1951-52. This took years of work and her early compiler work required selling the idea of programming “in english”, which was a hard sell at first (and which was one of her most steadfast positions). These compilers were widely used (for the time) and influential. Because of this work she was one of the two technical advisers to the project that defined Cobol. Flow-Matic and Cobol code don’t look all that obviously different, you can see the resemblance. Cobol appeared in 1959 and soon dominated DP. It didn’t materialize out of a vacuum.

    I really don’t see the problem.

    She was in the right place at the right time. No one is calling her a superwomen, just someone who played a key role in the evolution of software in the 20th century.

    You might be under the impression that her reputation was something created by re-writing history or to forward an agenda, as is all to common these days. I don’t think so. I saw her talk a couple of time, mostly at things like IEEE venues. In the early to mid-70s she was treated kind of like Linus Torvalds and for the same reasons, I suppose.

    Read More
  127. anonymous says:     Show CommentNext New Comment

    I have a half-baked theory about what happened to women in the computer industry around 1984.

    In the 60′s and 70′s women went into computing for similar reasons to why they went into accounting. Indeed, there was a fair amount of overlap in women in accounting and women in computing. Accounting was a stable 9-5 white-collar job, a good career path. In the 60s and most of the 70s the same was true of computing. To work in computing was almost by definition to work for a large stable company.

    There are still a lot of women in accounting:


    “Women are 49.4% of all auditors, accountants, and investment professionals in Canada.

    Women are 60.9% of all accountants and auditors in the United States.

    In a 2011 study, women were half of newly hired accounting graduates at CPA firms, and 40% of all CPAs.”

    With the arrival of microprocessors and the PC (say, around 1982), computing became the Wild West. All of a sudden everybody was in it for the big bucks. Pressure-cooker environments and 100-hour week “death marches” became common. One heard stories. A lot of smart and competitive guys who probably wouldn’t have entered the field before came in, maybe to some extent crowding out women. But the real issue was that the new environment was much higher risk than before and much less stable. Much more uncertainty. The expected “workpace” of a career in the field was no longer the same.

    So the women went back into accounting (and other similar professions).

    Read More
    • Replies: @Steve Sailer
    "I have a half-baked theory about what happened to women in the computer industry around 1984."

    Very good.

  128. anonymous says:     Show CommentNext New Comment

    “Thanks for the links about SAGE and the Almaden AF Station.”

    There have been efforts by various government agencies to tear it down as a hazard. It’s kind of like Moffet’s Hanger-1 (the dirigible hanger); a long-time landmark in the bay area that everybody takes for granted and no government even wants to pay enough to tear it down, but now they seem to be getting around to it:

    “Mount Umunhum radar tower will stay standing — at least for five years”, Eric Kurhi, Mercury News, posted 17-Oct-2012, update 20-Oct-2012.

    There’s an effort to save it: Umunhum Conservancy
    Dedicated to preserving and restoring Mt. Umunhum and its historic Radar Tower
    .

    The radar antennas on top of what is now the Monolith (and the other similar radars that were part of SAGE) were the largest movable radar antennas every made, I believe. They were the size of small radio telescopes.

    Here’s a pic of the Monolith with it’s antenna.

    It made the cover of a local phone book back in 1967.

    Read More
  129. @anonymous
    I have a half-baked theory about what happened to women in the computer industry around 1984.

    In the 60's and 70's women went into computing for similar reasons to why they went into accounting. Indeed, there was a fair amount of overlap in women in accounting and women in computing. Accounting was a stable 9-5 white-collar job, a good career path. In the 60s and most of the 70s the same was true of computing. To work in computing was almost by definition to work for a large stable company.

    There are still a lot of women in accounting:


    "Women are 49.4% of all auditors, accountants, and investment professionals in Canada.

    Women are 60.9% of all accountants and auditors in the United States.

    In a 2011 study, women were half of newly hired accounting graduates at CPA firms, and 40% of all CPAs."



    With the arrival of microprocessors and the PC (say, around 1982), computing became the Wild West. All of a sudden everybody was in it for the big bucks. Pressure-cooker environments and 100-hour week "death marches" became common. One heard stories. A lot of smart and competitive guys who probably wouldn't have entered the field before came in, maybe to some extent crowding out women. But the real issue was that the new environment was much higher risk than before and much less stable. Much more uncertainty. The expected "workpace" of a career in the field was no longer the same.

    So the women went back into accounting (and other similar professions).

    “I have a half-baked theory about what happened to women in the computer industry around 1984.”

    Very good.

    Read More
  130. @countenance
    The way I understand it, IBM was dragged into the PC era. IBM could have never invented or thought of the concept on its own, because it was an east coast corporation that was used to dealing with the culture of east coast corporations. The microcomputer (as it was once called) could have only been invented on the more individualist west coast.

    I’m not convinced that the Altair 8080 micro was all that important. It was the playground and testbed for those Silicon Valley boys, but it didn’t turn into anything remotely commercial.

    DEC already had desktop computers in the late 60s, which were glorified word processors. (ie computers used by women.) IBM answered DEC with its first PC in 1975, but didn’t get behind it. After Apple had popularized its own answer to the DEC desktops, IBM finally saw a market and tried its second PC in 1981, which caught on.

    Read More
  131. anonymous says:     Show CommentNext New Comment

    Here’s a story that gets to the heart of why women were right to leave programming when its status as a middle-class profession became questionable (“lock the boys up in the back room, throw them a pizza or burger ever so often, and let them gut it out… promise those H1-Bs the opportunity to marry each other and have an anchor baby, and they’ll do just about anything to spend 5 years slaving before the mast…”):

    “Feds set to destroy H-1B records: Records that are critical to research and take up a microscopic amount of storage are set for deletion”, Patrick Thibodeau, Computerworld, Oct 27, 2014:

    “…In a notice posted last week, the U.S. Department of Labor said that records used for labor certification, whether in paper or electronic, “are temporary records and subject to destruction” after five years, under a new policy.

    There was no explanation for the change, and it is perplexing to researchers. The records under threat are called Labor Condition Applications (LCA), which identify the H-1B employer, worksite, the prevailing wage, and the wage paid to the worker.

    …”Throwing information away is anathema to the pursuit of knowledge and akin to willful stupidity or, worse, defacing Buddhist statues,” said…

    …”It undermines our ability to evaluate what the government does and, in today’s world, retaining electronic records like the LCA is next to costless…

    …A full year’s worth of LCA data is less than 1GB.


    …escaped notice until the Labor Department posted a note (See Oct. 17 note titled “H-1B legacy records no long available.”)

    …says he would not rule out the idea that industry lobbyists, unhappy with some of the research, are behind the limit on records.

    …The new record retention policy may add to the criticism the U.S. has already faced over its immigration record keeping. Researchers have faulted government records for being error filled, inconsistently available and difficult to work with.”

    From the article it appears that the agency that keeps records on H1-Bs may be deliberately way-behind-the-times (mostly paper records), if not wilfully incompetent.

    It sure sounds like the government is really working overtime on buying itself a new population… “Trust us, we’re from the government and we’re here to help.”

    Read More
  132. anonymous says:     Show CommentNext New Comment

    Someone (Hi, Steve!) should look at the fertility of female H1-Bs, compared to native female US programmers (software engineers, call them whatever you will).

    Of course, it is becoming more apparent that the data might be hard to come by. That man behind the curtain sure seems to have a lot of curtains he’s trying to wrap us all in.

    Read More
  133. @AlphaMaleBrogrammer
    I really need to do some research on this question, because I keep hearing this dubious story about the supposed golden age of female programmers and I want to know whether there's any truth to it.

    Were these female "programmers" really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems work, etc.? Were they doing low level system programming and/or developing complex programs using higher level, abstract languages like Java?

    Or were they doing what would today be called "scripting" using languages like COBOL?

    There's a world of difference between someone with a CS degree from Stanford who works on Google's search engine, or develops embedded software for fighter jets, compared to someone who writes Excel macros or HTML and can't explain the difference between an array and a linked list. They might both call themselves "programmers" however, and technically they'd both be correct.

    I suspect that these female programmers back in the day were far closer to the latter. If someone can shed some light on this I'd greatly appreciate it.

    In the late 1960(s) and early 1970(s), this female worked in a data processing center
    at a major bank. Basically, we used the IBM 402 keypunch card readers. At that time,
    credit cards and some corporate & government checks were keypunch cards. The IBM 402
    was a HARD wired device, which the operator programmed. The “mainframe” we used
    was an IBM 360 with an IBM 1419 MICR reader. Most was programmed with FORTRAN,
    which is (unlike most of these posts) was behind the “new” or existing systems at a
    root level. This was a three step process: First, the logic would be written by the programming
    group. Then, their hand written program was sent to keypunchers, who would keypunch
    the cards. Then, they would take the cards to the operators, who also had an IBM 402, which
    then transferred the program to the IBM 360 thru sub-floored wiring.

    Once your understand that a PC does not really stand alone, then you can understand
    that the roots of it are STILL in the original programming, which in its time was more
    (not less) complicated. I built my own first PC and I programmed it myself because I
    had a background in FORTRAN. I would eventually co-own a telecom company and
    helped to merge an token ring and an ethernet, which both IBM and Nortel said could
    not be done, but then the software people didn’t ask the hardware people how
    telephones transmit data thru a T1 carrier.

    Considering that process, females dominated the computer field. But you have to take
    into account the big picture here. Or what happened with each revision of the
    technology itself and what part that revision had a role in the next revision…..not only
    in software, but also hardware. Inside, they only got smaller and faster. They still
    have the same basic skills for logic needed to do programming.

    For example, I just worked on a conversion for healthcare. The conversion works
    from FORTRAN to user friendly newer system with color and pictures (neither necessary
    for the job it is to accomplish). However, the conversion works through
    a separate translator program. The ROOT is still from the FORTRAN programmed database.
    So, basically, it is NOT more complicated or difficult. They are just making it difficult by
    adding unnecessary bells and whistles, instead of just doing another revision of
    the FORTRAN to increase allocated memory and add the connectivity.

    Read More
  134. @AlphaMaleBrogrammer
    I really need to do some research on this question, because I keep hearing this dubious story about the supposed golden age of female programmers and I want to know whether there's any truth to it.

    Were these female "programmers" really doing software engineering as we think of it today? Did they understand algorithms and data structures, CPU architecture, how operating systems work, etc.? Were they doing low level system programming and/or developing complex programs using higher level, abstract languages like Java?

    Or were they doing what would today be called "scripting" using languages like COBOL?

    There's a world of difference between someone with a CS degree from Stanford who works on Google's search engine, or develops embedded software for fighter jets, compared to someone who writes Excel macros or HTML and can't explain the difference between an array and a linked list. They might both call themselves "programmers" however, and technically they'd both be correct.

    I suspect that these female programmers back in the day were far closer to the latter. If someone can shed some light on this I'd greatly appreciate it.

    [MORE]
    PS: And to directly answer your question…..YES, females were programmers and often doing the most complex of the analytic thinking. And that was before Texas Instruments introduced the “financial calculator” that did NOT have the logic to query (or search) for accounts with overdrafts and/or negative balances to send out specific letters. An IBM 402 was a glorified auto-read calculator. The IBM 360 was a full blown and multi-tasked LOGIC system. It didn’t “search” on the internet, but it did multi-task and search the data base to insert data into another program, which could be a range of dates, times, names, etc. Just because it now does a query online does not mean it did not exist in those early computer days that were not as user friendly.

    Read More

Comments are closed.

PastClassics
Which superpower is more threatened by its “extractive elites”?
The evidence is clear — but often ignored
What Was John McCain's True Wartime Record in Vietnam?
The sources of America’s immigration problems—and a possible solution