What Happened To Women In Computer Science? 

For decades, the number of women in computer science grew faster than the number of men —until you get to 1984. At that point, the percentage of women began to plunge (even as the share of women in fields like mechanical engineering, math and physics kept rising).

So what happened? What was going on in 1984?

NPR’s Planet Money tried to untangle this question and the answer is complex.

One of the big changes to happen around 1984 was the introduction of small personal computers into the home. Early computers weren’t much more than toys (think pong and space invaders) and they were marketed almost exclusively to boys. 

In the 1990s, UCLA researcher Jane Margolis interviewed hundreds of computer science students at Carnegie Mellon University, which had one of the best programs in the country. She found that families were much more likely to buy computers for boys than for girls — even when the girls were the ones who were interested in computers. 

The pattern was pretty consistent. One student told a story of having to ask her brother for the key to use the computer because it was actually locked away from her in his room. This may be an extreme example, but Margolis never heard the reverse — no stories of boys having to go into their sister’s room to use the computer.

This was a big deal when those kids went to college. As personal computers became more common, computer science professors increasingly assumed that their students had grown up playing with computers.

By the mid 90s, the Carnegie Mellon computer science program was 93% men. Half the women who went to school for computer science ended up quitting the program. As Margolis explains:

“Because if you’re in a culture that is so infused with this belief that men are just better at this and they fit in better — a lot can shake your confidence. You can be sitting next to a male student who could say, ‘You don’t know that? …And you’re a computer science major?’” 

And these types of slights add up.

In her research, Margolis discovered that a lot of the women who were dropping out were great at computer science — more than half were on the dean’s list.

So how do we get women back in to computer science?

Margolis did her research with Allan Fisher, the Dean of the Computer Science program at Carnegie Mellon. The two ended up using what they had learned to make adjustments to the program.

They paid a lot more attention to teaching and added an intro course for students who didn’t have a lot of informal computer science experience.

And it worked. In 5 years, they turned the school around: 42% of computer science students were women (and the drop out rate was the same for men and women).

Top Image: Planet Money

Bottom Image: Two women operating the ENIAC’s main control panel

IBM makes huge quantum computing advance

A new study published in Nature details how researchers with IBM have found a way to detect errors which had been holding back quantum computing. Where normal computers use bits to represent data as a 1 or 0 state, a quantum ‘qubit’ can be either 1, 0 or both - known as a superposition. The problem so far has been that a qubit in this state can suddenly flip to just being a normal 1 or 0, or another type of error can occur, known as a phase flip.

Until now it’s only been possible to detect each type of error on its own, but not to detect both at the same time without affecting the calculation. The IBM team has shown that by using two independent qubits, it can be possible to reveal information stored on two other qubits which are being used to process data.

Internet translation (with cats instead of qubits):

If I have two cats resting with their eyes closed, they could be either asleep, awake, or just resting with their eyes shut. To complicate things further, they could also be in the process of waking up or going to sleep, which upsets our calculations if we’re using them for a school science fair project called ‘how many hours a day do cats sleep’. If we went and checked either cat we’d probably get an idea of what they were up to, but we’d definitely wake them up (and risk a scratch to the face).

IBM have just published a study showing how they can use two magical ‘measurement cats’ that are linked to the ‘data cats’ in our science fair project. One measurement cat shows when the data cats are just waking up, and one measurement cat shows when the data cats are just going to sleep. By combining all of this data, they’ve just won the science fair.


PetPix Stereo

Hackaday covers project by Michael Hill at VCF East X that takes an old Commodore PET computer and manages produce a steroscopic display - PETSCII VR!

What would happen if Oculus-quality virtual reality was created in the 80s on the Commodore PET? [Michael Hill] knows, because he created a stereoscopic video headset using a PET.

… This year, he’s doubling the number of screens, and sending everything to two iPhones in a Google Cardboard-like VR headset. Apart from the optics, the setup is pretty simple: cameras get image data, it’s sent over to a PET, and a stream of characters are sent back.

More Here

The only thing stronger than your imagination is your imagination connected to the billions of other imaginations all over the world, connected to smart machines that continue to get smarter, faster.
—  Rita King
Year of Light: Computer processor based on the brain uses light to transmit signals

Traditional computers manipulate electrons to transform our keystrokes and Google searchers into meaningful actions. But as parts of the computer processor shrink to only a few atoms across, electrons become unpredictable and our ability to shuttle them across long and short distances diminishes.

The Lightwave Communications Laboratory at Princeton University is trying a different approach to designing ultrafast communication networks and signal processing devices. Instead of electrons, they use light!

Keep reading

What if all governmental agencies switched to Linux and OpenOffice tomorrow? What if the military paid a hundred of its own IT guys to design a more reliable intelligence tracking system that didn’t require disclosing access parameters to outside developers? Suddenly, Microsoft’s make-work vanishes. No longer does the overgrown autistic boy with the propeller beanie get paid a billion dollars a month to bag groceries as part of a special program to make Bill and Melinda feel like contributing members of society. The great 20th and 21st century Computer Trust breaks apart, once the government’s communist software wing stops standardizing all professional operating systems under the Microsoft method.
—  High Arka

Green computer server heats homes for free

The most ecologically sound forms of green energy are those which would otherwise have been the waste product of another process. We have already seen start-up Bio-Bean turning used coffee beans into energy to power London’s coffee shops and LucidPipes harvesting green energy from Portland’s water pipes. Now, Netherlands-based Nerdalize is using the heat produced from computer servers to warm the homes they are installed in. READ MORE…

“…Now it’s computers and more computers and soon everybody will have one. 3 year olds will have computers and everybody will know everything about everybody else long before they meet them and so they won’t want to meet them. Nobody will want to meet anybody else ever again and everybody will be a recluse like I am now.”

An excerpt from one of Charles Bukowski’s poems, regarding media convergence, written several years before the advent of social networks and handheld telecommunication devices. Bukowski received a Macintosh IIsi computer and a laser printer from his wife, Linda, as Christmas presents in 1990.