Bots on Amazon’s task service may be souring psych studies

It's not uncommon for scientists to use Amazon's Mechanical Turk for research — they can quickly gather survey data in return for a small outlay to participants. Academics are quickly discovering the limits of the task service with the modern bot-ha…

MIT can secure cloud-based AI without slowing it down

It's rather important to secure cloud-based AI systems, especially when they they use sensitive data like photos or medical records. To date, though, that hasn't been very practical — encrypting the data can render machine learning systems so slow…

VR optics could help old folks keep the world in focus

The complex optics involved with putting a screen an inch away from the eye in VR headsets could make for smartglasses that correct for vision problems. These prototype “autofocals” from Stanford researchers use depth sensing and gaze tracking to bring the world into focus when someone lacks the ability to do it on their own.

I talked with lead researcher Nitish Padmanaban at SIGGRAPH in Vancouver, where he and the others on his team were showing off the latest version of the system. It’s meant, he explained, to be a better solution to the problem of presbyopia, which is basically when your eyes refuse to focus on close-up objects. It happens to millions of people as they age, even people with otherwise excellent vision.

There are, of course, bifocals and progressive lenses that bend light in such a way as to bring such objects into focus — purely optical solutions, and cheap as well, but inflexible, and they only provide a small “viewport” through which to view the world. And there are adjustable-lens glasses as well, but must be adjusted slowly and manually with a dial on the side. What if you could make the whole lens change shape automatically, depending on the user’s need, in real time?

That’s what Padmanaban and colleagues Robert Konrad and Gordon Wetzstein are working on, and although the current prototype is obviously far too bulky and limited for actual deployment, the concept seems totally sound.

Padmanaban previously worked in VR, and mentioned what’s called the convergence-accommodation problem. Basically, the way that we see changes in real life when we move and refocus our eyes from far to near doesn’t happen properly (if at all) in VR, and that can produce pain and nausea. Having lenses that automatically adjust based on where you’re looking would be useful there — and indeed some VR developers were showing off just that only 10 feet away. But it could also apply to people who are unable to focus on nearby objects in the real world, Padmanaban thought.

This is an old prototype, but you get the idea.

It works like this. A depth sensor on the glasses collects a basic view of the scene in front of the person: a newspaper is 14 inches away, a table three feet away, the rest of the room considerably more. Then an eye-tracking system checks where the user is currently looking and cross-references that with the depth map.

Having been equipped with the specifics of the user’s vision problem, for instance that they have trouble focusing on objects closer than 20 inches away, the apparatus can then make an intelligent decision as to whether and how to adjust the lenses of the glasses.

In the case above, if the user was looking at the table or the rest of the room, the glasses will assume whatever normal correction the person requires to see — perhaps none. But if they change their gaze to focus on the paper, the glasses immediately adjust the lenses (perhaps independently per eye) to bring that object into focus in a way that doesn’t strain the person’s eyes.

The whole process of checking the gaze, depth of the selected object and adjustment of the lenses takes a total of about 150 milliseconds. That’s long enough that the user might notice it happens, but the whole process of redirecting and refocusing one’s gaze takes perhaps three or four times that long — so the changes in the device will be complete by the time the user’s eyes would normally be at rest again.

“Even with an early prototype, the Autofocals are comparable to and sometimes better than traditional correction,” reads a short summary of the research published for SIGGRAPH. “Furthermore, the ‘natural’ operation of the Autofocals makes them usable on first wear.”

The team is currently conducting tests to measure more quantitatively the improvements derived from this system, and test for any possible ill effects, glitches or other complaints. They’re a long way from commercialization, but Padmanaban suggested that some manufacturers are already looking into this type of method and despite its early stage, it’s highly promising. We can expect to hear more from them when the full paper is published.

Material with eagle-like grasp could lead to grippier robot hands

Most materials bulge out when you squeeze them, pushing the energy outside. But that's not always what you want — wouldn't it sometimes be better for them to collapse and hold the energy inside? These exotic materials (known as auxetics) already e…

NASA launches Parker Solar Probe in mission to ‘touch’ the Sun

After a few delays, the Sun-chasing Parker Solar Probe is on its way. NASA launched the spacecraft aboard a ULA Delta IV Heavy rocket at 3:31AM Eastern this morning (August 12th) and confirmed that the vessel was healthy at 5:33AM. The probe still…

NASA’s Parker Solar Probe launches tonight to ‘touch the sun’

NASA’s ambitious mission to go closer to the Sun than ever before is set to launch in the small hours between Friday and Saturday — at 3:53 AM Eastern from Kennedy Space Center in Florida, to be precise. The Parker Solar Probe, after a handful of gravity assists and preliminary orbits, will enter a stable orbit around the enormous nuclear fireball that gives us all life and sample its radiation from less than 4 million miles away. Believe me, you don’t want to get much closer than that.

If you’re up late tonight (technically tomorrow morning), you can watch the launch live on NASA’s stream.

This is the first mission named after a living researcher, in this case Eugene Parker, who in the ’50s made a number of proposals and theories about the way that stars give off energy. He’s the guy who gave us solar wind, and his research was hugely influential in the study of the sun and other stars — but it’s only now that some of his hypotheses can be tested directly. (Parker himself visited the craft during its construction, and will be at the launch. No doubt he is immensely proud and excited about this whole situation.)

“Directly” means going as close to the sun as technology allows — which leads us to the PSP’s first major innovation: its heat shield, or thermal protection system.

There’s one good thing to be said for the heat near the sun: it’s a dry heat. Because there’s no water vapor or gases in space to heat up, find some shade and you’ll be quite comfortable. So the probe is essentially carrying the most heavy-duty parasol ever created.

It’s a sort of carbon sandwich, with superheated carbon composite on the outside and a carbon foam core. All together it’s less than a foot thick, but it reduces the temperature the probe’s instruments are subjected to from 2,500 degrees Fahrenheit to 85 — actually cooler than it is in much of the U.S. right now.

Go on – it’s quite cool.

The car-sized Parker will orbit the sun and constantly rotate itself so the heat shield is facing inward and blocking the brunt of the solar radiation. The instruments mostly sit behind it in a big insulated bundle.

And such instruments! There are three major experiments or instrument sets on the probe.

WISPR (Wide-Field Imager for Parker Solar Probe) is a pair of wide-field telescopes that will watch and image the structure of the corona and solar wind. This is the kind of observation we’ve made before — but never from up close. We generally are seeing these phenomena from the neighborhood of the Earth, nearly 100 million miles away. You can imagine that cutting out 90 million miles of cosmic dust, interfering radiation and other nuisances will produce an amazingly clear picture.

SWEAP (Solar Wind Electrons Alphas and Protons investigation) looks out to the side of the craft to watch the flows of electrons as they are affected by solar wind and other factors. And on the front is the Solar Probe Cup (I suspect this is a reference to the Ray Bradbury story, “Golden Apples of the Sun”), which is exposed to the full strength of the sun’s radiation; a tiny opening allows charged particles in, and by tracking how they pass through a series of charged windows, they can sort them by type and energy.

FIELDS is another that gets the full heat of the sun. Its antennas are the ones sticking out from the sides — they need to in order to directly sample the electric field surrounding the craft. A set of “fluxgate magnetometers,” clearly a made-up name, measure the magnetic field at an incredibly high rate: two million samples per second.

They’re all powered by solar panels, which seems obvious, but actually it’s a difficult proposition to keep the panels from overloading that close to the sun. They hide behind the shield and just peek out at an oblique angle, so only a fraction of the radiation hits them.

Even then, they’ll get so hot that the team needed to implement the first-ever active water cooling system on a spacecraft. Water is pumped through the cells and back behind the shield, where it is cooled by, well, space.

The probe’s mission profile is a complicated one. After escaping the clutches of the Earth, it will swing by Venus, not to get a gravity boost, but “almost like doing a little handbrake turn,” as one official described it. It slows it down and sends it closer to the sun — and it’ll do that seven more times, each time bringing it closer and closer to the sun’s surface, ultimately arriving in a stable orbit 3.83 million miles above the surface — that’s 95 percent of the way from the Earth to the sun.

On the way it will hit a top speed of 430,000 miles per hour, which will make it the fastest spacecraft ever launched.

Parker will make 24 total passes through the corona, and during these times communication with Earth may be interrupted or impractical. If a solar cell is overheating, do you want to wait 20 minutes for a decision from NASA on whether to pull it back? No. This close to the sun even a slight miscalculation results in the reduction of the probe to a cinder, so the team has imbued it with more than the usual autonomy.

It’s covered in sensors in addition to its instruments, and an onboard AI will be empowered to make decisions to rectify anomalies. That sounds worryingly like a HAL 9000 situation, but there are no humans on board to kill, so it’s probably okay.

The mission is scheduled to last seven years, after which time the fuel used to correct the craft’s orbit and orientation is expected to run out. At that point it will continue as long as it can before drift causes it to break apart and, one rather hopes, become part of the sun’s corona itself.

The Parker Solar Probe is scheduled for launch early Saturday morning, and we’ll update this post when it takes off successfully or, as is possible, is delayed until a later date in the launch window.

NASA puts $44 million toward cryogenics and mid-air spacecraft retrieval

NASA has announced a set of public-private partnerships with several U.S. space companies, totaling an impressive $44 million. Blue Origin, Astrobotic Technology, United Launch Alliance and more are the recipients of up to $10 million each for a variety of projects aimed at exploring and utilizing space safely and efficiently.

The 10 awards are for “tipping point” technologies, as NASA calls them, that are highly promising but need funding for a ground or flight demonstration — in other words, to get it out of the lab.

ULA is the big winner here, taking home $13.9 million split between three projects — $10 million will go to looking into a cryogenic vehicle fluid management system that could simplify and improve lunar landers. The rest of the money is split between another cryogenic fluid project for missions with long durations, and a project to “demonstrate mid-air retrieval capabilities up to 8,000 pounds… on a vehicle returning to Earth from orbital velocity.” Really, that last one is the cheapest?

Blue Origin has $13 million coming its way, primarily for… yet another cryogenic fluid management system for lunar landers. You can see where NASA’s priorities are — putting boots on the regolith. The remainder goes to testing a suite of advanced sensors that could make lunar landings easier. The company will be testing both these systems on its New Shepard vehicle from as high as 100km.

The other big $10 million prize goes to Astrobotic Technology, which will, like Blue Origin, be working on a sensor suite for Terrain Relative Navigation. It’s basically adding intelligence to a craft’s landing apparatus so it can autonomously change its touchdown location, implement safety measures and so on, based on the actual local observed conditions.

The Mars 2020 Rover will be using its own TRN system, and the ones funded here will be different and presumably more advanced, but this gif from NASA does a good job illustrating the tech:

Several other endeavors were selected by NASA for funding, and you can find them — and more technical details for the ones mentioned above — at the partnership announcement page.

Your smartphone is a ‘security blanket’ for your social life

Do you instinctively reach for your smartphone whenever you're in an awkward chat with unfamiliar people, or just feel like you're the odd one out? There's likely a good reason for it. Researchers have published a study indicating that people treat…