News & Updates

Enlarge / Airport security line. (credit: TSA)

Congressional Republicans want to impose “net neutrality” rules that allow Internet service providers to charge online services and websites for priority access to consumers. Making the case for paid prioritization Tuesday, US Rep. Marsha Blackburn (R-Tenn.) said that paying for priority access would be similar to enrolling in TSA Precheck.

“In real life, all sorts of interactions are prioritized every day,” Blackburn said in her opening statement at a subcommittee hearing on paid prioritization. Blackburn continued:

Many of you sitting in this room right now paid a line-sitter to get priority access to this hearing. In fact, it is commonplace for the government itself to offer priority access to services. If you have ever used Priority Mail, you know this to be the case. And what about TSA Precheck? It just might have saved you time as you traveled here today. If you define paid prioritization as simply the act of paying to get your own content in front of the consumer faster, prioritized ads or sponsored content are the basis of many business models online, as many of our members pointed out at the Facebook hearing last week.

Dividing up online services into those that have paid for TSA Precheck-like priority access and those that haven’t wouldn’t necessarily be appealing to consumers. While TSA Precheck lets travelers zoom through security, everyone else is stuck in a long, slow-moving line and met with frequent obstacles. Comparing paid prioritization to TSA Precheck lends credence to the pro-net neutrality argument that allowing paid fast lanes would necessarily push all other online services into “slow lanes.”

Read 20 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge / They were big, but we showed up, and they’re now gone. (credit: Mauricio Anton)

When the first modern humans ventured beyond Africa during the late Pleistocene, roughly 120,000 years ago, they stepped into a world filled with giants: the six-ton giant ground sloth in South America, the two- to three-ton wooly rhino in Europe and northern Asia, the 350- to 620-pound sabertooth cat in North America, and the six-ton wooly mammoth in Eurasia and North America. It’s hard to imagine a world filled with animals that large. The giants of the Pleistocene quickly vanished, and the animals that survived were generally two or three times smaller than those that went extinct. A new study indicates that the late Pleistocene decrease in mammal size coincided with the geographical spread of humans around the world—and the authors say that’s not just happenstance.

Human involvement in the disappearance of the Pleistocene megafauna is still the subject of intense debate, but this is hardly the first time we’ve been implicated. To provide a different perspective on these extinctions, a team of biologists led by Felisa Smith of the University of New Mexico, Albuquerque, decided to look for changes in the pattern of extinctions since the beginning of the Cenozoic period 65 million years ago—the end of the dinosaurs and the beginning of the rise of mammals. Species go extinct all the time at a steady background rate of about one to five species per year. If that rate or the kinds of animals dying off changed after humans started colonizing the world beyond Africa, that could imply we had something to do with it.

The biologists examined two large datasets. One listed the global distribution and body size of animal species in the late Pleistocene and Holocene, starting 125,000 years ago. The other listed similar information for species spanning the whole Cenozoic. Starting at around 125,000 years ago, the datasets traced a decrease in both the mean and the maximum body size of mammals on every continent, coinciding with the spread of humans into each region. Wherever humans went, mammals got smaller, and big ones tended to die off.

Read 13 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge / No, no you can’t. (credit: Nathan Mattise)

Google’s App Engine may not have been designed to provide a way for developers to evade censors, but for the past few years it has offered one, via a technique known as domain fronting. By wrapping communications to a service with a request to an otherwise innocuous domain or IP address range such as Google’s, application developers can conceal requests to domains otherwise blocked by state or corporate censors. It’s a method that has been used both for good and ill—adopted by Signal, the anti-Chinese censorship service GreatFire.org, plugins for the Tor anonymizing network, some virtual private network providers, and by an alleged Russian state-funded malware campaign to obfuscate Tor-based data theft.

But on April 13, members of the Tor Project noticed that domain fronting had become broken. The reason, according to a report by The Verge’s Russell Brandom, is that Google made changes to the company’s network architecture that had been in the works for some time. A Google representative told Brandom that domain fronting had never been officially supported by Google, and it only worked until last week “because of a quirk of our software stack… as part of a planned software update, domain fronting no longer works. We don’t have any plans to offer it as a feature.”

Ars attempted to contact Google, but we’ve received no response as of press time. [Update, 4:40 PM EDT:  Google sent the same statement as given to Brandom in response to our query.]

Read 6 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

(credit: Wellness GM)

A drug that treats a variety of white blood cell cancers typically costs about $148,000 a year, and doctors can customize and quickly adjust doses by adjusting how many small-dose pills of it patients should take each day—generally up to four pills. At least, that was the case until now.

Last year, doctors presented results from a small pilot trial hinting that smaller doses could work just as well as the larger dose—dropping patients down from three pills a day to just one. Taking just one pill a day could dramatically reduce costs to around $50,000 a year. And it could lessen unpleasant side-effects, such as diarrhea, muscle and bone pain, and tiredness. But just as doctors were gearing up for more trials on the lower dosages, the makers of the drug revealed plans that torpedoed the doctors’ efforts: they were tripling the price of the drug and changing pill dosages.

The drug, ibrutinib (brand name Imbruvica), typically came in 140mg capsules, of which patients took doses from 140mg per day to 560mg per day depending on their cancer and individual medical situation. (There were also 70mg capsules for patients taking certain treatment combinations or having liver complications.) The pills treat a variety of cancers involving a type of white blood cell called B cells. The cancers include mantle cell lymphoma, which was approved for treatment with four 140mg pills per day, and chronic lymphocytic leukemia, approved to be treated with three 140mg pills per day. Each 140mg pill costs somewhere around $133—for now.

Read 7 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

This video comparison shows how RetroArch emulation can actually react to button inputs more quickly than original NES hardware.

We’ve previously written about how difficult it is to perfectly emulate classic video game consoles even with powerful modern computer hardware. Now, the coders behind the popular RetroArch multi-emulator frontend are working to make their emulation better than perfect, in a way, by removing some of the input latency that was inherent in original retro gaming hardware.

While early game consoles like the Atari 2600 sample and process user inputs between frames, consoles since the NES usually run that game logic while a frame is rendering. That means the game doesn’t output its reaction to a new input until the next frame after the button is pressed at earliest. In some games, the actual delay can be two to four frames (or more), which can start to be a noticeable lag at the usual 60 frames per second (or about 17ms per frame).

An experimental Input Lag Compensation mode being rolled out in new versions of RetroArch fixes this issue by basically fast-forwarding a few hidden frames behind the scenes before displaying that first “reaction” frame in the expected spot. So in a game like Sonic the Hedgehog, which has two frames of input lag, the game will quickly emulate two additional, hidden frames after every new input. Then, the emulator actually shows the third post-input frame (where Sonic first shows a visible reaction) timed for when the first post-input frame would naturally appear, cutting out the delay a player would usually see.

Read 5 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge (credit: Amazon)

Amazon just released a new way for Alexa users to customize their experience with the virtual assistant. New Alexa Skill Blueprints allow users to create their own personalized Alexa skills, even if they don’t know how to code. These “blueprints” act as templates for making questions, responses, trivia games, narrative stories, and other skills with customizable answers unique to each user. Amazon already has a number of resources for developers to make the new skills they want, but until now, users have had to work within the confines of pre-made Alexa skills.

Currently, more than 20 templates are available on the new Alexa Skill Blueprints website, all ready for Alexa users to personalize with their own content. Let’s say you want to make a personalized trivia game for your family and friends: choosing the Trivia blueprint brings up more information about how this particular blueprint works, including audio examples and instructions on how to fill out the template. Click “Make Your Own” to then write your own trivia questions, possible answers, and choose which answer is correct for each question. You can even add sound effects like applause to make the game feel more real. After naming your trivia game, it will be accessible within minutes on all of the Alexa devices associated with your Amazon account.

Read 4 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge / Velodyne’s lidars aren’t the only game in town any more. (credit: Velodyne)

David Hall invented modern three-dimensional lidar more than a decade ago for use in the DARPA Grand Challenge competitions. His company, Velodyne, has dominated the market for self-driving car lidar ever since. Last year, Velodyne opened a factory that it said had the capacity to produce a million lidar units in 2018—far more than any other maker of high-end lidars.

Now Velodyne is starting to see some serious competition. Last week, lidar startup Luminar announced that it was beginning volume production of its own lidar units. The company expects to produce 5,000 units per quarter by the end of 2018.

Meanwhile, Israeli startup Innoviz is also getting ready to manufacture its InnovizPro lidar in significant volume. The company declined to give Ars exact production numbers, only telling us it has orders for thousands of units. Innoviz believes it can scale up manufacturing quickly to satisfy that demand.

Read 26 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge (credit: Prayitno)

Yet another Los Angeles city councilman has taken Waze to task for creating “dangerous conditions” in his district, and the politician is now “asking the City to review possible legal action.”

“Waze has upended our City’s traffic plans, residential neighborhoods, and public safety for far too long,” LA City Councilmen David Ryu said in a statement released Wednesday. “Their responses have been inadequate and their solutions, non-existent. They say the crises of congestion they cause is the price for innovation—I say that’s a false choice.”

In a new letter sent to the City Attorney’s Office, Ryu formally asked Los Angeles’ top attorney to examine Waze’s behavior.

Read 5 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

Enlarge / The Moto G6, Moto G6 Play, Moto E5 Plus, and Moto E5 Play. (Not exactly to scale.)

NEW YORK CITY—Motorola is taking the wraps off its mid- to low-end lineup today. The company is launching four phones at once—the Moto G6, Moto G6 Play, Moto E5 Plus, and the Moto E5 Play. And no matter what Motorola does with these devices, there’s almost no competition in the sub-$300 price range (especially here in the US), making all of these phones worthy of consideration just because of their price point.

Announcing four phones at once (some with multiple configurations!) can get really confusing, so let’s start with a giant spec sheet comparing them all. Right off the bat, there are some notable similarities: all four phones have headphone jacks, MicroSD slots, fingerprint readers, a “water repellent” coating, Android 8.0 Oreo, and all the usual connectivity options except for NFC.

MOTO G6 MOTO G6 PLAY MOTO E5 PLUS MOTO E5 PLAY
STARTING PRICE $249 $199 unknown unknown
SCREEN 5.7″ 2160×1080 LCD 5.7″ 1440×720 LCD 6″ 1440×720 LCD 5.2″ 1080×720 LCD
CPU Snapdragon 450

(Eight 1.8Ghz Cortex A53 Cores, 14nm)

Snapdragon 427

(Four 1.4GHz Cortex A53 Cores, 28nm)

Snapdragon 435

(Eight 1.4GHz Cortex A53 Cores, 28nm)

Snapdragon 425 or 427

(Four 1.4GHz Cortex A53 Cores, 28nm)

GPU Adreno 506 Adreno 308 Adreno 505 Adreno 308
RAM 3GB or 4GB 2GB or 3GB 3GB 2GB
STORAGE 32GB or 64GB 16GB or 32GB 32GB 16GB
PORTS USB-C, headphone jack Micro USB, headphone jack Micro USB, headphone jack Micro USB, headphone jack
BATTERY 3000Ah 4000Ah 5000Ah 2800Ah
BACK MATERIAL Gorilla Glass 3 Clear plastic Clear plastic Opaque plastic

The Moto G6

Motorola

Read 26 remaining paragraphs | Comments

Source: http://feeds.arstechnica.com/arstechnica/index/

A new job opening post on Facebook suggests that the social network is forming a team to build its own hardware chips, joining other tech titans like Google, Apple, and Amazon in becoming more self-reliant.

According to the post, Facebook is looking for an expert in ASIC and FPGA—two custom silicon designs to help it evaluate, develop and drive next-generation technologies within Facebook—


Source: http://feeds.feedburner.com/TheHackersNews