Latest

Tuesday, August 8, 2017

How Microsoft wants to bring broadband to rural Americans

Old, unused TV signals could soon become the rural broadband of the future — but the TV stations of today have some qualms about the idea.
People have been making claims like this for “white spaces” connectivity since the early 2000s. But this time around, it’s Microsoft (MSFT) that’s putting its money and influence behind the effort.
And nearly nine years after the Federal Communications Commission approved the concept , that company and other white-spaces advocates can finally start pointing to real-world results.


Surfing on the airwaves

 

The notion the FCC began studying in 2002 goes like this: Since broadcasters don’t use all of the available television airwaves —  and since that spectrum, expanded by a recent FCC auction , reaches long distances — let’s allow internet providers to use them.
That could deliver downloads of 10 megabits per second or faster up to 10 miles from a transmitter — at half the deployment cost of LTE wireless.
But because those openings aren’t uniform nationwide, the FCC couldn’t simply offer one block of spectrum. Instead, it’s had to create a database for white-spaces devices to verify they’re limiting themselves to vacant airwaves.
After testing this concept for more than a decade in the U.S. and abroad, Microsoft wants to take it nationwide.
In a July 10 report and a July 11 speech by president and chief legal officer Brad Smith, the company outlined how white-spaces technology can bring broadband to 80% of the 23.4 million rural Americans lacking it by July 4, 2022.
To get there, Microsoft will invest in white-spaces providers, offer free licensing of 39 patents covering the technology, and support digital-skills training from the National 4-H Council and other groups.
As for the remaining 20% of unconnected rural Americans, Microsoft thinks satellite (historically plagued by data caps) suffices for more isolated users, while denser populations merit fiber-optic and fixed-wireless connections.  
Microsoft puts this vision’s capital and initial operating costs at $8 to $12 billion, although its share will be considerably less: It estimates that its direct investments will bring white-spaces broadband to 2 million people by 2022.
That still amounts to a significant white-spaces endorsement from a big company — something the technology has lacked.
“We’re now at a point where things have gelled,” said Harold Feld, a senior vice president at the tech-policy group Public Knowledge who backs white spaces as “the duct tape of rural broadband” that can patch gaps in coverage. “What you need is something to jump-start it.”

A start in Virginia

 

For on-the-ground proof of this potential, Microsoft points to the test it’s backed in Charlotte and Halifax counties in southern Virginia .
In that rural area — where OpenSignal’s crowdsourced coverage map shows spotty LTE even along major roads — Microsoft and partner Mid-Atlantic Broadband Communities Corp. have brought broadband with speeds that are faster than DSL to some 130 households, with faster and more widespread access expected this year. Still, the speeds are below cable internet rates.
Mid-Atlantic Broadband CEO Tad Deriso said those users — whose options before amounted to “satellite, dial-up or nothing” — had downloads of ”right around 5 megabits per second” from as far as four and a half miles away. Uploads are slower, as happens with most wired broadband outside fiber.
Deriso said that ongoing channel-bonding tweaks should get those speeds as high as 15 to 25 megabits per second. That top number is considered true broadband according to the FCC.
Mid-Atlantic Broadband did get help from public investments in broadband : Twelve of its 16 transmitter sites are at schools with fiber-optic links.
Deriso said the technology has lived up to advance billing in keeping broadband out of the way of TV broadcasts — unlike a test MBCC and another vendor ran a few years earlier.
“It’s been easy sailing,” he said. “We’ve had no interference issues with the broadcast stations.”
The technology will get a bigger test later this year when a partner internet provider, B2X Online , will start selling access. B2X CEO Warren Kane said it will offer three tiers: free access to whitelisted educational sites, $10 a month for downloads up to 2 Mbps, and $40 a month for 5 Mbps or more. While he said he’s still deciding what exact speeds to advertise, neither paid service will include a data cap.
Deriso said he sees no other viable broadband system in his firm’s rural context: “We’re all-in on this type of technology in our little footprint down here.”

Microsoft’s big request: one more channel

 

The tricky part of Microsoft’s agenda is its request that the FCC make one additional white-spaces channel available in major markets.
The National Association of Broadcasters — long skeptics of the white-spaces concept — has not been amused by this. In a press release , spokesman Dennis Wharton called it “the height of arrogance” and asked why the company hadn’t bid on that spectrum in the FCC’s recent auction.
“It’s Microsoft that is asking for something new from the FCC,” he said in a follow-up email that noted broadcasters aren’t seeking new spectrum for their pending conversion to a next-generation TV standard , ATSC 3.0, that will let them offer data services. “We vigorously oppose that idea, because of the negative impact on TV broadcasters.”
Broadcasters in particular fear that smaller stations that relay network signal to isolated areas will get squeezed out — a point Craig Fugate, a director of the Federal Emergency Management Agency under President Obama, made in a Friday op-ed warning that rural residents could miss storm warnings.
Feld said this extra channel wasn’t needed in rural areas but would allow economies of scale nationwide — in turn driving down white-spaces equipment costs, today on the order of $800 for a receiver.
That political argument will not be settled quickly. But by the time it does, white-spaces technology should no longer be a blank space on the broadband map.

 


 


 


Monday, July 10, 2017

Donald Trump backtracks on Russia joint cybersecurity unit

Donald Trump has backtracked on a proposal to work with Russia to create an "impenetrable" cybersecurity unit to prevent election hacking.
Hours after promoting the idea on Sunday, the US president said that he did not think it could actually happen.
The idea of a partnership with Russia was ridiculed by senior Republicans.
It comes after Mr Trump's first face-to-face talks with Russian President Vladimir Putin in Germany on Friday, in which the pair discussed the issue.
Mr Trump described the outcome of the talks as positive and suggested closer co-operation between the two nations.
"Putin and I discussed forming an impenetrable cybersecurity unit so that election hacking, and many other negative things, will be guarded and safe," he said.
The initial proposal immediately prompted derision from Democrats, as well as some Republicans who questioned why the US would work with Russia after the Kremlin's alleged meddling in the 2016 US election.
Mr Trump shifted his position on Sunday night.
"The fact that President Putin and I discussed a cybersecurity unit doesn't mean I think it can happen. It can't," he tweeted.
However, he stressed that another issue discussed in his talks with Mr Putin, a ceasefire in south-western Syria, had come into effect.
Treasury Secretary Steve Mnuchin had sought to defend the proposed cyber unit after Mr Trump's initial announcement.
Speaking on ABC's This Week programme, he described it as a "significant accomplishment" for Mr Trump.
"What we want to make sure is that we co-ordinate with Russia," he added.
However, Republican Senator Marco Rubio suggested that such an initiative would be like partnering with Syrian President Bashar al-Assad on chemical weapons.
Republican Senator Lindsey Graham said: "It's not the dumbest idea I've ever heard, but it's pretty close."
A special prosecutor is investigating whether Trump associates colluded with alleged Russian efforts to influence the 2016 US election.
Both Mr Trump and Mr Putin said the allegations had been discussed.
However, the two sides described the content of the meeting differently.
Mr Trump said he "strongly pressed" the issue with Mr Putin, who had "vehemently denied" interfering in the US election.
He also said it was time to work more "constructively" with Russia.
President Putin said he believed President Trump had accepted his assurances that Moscow had not interfered in the vote.
However, US Secretary of State Rex Tillerson said interference in the 2016 election remained an impediment to better relations with Russia, while the US ambassador to the UN, Nikki Haley, said the US "can't trust Russia" and "won't ever trust Russia".

Thursday, May 25, 2017

Nvidia Embraces Deep Neural Nets With Volta


At this year's GPU Technology Conference, Nvidia's premier conference for technical computing with graphic processors, the company reserved the top keynote for its CEO Jensen Huang. Over the years, the GTC conference went from a segment in a larger, mostly gaming-oriented and somewhat scattershot conference called "nVision" to become one of the key conferences that mixes academic and commercial high-performance computing.
Jensen's message was that GPU-accelerated machine learning is growing to touch every aspect of computing. While it's becoming easier to use neural nets, the technology still has a way to go to reach a broader audience. It's a hard problem, but Nvidia likes to tackle hard problems.



The Nvidia strategy is to disburse machine learning into every market. To accomplish this, the company is investing in Deep Learning Institute, a training program to spread the deep learning neural net programming model to a new class of developers.
Much as Sun promoted Java with an extensive series of courses, Nvidia wants to get all programmers to understand neural net programming. With deep neural networks (DNNs) promulgated into many segments, and with cloud support from all major cloud service suppliers, deep learning (DL) can be everywhere -- accessible any way you want it, and integrated into every framework.
DL also will come to the Edge; IoT will be so ubiquitous that we will need software writing software, Jensen predicted. The future of artificial intelligence is about the automation of automation.
Nvidia's conference is all about building a pervasive ecosystem around its GPU architectures. The ecosystem influences the next GPU iteration as well. With early GPUs for high-performance computing and supercomputers, the market demanded more precise computation in the form of double precision floating-point format processing, and Nvidia was the first to add a fp64 unit to its GPUs.
GPUs are the predominant accelerator for machine learning training, but they also can be used to accelerate the inference (decision) execution process. Inference doesn't require as much precision, but it needs fast throughput. For that need, Nvidia's Pascal architecture can perform fast, 16-bit floating-point math (fp16).
The newest GPU is addressing the need for faster neural net processing by incorporating a specific processing unit for DNN tensors in its newest architecture -- Volta. The Volta GPU processor already has more cores and processing power than the fastest Pascal GPU, but in addition, the tensor core pushes the DNN performance even further. The first Volta chip, the V100, is designed for the highest performance.
The V100 is a massive 21 billion transistors in semiconductor company TSMC's 12nm FFN high-performance manufacturing process. The 12nm process -- a shrink of the 16nm FF process -- allows the use of models from 16nm. This reduces the design time.
Even with the shrink, at 815mm2 Nvidia pushed the size of the V100 die to the very limits of the optical reticle.
The V100 builds on Nvidia's work with the high-performance Pascal P100 GPU, including the same mechanical layout, electrical connects, and the same power requirements. This makes the V100 an easy upgrade from the P100 in rack servers.
For traditional GPU processing, the V100 has more than 5,120 CUDA (compute unified device architecture) cores. The chip is capable of 7.5 Tera FLOPS of fp62 math and 13TF of fp32 math.
Feeding data to the cores requires an enormous amount of memory bandwidth. The V100 uses second generation high-bandwidth memory (HBM2) technology to feed 900 Gigabytes/sec of bandwidth to the chip from the 16 GB.
While the V100 supports the traditional PCIe interface, the chip expands the capability by delivering 300 GB/sec over six NVLink interfaces for GPU-to-GPU connections or GPU-to-CPU connections (presently, only IBM's POWER 8 supports Nvidia's NVLink wire-based communications protocol).
However, the real change in Volta is the addition of the tensor math unit. With this new unit, it's possible to perform a 4x4x4 matrix operation in one clock cycle. The tensor unit takes in a 16-bit floating-point value, and it can perform two matrix operations and an accumulate -- all in one clock cycle.
Internal computations in the tensor unit are performed with fp32 precision to ensure accuracy over many calculations. The V100 can perform 120 Tera FLOPS of tensor math using 640 tensor cores. This will make Volta very fast for deep neural net training and inference.
Because Nvidia already has built an extensive DNN framework with its CuDNN libraries, software will be able to use the new tensor units right out of the gate with a new set of libraries.
Nvidia will extend its support for DNN inference with TensorRT -- where it can train neural nets and compile models for real-time execution. The V100 already has a home waiting for it in the Oak Ridge National Labs' Summit supercomputer.

Nvidia Drives AI Into Toyota

Bringing DL to a wider market also drove Nvidia to build a new computer for autonomous driving. The Xavier processor is the next generation of processor powering the company's Drive PX platform.
This new platform was chosen by Toyota as the basis for production of autonomous cars in the future. Nvidia couldn't reveal any details of when we'll see Toyota cars using Xavier on the road, but there will be various levels of autonomy. including copiloting for commuting and "guardian angel" accident avoidance.
Unique to the Xavier processor is the DLA, a deep learning accelerator that offers 10 Tera operations of performance. The custom DLA will improve power and speed for specialized functions such as computer vision.
To spread the DLA impact, Nvidia will open source instruction set and RTL for any third party to integrate. In addition to the DLA, the Xavier System on Chip will have Nvidia's custom 64-bit ARM core and the Volta GPU.
Nvidia continues to execute on its high-performance computing roadmap and is starting to make major changes to its chip architectures to support deep leaning. With Volta, Nvidia has made the most flexible and robust platform for deep learning, and it will become the standard against which all other deep learning platforms are judged.

Saturday, May 20, 2017

How to make a blog

See what it's look like

FCC website 'targeted by attack' after John Oliver comments

The US Federal Communications Commission (FCC) website was deliberately attacked on 8 May, the regulator has said.
The incident began hours after comedian John Oliver criticised FCC plans to reverse US net neutrality rules.
Mr Oliver urged people to post to the site's online commenting system, protesting against the proposals.
The FCC said that issues with the site were caused by orchestrated attacks, not high volumes of traffic.
"These actors were not attempting to file comments themselves; rather they made it difficult for legitimate commenters to access and file with the FCC," chief information officer Dr David Bray said in an official statement.
"While the comment system remained up and running the entire time, these distributed denial of service (DDoS) events tied up the servers and prevented them from responding to people attempting to submit comments."

'Trolling the trolls'

In his Sunday night show Last Week Tonight, Mr Oliver called on viewers to visit a website that would direct them to the correct page on the FCC site to leave their comments.
"Every internet group needs to come together… gamers, YouTube celebrities, Instagram models, Tom from MySpace if you're still alive. We need all of you," he said.
His plea came after FCC chairman Ajit Pai said in April that he would review rules made in 2015 that require broadband companies to treat all online traffic equally.



Last December, Mr Pai said in a speech that the net neutrality laws were "holding back investment, innovation, and job creation".
"Mr Pai is essentially trolling the trolls," Chris Marsden, professor of internet law at the University of Sussex, told the BBC.
"If you bait John Oliver, you reap what you sow."
The FCC will vote on Mr Pai's proposals to revoke the legislation on 18 May.