Intel Intel and the IOC ink 7-year Olympics tech deal for VR, drones and more Posted Jun 21, 2017 by Ingrid Lunden (@ingridlunden)
Intel has been targeting the sports industry as part of its bigger push into emerging technologies (and finding customers for the resulting new services), and today the company announced its latest play in that area. Intel has signed a wide-ranging deal with the International Olympic Committee to develop technology solutions for the Olympic Games, including developing applications and platforms for VR (including the first live broadcast in VR of Olympics events), 3D and 360-content; AI analytics; drones; and 5G services to bring the Olympics in a more direct way to more people, or in the words of Intel CEO Brian Krzanich, “Experience a prime seat while being anywhere in the world.”
The efforts will start with the next Olympics in Pyeongchang in 2018, and go through to 2024.
The news and some potential applications in these areas were unveiled at a press conference today with IOC president Thomas Bach and Intel CEO Brian Krzanich signing the deal live and Krzanich getting told that he will be an honorary torch bearer (apparently a surprise to him).
There is no value on the deal being made public, but the Olympics have been known to bring in billions of dollars in sponsorship and payments for broadcast rights in order for broadcasters to make money off those ads. It was estimated that for one Olympics, the 2012 edition in London, several worldwide sponsors (including Acer and Samsung) ponied up $100 million each to the IOC.
The rise of digital networks and new media technologies have compounded that opportunity exponentially, with more ways presenting that coverages, and more ways to reach more sports fans. Intel working on developing more of these services will directly feed into expanding that machine.
“Intel’s vision is that building a better world is our business. Our vision is building a better world through sport,” said Bach. “So bringing together these two visions will allow us to make great progress with regard to experience through games, as well as promoting the values we are sharing. The Olympic games are about excellence in sport but also about connecting people and sharing this common experience of what it means to feel the Olympic spirit.
“In our digital age we are living in now, people are making this experience in a different way. Some decades ago you could experience this only in a stadium, then broadcasting came up. And now with the digital age and cutting edge technology from Intel, athletes, spectators and all can share this experience in a different and innovative way. This is the vision of the Olympic agenda for 2020.”
“We are excited to join the Olympic Movement and integrate Intel’s innovative technologies to advance the Olympic Games experience for fans around the world,” added Krzanich. “Through this close collaboration with the Olympic family, we will accelerate the adoption of technology for the future of sports on the world’s largest athletic stage.”
Intel has made a number of moves in the arena of sports to ramp up its involvement in sports as a target vertical for the company’s efforts in emerging areas like VR. They have included acquisitions of Voke VR for immersive sports tech; Replay Technologies for 3D video tech; providing a March Madness experience in VR; and helping provide tech for the X-Games in Aspen and the NBA All-Star Weekend.
Compared to the Olympics, the king of all sports events, all of these feel like prep work.
It’s not clear which other companies will be working with the IOC for technology for the upcoming games. Other tech companies that have supported the Olympics in the past with tech implementations include Facebook, NBC and many more. But when it comes to larger sponsorship deals, these are less common, and pricier. Indeed, as we saw with IBM, which was sponsor for 38 years before backing away in 1998, those collaborations can be costly and potentially not worth the effort.
Intel, which appears to be replacing McDonald’s as the primary global sponsor, is the 13th partner for the IOC, the companies said today. Other tech companies that are currently involved with sponsoring the Olympics include Alibaba, which will be the “cloud services” and “e-commerce platform services” sponsor until 2028.
A recap of some of the areas where Intel says it will be working:
Technological and content support for Olympic Broadcasting Services’ host broadcaster operations, as well as for the Olympic Channel. Intel’s 5G platforms to be used at the Olympic Games for communications, presumably around IoT projects. Intel True VR will offer the first live virtual reality broadcast of the Olympic Winter Games. Intel drone light show technology (used elsewhere to great effect). Intel 360 replay technology.
IOC and Intel Announce Worldwide TOP Partnership Through 2024 As a Worldwide TOP Partner, Intel Will Work with the IOC to Reimagine the Future of the Olympic Games with New Levels of Fan Interaction Through Leading-Edge Technology
Intel and the IOC are partnering to bring Intel’s leading technology to enhance the Olympic Games through 2024. Intel will focus primarily on infusing its 5G platforms, VR, 3D and 360-degree content development platforms, artificial intelligence platforms, and drones, along with other silicon solutions to enhance the Olympic Games. Intel will work across the Olympic Movement and with other Olympic partners to integrate technology into many facets of the Olympic Games. NEW YORK, June 21, 2017 – The International Olympic Committee (IOC) and Intel today announced a long-term technology partnership at an official signing ceremony in New York with IOC President Thomas Bach and Brian Krzanich, Intel’s chief executive officer.
Intel will join “The Olympic Partner” (TOP) worldwide sponsorship program, becoming a Worldwide TOP Partner through 2024. This partnership will transform the Olympic Games and the Olympic experience.
“As a result of Olympic Agenda 2020, the IOC is forging groundbreaking partnerships, said Bach. “Intel is a world leader in its field, and we’re very excited to be working with the Intel team to drive the future of the Olympic Games through cutting-edge technology. The Olympic Games provide a connection between fans and athletes that has inspired people around the world through sport and the Olympic values of excellence, friendship and respect. Thanks to our new innovative global partnership with Intel, fans in the stadium, athletes and audiences around the world will soon experience the magic of the Olympic Games in completely new ways.”
“We are excited to join the Olympic Movement and integrate Intel’s innovative technologies to advance the Olympic Games experience for fans around the world,” said Krzanich. “Through this close collaboration with the Olympic family, we will accelerate the adoption of technology for the future of sports on the world’s largest athletic stage.”
The Olympic Games offer an unparalleled global platform to showcase what Intel technology can do to transform the future of sports. The first Olympic Games activation will take place at the Olympic Winter Games Pyeongchang 2018, in South Korea, in February 2018, where Intel technology will provide real-time virtual reality viewing of the Olympic Winter Games.
Working with the IOC, Intel’s contributions to the Olympic Movement will include technology developments that will be rolled out as the multiyear partnership evolves, for example:
Technological and content support for Olympic Broadcasting Services’ host broadcaster operations, as well as for the Olympic Channel, the multi-platform destination where fans can discover, engage and share in the power of sport and the excitement of the Olympic Games all year round. Intel’s 5G platforms will be used at the Olympic Games to demonstrate how 5G will transform communications over the next decade. Starting in 2018, Intel will partner on what is expected to be the first 5G showcase, setting the stage for the global deployment of 5G. Intel® True VR will offer the first live virtual reality broadcast of the Olympic Winter Games, providing fans the opportunity for a more immersive experience from their own homes. Intel drone light show technology will create never-seen-before images in the sky. Intel 360 replay technology will allow fans to experience the greatest, most memorable Olympic moments from every angle at the Olympic venues. In the future, TV viewers at home will be able to experience what it’s like to be at the Olympic Games with a front-row seat, or choose from many different viewing points in the Olympic venues. The power to choose what they want to see and how they want to experience the Olympic Games will be in the hands of the fans. As a Worldwide TOP Partner, Intel will support the National Olympic Committees and their teams around the world, as well as the IOC and the Organizers of the Olympic Games. Intel’s global activation rights will include the Olympic Winter Games Pyeongchang 2018, the Olympic Games Tokyo 2020, the Olympic Winter Games Beijing 2022 and the Olympic Games in 2024 in a city yet to be selected.
“Our strategic partnership with Intel is a clear demonstration of the enduring appeal of the Olympic Games and the trust the world’s leading companies have in the Olympic Movement, said Tsunekazu Takeda, the IOC’s marketing commission chair. “The Olympic Movement is looking forward to working with Intel to achieve its vision of building a better world through sport.”
How Intel Lit Up the Super Bowl With Drones—and Why By Jacob Brogan These aren’t regular drones: They’re patriotic drones.
Here at Future Tense, we’re on the record with our belief that no Super Bowl is complete unless it features jetpacks. In that respect—if in few others—Lady Gaga’s enthusiastic, inclusive halftime performance arguably disappointed.
Still, the production incorporated a compelling bit of contemporary technology: In a reportedly pre-recorded sequence, a swarm of 300 tightly coordinated drones lit up the sky, circling around one another in patterns choreographed tightly as anything happening on stage. While Gaga mugged at the camera, the devices came together, forming the shape of a massive American flag.*
As a tag at the end of the show made thuddingly clear, that display came courtesy of Intel, the company that developed and deployed the technology. The company and its collaborators sometimes refer to the individual devices as “spaxels,” a portmanteau of “space” and “pixels.” It’s a helpful term, in that it gets at what Intel is really up to here from a technological perspective. The system works much like an immensely complicated low-resolution computer monitor: Wired explains that each of the flying robots respond to a central computer, “oblivious to what the hundreds of machines around it are doing.”
The halftime performance comes on the heels of a handful of previous events in which Intel demonstrated the technology. The first notable production, held, Intel explains, at “a private, secure location” near Hamburg, Germany, in November 2015, involved 100 drones. Though artists had programmed out the devices’ flight patterns in advance, every 25 devices were also controlled by a single pilot. The company pared that back to a single pilot in a subsequent 100-drone performance in the United States in 2016, and then at an another with 500 drones later that year.
Intel’s drones, dubbed Shooting Stars by the company, resemble consumer quad-copters, but the devices it’s been using lately differ in a handful of key ways, including the addition of a cage around the helicopter blades, presumably against the eventuality that they might collide with one another. Since the tech that allows the drones to receive instructions is mostly hidden away in their chassis, the devices’ other dominant feature is a large LED light at the bottom. “All this drone can do is light up the sky, but this is something it can do really, really well,” says Daniel Gurdan, an engineer involved with the project, in a video from that recent 500-drone performance.
Though most of Intel’s prior productions took place at secluded outdoor venues, the company’s ambitions were already clear in its past promotional videos. “Our goal is to do this over stadiums, to do this over events that have large populations,” Intel CEO Brian Krzanich intones in one, over footage of the tightly coordinated robots expanding and contracting like the luminescent wings of some gargantuan angel.
Well, congratulations, Krzanich! You sort of did it.
It’s impossible to overstate the rhetorical importance of that accomplishment for Intel: The company regularly stresses that it has collaborated with the Federal Aviation Administration to receive exemptions for its airborne performances. Those efforts are of a piece with its other attempts to sway regulators, including a recent demonstration of drone technology on Capitol Hill. That Intel received permission to pull off such a feat in Houston, above a packed arena, is almost as impressive as the technological accomplishment, speaking to the legal progress it’s made.
Ultimately, though, the real goal of its Sunday night performance likely had as much (or more) to do with overcoming public discomfort as it did with swaying regulators. Almost every article about the show repeats the official Intel line, indicating, as Wired does, that these colorful devices “will one day revolutionize search-and-rescue, agriculture … and more.” For what it’s worth, academic drone researchers—unaffiliated with Intel—have described similar possibilities to me. But it’s worth noting that the Department of Defense has tested still-more sophisticated drone swarm systems that can be deployed from F/A-18 Hornets. While it’s generally important not to conflate consumer drones with military ones—the two share little more than a name—these related applications suggest the distinct technologies might be on a collision course.
For now, though, Intel would probably prefer we set such considerations and concerns aside. And what better way to calm us than to light up the sky up in the red, white, and blue of the American flag?
News about the drone light show that Intel put on at the Super Bowl this year was tough to miss. Articles such as The Secret Behind Those Super Bowl Half-Time Show Drones as well as Intel Flew 300 Drones in Sync to Create an Epic Light Show at the Super Bowl detailed exactly how Intel® Shooting Star™ drones were able to create massive images that lit up the sky, and explained exactly how this was able to happen from a technical and legal perspective.
inteldrone2 While it was impressive to learn about details like the four billion color combinations that the drones were capable of creating and how a single operator could control the entire swarm, it didn’t make major waves in the commercial space. After all, this kind of application is something that’s definitely interesting, but not something that appears especially relevant for commercial operators. What possible use could a utility or search and rescue professional have with a drone that can flash a bunch of lights?
I found out that answer and more after connecting with Natalie Cheung from Intel, who is focused on growing the light show business for Intel. I went into the conversation not knowing if there would be much to focus on from a commercial perspective, but Natalie quickly explained the many reasons commercial drone operators should be interested in and looking at this technology. We also discuss the development of dynamic capabilities for these drone fleets, what kind of specific commercial opportunities might be created, other capabilities that these drones might soon be able to employ and plenty more.
Jeremiah Karpowicz: How have things changed for you since we last caught up at the Commercial UAV Expo?
Natalie Cheung: At Commercial UAV we talked about all the different commercial products we were focused on. At the time, I was the marketing director for the drone team, and because it was a commercial event, I was focused more on the Intel Falcon 8+ and the MAVinci Sirius Pro. I’ve always been working on light shows though. It was something I started 18 or 19 months ago. Since then, I’ve transitioned to become fully focused on light shows and growing that as a business. I’m very focused on how we can expand the innovations with these light shows.
What does that expansion look like? Are you focused on seeing that growth happen in terms of entertainment or in terms of potential commercial applications?
Let me start out by telling you why we’re doing drone light shows, and then we can walk through the process.
The reason we’re doing these light shows is to further innovate on technology for drones. We started doing light shows because we see the concept of multiple drones controlled by one pilot or PC being useful in other categories. Imagine if you’re using a drone to find a lost hiker. Having one drone out there looking for that person is good, but having multiple drones is even better. It’s not just about the extra hardware though. Having all of that essential data controlled by one computer is critical because it simplifies what could otherwise be a long and difficult process.
Think of that same concept in a different sector. Having multiple drones inspect one bridge or cell tower makes the entire process faster and more efficient. That kind of workflow also makes it easier to continue operations throughout the day. That’s the main reason we’re innovating in this space and why we’re focused on showcasing this multiple drones per pilot technology.
It’s not the only reason though, because we’re also very interested in changing the perceptions some people have developed around the technology in a practical sense. Many people simply think of a drone as a flying camera, but we want to show people that using a drone doesn’t just mean having a camera. Using a drone can mean that drone carries a payload like an LED, or something else entirely.
So you’re trying to get people think differently about UAV technology itself, aren’t you?
We really want to change the definition of “drone” and how drones can be seen and how the technology can be used in other industries. Until we started doing this, putting an LED on a drone without a camera and purposefully building a drone for this kind of show had never been done. Our Shooting Star drones are purposefully built for a light show. It’s built with safety and endurance in mind, where it’s only purpose it to fly low, in precision, and to synch the lights. There’s no camera, and the LED build is configured with the drone to display different levels of brightness. It has 4 billion color combinations, with red, green, blue and white. It’s super lightweight and safe, and that’s the purpose behind the construction of this drone.
The Intel portfolio has a number of different drones built for different industries, and we want people in all of those industries to think about how the technology can meet their needs, whatever they happen to be.
Intel and Walt Disney Parks and Resorts are collaborating on “Starbright Holidays – An Intel Collaboration” at the Walt Disney World Resort in Florida. The event at the Disney Springs entertainment district features 300 Intel Shooting Star drones in a choreographed aerial performance. (Credit: Intel Corporation) Intel and Walt Disney Parks and Resorts are collaborating on “Starbright Holidays – An Intel Collaboration” at the Walt Disney World Resort in Florida. The event at the Disney Springs entertainment district features 300 Intel Shooting Star drones in a choreographed aerial performance. (Credit: Intel Corporation) Have you seen those perceptions being to change?
Yes, but what’s really exciting is to see how they expand to audiences that aren’t focused on the drone industry. When we were at Disney Springs at Walt Disney World Resort producing “Starbright Holidays – an Intel collaboration,” I think the most memorable part was hearing a child refer to the drones as “light fairies”. We never envisioned touching people like that with a drone.
At the Super Bowl, there were over 110 million different viewers, and I’m sure that no one had ever seen drones fly like that. And some people didn’t even believe it was drones/ They thought it was lasers or something else. So it really does open up people’s minds around how that definition of the technology can be expanded.
My goal is to expand these light shows everywhere, so people can see how interesting and different these light show can be and how they can paint different pictures in the sky by using a full three dimensional canvas.
Rather than changing perceptions, this kind of usage might very well define the creation of enterprise fleets. Can you talk a little bit about how the drone light show fleets operate?
A lot of people assume that our fleet of drones can talk to each other, and that’s a bit of a misconception. Our drones are able to talk back to our system. The way path/animation of the drone is preprogrammed. The colors and where it will go is already preprogrammed prior to flight. So in that sense we have more control, and we have more autonomy, because we understand where it will go during the flight. We press a button and they all launch and perform the show.
When we talk about commercial fleets there’s a slight difference, because commercial applications will in most cases need to be a bit more complicated. It’s an important difference to say it’s not dynamic at this point. It’s completely controlled and we understand within each timeframe of what’s going to happen.
Is that dynamic control being developed?
I think that’s where a lot of the industry wants to go. It’s something that everyone’s looking at, and it’s a question of at what touch point will this be used and where will it be used effectively.
That’s part of the reason it’s been such a great experience with these light shows. We’re creating this usage and system that is going to be relevant in the present as well as the future as the technology changes and develops
That ties back into changing people’s perceptions, because even though we know synchronized drones can be used for things like emergency response, these fleets can be used to do something like create a floating LED screen, can’t they? Are those kind of innovative solutions something you’re actively looking at?
If you think about the type of work that we’re doing, you’re basically putting a controlled flying light in the sky. Once that’s up there, you can create and do different things with it. You can create that LED screen like you mentioned, but it really depends on what the artist or operator wants to do with it. And that absolutely ties back into how operators’ perceptions can and are changing.
We started off with 100 drones in the sky, where we could do things like create the Intel logo, but it was just an outline of it. With 500, we could create a crystal clear logo, dynamically sampled and all the drones were spread out to fill out the whole Intel logo. That’s just a simple example of the innovation we’re working on, and the developments we’re hoping to enable.
Speaking of innovations, just before we connected I took a look at the display from Coachella. Seems like these artists are already looking to push past what was achieved at the Super Bowl, doesn’t it?
I’ll be honest: the Super Bowl was really a once in a lifetime opportunity, and I don’t know what could beat that. But we’re always looking at different ways to expand and show drone light shows to different audiences.
Coachella is one of the largest music festivals in the world and this was just such a great opportunity to reach a whole different type of audience. It was an audience that had never really sought to integrate music, art and drones to together to create a very different experience. It was all about the music being synched up to the lights precisely so that it could create different beats.
In the Coachella video, we made it very in tune with Coachella itself. We had the palm trees and wind turbines to represent Palm Springs, and then we had the iconic Ferris wheel that Coachella is known for. We really want to tailor these shows to each audience and each group. We really thought about what kind of music to put in there, how the lights should work, and how to make it appeal to that audience.
We want to get this technology in front of people who have never seen these shows or even flown a drone. We want to let that influence how they view and think of the technology. We want to expand people’s knowledge of what drones can do.
What kind of technical expansion are you focused on with this technology? How will that be applicable to commercial enterprise operators?
We’re learning a lot through the drone light shows. One prime example of that has been going from Drone 100 to the Intel Shooting Star fleet. We’ve learned how to make everything smaller and efficient. Having four or five people on site now versus the 16-20 we had before is a good example of that. Also, we’ve reduced the amount of times we’ll have to be on site and making the animation cycle more efficient by using algorithms instead of manually plotting each drone in the sky.
Those are just a few of the key learning’s we have in going from 100 to a whole fleet. We’re still learning a lot about what we can improvise and also what sort of technical innovations we can integrate into our fleet for the next version.
What has you excited about the future of these drone light shows in 2017 and beyond?
My focus is around making drone light shows as commonplace as extravagant fireworks, and how we can get there. That’s something my whole team is looking at in terms of how we can branch out where drone light shows go and make it as easy to do as fireworks, but still maintain the green and safety aspect.
I’m really excited about giving more people to see these light shows live this year though. You really do have to see them live. The videos are great, but we’ve noticed that being there and seeing these really changes how people perceive drones and helps with showcasing how else the technology can be used in the commercial space.
Audi Autonomous Car System PR Report: Intel Inside New Audi Autonomous Car System Sensor areas for environment observation Sensors that provide information to the autonomous driving systems surround the 2018 Audi A8. (Credit: Audi) Intel processing power will be part of autonomous driving systems on the 2018 Audi A8, it was unveiled today.
The Silicon Valley Business Journal reported that Audi will use processors from Intel’s Programmable Solutions Group (PSG) and from its Wind River subsidiary as part of the self-driving system that will allow for Level 3 autonomous driving, among the most sophisticated technology that will be on the road.
The Business Journal quoted Michael Hendricks, automotive director for PSG:
“If you look across all these different solutions you’ve got the FPGA (field-programmable gate array) … and the operating system. So a lot of Intel content (is in) the world’s first Level 3 autonomous driving system.”
Intel created its Programmable Solutions Group from the former Altera Corp. Intel closed on its acquisition of Altera in December 2015. PSG is responsible for Intel’s field programmable gate array technology. FPGAs allow for great flexibility in the programming of hardware and software, and are often used in the Internet of Things and the data center.
Wind River, an Intel company, supplies the VxWorks operating system for the highly scalable, safety-related electronic control unit. As the underlying software platform, VxWorks maintains and monitors the safety of critical applications.
The Business Journal also reported on Hendricks’ comments:
The Altera portion is doing the object fusion, map fusion, parking, pre-crash, processing and functional safety aspects of the self-driving car system.
As previously announced, Mobileye is also in the solution. In March, Intel announced its intention to purchase Mobileye; that acquisition has yet to close.
Intel Predicts Autonomous Driving Will Spur New ‘Passenger Economy’ Worth $7 Trillion Study Estimates the Value of Goods and Services in the Early Years of the ‘Passenger Economy’ Will Be More Than Twice the Size of the ‘Sharing Economy’
ad-7t-infographic-2x1 » Click to view full infographic NEWS HIGHLIGHTS
Intel predicts a new “Passenger Economy” will emerge to support the idle time when drivers become riders The economic opportunity will grow from $800 billion to $7 trillion as autonomous vehicles become mainstream Mobility-as-a-Service will disrupt long-held patterns of car ownership, maintenance, operations and usage SANTA CLARA, Calif., June 1, 2017 – Today, Intel Corporation revealed the findings from a new study that explores the yet-to-be-realized economic potential when today’s drivers become idle passengers. Coined the “Passenger Economy” by Intel and prepared by analyst firm Strategy Analytics, the study predicts an explosive economic trajectory growing from $800 billion in 2035 to $7 trillion by 2050.
History has proven that technology is the catalyst for massive societal transformation and that businesses need to adapt or risk failure, or worse, extinction. New digital business models ushered in by personal computing, the internet, ubiquitous connectivity and smartphones gave birth to whole new economies. Autonomous driving will do the same.
“Companies should start thinking about their autonomous strategy now,” said Intel CEO Brian Krzanich. “Less than a decade ago, no one was talking about the potential of a soon-to-emerge app or sharing economy because no one saw it coming. This is why we started the conversation around the Passenger Economy early, to wake people up to the opportunity streams that will emerge when cars become the most powerful mobile data generating devices we use and people swap driving for riding.”
Autonomous driving and smart city technologies will enable the new Passenger Economy, gradually reconfiguring entire industries and inventing new ones thanks to the time and cognitive surplus it will unlock.
Read the Report: Accelerating the Future: The Economic Impact of the Emerging Passenger Economy
“Not unlike the space race of the 1960s, today’s announcement is a rallying cry to the world to put its best minds on this challenge,” said Greg Lindsay, urbanist and mobility futurist. “The future of mobility, economic advancement and the emergence of new growth opportunities like the Passenger Economy demand ongoing dialogue. I am excited to partner with Intel, take this discussion on the road and look at solutions through the lens of the diverse industries that will shape our future – from automakers to investors and policy makers to startups.”
The new report frames the value of the economic opportunity through both a consumer and business lens and begins to build use cases designed to enable decision-makers to develop actionable change strategies.
Press Kit: Autonomous Driving at Intel
“Autonomous technology will drive change across a range of industries and define a new landscape, the first green shoots of which will appear in the business-to-business sector,” said study co-author Harvey Cohen, president, Strategy Analytics. “The emergence of pilotless vehicle options will first appear in developed markets and will reinvent the package delivery and long-haul transportation sectors. This will relieve driver shortages around the world and account for two-thirds of initial projected revenues.”
The research firm further points out that autonomously operated vehicle commercialization will gain steam by 2040 – generating an increasingly large share of the projected value and heralding the emergence of instantaneously personalized services.
Key report highlights include:
Business use of Mobility-as-a-Service (MaaS) is expected to generate $3 trillion in revenues, or 43 percent of the total passenger economy. Consumer use of Mobility-as-a-Service offerings is expected to account for $3.7 trillion in revenue, or nearly 55 percent of the total passenger economy. $200 billion of revenue is expected to be generated from rising consumer use of new innovative applications and services that will emerge as pilotless vehicle services expand and evolve. Conservatively, 585,000 lives can be saved due to self-driving vehicles in the era of the Passenger Economy from 2035 to 2045. Self-driving vehicles are expected to free more than 250 million hours of consumers’ commuting time per year in the most congested cities in the world. Reductions in public safety costs related to traffic accidents could amount to more than $234 billion over the Passenger Economy era from 2035-2045. Highlights of future scenarios explored in the study include: Car-venience: From onboard beauty salons to touch-screen tables for remote collaboration, fast-casual dining, remote vending, mobile health care clinics and treatment pods, and even platooning pod hotels, vehicles will become transportation experience pods. Movable movies: Media and content producers will develop custom content formats to match short and long travel times. Location-based advertising: Location-based advertising will become more keenly relevant, and advertisers and agencies will be presented with a new realm of possibilities for presenting content brands and location. Mobility-as-a-perk: Employers, office buildings, apartment complexes, university campuses and housing estates will offer MaaS to add value to and distinguish their offer from competitors or as part of their compensation package. The Passenger Economy report was sponsored by Intel and developed by Strategy Analytics. To read the full report and see additional materials, visit the Intel newsroom’s Autonomous Driving Press Kit.
About Strategy Analytics
Strategy Analytics, Inc. provides the competitive edge with advisory services, consulting and actionable market intelligence for emerging technology, mobile and wireless, digital consumer and automotive electronics companies. With offices in North America, Europe and Asia, Strategy Analytics delivers insights for enterprise success. www.StrategyAnalytics.com.
BMW Group, Intel and Mobileye Announce Delphi as a Development Partner and System Integrator for their Autonomous Driving Platform The BMW Group, Intel and Mobileye cooperation intends to integrate and industrialize level 3 to level 5 automated driving technology for multiple automotive OEMs. Delphi will leverage its expertise in automated driving and system integration to assist the cooperation in the development and initial deployment of their automated driving technology. BMW displays one of the first of approximately 40 highly automated vehicles that were announced by BMW, Intel and Mobileye during a one-day autonomous driving workshop on Wednesday, May 3, 2017, at Intel's Silicon Valley Center for Autonomous Driving in San Jose, California. (Credit: Intel Corporation) BMW displays one of the first of approximately 40 highly automated vehicles that were announced by BMW, Intel and Mobileye during a one-day autonomous driving workshop on Wednesday, May 3, 2017, at Intel’s Silicon Valley Center for Autonomous Driving in San Jose, California. (Credit: Intel Corporation) Munich, May 16, 2017 – The BMW Group, Intel and Mobileye (“Cooperation Partners”) announce their intention to onboard Delphi as a development partner and system integrator for their state-of-the-art autonomous driving platform. The four partners intend to jointly deploy a cooperation model to deliver and scale the developed solutions to the broader OEM automotive industry and potentially other industries.
Delphi has already provided a prototype compute platform to the BMW Group and is working together with Intel and Mobileye in the areas of perception, sensor fusion and high performance automated driving computing.
In July 2016, the BMW Group, Intel, and Mobileye announced that they are joining forces to make self-driving vehicles become a reality and are collaborating to bring solutions for highly and fully automated driving into series production by 2021. The Cooperation Partners have since developed a scalable architecture that can be adopted by other automotive developers and carmakers to pursue state-of-the-art designs and create differentiated brands.
Press Kit: Autonomous Driving at Intel
System integrators, such as Delphi, are critical for the go-to-market strategy of the joint solution to reach multiple automotive OEMs quickly. A key role for Delphi will be the integration of the solution delivered by BMW Group, Intel and Mobileye into OEM vehicle architectures. Additionally, Delphi may also provide required hardware components such as sensors as well as specific customization efforts and applications for differentiation.
This engagement between Delphi and the Cooperation Partners is non-exclusive. The Cooperation Partners are in the process of onboarding additional integration and development partners to support future OEM customer needs.
“From the very beginning we designed our cooperation on a non-exclusive platform for this technology of the future. With the onboarding of Delphi we significantly strengthen our development of the automated driving and do a future step in spreading this technology across the industry,” stated Klaus Fröhlich, Member of the Board of Management of BMW AG for Development.
“The partnership between BMW, Intel and Mobileye continues to break new ground in the auto industry,” said Intel CEO Brian Krzanich. “In less than one year the joint teams have made substantial progress to deliver a scalable platform for autonomous driving and are on path to deliver 40 pilot cars in second half of this year. Adding Delphi as an integration partner will help to accelerate the introduction of autonomous cars on the streets from multiple carmakers and offer differentiation to customers.”
“Collaboration and inclusion across multiple automakers and suppliers is the best approach to developing a safe, cost-efficient and fast-to-market solution for autonomous driving,” said Professor Amnon Shashua, Mobileye’s co-founder, chairman and CTO. “Delphi’s expertise in the field, as well as long history of integrating complex systems, makes them a very appropriate choice to join this cooperation.”
“This is a great opportunity for Delphi to use its technical depth and experience with automated driving and electrical architecture to help the cooperation develop and deploy at scale. Our close working relationship with all three partners serves as a solid foundation for a success,” said Kevin Clark, president and CEO of Delphi.
The BMW Group
With its four brands BMW, MINI, Rolls-Royce and BMW Motorrad, the BMW Group is the world’s leading premium manufacturer of automobiles and motorcycles and also provides premium financial and mobility services. As a global company, the BMW Group operates 31 production and assembly facilities in 14 countries and has a global sales network in more than 140 countries. In 2016, the BMW Group sold approximately 2.367 million cars and 145,000 motorcycles worldwide. The profit before tax was approximately € 9.67 billion on revenues amounting to € 94.16 billion. As of 31 December 2016, the BMW Group had a workforce of 124,729 employees. The success of the BMW Group has always been based on long-term thinking and responsible action. The company has therefore established ecological and social sustainability throughout the value chain, comprehensive product responsibility and a clear commitment to conserving resources as an integral part of its strategy.
Mobileye N.V. is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving. Our technology keeps passengers safer on the roads, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Our proprietary software algorithms and EyeQ® chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye’s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a RoadBook™ of localized drivable paths and visual landmarks using REM™; and provide mapping for autonomous driving. Our products are or will be integrated into car models from more than 25 global automakers. Our products are also available in the aftermarket.
Delphi Automotive PLC (NYSE: DLPH) is a high-technology company that integrates safer, greener and more connected solutions for the automotive and transportation sectors. Headquartered in Gillingham, U.K., Delphi operates technical centers, manufacturing sites and customer support services in 46 countries. Visit delphi.com.
This press release contains certain forward-looking statements. Words such as “believes,” “intends,” “expects,” “projects,” “anticipates,” and “future” or similar expressions are intended to identify forward-looking statements. These statements are only predictions based on our current expectations and projections about future events. You should not place undue reliance on these statements. Many factors may cause our actual results to differ materially from any forward-looking statement, including the risk factors and other matters set forth in the public filings of each of the parties to this press release. Neither party undertakes any obligation to update or revise any forward-looking statement, whether as a result of new information, future events or otherwise, except as may be required by law.
Intel Announces Satisfaction of Antitrust Clearance Condition for Proposed Acquisition of Mobileye
Business Wire Business WireAugust 1, 2017 SANTA CLARA, Calif.--(BUSINESS WIRE)--
Intel Corporation (INTC) today announced that on July 31, 2017, the Korea Fair Trade Commission approved the previously announced tender offer to purchase all of the outstanding ordinary shares of Mobileye N.V. (MBLY). The tender offer is being made pursuant to the Purchase Agreement, dated as of March 12, 2017, by and among Intel, Cyclops and Mobileye (the “Purchase Agreement”).
As a result of the approval of the tender offer from the Korea Fair Trade Commission, all required antitrust clearances have now been obtained.
Intel also announced that Intel and Mobileye have agreed that the tender offer will expire at 5:00 p.m., New York City time, on August 7, 2017
Intel Showcases Application of AI for Space Research at NASA FDL Event Today, Intel hosted the NASA Frontier Development Lab (FDL) Wrap-Up Event at its Santa Clara campus, concluding an eight-week summer program. Intel is a key partner in FDL and provided support to ongoing research that is exploring the use of artificial intelligence (AI) to solve a range of challenges within the fields of space weather, space resources and planetary defense.
Through its work with NASA FDL, Intel is addressing critical knowledge gaps by using AI to help establish humanity as a space-faring civilization and solve problems that potentially affect all of us here on Earth.
As part of the program, Intel supported and mentored researchers who used Intel® Nervana™ deep learning technology to tackle the complex challenge of building detailed maps of the lunar poles – a thorny challenge that involves detecting craters and other features within the dramatic shadows of the polar regions as well as resolving the image artifacts and registration challenges required to make a cohesive map. This is a task that takes many weeks by hand for just a small section of the moon.
The team demonstrated that deep learning could achieve the same results as a human expert with vastly improved speeds, suggesting that detailed maps of all rocky objects in the solar system could be automated using deep learning techniques. It will also help future commercial space missions by providing reliable terrain mapping for missions looking for water and other volatiles. Intel supplied full Intel Nervana Cloud access and neon™ software to the researchers, as well as mentorship from Intel engineers.
Intel believes AI is driving the next big wave of computing and revolutionizing the way businesses operate and how people engage in every aspect of life. As a data company, it is imperative that Intel deliver solutions that create, use and analyze the massive amounts of data that are generated each and every minute. There is an incredible opportunity to make AI accessible to every industry, and Intel is working to do that with focus and support from across the company.
“Intel Nervana is uniquely designed to enable researchers and data scientists to use AI to solve some of the world’s biggest challenges, and it’s ideal for a problem such as accelerating space travel,” said Naveen Rao, corporate vice president and general manager, Artificial Intelligence Products Group, Intel. “From the moment we heard about this challenge, we were committed to applying our expertise and technology solutions to the groundbreaking work being done on applications of AI for space research. Congratulations to the research teams, and to the Intel mentors, who are advancing technology that could take us to Mars and beyond.”
At the event, the Intel-sponsored team shared its findings, which will be applied to near-term space missions. Additionally, Rao will participate in a thought leadership panel, in which industry experts will discuss the application of AI to space.
martyc: Looks like you are buying Msft again!
Dec 15, 2017 11:23:29 GMT -6
martyc: The news that Trump called Rupert to congratulate him sure seems to indicate that this is heading to approval
Dec 15, 2017 11:22:23 GMT -6
Under: DIS finally getting some traction.?
Dec 14, 2017 17:08:45 GMT -6
martyc: I took an entry level position in DIS. Will add eventually to overweight when it becomes clearer that the deal will go thru. Can't believe how well positioned they will be. 60% Hulu. 20% of content watched on NFLX they can pull. More in thread
Dec 14, 2017 11:05:16 GMT -6
Under: Great posts on $DIS
Dec 13, 2017 17:50:49 GMT -6
Under: $ROKU Citron on a war path.
Nov 28, 2017 15:11:20 GMT -6
Under: $HAS takeover bid for $MAT?
Nov 10, 2017 16:16:07 GMT -6
martyc: Not looking like the market will provide any discounted opp for SGMO. Call was just too professional and all signs indicate they are on a great path for commercialization. Happy with core but wish I had some trading shs
Nov 10, 2017 9:04:05 GMT -6
martyc: For anyone looking to find an entry point into SGMO, I'm almost hoping is sells off in next few days so I can add more. They are really clicking but the fact they haven't signed new deals might cause some to exit. Watching as I have room for trading shs
Nov 9, 2017 18:28:09 GMT -6
martyc: Been an interesting ride so far. I figured the Bears would be about this good but hoped the O wouldn't look so lame. Another building yr but still possible to get to 8-8 IMO
Nov 9, 2017 18:26:08 GMT -6
Under: whats up with your Bears this year Marty?
Nov 9, 2017 17:35:25 GMT -6
martyc: Hope you were long ROKU. I wanted to see Q first so missed out
Nov 9, 2017 7:08:53 GMT -6