Golden Gate Ventures closes new $100M fund for Southeast Asia

Singapore’s Golden Gate Ventures has announced the close of its newest (and third) fund for Southeast Asia at a total of $100 million. The first hit a first close in the summer, as TechCrunch reported at the time, and now it has reached full capacity. Seven-year-old Golden Gate said its LPs include existing backers Singapore sovereign fund Temasek, Korea’s Hanwha, Naver — the owner of messaging app Line — and EE Capital. Investors backing the firm for the first time through this fund include Mistletoe — the fund from Taizo Son, brother of SoftBank founder Masayoshi Son — Mitsui Fudosan, IDO Investments, CTBC Group, Korea Venture Investment Corporation (KVIC), and Ion Pacific. Golden Gate was founded by former Silicon Valley-based trio Vinnie Lauria, Jeffrey Paine and Paul Bragiel . It has investments across five markets in Southeast Asia — with a particular focus on Indonesia and Singapore — and that portfolio includes Singapore’s Carousell, automotive marketplace Carro, P2P lending startup Funding Societies, payment enabler Omise and health tech startup Alodokter. Golden Gate’s previous fund was $60 million and it closed in 2016. Some of the firm’s exits so far include the sale of Redmart to Lazada (although not a blockbuster), Priceline’s acquisition of Woomoo, Line’s acquisition of Temanjalan and the sale of Mapan (formerly Ruma) to Go-Jek. It claims that its first two funds have had distributions of cash (DPI) of 1.56x and 0.13x, and IRRs of 48 percent and 29 percent, respectively. “When I compare the tech ecosystem of Southeast Asia (SEA) to other markets, it’s really hit an inflection point — annual investment is now measured in the billions. That puts SEA on a global stage with the US, China, and India. Yet there is a youthfulness that reminds me of Silicon Valley circa 2005, shortly before social media and the iPhone took off,” Lauria said in a statement. A report from Google and Temasek forecasts that Southeast Asia’s digital economy will grow from $50 billion in 2017 to over $200 billion by 2025 as internet penetration continues to grow across the region thanks to increased ownership of smartphones. That opportunity to reach a cumulative population of over 600 million consumers — more of whom are online today than the entire U.S. population — is feeding optimism around startups and tech companies. Golden Gate isn’t alone in developing a fund to explore those possibilities, there’s plenty of VC activity in the region. Some of those include Openspace, which was formerly known as NSI Ventures and just closed a $135 million fund, Qualgro, which is raising a $100 million vehicle and Golden Equator, which paired up with Korea Investment Partners on a joint $88 million fund. Temasek-affiliated Vertex closed a $210 million fund last year and that remains a record for Southeast Asia. Golden Gate also has a dedicated crypto fund, LuneX, which is in the process of raising $10 million.

New iPhones courageously ditch the free headphone dongle

Apple is under the impression that its “courage” has already paid off and that it no longer needs to ship a headphone dongle with its new phones. Mission accomplished! The new iPhone XS and XR models will not include the Lightning to 3.5mm headphone jack adapter, and users will have to buy it separately for $9. The iPhone 8 will also not include the dongle moving forward, The Verge reported. Courage. On the bright side, the dongle is only $9, and if you’ve been an iPhone owner for the past few years, you’ve got one already. To be clear, a lot of phones have been moving in the headphone jack-less direction and including the dongles with its past models was a nice precedent set by Apple. That said, the Pixel 2 included the dongle, so Apple is again leading the way here with an unpopular move.

New iPhones courageously ditch including a free headphone dongle

Apple is under the impression that its “courage” has already paid off and that it no longer needs to ship a headphone dongle with its new phones. Mission accomplished! The new iPhone XS and XR models will not include the Lightning to 3.5mm headphone jack adapter, and users will have to buy it separately for $9. The iPhone 8 will also not include the dongle moving forward, The Verge reported. Courage. On the bright side, the dongle is only $9, and if you’ve been an iPhone owner for the past few years, you’ve got one already. To be clear, a lot of phones have been moving in the headphone jack-less direction and including the dongles with its past models was a nice precedent set by Apple. That said, the Pixel 2 included the dongle, so Apple is again leading the way here with an unpopular move.

The iPhone Xs and Xs Max get dual-SIM capability

There are many reasons why dual-SIM capabilities make sense. And that’s why many Android smartphones let you insert two SIM cards. Apple is entering the world of dual-SIM capabilities with a physical SIM tray and an eSIM for most of the world, and two physical SIM cards in China. You won’t be able to buy a second SIM card at the airport and put it in the phone. Instead, just like on the iPad, you’ll have to subscribe to a plan using your iPhone. Few telecom companies support eSIM just yet. Apple showed the logos of Verizon, T-Mobile, AT&T, Bell, EE, Vodafone, Airtel, Deutsche Telekom, Truphone, GigSky and Jio. Let’s hope that this move is going to convince more telecom carriers to switch to eSIM. Being able to sign into your mobile plan just like you would sign into your Spotify account sounds like a dream. If you use two SIM cards, you’ll be able to manage two phone numbers, use two plans and more. This is particularly useful if you live in a fragmented region. For instance, many countries have regional telecom companies. So you need to swap your SIM card if you’re traveling back and forth between two cities. In China, Apple can’t embed an eSIM into its devices. So the company is going to release a special iPhone Xs and Xs Max for China. This model will let you insert two physical SIM cards at once, back-to-back.

The iPhone Xr is the new budget iPhone

Apple just announced a new budget iPhone to along with the iPhone Xs. It brings many of the goodies found on the new and expensive iPhone Xr but for much less and it’s available in a variety of colors. This phone replaces the iPhone 8 as the least expensive iPhone available. Like the iPhone Xs and the iPhone X before it, the Xr is a full-screen phone minus a notch at the top that houses the phone’s camera and FaceID sensors. Long live the Home Button. It’s no longer available on any iPhone model. The screen is a 6.1-inch LCD screen, unlike the OLED version found in the iPhone XR, and Apple calls it a liquid retina screen with 1792 x 828 with 326 ppi. Even at a 6.1-inch screen, the phone itself is smaller than the previous iPhone 8 Plus. Inside is Apple’s new A12 Bionic chip that supports improved battery life, neural networks and advanced processing. The body is made out of 7000 series aerospace grade aluminum that’s more durable glass and comes in white, black, blue, coral and yellow. The case also has IP 67 protection to keep it safe from dust and water. [gallery ids="1710969,1710955,1710951,1710950,1710949,1710948"] Unlike the iPhone X and iPhone Xs, the iPhone Xr has a single lens camera. It’s a 12mp sensor with a fast 1.8 aperture lens and packs a true tone flash. Even though there’s only one lens, the iPhone Xr can still do portrait mode photos like the iPhone X and iPhone Xs. The iPhone Xr even has adjustable bokah found in the iPhone Xs. Compaired to the iPhone 8 Plus, the iPhone XR has an hour and a half longer battery life.

So long then, iPhone home button…

… it was nice pressing you. Well, at least some of the thousands and thousands of times. Apple has finally abandoned a feature that’s been a staple of its smartphones since the very start, over a decade ago: A physical home button. The trio of almost-all-screen iPhones unboxed today at its Cupertino HQ go all in on looks and swipes, with nothing but a sensor-housing notch up top to detract from their smoothly shining faces. Last year Apple only ditched the button on its premium iPhone X handset, retaining physical home buttons on cheaper iPhones. But this year it’s a clean sweep, with buttons dropped across the board. If you want to go home on the new iPhone Xs, iPhone Xs Max or iPhone Xr (as the trio of new iPhones are confusingly named) well, there’s a gesture for that: An up swipe from the bottom edge of the screen, specifically. Or a look and that gesture if your phone is locked. This is because Apple has also gone all in on its facial biometric authentication system, Face ID, for its next crop of iPhones — throwing out the predecessor Touch ID biometric in the process. “Customer love it!” enthused Apple’s marketing chief, Phil Schiller, talking up Face ID from the stage, after CEO Tim Cook had reintroduced the tech by collapsing it all to: “Your phone knows what you look like and your face becomes your password.” “There’s no home button,” confirmed Schiller, going over the details of the last of the three new iPhones to be announced — and also confirming Face ID is indeed on board the least pricey iPhone Xr. “You look at it to unlock it… you look at it to pay with Apple Pay,” he noted. So hey there Face ID, goodbye Touch ID. Like any fingerprint biometric Touch ID is fallible. Having been doing a lot of DIY lately it simply hasn’t worked at all for my battered fingertips for more than a month now. Nor does it work well if you have dry skin or wet hands and so on. It can also be hacked with a bit of effort, such as via silicone spoofs. Still, Touch ID does have its fans — given relative simplicity. And also because you can register multiple digits to share biometric access to a single iPhone with a S.O. (Or, well, your cat.) Apple has mitigated the device sharing issue by adding support for two faces per device being registered with Face ID in iOS 12. (We haven’t tested if it’ll register a cat yet.) However the more major complaint from privacy advocates is that turning a person’s facial features into their security and authentication key normalizes surveillance. That’s certainly harder to workaround or argue against. Apple will be hoping its general pro-privacy stance helps mitigate concerns on that front. But exactly how the millions of third party apps running on its platform make use of the facial biometric feature is a whole other issue, though. Elsewhere, debate has focused on whether Face ID makes an iPhone more vulnerable to being force unlocked against its owner’s will. The technology does require active interaction from the registered face in question for it to function, though — a sort of ‘eyes-on’ check and balance. It’s probably not perfect but neither was a fingerprint biometric — which could arguably be more easily forcibly taken from someone in custody or asleep. But it’s irrefutable that biometrics come with trade-offs. None of these technologies is perfect in security terms. Arguably the biggest problem is there’s no way to change your biometric ‘password’ if your data leaks — having your fingerprints or face surgically swapped is hardly a viable option. Yet despite such concerns the march towards consumer authentication systems that are robust without being hopelessly inconvenient has continued to give biometrics uplift. And fingerprint readers, especially, are now pretty much standard issue across much of the Android device ecosystem (which may also be encouraging Apple to step up and away now, as it seeks to widen the gap with the cheaper competition). In the first year of operation its Face ID system does appear to have been impressively resilient, too — barring a few cases of highly similar looking family members/identical twins. Apple is certainly projecting confidence, now, going all in on the tech across all its iPhones. If you’re inconsolable about the loss of the Home Button it’s not entirely extinct on Apple hardware yet: The iPad retains it, at least for now. And if it’s Touch ID you’re hankering for Apple added the technology to the MacBook Pro’s Touch Bar (on 2016 models and later). Yet the days of poking at a physical button as a key crux of mobile computing do now look numbered. Contextual computing — and all it implies — is the name of the game from here on in. Which is going to raise increasingly nuanced questions about the erosion of user agency and control, alongside major privacy considerations and related data ethics issues, at the same time as ramping up technological complexity in the background. So no pressure then! At the end of the day there was something wonderfully simple about having a home button always sitting there — quietly working to take people back to a place they felt comfortable. It was inclusive. Accessible. Reassuring. For some an unnecessary blemish on their rectangle of glass, for sure, but for others an important touchstone to get them where they needed to go. Hopefully Apple won’t forget everything that was wrapped around the home button. It would certainly be a shame if its spirit of inclusiveness also fell by the wayside. Photo by Kim Kulish/Corbis via Getty Images

Apple introduces the iPhone Xs and iPhone Xs Max

Another year, another set of brand spankin’ new iPhones. But this year, little has been left to the imagination as leaks have continued to spring up over the course of the past few months. Today, however, the new iPhone becomes official. Apple has introduced a new models of the premium iPhone, the iPhone Xs, which comes in three finishes, gold, silver and space grey. So let’s take a look at the details. Design The new iPhone doesn’t look all that different from the iPhone X, but that is always the case with the “S” years. The phones come in gold, silver and space grey and are made with surgical grade steel, as well as a new glass formulation for durability. The Apple team has also upgraded the dust and water resistance of the iPhone, bumping it to IP68 rated, with water resistance up to 2 meters deep for several minutes. Schiller added that the phone was tested in many liquids, including orange juice, tea, wine and beer. Display The new display on the iPhone Xs is a Super Retina OLED display, but it has 60 percent greater dynamic range than the previous generation. Displays come in two sizes — 5.8-inch and 6.5-inch — with 458 pixels per inch. The bigger phone is called the iPhone Xs Max. Unfortunately, on both models, that notch is still hanging out at the top of the phone, but not without good reason. Housed in that sliver of bezel is an infrared camera, flood illuminator, ambient light sensor, proximity sensor, speaker, microphone, front camera, and dot projector. Much of this, of course, allows for FaceID to continue on this next gen of the iPhones. It has a faster secure enclave and faster algorithms have improved FaceID in the iPhone Xs, with Phil Schiller saying it’s the most secure facial authentication in a smartphone ever. Specs Perhaps the biggest spec upgrade on the iPhone Xs is the new A12 Bionic chip, the industry’s first 7nm chip with 6.9 billion transistors. It has a 6-core CPU, with two high-performance cores that are 15 percent faster and 40 percent lower power than the A11. There’s also a new 4-core GPU in the A12 that’s 50 percent faster with tessellation and multilayer rendering. Plus, there is a new neural engine with an 8-core dedicated machine learning processor. So how does that translate to real-world use? Well, the new iPhone Xs is capable of 30 percent faster app opens thanks to that A12 Bionic chip. As is standard with Apple, the company gave some other examples of how this processor will change the way we operate on our phones, including software upgrades from iOS 12 like AR, Memoji, and Siri shortcuts. Apple also did a demo from Bethesda showing off the A12 powering the new Elder Scrolls Blades game. In the storage department, the iPhone Xs comes with up to 512GB of storage. Developing… Please refresh

The best security and privacy features in iOS 12 and macOS Mojave

September is Apple hardware season, where we expect new iPhones, a new Apple Watch and more. But what makes the good stuff run is the software within. First revealed earlier this year at the company’s annual WWDC developer event in June, iOS 12 and macOS Mojave focus on a running theme: security and privacy for the masses. Ahead of Wednesday big reveal, here’s all the good stuff to look out for. macOS Mojave macOS Mojave will be the sixth iteration of the Mac operating system, named after a location in California where Apple is based. It comes with dark mode, file stacks, and group FaceTime calls. Safari now prevents browser fingerprinting and cross-site tracking What does it do? Safari will use a new “intelligent tracking prevention” feature to prevent advertisers from following you from site to site. Even social networks like Facebook know which sites you visit because so many embed Facebook’s tools — like the comments section or the “Like” button. Why does it matter? Tracking prevention will prevent ad firms from building a unique “fingerprint” of your browser, making it difficult to serve you targeted ads — even when you’re in incognito mode or private browsing. That’s an automatic boost for personal privacy as these companies will find it more difficult to build up profiles on you. Camera, microphone, backups now require permission What does it do? Just like when an app asks you for access to your contacts and calendar, now Mojave will ask for permission before an app can access your FaceTime camera and microphone, as well as location data, backups and more. Why does it matter? By expanding this feature, it’s much more difficult for apps to switch on your camera without warning or record from your microphone without you noticing. That’s going to prevent surreptitious ultrasonic ad tracking and surveillance by malware that hijack your camera. But also asking permission for access to your backups — often unencrypted — will prevent malware or hackers from quietly stealing your data. iOS 12 iOS 12 lands on more recent iPhones and iPads, but will bring significant performance boosts to older supported devices, new Maps, smarter notifications and updated AIKit . Password manager will warn of password reuse What does it do? iOS 12’s in-built password manager, which stores all your passwords for easy access, will now tell if you’re using the same password across different sites and apps. Why does it matter? Password reuse is a real problem. If you use the same password on every site, it only takes one site breach to grab your password for every other site you use. iOS 12 will let you know if you’re using a weak password or the same password on different sites. Your passwords are easily accessible with your fingerprint or your passcode. Two-factor codes will be auto-filled What does it do? When you are sent a two-factor code — such as a text message or a push notification — iOS 12 will take that code and automatically enter it into the login box. Why does it matter? Two-factor authentication is good for security — it adds an extra layer of protection on top of your username and password. But adoption is low because two-factor is cumbersome and frustrating. This feature keeps the feature security intact while making it more seamless and less annoying. USB Restricted Mode makes hacking more difficult What does it do? This new security feature will lock any accessories out of your device — including USB cables and headphones — when your iPhone or iPad has been locked for more than an hour. Why does it matter? This is an optional feature — first added to iOS 11.4.1 but likely to be widely adopted with iOS 12 — will make it more difficult for law enforcement (and hackers) to plug in your device and steal your sensitive data. Because your device is encrypted, not even Apple can get your data, but some devices — like GrayKeys — can brute-force your password. This feature will render these devices largely ineffective. Apple’s event starts Wednesday at 10am PT (1pm ET).

Questions about Apple’s new Maps, answered

Earlier today we revealed that Apple was re-building maps from the ground up. These are some questions from readers that came up when we went live. You can ask more questions here and I’ll try to add them. What part of Maps will be new? The actual map. Apple is building them from scratch, with its own data rather than relying on external partners. What does that mean in terms of what I’ll see? New foliage markers, showing you where ground cover like grass and trees exists more accurately. Pools, parking lots, exact building shapes, sports areas like baseball diamonds, tennis and basketball courts and pedestrian pathways that are commonly walked but previously unmapped. There are also some new features like the ability to determine where the entrances are to buildings based on maps data. Will it look visually different? Only with regards to additional detail. Maps is not getting a visual ‘overhaul’ yet (it was implied that it will eventually) but you’ll notice differences immediately. Here’s an example: Does it use information from iPhones? Yes. It uses segments of trips you take that have been anonymized called probe data to determine things like ‘is this a valid route?’ or to glean traffic congestion information. Can I be identified by this data — does Apple know it’s me making the trips? No. The only device that knows about your entire trip is your personal device. When information and/or requests are sent to Apple, a rotating random identifier is assigned to chunks of data which are segmented for additional safety before transmission. Basically, all Apple will ever see is a random slice of any person’s trip without beginning or end that it uses to update its maps and traffic info. Not only can it not tell who it came from, Apple says it cannot even reconstruct a trip based on this data — no matter who asks for it. Can I opt out? Yes. It will not happen if you do not turn location services on, and it can be toggled off in the Privacy settings for Maps. It’s not a new setting, it’s just the existing maps setting. Will it use more data or battery? Apple says no. It’s saying that the amount of both resources used are so negligible as to be swallowed up in normal efficiency gains. When is it coming to the rest of the world? Bay Area in beta next week and Northern California this fall were as much as I got, however, Apple SVP Eddy Cue did say that Apple’s overall maps team was global. We’ve got a dedicated team — we started this four years ago — across a variety of fields from ML, to map design, to you name it. There’s thousands of people working on this all around the globe from here in the Bay Area, to Seattle, Austin, New York. We have people in other countries, in cities like Berlin, Paris, Singapore, Beijing, Malmö, Hyderabad. This team’s dispersed around the globe. It’s important to have that when you’re trying to create and do this. We’re trying to look at how people use our devices all around the world. Our focus is to build these maps for people on the go. Does this mean street view mode is coming? Maybe, Apple did not announce anything related to a street-level view. With the data that it is gathering from the cars, it could absolutely accomplish this, but no news yet. What about businesses? The computer vision system Apple is using can absolutely recognize storefronts and business names so I’d expect that to improve. Will building shapes improve in 3D? Yes. Apple has tools specifically to allow its maps editors to measure building heights in the 3D views and to tweak the shapes of the buildings to make them as accurate as possible. The measuring tools also serve to nail down how many floors a building might have for internal navigation.

Apple is rebuilding Maps from the ground up

I’m not sure if you’re aware, but the launch of Apple Maps went poorly. After a rough first impression, an apology from the CEO, several years of patching holes with data partnerships and some glimmers of light with long-awaited transit directions and improvements in business, parking and place data, Apple Maps is still not where it needs to be to be considered a world class service. Maps needs fixing. Apple, it turns out, is aware of this, so It’s re-building the maps part of Maps. It’s doing this by using first-party data gathered by iPhones with a privacy-first methodology and its own fleet of cars packed with sensors and cameras. The new product will launch in San Francisco and the Bay Area with the next iOS 12 Beta and will cover Northern California by fall. Every version of iOS will get the updated maps eventually and they will be more responsive to changes in roadways and construction, more visually rich depending on the specific context they’re viewed in and feature more detailed ground cover, foliage, pools, pedestrian pathways and more. This is nothing less than a full re-set of Maps and it’s been 4 years in the making, which is when Apple began to develop its new data gathering systems. Eventually, Apple will no longer rely on third-party data to provide the basis for its maps, which has been one of its major pitfalls from the beginning. “Since we introduced this six years ago — we won’t rehash all the issues we’ve had when we introduced it — we’ve done a huge investment in getting the map up to par,” says Apple SVP Eddy Cue, who now owns Maps in an interview last week. “When we launched, a lot of it was all about directions and getting to a certain place. Finding the place and getting directions to that place. We’ve done a huge investment of making millions of changes, adding millions of locations, updating the map and changing the map more frequently. All of those things over the past six years.” But, Cue says, Apple has room to improve on the quality of Maps, something that most users would agree on, even with recent advancements. “We wanted to take this to the next level,” says Cue. “We have been working on trying to create what we hope is going to be the best map app in the world, taking it to the next step. That is building all of our own map data from the ground up.” In addition to Cue, I spoke to Apple VP Patrice Gautier and over a dozen Apple Maps team members at its mapping headquarters in California this week about its efforts to re-build Maps, and to do it in a way that aligned with Apple’s very public stance on user privacy. If, like me, you’re wondering whether Apple thought of building its own maps from scratch before it launched Maps, the answer is yes. At the time, there was a choice to be made about whether or not it wanted to be in the business of Maps at all. Given that the future of mobile devices was becoming very clear, it knew that mapping would be at the core of nearly every aspect of its devices from photos to directions to location services provided to apps. Decision made, Apple plowed ahead, building a product that relied on a patchwork of data from partners like TomTom, OpenStreetMap and other geo data brokers. The result was underwhelming. Almost immediately after Apple launched Maps, it realized that it was going to need help and it signed on a bunch of additional data providers to fill the gaps in location, base map, point-of-interest and business data. It wasn’t enough. “We decided to do this just over four years ago. We said, “Where do we want to take Maps? What are the things that we want to do in Maps? We realized that, given what we wanted to do and where we wanted to take it, we needed to do this ourselves,” says Cue. Because Maps are so core to so many functions, success wasn’t tied to just one function. Maps needed to be great at transit, driving and walking — but also as a utility used by apps for location services and other functions. Cue says that Apple needed to own all of the data that goes into making a map, and to control it from a quality as well as a privacy perspective. There’s also the matter of corrections, updates and changes entering a long loop of submission to validation to update when you’re dealing with external partners. The Maps team would have to be able to correct roads, pathways and other updating features in days or less, not months. Not to mention the potential competitive advantages it could gain from building and updating traffic data from hundreds of millions of iPhones, rather than relying on partner data. Cue points to the proliferation of devices running iOS, now numbering in the millions, as a deciding factor to shift its process. “We felt like because the shift to devices had happened — building a map today in the way that we were traditionally doing it, the way that it was being done — we could improve things significantly, and improve them in different ways,” he says. “One is more accuracy. Two is being able to update the map faster based on the data and the things that we’re seeing, as opposed to driving again or getting the information where the customer’s proactively telling us. What if we could actually see it before all of those things?” I query him on the rapidity of Maps updates, and whether this new map philosophy means faster changes for users. “The truth is that Maps needs to be [updated more], and even are today,” says Cue. “We’ll be doing this even more with our new maps, [with] the ability to change the map real-time and often. We do that every day today. This is expanding us to allow us to do it across everything in the map. Today, there’s certain things that take longer to change. “For example, a road network is something that takes a much longer time to change currently. In the new map infrastructure, we can change that relatively quickly. If a new road opens up, immediately we can see that and make that change very, very quickly around it. It’s much, much more rapid to do changes in the new map environment.” So a new effort was created to begin generating its own base maps, the very lowest building block of any really good mapping system. After that, Apple would begin layering on living location data, high resolution satellite imagery and brand new intensely high resolution image data gathered from its ground cars until it had what it felt was a ‘best in class’ mapping product. There is only really one big company on earth who owns an entire map stack from the ground up: Google . Apple knew it needed to be the other one. Enter the vans. Apple vans spotted Though the overall project started earlier, the first glimpse most folks had of Apple’s renewed efforts to build the best Maps product was the vans that started appearing on the roads in 2015 with ‘Apple Maps’ signs on the side. Capped with sensors and cameras, these vans popped up in various cities and sparked rampant discussion and speculation. The new Apple Maps will be the first time the data collected by these vans is actually used to construct and inform its maps. This is their coming out party. Some people have commented that Apple’s rigs look more robust than the simple GPS + Camera arrangements on other mapping vehicles — going so far as to say they look more along the lines of something that could be used in autonomous vehicle training. Apple isn’t commenting on autonomous vehicles, but there’s a reason the arrays look more advanced: they are. Earlier this week I took a ride in one of the vans as it ran a sample route to gather the kind of data that would go into building the new maps. Here’s what’s inside. In addition to a beefed up GPS rig on the roof, four LiDAR arrays mounted at the corners and 8 cameras shooting overlapping high-resolution images – there’s also the standard physical measuring tool attached to a rear wheel that allows for precise tracking of distance and image capture. In the rear there is a surprising lack of bulky equipment. Instead, it’s a straightforward Mac Pro bolted to the floor, attached to an array of solid state drives for storage. A single USB cable routes up to the dashboard where the actual mapping capture software runs on an iPad. While mapping, a driver…drives, while an operator takes care of the route, ensuring that a coverage area that has been assigned is fully driven and monitoring image capture. Each drive captures thousands of images as well as a full point cloud (a 3D map of space defined by dots that represent surfaces) and GPS data. I later got to view the raw data presented in 3D and it absolutely looks like the quality of data you would need to begin training autonomous vehicles. More on why Apple needs this level of data detail later. When the images and data are captured, they are then encrypted on the fly immediately and recorded on to the SSDs. Once full, the SSDs are pulled out, replaced and packed into a case which is delivered to Apple’s data center where a suite of software eliminates private information like faces, license plates and other info from the images. From the moment of capture to the moment they’re sanitized, they are encrypted with one key in the van and the other key in the data center. Technicians and software that are part of its mapping efforts down the pipeline from there never see unsanitized data. This is just one element of Apple’s focus on the privacy of the data it is utilizing in New Maps. Probe data and Privacy Throughout every conversation I have with any member of the team throughout the day, privacy is brought up, emphasized. This is obviously by design as it wants to impress upon me as a journalist that it’s taking this very seriously indeed, but it doesn’t change the fact that it’s evidently built in from the ground up and I could not find a false note in any of the technical claims or the conversations I had. Indeed, from the data security folks to the people whose job it is to actually make the maps work well, the constant refrain is that Apple does not feel that it is being held back in any way by not hoovering every piece of customer-rich data it can, storing and parsing it. The consistent message is that the team feels it can deliver a high quality navigation, location and mapping product without the directly personal data used by other platforms. “We specifically don’t collect data, even from point A to point B,” notes Cue. “We collect data — when we do it —in an anonymous fashion, in subsections of the whole, so we couldn’t even say that there is a person that went from point A to point B. We’re collecting the segments of it. As you can imagine, that’s always been a key part of doing this. Honestly, we don’t think it buys us anything [to collect more]. We’re not losing any features or capabilities by doing this.” The segments that he is referring to are sliced out of any given person’s navigation session. Neither the beginning or the end of any trip is ever transmitted to Apple. Rotating identifiers, not personal information, are assigned to any data or requests sent to Apple and it augments the ‘ground truth’ data provided by its own mapping vehicles with this ‘probe data’ sent back from iPhones. Because only random segments of any person’s drive is ever sent and that data is completely anonymized, there is never a way to tell if any trip was ever a single individual. The local system signs the IDs and only it knows who that ID refers to. Apple is working very hard here to not know anything about its users. This kind of privacy can’t be added on at the end, it has to be woven in at the ground level. Because Apple’s business model does not rely on it serving, say, an ad for a Chevron on your route to you, it doesn’t need to even tie advertising identifiers to users. Any personalization or Siri requests are all handled on-board by the iOS device’s processor. So if you get a drive notification that tells you it’s time to leave for your commute, that’s learned, remembered and delivered locally, not from Apple’s servers. That’s not new, but it’s important to note given the new thing to take away here: Apple is flipping on the power of having millions of iPhones passively and actively improving their mapping data in real time. In short: traffic, real-time road conditions, road systems, new construction and changes in pedestrian walkways are about to get a lot better in Apple Maps. The secret sauce here is what Apple calls probe data. Essentially little slices of vector data that represent direction and speed transmitted back to Apple completely anonymized with no way to tie it to a specific user or even any given trip. It’s reaching in and sipping a tiny amount of data from millions of users instead, giving it a holistic, real-time picture without compromising user privacy. If you’re driving, walking or cycling, your iPhone can already tell this. Now if it knows you’re driving it can also send relevant traffic and routing data in these anonymous slivers to improve the entire service. This only happens if your maps app has been active, say you check the map, look for directions etc. If you’re actively using your GPS for walking or driving, then the updates are more precise and can help with walking improvements like charting new pedestrian paths through parks — building out the map’s overall quality. All of this, of course, is governed by whether you opted into location services and can be toggled off using the maps location toggle in the Privacy section of settings. Apple says that this will have a near zero effect on battery life or data usage, because you’re already using the ‘maps’ features when any probe data is shared and it’s a fraction of what power is being drawn by those activities. From the point cloud on up But maps cannot live on ground truth and mobile data alone. Apple is also gathering new high resolution satellite data to combine with its ground truth data for a solid base map. It’s then layering satellite imagery on top of that to better determine foliage, pathways, sports facilities, building shapes and pathways. After the downstream data has been cleaned up of license plates and faces, it gets run through a bunch of computer vision programming to pull out addresses, street signs and other points of interest. These are cross referenced to publicly available data like addresses held by the city and new construction of neighborhoods or roadways that comes from city planning departments. But one of the special sauce bits that Apple is adding to the mix of mapping tools is a full on point cloud that maps the world around the mapping van in 3D. This allows them all kinds of opportunities to better understand what items are street signs (retro-reflective rectangular object about 15 feet off the ground? Probably a street sign) or stop signs or speed limit signs. It seems like it could also enable positioning of navigation arrows in 3D space for AR navigation, but Apple declined to comment on ‘any future plans’ for such things. Apple also uses semantic segmentation and Deep Lambertian Networks to analyze the point cloud coupled with the image data captured by the car and from high-resolution satellites in sync. This allows 3D identification of objects, signs, lanes of traffic and buildings and separation into categories that can be highlighted for easy discovery. The coupling of high resolution image data from car and satellite, plus a 3D point cloud results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher resolution and easier to see, visually. And it’s synchronized with the ‘panoramic’ images from the car, the satellite view and the raw data. These techniques are used in self driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to ‘see’ through brush or tree cover that would normally obscure roads, buildings and addresses. This is hugely important when it comes to the next step in Apple’s battle for supremely accurate and useful Maps: human editors. Apple has had a team of tool builders working specifically on a toolkit that can be used by human editors to vet and parse data, street by street. The editor’s suite includes tools that allow human editors to assign specific geometries to flyover buildings (think Salesforce tower’s unique ridged dome) that allow them to be instantly recognizable. It lets editors look at real images of street signs shot by the car right next to 3D reconstructions of the scene and computer vision detection of the same signs, instantly recognizing them as accurate or not. Another tool corrects addresses, letting an editor quickly move an address to the center of a building, determine whether they’re misplaced and shift them around. It also allows for access points to be set, making Apple Maps smarter about the ‘last 50 feet’ of your journey. You’ve made it to the building, but what street is the entrance actually on? And how do you get into the driveway? With a couple of clicks, an editor can make that permanently visible. “When we take you to a business and that business exists, we think the precision of where we’re taking you to, from being in the right building,” says Cue. “When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They’ve done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We’re going to make sure that we’re taking you to exactly the right place, not a place that might be really close by.” Water, swimming pools (new to Maps entirely), sporting areas and vegetation are now more prominent and fleshed out thanks to new computer vision and satellite imagery applications. So Apple had to build editing tools for those as well. Many hundreds of editors will be using these tools, in addition to the thousands of employees Apple already has working on maps, but the tools had to be built first, now that Apple is no longer relying on third parties to vet and correct issues. And the team also had to build computer vision and machine learning tools that allow it to determine whether there are issues to be found at all. Anonymous probe data from iPhones, visualized, looks like thousands of dots, ebbing and flowing across a web of streets and walkways, like a luminescent web of color. At first, chaos. Then, patterns emerge. A street opens for business, and nearby vessels pump orange blood into the new artery. A flag is triggered and an editor looks to see if a new road needs a name assigned. A new intersection is added to the web and an editor is flagged to make sure that the left turn lanes connect correctly across the overlapping layers of directional traffic. This has the added benefit of massively improved lane guidance in the new Apple Maps. Apple is counting on this combination of human and AI flagging to allow editors to first craft base maps and then also maintain them as the ever changing biomass wreaks havoc on roadways, addresses and the occasional park. Here there be Helvetica Apple’s new Maps, like many other digital maps, display vastly differently depending on scale. If you’re zoomed out, you get less detail. If you zoom in, you get more. But Apple has a team of cartographers on staff that work on more cultural, regional and artistic levels to ensure that its Maps are readable, recognizable and useful. These teams have goals that are at once concrete and a bit out there — in the best traditions of Apple pursuits that intersect the technical with the artistic. The maps need to be usable, but they also need to fulfill cognitive goals on cultural levels that go beyond what any given user might know they need. For instance, in the US, it is very common to have maps that have a relatively low level of detail even at a medium zoom. In Japan, however, the maps are absolutely packed with details at the same zoom, because that increased information density is what is expected by users. This is the department of details. They’ve reconstructed replicas of hundreds of actual road signs to make sure that the shield on your navigation screen matches the one you’re seeing on the highway road sign. When it comes to public transport, Apple licensed all of the type faces that you see on your favorite subway systems, like Helvetica for NYC. And the line numbers are in the exact same order that you’re going to see them on the platform signs. It’s all about reducing the cognitive load that it takes to translate the physical world you have to navigate through into the digital world represented by Maps. Bottom line The new version of Apple Maps will be in preview next week with just the Bay Area of California going live. It will be stitched seamlessly into the ‘current’ version of Maps, but the difference in quality level should be immediately visible based on what I’ve seen so far. Better road networks, more pedestrian information, sports areas like baseball diamonds and basketball courts, more land cover including grass and trees represented on the map as well as buildings, building shapes and sizes that are more accurate. A map that feels more like the real world you’re actually traveling through. Search is also being revamped to make sure that you get more relevant results (on the correct continents) than ever before. Navigation, especially pedestrian guidance, also gets a big boost. Parking areas and building details to get you the last few feet to your destination are included as well. What you won’t see, for now, is a full visual redesign. “You’re not going to see huge design changes on the maps,” says Cue. “We don’t want to combine those two things at the same time because it would cause a lot of confusion.” Apple Maps is getting the long awaited attention it really deserves. By taking ownership of the project fully, Apple is committing itself to actually creating the map that users expected of it from the beginning. It’s been a lingering shadow on iPhones, especially, where alternatives like Google Maps have offered more robust feature sets that are so easy to compare against the native app but impossible to access at the deep system level. The argument has been made ad nauseam, but it’s worth saying again that if Apple thinks that mapping is important enough to own, it should own it. And that’s what it’s trying to do now. “We don’t think there’s anybody doing this level of work that we’re doing,” adds Cue. “We haven’t announced this. We haven’t told anybody about this. It’s one of those things that we’ve been able to keep pretty much a secret. Nobody really knows about it. We’re excited to get it out there. Over the next year, we’ll be rolling it out, section by section in the US.”

HOT NEWS

- Advertisement -

RANDOM POSTS TODAY