Who is going to drive?

Edward Loh 01 Edward Loh – Editor in ChiefMotor Trend logo January 2016

Motor Trend January 2016

In between the execution of our annual Car of the Year and Truck of the Year programs, I spent a weekend at Rennsport V, the fifth installment of all things Porsche, up at Mazda Raceway Laguna Seca. Throughout the event , I bumped into legendary Porsche racers, guys like Jacky Ickx, Vic Elford. Derek Bell, Hans Stuck and Jochen Mass, along with a few of the newer endurance racing studs Mark Webber, Earl Bamber, and Patrick Long. Porsche even arranged for a ride with Brendon Hartley, factory driver of the second place finisher at Le Mans, in the new turbocharged 911 Carrera S.
The next day, I took a side trip to nearby Mountain View. California, for the polar opposite of Rennsport, a press conference on Google’s self-driving car (SDC) program.  As I had flown into the Bay Area, transit between venues involved hired cars of two different types.
The first was typical black car service, driven by a talkative young man who liked to peed, tailgate ,and brake late, all while frequently consulting the mobile phone on his lap.
My second ride was an UberX,  a brand-new Toyota Corolla so dealer fresh it still had a paper license plate and handwritten tag on the key noting car color and VIN. Such a shame the drive rear-ended another car just before arriving at my destination.
At the Googleplex, I glided around the parking lot in one of Google’s sensor-studded autonomous pods (page 18) before listening intently to project leader Chris Urmson describe the chief attribute and concern of Google’s SDC program: safety. Later, Google founder Sergey Brin dropped in unannounced to give us 30 minutes of insight into the future of cars, self-driving and otherwise. During the Q+A session, a journalist asked whether Brin was surprised by the number of accidents
Google cars have been involved in. “What has surprised me is the frequency, actually, of the number of times we’ve been rear-ended. … Humans are just not paying attention,”
Brin said, noting that the majority involved human drivers rear-ending the SDCs, “That’s not the end of the world, but that speaks to the challenge, with all the phones and the other distractions of cur modem age, to drive. In those situations, the car is probably much better equipped to drive than the distracted human.”
I left the press conference early, racing to beat rush-hour traffic to San Francisco. While I boarded my flight, Jason Cammisa called from the Tesla Model X launch (page 142) a few miles away in Fremont. A Model S with a beta version of the new autopilot software (page 145) was waiting to drive me down to L.A. if I wanted. I declined; I was so exhausted, it would have been dangerous no matter who was driving.
After a blissfully uneventful Uber from the LAX to HQ. I wandered into my dark office to find that Nate Martinez had left me the key to a 526 hp Mustang GT350. M short drive home was a riot – a fitting end to a surreal few days.
I frequently mulled over this sequence of events during the creation of this issue, which will, for the first time ever, reveal Motor Trend’s 2016 Car, Truck, SUV and Person of the Year.  Our choices will be controversial – they always are – but In the broader context of the future of transportation, they’re just footnotes.
Cars, trucks and SUV’s have never been  more powerful, more efficient, or more complicated than they are today. They have never been safer, easier to drive, higher performing or more connected. And yet, the automotive industry and car culture as we know it are under siege from all sides.
We are quickly approaching an intersection in the transportation landscape. The question isn’t just which way are we going – but who is going to drive?

no-driver-no-problem.jpg

Edward Loh 
Editor in Chief Motor Trend – January 2016
Google hosted an update on Its self – driving car (SDC) project in early October that included ride-alongs in two test mules. Motor Trend pod caster-in- chief Charlie Vogelheirn and I ventured to Google headquarters in Mountain View, California,
for the event. which started with rides and concluded with a presentation by the SDC team.
nodriver 01a
Our first ride was in a modified Lexus RX450h. one of three apparently identical SDCs Google made available for our junket. Google started with Toyota Priuses when it began experimenting with self-driving technologies, and the Lexus SUV’s appear to host Google’s latest self-driving hardware and software alongside traditional driver controls, namely a logically redundant steering wheel and gas and brake pedals.
Our Lexus came with two co-drivers, one behind the wheel, the other in the front passenger seat, monitoring a laptop, so Charlie and I rode in the second row. Aside from a short but wide-aspect ratio monitor mounted high on the center console,
the Lexus, except for a large red button mounted next to the shifter – ostensibly to be hit in emergencies. Despite the light modifications, Google requested no interior photos of any of the vehicles we rode in.
nodriver-02a.jpg
The exterior was more heavily modified (and photographable), most notably with an ice-bucket-sized spinning array atop a roof-mounted rack. “The Lexus has a plethora of sensors hung around the perimeter of the vehicle,” quipped Charlie
at first sight. “The vehicle is as much a test bed of sensors as a display of autonomous capabilities.” To sense its environment, Google self-driving cars rely on camera, radar, laser (LIDAR). and global positioning (GPS) systems mounted at various positions on he vehicle. Cameras are generally used for monitoring short – to medium-range distances and evaluating conditions such as changing traffic signals. objects in the immediate vicinity, and parking situations. Radar can be used in short – medium – and long-distance applications. Because it uses radio waves. radar is largely unaffected by weather conditions. LlDAR, the spinning getup on the roof, reflects laser beams in a 36O-degree field of view, and GPS provides precise location information.
nodriver 03a
So how does all this, uh, ride?
Well, despite how chaotic It may sound in the Motor Trend podcast Charlie and I attempted to record. I found our roughly our 12-minute journey interesting, if a little underwhelming.
After gawking, them boarding and buckling ourselves in, we were off – slowly at first and then to a full stop, Our longest delay of the day as we waited to merge to the right out the Google parking lot. It was a somewhat busy street and the roadway to the left obstructed by a curve in the road, so our Google Lexus made plenty sure that the road was clear before pulling out.
Once underway, it was business as usual. The drivers told us that the top speed of the self-driving RX4S0h is 35 mph and that the vehicles are programmed to obey all traffic laws and posted signs.
 nodriver-021.jpg
That seemed the case as we puttered around the suburban neighborhoods that surround the Googleplex HQ.  Charlie and I did note that at one point the speedometer indicated that were doing more than 25 mph in a residential area that shifted to a school zone.
“Good to know that it would only go through a school zone at 15 … err, 25 … err, 28 mph,” Charlie said. Our drivers had no comment.
As we cruised along neatly rimmed neighborhoods dotted with mid-century Eichler homes, there was time to evaluate that short and wide multicolor display mounted high on the center of the dash. We’re bombarded by screens in cars these days, but the slim yet spare screen in the Google Lexus is surprisingly refreshing. The road ahead glowed green against a dark field that occasionally rendered oncoming roads and objects in purplish-pink. Tree and cars and other obstacles we approached were regularly rendered every few seconds in  chitish squiggles. The pulsing manner in which the environment was drawn, apparently by the LIDAR unit spinning above our heads, was reminiscent  o scenes from old World War II submarine movies, where the sweep of the radar reveals the position of nearby ships and terrain. The road ahead and simple rendering of the surrounding area were just about all the information the screen conveyed during our trip: no speeds or traffic data, little in the way of street names or signage, there is precious little to report otherwise. The ride and handling were smooth, quiet, unremarkable; turns out a Lexus at 35 mph is a Lexus at 35 mph no matter who is (or isn’t) driving. There were no abrupt scoops or changes of direction, no jerky or unsmooth driving that would indicate anything other than a human chauffeur was behind the wheel Charlie did have mixed feelings about the way the lane changes were initiated. “The [auditory] lane change announcements were both informative and intrusive,” he said. “Imagine doing the same as you were driving.”
The only other distractions to the self-driving car experience: a bit of fan noise and heat emanating from computer hardware hidden behind the rear seats-and the human co-drivers. They were both friendly enough and happy to answer our rapid-fire, basic questions, but their mere presence took a little of the autonomous magic away. While Charlie noted how odd it was to see the steering wheel move by itself as the vehicle negotiated comers and intersections, I found it equally strange to watch the imminent disaster.
Is that really necessary? I thought to myself. Some of this blase attitude is, no doubt, a product of my own familiarity with the current state of semi-autonomous driving systems. A week before this Google ride, A week before  at our annual Car of the Year testing, we marveled at the new Chevy Malibu’s adaptive cruise control and lane keep assist systems’ ability to hold an 80 – mph cruising speed around a 6 mile oval while the driver took his hands off the wheel for the entire ride. Sure, asking a vehicle to maintain speed and lane position on a dosed highway is less complicated and risky than asking a car to drive itself around a crowded Silicon Valley suburb, but Google should take comfort; the trust  in its technology and others is out there.
Or perhaps I’m speaking too soon and this is but false bravado. Is a true driverless car something to fear? We soon found out.
Sure, the google SDC looks cute in a dopey sort of way, but that’s the design – a nonthreatening and approachable design dictated by safety. Make no mistake, what google is trying to do by creating its own self-driving car is very different from what has made it a leader in the technology space. With such thoughts swirling Charlie an I embraced the sinister goofiness, opened the door and climbed into the small electric-powered two-seater.
“Very welcoming, plenty of room,” Charlie said as he shut the door. For this portion of the ride-along, the google self-driving car team had journalists staged atop one of the Googleplex parking garages, queening for two-minute rides along a predetermined course.
That’s by design, too. The Google team worked with a number of well-known auto industry suppliers to build the prototypes and early versions of the car, but plans for consumer sales have yet to be revealed. Instead, Google hinted that its first application might be in me sort of masses-reaching, shared ride service. That would explain the paucity and simplicity of the controls; between the two seats were familiar switches for the windows, seat heaters, and reading lights and a glowing “Go” button that toggles to “Pull Over” once depressed. Behind it  lay a red-trimmed button under a plastic shield, presumably for emergency situations.
In from of both passengers was a large and deep bin for holding bags and gear. On the dash above the bin was a screen similar to the one found in the Lexus. Behind that was the car’s large windshield, which we later learned is made from flexible plastic to protect pedestrians from Impact. On the A-pillars, two rectangular screens showed views from the front comers of the car-not sure if they will on the car when It heads into production. as they seemed very protorypeish. Also curious was a vertical and horizontal slit In the foot well area. It seemed the ideal spot for some sort of pedal, perhaps to pop out in an emergency? Who knows.
After belting in and handling over our podcast tools we were ready to go.  In the real world, you’d push the button and tell Google where you’d like to go, perhaps via your phone (the team is still working this out). but since this was a pre programmed route, we just pushed the Go button and slowly rolled out.
The Google team set up a simple route complete with obstacles meant to simulate few real-world situations. After negotiating a 180 degree U-turn into a straightaway, the vehicle slowed for a pedestrian (a Google employee) who abruptly crossed
our path. We negotiated a couple more runs and yielded again as another Googler pulled out in front of us in a Ford Fusion. (It’s also white. also a hybrid-notice a theme here?) After slowing and safely passing, we yielded again for a bicyclist who approached from the side.
After rounding the last bend, we sidled up to some cones and slowed to a stop at a predetermined stop at a predetermined spot a few paces behind where the next riders were to be picked up.
“The ride was benign, sterile, very similar to an airport tram or Disney ride that calmly accelerates to a predetermined low speed,” Charlie said after the ride. “We’ve all been on the ride a hundred ties, yet this time there wasn’t a rail. The evasive moves and stops were unremarkable and predictable. I would like to have experienced a panic stop”
I wished for more, as well. Google’s self-driving technology is impressive; our short rides in its two cars clearly demonstrated that there is plenty more capability beyond what the  team showed us. But modesty is understandable given what is at stake. Google is the highest- profile non-car company developing autonomous driving technology and considered by many to be the smartest guys in the room. Any mishap in the early roll out to the press and public would represent a setback not just for Google but for the entire emerging category. Still there are moments when you just can’t help but be impressed.
“The most compelling moment was when we got out and the empty car proceeded for the next rider,” Charlie said. “Something about the vehicle moving without an occupant drove home the autonomous character.”
Subsequently, October 2016, Google provided a new get together  and the best start is to take a look at the video showing actually what it is all about.
Advertisements

Home Brew Computers and Games

As it can be read at Wikipedia, Home Brew Computers are generally a reference to games when they were not yet what they became when they came of age. With the help of my son, Pedro de Souza Campos, an electric and computer engineer, and a hands on person who made some bucks when in College assembling personal computers and having an unfulfilled dream of not becoming a games designer, let’s delve a little bit in the subject  adding up to the above article of Wikipedia. The General Perspective can be seen as follows:

Graphics and Games

IntelProcessorHistory

70’s and mid 80’s

Intel was really meant to the Personal Computer, which became real as a product with the advent of the 8080 processor which was selected to the IBM PC, which exercised an orderly influence on the market, or the environment. Prior to the IBM PC, this market was created by the Zilog Z80 and the MOS 6502. As matter of fact, this market or environment was invented by Timex Sinclair 1000.

At this point in time the line between games and home computers was blurred, because there was a perception that one of the uses of home computers would be gaming. But before the existence of what today in the Windows is the bundle the Office, you had to perform all these tasks some how. An it was what Radio Shack did with its TRS 80 distributed by its home computers operation Tandy. The first TRS 80 was not well resolved (then its nickname Trash) but the model II was a huge success, sold in the millions. Its graphic package, when the memory was expanded to 64Kb was phenomenal and became a hit with its 16 colours, being clearly a better option than the Zilog Z80.
At this time, the games market was dominated by Atari with its 8 bit processor VCS 2600 in 1977 and some minor players such as Colecovision and Intellvision. Actually videogames would be pushing ahead technology in other areas.

Video Games

Video Games time line

The existence of Zilog Z80 and MOS 6502 gave birth to intelligent Video Games Consoles, and the Golden Age of Arcade Video Games came to an end and the second generation of video game consoles formatted the market which would sell in the hundred thousands to reach millions.  A comprehensive list and the details about them can be seen in the above articles.
The best-selling console of the second generation is by far the Atari 2600 at 30 million units. As of 1990, the Intellivision had sold 3 million units, a number around 1 million higher than the Odyssey2 sales, and the ColecoVision‘s total sales at 2 million units by April 1984, eight times the number of purchases for the Fairchild Channel F within one year, which was 250,000 units.
After this excellent start the video game market “crashed” and the basic reason was that home computers started to come of age. In this period home computers, such as  TK 85, Commodore Vic 20, were BASIC language oriented but with the advent of Intel 8087 Home computers  became more powerful and more able to deal with graphics and other needs of video games, no to mention that the clock speed wasn’t anymore the measure of computer power. In its place the industry started to measure performance in FLOPS, acording do IEE 754-1985, from which Intel 8087 was the first processor to be designed and many revolutionary processors also appeared giving birth to machines such as Apple IICommodore PET, TRS-80 Model I, IBM XT  which handled colour graphics extremely well and also numerical problems so necessary to video games.
This was perhaps the golden era of opportunities because the home computer was starting to come of age and perhaps the greatest opportunity any company ever lost was IBM, when it decided to insist in its MS DOS instead of going to the then incipient Windows, which according to urban tales was offered to IBM by Bill Gates, which invented it.
In this period also it was launched the  Nintendo Entertainment System (commonly abbreviated as NES) is an 8-bit home video game console that was developed and manufactured by Nintendo.
The Computer Processor was becoming Microprocessor and a whole bunch of them became available.
Microsoft also marketed in Japan its MSX operating system, which never reached the USA and was superseded in Japan by f Nintendo‘s Family Computer, as it was already mentioned, these new processors and chipsets brought amazing machines such as the Commodore 64, Apple 2 and Tandy TRS 80 and IBM PC XT.
At the same time Home Entertainment was coming of age.  CD’s, Laser Discs and later DVD’s were particularly useful for Video Games.

Mid 80’s to 90’s

When the technology didn’t allow per se, it was used, with the help of CD’s,  a full motion video (FMV) which is a video game narration technique that relies upon pre-recorded video files (rather than spritesvectors, or 3D models) to display action in the game. While many games featured FMVs as a way to present information during cutscenes, games that were primarily presented through FMVs and were referred to as full-motion video games or interactive movies.
The introduction of the  Motorola MC68000 processor was a leap ahead and high graphic processing capacity computers such as Comodore Amiga, Atari ST and Apple GS come to existence. It starts to become available 16 Bit processing, bringing down to home use a much more sophisticated mathematical capacity and what was then called memory direct access. The videogame industry pushes the technology to the limit with the Sega Genesis in 1988 with its  8 MHZ nominal processing over the MC68000 16 bits processor.  Motorola’s version was called the MC68HC000, while Hitachi’s was the HD68HC000. The 68HC000 was eventually offered at speeds of 8–20 MHz. Nintendo follows with its Super Famicon Games and SNES in 1989 and in Japan there was the introduction of the 16 bits WD CPU clocking 4GHZ. Despite inferior speed, its graphic unit was revolutionary offering for the first time 3D processing, zoom and rotation, which has never been seen before. The Computer Market was reaching its limits with Amiga ST 1000 and McIntosh Apple with a ticket price over 3.000 dollars then, which was how much it would cost a Camaro… That’s when the Personal Computer as such becomes of age. IBM was leading this segment. That’ s when larger capacity disk drives (up to 1.44Mb) became available. CD Rom technology became popular allowing larger capacity and more complicated applications to be offered (740 Mb).

90’s to 2000’s

Improving over the Intel 80286, and its product IBM PC AT Compaq was first to use the Intel 80386  in April 1986 and it marked the first CPU change to the PC platform that was not initiated by IBM. It was reverse engineering, but done legally. An IBM 386 machine eventually reached the market seven months later, but by that time Compaq was the 386 supplier of choice and IBM had lost its image of technical leadership. This paradigm breakup was followed in May, 22 1990 by Microsoft which announces its Windows 3.0. These computers evolved to use the Intel 80287 and 80387, which had co processing power together with totally linear access to the memory, which would improve the floating point commands to accelerate matemathical processing. The Graphics evolve from pallets to 16 colours EGA and 256 colours VGA. Storage is improved with the use of 10 and 20MB hard drives. This led to a massification of notebooks selling due to the improved portability. The progress continued with the introduction of the 80486 and Motorola’s equivalent for Apple the Motorola 68040 . It was a time of enormous progress, perhaps the biggest for a time lapse and the family of processors based in the 80486 expanded and upgraded to the following:

  • i486SL-NM: i486SL based on i486SX
  • IntelRapidCAD: a specially packaged Intel 486DX and a dummy floating point unit (FPU) designed as pin-compatible replacements for an Intel 80386 processor and 80387 FPU.
  • i487SX (P23N): i486DX with one extra pin sold as an FPU upgrade to i486SX systems; When the i487SX was installed it ensured an i486SX was present on the motherboard but disabled it, taking over all of its functions.
  • i486 OverDrive (P23T/P24T): i486SX, i486SX2, i486DX2 or i486DX4.

Intel dropped the 80 in the “80487”, which was called  i487SX (P23N) and was marketed as a floating point unit coprocessor for Intel i486SX machines, because of Court ruling. The next family would have a Latin name, Pentium, which would be the new naming system Intel adopted. A 50 MHz 80486 executes around 40 million instructions per second on average and is able to reach 50 MIPS peak performance. The Pentium machines would reach 300 MHZ in the near future. It was also introduced the Pentium MMX,  designed to run faster when playing multimedia applications. According to Intel, a PC with this processor runs a multimedia application up to 60% faster than one with having the same clock speed but without MMX.
1994 saw the appearance of a new player in the computer games industry: Sony, which brooke up a Nintendo  partnership. Sony introduced its Playstation, under a architecture breakthrough: RISC, powered by its 32 bits processor R3000A, which runs at 32MHZ. It became possible for the first time the processing of polygons in 3 dimensions with only 2MB of memory.
In the 1994-1996 the Internet changed from a scientific and governmental research network to a commercial and consumer marketplace and exploded. IBM ruled the market and Apple almost went bankrupt, re inventing itself with the new iMac, very strong in graphics applications. Windows introduces the 98e and Apple invests in verticalization of its iOS System in all of their platforms. MP3 music compression format is born, together with the DVD, with its high video compression storage technology.

2000’s to 2010’s

64 bits architecture sets in and video games expands to 64 bits. Intel takes over with its Pentium and Celeron chips  with speeds up to 1GHZ. Storage goes up in the gigabyte sizes with DVD’s reaching their peak at 84GB with two layers of storage. RAM memories expands to 1 & 2 GB. Notebooks became even smaller and their batteries stand up to more hours than ever. IDE comes to an end and USB takes over.IDE to USB

Apple releases its iPhone with their cortex processors. iMacs use their series A processors.
Storage devices start disappearing going solid state, such as Pen Drives  and HDD’s. 
New video games platforms come up, such as SEGA Dreamcast ( Hitachi Architecture SH4), Playstation 2 (Still with Risc5900 now called Emotion Engine, with a 300MHz speed) and Nintendo 64 (NEC VR 100MHZ).
Sega Dreamcast was a 128 bit machine and there were doubts if it really was so. Take a look at this video about it.
Microsoft launches itsXbox for games, with a 32 bits Intel processor on a 700MHZ clock, which is essentially a PC. Performance now is measured under the floating point criteria. Blue Ray DVD’s come to existence together with 50 GB HD SVD, with two layers, managing 1080 Full HD images.

Cathode Ray Tube screen TV’s come to an end replaced by Large Screen Television Technology. Internet speed goes up in the Megabyte class allowing large scale use in new fashion network social programs such as Facebook ,WhatsApp and many others, etc. Not to mention that pre recorded media such as movies goes up to the clouds with 1080 P Full HD quality. The Personal Computer Division of IBM is bought by the Chinese Lenovo and the IBM logo disappear from Thinkpad. IBM follows up announcing that now it is a service dedicated company and no longer a Computer Manufacturing Organization. Arcades come to an end, being the technology used in PC’s or dedicated platforms managing to bring up similar quality. Microsoft introduces its most popular Windows ever, the XP. Wifi and Bluetooth takes over in the streaming scene.

2010’s on to 2020’s

Decade of Mobility – Giants announce the end or decline of their manufacturing operations.  In 1965, manufacturing accounted for 53 percent of the economy. By 1988 it only accounted for 39 percent, and in 2004, it accounted for just 9 percent.
It is not different in the computer business including cell phones, once that they became miniaturized computers capable of doing anything the big computers of some decades ago use to do.
Lenovo buysMotorola. Take a look why. Nokia is bought by Microsoft . Take a look why. Microsoft tries unsuccessfully to launch the Windows platform in the cell phone scene. Apple launches iPad and Touch Screen. Samsumg and Apple dominates 90% of the cell phone market. Digital technology mobility becomes available with the advent of 3G. Motorola goes touch screen with its Surface Technology.
CD’s and DVD’s renting model of business is terminated.
Blockbuster goes out of business. Norway announces that it will shut down its FM network.
Cloud based computing takes over. Radio Shack goes bankrupt. Steve Jobs dies. Apple becomes the mos valuable brand in the planet.
The video games market is left to three companies: Sony and its Playstation 4, Microsoft and its Xbox One and Nintendo “kind of lost” between its WiiU sales fiasco and its new Switch concept. 4K Technology comes to TV with its 4096 lines of resolution.
Electric cars not only are succeeding, such as Tesla, but are the only way with Volvo announcing the ending of its internal combustion engines being completely replaced by electrical ones by 2020. Cars become more “intelligent” with embarked technology. This is where we stand now and a perfect way to peek in to the future is the January 2016 issue of Motor Trend, when they announced their coveted Car, Truck, SUV and person of the year, from which I separated the report on the Google self-driving car (SDC).

Who is going to drive?