Google’s self-driving cars are programmed to exceed speed limits by up to 10mph, according to a report on the BBC News website.
Dmitri Dolgov, the project’s lead software engineer, said that when surrounding vehicles were breaking the speed limit, going more slowly could actually present a danger, and the Google car would accelerate to keep up.
The UK Government will conduct a trial of driverless cars on public roads in three cities from January 2015.
Commenting on Google self-drive cars’ ability to exceed the speed limit, a DfT spokesman told BBC News: "There are no plans to change speed limits, which will still apply to driverless cars."
According to BBC News, in a separate development earlier this week the US Government announced that it wants all cars and light trucks to be equipped with technology that could “prevent collisions”. The technology comprises radio signals emitted by vehicles which would allow them to "talk" to each other, and alert drivers to potential accidents.
I don’t know what on earth Mr Dolgov was thinking of when he made this comment, but because of this I’ve gone from being open-minded on driverless cars to severe trepidation as they’ll only be as good as the humans who designed/programmed them. I think Idris may be right about it ending in tears. One thing’s for certain, as soon as the first accident occurs involving one, the press will be waiting to pounce!
Hugh Jones, Cheshire
0
I know it’s not very exciting to state this but please remember that exceeding the speed limit is actually illegal. If you need to break the law to achieve an overtake then you are not doing anything to improve road safety; you are probably just being impatient. If there is some chance of meeting an oncoming vehicle then the overtake shouldn’t be started in the first place.
Willie, Glasgow
0
I’m sure I have acknowledged in previous posts on this topic that the task of programming such vehicles is significant and it is up to these specialists to show that it can be done safely. And like the Moon landings, the act of trying to achieve this would probably yield huge benefits in technology advancement even if ultimately the idea itself has no future. As a cyclist I would welcome cars which couldn’t be made to squeeze past me but would wait patiently behind until a proper overtake could be performed, and which would give me due priority at junctions and roundabouts. Duncan is right, “driverless” is a misleading term, we are really talking about a form of remote control. One advantage I see in this is that decision-making becomes dispassionate, not corrupted by circumstances as humans may be. An end to road rage, perhaps?
Tim Philpot, Wolverhampton
0
How on earth do they react to cyclists – would they overtake just because the cyclist is slower or would they be able to read the road situation -ie see an oncoming vehicle or know the cyclist is indicating a right turn?
Peter Cornwell, Norwich
0
It will all end in tears.
Idris Francis Fight Back With Facts Petersfield
0
Remember that there is no such thing as a ‘driverless’ car. The human driver at the sharp end has simply been replaced by a human programmer at the blunt end.
Duncan MacKillop, Stratford on Avon
0
Anyone who’s been tailgated while driving at the speed limit knows the dilemma … speed up, and accept any risk arising, or don’t, and accept a different risk. I suggest accurately calculating in real time which risk is the greater is often a task beyond brains or processors, not least because it would require knowledge of the idiosyncratic reactions of the random selection of other humans involved which in turn would vary depending on their condition at that precise moment. But is it worse for a computer to make a mistake than a human? And why is compensation for the misdeeds of others the primary focus? Eric’s example illustrates perfectly that a human makes flawed decisions and then has to compensate for them. In that situation perhaps a driverless car would a) not feel the need to hurry, b) not commit to overtaking if a successful outcome could not be anticipated, and c) if anything did happen while overtaking which was outside its range of expectation, have a better chance than a human of instantly calculating the most effective evasive action.
There is an obvious point beyond this. We are contemplating a situation where one driverless car is introduced into an otherwise human field. Now consider a situation with a high proportion of driverless cars all sharing protocols and intercommunicating. I can see numerous ways this would reduce the potential for conflict arising in the first place.
Tim Philpot, Wolverhampton
0
There are probably no more than a dozen people in the world that have done a comprehensive input and process analysis of the driving/riding task. On the other hand most of authors of the books and publications out there seem to concentrate more on process outputs and facts. Process outputs and facts are nice to know, but they can’t help us to solve any problems as all they can do is to show us where to start looking.
Duncan MacKillop, Stratford on Avon
0
Common law and common sense suggest you cannot design beyond the limit of the system e.g. Google car has blow out at 80mph and hits safety barrier designed for 70mph impact. Open and shut case of negligence?
pete liverpool
0
I am sure we have discussions here previously regarding safety versus speed limits. An example where exceeding the limit is judicious would be a committed overtake where an oncoming vehicle appears and the safest escape is to accelerate to whatever speed is necessary to complete the manoeuvre as quickly as possible. Of course there are issues there with misjudgement/surprise but I would be surprised if anyone who has been driving for a few years has never found themself in a similar situation.
Eric Bridgstock, Independent Road Safety Research, St Albans
0
Duncan
Apart from the bit about Mr Dolgov being “one of the few people in the world that has actually spent time studying how it is that we actually do drive a car”, I’m delighted to say that I’m in full agreement with your post!
Nick Rawlings, editor, Road Safety News
0
I suspect that Mr Dolgov is one of the very few people in the world that has actually spent time studying how it is that we actually do drive a car. If he’s any good as a software engineer, which I somehow suspect he is, then he would have analysed every action, every reaction, every negotiation, every error, every ambiguity, every near-miss, every success and every failure that goes into making even the simplest of car journey’s.
Rather than looking at what people should have done in particular situations he would no doubt have looked at what they did do considering the options open to them at the time. Once he had done his in-depth analysis of the system he would have had ideas about the task and seen if he could create a robust piece of software that handled the thousands of variables in these tasks as well as a human could handle them.
I would suggest that if Mr Dolgov thinks that his driverless cars should sometimes break the speed limit to keep everything reasonably safe then I for one would tend to believe him.
Duncan MacKillop, Stratford on Avon
0
Moving away from the speed issue for a moment, I would like to know how a driverless car will cope with a pedestrian waiting at a zebra crossing. Will it recognise this and stop? And how will it recognise a red, amber and green traffic light? I would imagine these issues have been thought about…have they?
Alan Kennedy, Durham
0
Based on Mr Dolgov’s strange logic and questionable understanding of driving, what other ‘law-bending’ has been programmed in I wonder? Accelerating through ‘red lights’? If I was a passenger in one of his cars, I think I would be more on edge than I normally am when I’m being driven by someone – which is saying something.
Hugh Jones, Cheshire
0
It would appear that the Google team has identified conditions and circumstances where they have concluded that is is safer to exceed the speed limit than not to. From a pure safety viewpoint, it is essential that safety is given priority over legal compliance. I would like to see their arguments and evidence.
Eric Bridgstock, Independent Road Safety Research, St Albans
0
Who pays a speeding fine in this case, is it the passenger of the driverless car or the programmer of the system that has set to do so?
Mark, London
0
The “follow the pack” is poor logic and the excuse of the inattentive driver. If automatic and assited cars obey the speed limit then only a small percentage will be needed to get general compliance. If they can exceed the limit, what prevents a speed “run away”? Communication is useful but opens a potential route for hacking of systems which has already been done.
Mark, Caerphilly
0
So the “driverless car” exceeds the speed limit, but the speed limits “still apply to driverless cars”. So I take it the ticket for exceeding the speed limit will be prepared by an automated Police Officer operated remotely from a control room somewhere and delivered by Robocop!
John Scruby Rotherham
0
One step forwards and two steps back it would seem. Reading Mr Dolgov’s comment, you wonder whether those programming these vehicles have themselves the necessary driving experiences and abilities to the job of programming driverless cars. Shouldn’t we be reassured of their own driving credentials first before entrusting ourselves to their cars?
Hugh Jones, Cheshire
0