RISKS-LIST: RISKS-FORUM Digest Saturday 15 April 1989 Volume 8 : Issue 57 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: H.D. Thoreau on Risks of Believing Computations (Jim Haynes) Airbus 320 (Brian Randell) 1,000 Pilots Face ban (Dermot Williams) RFI and elevators (Robert A. Morris) Electronic Truant Officers (Carolyn M. Kotlas, Michael R. Hoffman, Ed Robertson) Re: Computer CAN attempt to defraud you (Hugh Davies) Computer maliciousness (Peter da Silva) ---------------------------------------------------------------------- Date: Thu, 13 Apr 89 22:08:56 -0700 From: haynes@ucscc.UCSC.EDU (Jim Haynes) Subject: H.D. Thoreau on Risks of Believing Computations (RISKS-8.56) This reminded me of an anecdote in one of the early books about computers, which was used to illustrate Babbage's insistence that the Difference Engine should produce as output plates for printing the results, lest someone's error in copying/typesetting introduce an error. (Might have been the book "Faster, Faster" by W. J. Eckert.) There was a brief period in which England and Spain were on good terms with each other. An English admiral invited a Spanish admiral aboard his flagship for a visit, during which he presented the visitor with a beautifully bound copy of the English navigational tables. After the visit the Spanish fleet sailed away and was never heard from again. It seems the English never used their own navigational tables, knowing them to be full of errors; they always used the French tables. [Turn the tables on the fleet afoot? PGN] ------------------------------ Date: Fri, 14 Apr 89 18:28:29 BST From: Brian Randell Subject: Airbus 320 I recently obtained - from Bev Littlewood - a copy of an article in France-Soir for 18 February about computer-related problems of the Airbus 320. In case it can, despite the passage of time, still add usefully to the information that has been made available in the Anglo-Saxon (French for UK plus US!) press, I am providing an almost complete translation, in which I have endeavoured to retain the flavour and style of the original article. (My apologies for the amateur nature of the translation and of the inadequacy of the dictionary that I had available to me at the time!) Brian Randell, Computing Laboratory, University of Newcastle upon Tyne AIRBUS 320: THE COMPUTER REFUSES TO PASS ON THE PILOT'S INSTRUCTIONS: One of the Incidents which has caused Aerospatiale to return the machines to "marbre" [?] Less than one year after it first went into service, the Airbus A320, the most sophisticated civil airliner existing, has to go back to the "marbre" [?]. A simple revision after thousands of hours in the air? Not just that! After going into service in April last, the plane is "chouchoute'"[?] by its builder, Airbus-Industrie, and the air lines. All the improvements capable of being made to the latest Airbus are made under the control of DGAC (Direction Ge'ne'rale de l'Aviation Civile). "The A320, like all new machines, is in its period of debugging ["de'verminage"]," emphasized Daniel Tenebaum, the boss of DGAC. "It is above all a question of removing faults which have appeared since it first went into service." STRAIGHT AT A MOUNTAIN: During critical phases - landing and take-off - the computer system shows only the most dangerous alarms. "This is a wretched problem", explained an Air France captain, "Certain failures are recorded by the computer, but the pilots are informed only later". Paul Baud, the flight trials director of Airbus-Industrie explains: "To ease the task of the pilot, only the problems which relate directly to critical phases are communicated to him: fire in the engines, the baggage hold, or in the toilets." But there are worse ones. The computer system (nicknamed the "Little Genius") sometimes escapes from the control of the crew. "I am going to land at Geneva in my A320, and it happens that the altitude indicators show `hauteurs farfelues' [incorrect heights?]. Luckily the airport urgently advised me of this. Otherwise we would have flown straight into the mountain!" Immediately, the captain demanded that the computer be replaced as being not completely reliable. Worse. The case of the pilot who saw with horror his computer indicating "full fuel load" when he started his descent to the airport in West Berlin. And the famous "Little Genius" refused to let him take over manual control. "We very nearly had a catastrophe", said the pilot. The cause, it seems, is the electrical power supply. "We have screened[?] all electrical resistances" ["Nous avons passe' au crible l'ensemble des re'sistances"], insists Paul Baud. "There was though a failure. Henceforth, the on-board computers will be fitted with higher performance diodes." A FAULTY COMPUTER: They will even improve the transformers-rectifiers. These serve to supply the A320's automation systems, in modifying the the alternating and the direct current. Nevertheless, Airbus-Industrie points out that all the flight-critical mechanisms are duplicated. One defective computer is thus immediately replaced by its twin. [Paragraphs about complaints regarding noise-levels in the A320, and plans regarding improved sound-proofing.] ------------------------------ Date: Fri, 14 Apr 89 19:01:33 GMT From: Dermot Williams Subject: 1,000 Pilots Face ban From Dublin's EVENING HERALD of Thursday 13th April, without permission: "1,000 Pilots Face Ban" The US Federal Aviation Administration said it planned to suspend or revoke the licences of more than 1,000 pilots who lied about past alcohol or drugs convictions. The FAA said about 10 per cent of them were commercial airline pilots and the rest were private pilots. The FAA said it got the names of more than 6,000 pilots through a computer match of medical applications, criminal records and state motor vehicle records. Any of the pilots on the list care to comment? Do you feel that this is a fair or foul use of computer databases? Dermot Williams, University College Dublin, Dept. of Computer Science [In an effort to make sure we stick to the computer risks, and not compete with the aviation BBoards on technical nuances, I suggest that some of the pending submissions to RISKS might better be redirected elsewhere. This item is clearly a computer database problem, not an aviation problem. PGN] ------------------------------ Date: Thu, 13 Apr 89 22:09:59 EDT From: Robert Morris Subject: RFI and elevators Dave Horsfal writes in Risks 8.54: I have a little hand-held (amateur) transceiver, generating just 3 watts on 147 MHz from a "rubber duck" antenna - very inefficient. When I'm in the mood, I trigger it next to various bits of electronic equipment, just to test their RF susceptibility. ... This is distressing behavior from a licensed amateur radio operator. In the US, this might subject one to revocation of the license and possibly criminal penalties if the action caused damage or injury. In the US, amatuer radio transmissions are restricted in purpose, and testing RFI rejection of commercial equipment is not one of them. Even if the manufacturer were wholly negligent in their RFI rejection, the amatuer ``investigator'' of this fact could reasonably be expected to understand the consequences of probing this inadequate security. For example, I rather doubt that any one would make such an investigation of, say, someone's pacemaker. In my opinion, amatuer radio operators have approximately the same responsibility as did the author of the Internet worm. They have substantial technical knowledge and good reason to believe that their action could cause malfunction, and in this case, possible injury. Robert A. Morris KA1BWN [Robert Morris was a signer of the Declaration of Independence. We have now had at least FOUR different namesakes contributing to or discussed in RISKS. I hope no one is confused. PGN] ------------------------------ Date: 14 Apr 89 12:40:53 GMT From: kotlas@uncecs.edu (Carolyn M. Kotlas) Subject: Electronic Truant Officers (Re: RISKS-8.56) My daughter's high school (and several others in this area) has had such a notification system in place for several years. I don't know how much a part the school's computers play in this, but the notification is in the form of a telephone call to the home and a generic recording that is played. Something along the lines of "Your child was reported absent in one or more of his/her classes today." The source of problems (or "risks") of this system is human, not computer-based. Every time I received the recording, my daughter's absence was excused, usually because of a school field trip that had been approved by the school; so if there's poor coordination between teachers and administration, parents will receive false alarms. (Which, like too many cries of "Wolf!" may lessen a parent's belief in any real reports of absences.) Also, since the calls are usually generated at a predictable time in the evening of the absence, truants could just take the call for the parent and report it as a wrong number. (I've also heard of people without children getting these calls, either due to typos in the student's records or misdialing of the number.) So much for an infallible reporting system here. --Carolyn Kotlas, UNC-ECS, Research Triangle Park, NC ------------------------------ Date: Fri, 14 Apr 89 13:13:43 EDT From: h44394@leah.Albany.EDU (Michael R. Hoffman) Subject: Re: Electronic Truant Officers When I was in my Sophomore year at The Bronx High School of Science, they implemented a computerized attendence scheme. Every student was given a number (welcome to the real world... forget your name, you are now ####) which was used to trace the student through their years at school. When attendence was taken in classes and in Homeroom, the teachers would fill out "bubble" sheets which were passed through a scanner to the central computer, which was located somewhere in Manhatten! They quickly found many problems in the system. Errors in filling in the wrong bubbles, the computer crashing, students forgetting their number, and students who would cut Homeroom (which meant "Absent for Day") yet were not marked absent in certain classes, really screwed the school administration up. And, as with most other computer systems, there were ways around the system. Supposedly being the "brightest students in the country" (YEAH, Right!! :-}), you can imagine the fun we had beating it. In a word, computerized school attendence systems are a JOKE! And they don't help with convincing the students that are REAL people, not just cogs in the system. ------------------------------ Date: 14 Apr 89 23:00:24 GMT From: Ed Robertson Subject: Electronic Truant Officers (Re: RISKS-8.56) One evening last week the phone rang and I answered to hear a sepulcral electronic voice announce that my son, whom I know was in school, had been absent from all of his classes that week. The best part of this system, from the schools point of view, is that there's not even any chance to question that electronic voice. Edward Robertson, Computer Science Dept, Indiana U., Bloomington, IN 47405-4101 ------------------------------ Date: 14 Apr 89 From: "hugh_davies.WGC1RX"@Xerox.COM Subject: Re: Computer CAN attempt to defraud you linden@Sun.COM (Peter van der Linden) asserts that a computer can defraud you. His story about the pie factory is seriously flawed. 1) The computer is just a tool. You are being defrauded by the management of the pie factory, in the same way you are defrauded by the taxi driver who short changes you rather than by his taximeter. 2) Weights and measures are controlled by legislation. It may be immoral to take advantage of the loopholes in that legislation, but it is not dishonest. If the law is unsatisfactory, get it changed. 3) In the pie factory case, before automation, 50% of consumers of a 4oz. pie were getting *more* than they had paid for. I wonder how many of them wrote to the factory to offer more money? 4) Computer weighing systems do *not* allow "an accuracy hitherto unobtainable". (I wrote potato chip weighing systems for two years for a living). What they generally do is allow repeatability in weighing, i.e., a narrowing of the distribution curve of the weights dispensed, which then allows a slight reduction in the 'set-weight', at an enormous saving to the producer, spread over millions of items, but a minimal impact on the consumer of a single item. 5) Peter says "if the pie was a "4oz" pie, the bakers were permitted to range from 3.5 to 4.5oz". This sounds unlikely to me. I am not familiar with American weights and measures legislation, but the law is usually either formulated such that *no* pie may weight less than 4oz - which means that the average pie must actually weigh 4oz plus twice the standard deviation of pie weight (at least - depends on how assiduous you want to be in avoiding prosecution!), or there is some kind of limit on what proportion of pies may weigh less than the marked weight. If the control is merely on the average weight, given two pies, I'll have the 8oz pie and you can have the empty carton! In either case, it is in the manufacturers interest to reduce the standard deviation as much as possible, which is what the computer allows. In fact, the real problem is not weighing the pies, or whatever, but accurately dispensing the filling. In the EEC, all products are divided into two categories, 'easy to pack', and 'difficult to pack' with the former having tighter controls than the latter. When I was weighing potato chips, one of the things we did was make sure each and every packet had at least the legal minimum content. This goes part-way towards ensuring that every consumer gets what he paid for. Hugh Davies ------------------------------ Date: Fri, 14 Apr 89 13:38:50 -0400 From: ficc!peter@uunet.UU.NET Subject: Computer maliciousness (Re: RISKS-8.56) Having been roundly chastened for claiming that a computer can not be malicious, let me explain this point more fully. A bank may have policies that are malicious, and may embody these policies in a computer program. I would not deny that... the point I'm making, though, is that the computer software can be assumed to embody the policies of the bank. Subject to bugs and design flaws, of course, but it's the bank's policies. An agent of the bank, then, has a reason to stand by the computer: While the software may have bugs, they can be reasonably certain it is not intended to defraud the bank. So long as the bank has reasonable policies, they can also assume that there's nothing in the program intended to deliberately defraud its customers. They have no such certainty about the customers themselves. The problem comes when a customer has documentation to substantiate his or her claim, or they know there's a bug, and they still don't act. Peter da Silva, Xenix Support, Ferranti International Controls Corporation. ------------------------------ End of RISKS-FORUM Digest 8.57 ************************ -------