A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Space Science » History
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

Faulty hardware found on shuttle



 
 
Thread Tools Display Modes
  #1  
Old March 28th 04, 07:20 AM
Kevin Willoughby
external usenet poster
 
Posts: n/a
Default Faulty hardware found on shuttle

In article ,
says...
The advantage of a limited memory for a computer is that it forces the
programer to really sit down and work the problem on paper to simplify it
before starting to code.


Youch! In software engineering (heck, all forms of engineering), the
development cost is a strong, non-linear, function of the number of
constraints on the system. A computer that is almost too small provides
a number of constraints that make it *much* harder to build a good
system.

One of the hardest lessons learned in the past half-century of
programming, is that it is really, *really* hard to work the problem on
paper. So much so that the current flavor-of-the-month in programming
("extreme programing") flatly rejects that it is possible. The folks who
have learned to work the problem on paper (e.g., Dijkstra, who never
learned to use a word processor and eventually transended the
typewriter, composing his latter thoughts with pen and ink) are
exceeding rare.


If they hire microsoft weenies to code the next
shuttle, they'll just bloat the code up and not worry about limited
resources and that is when you start to have problems because you're not
careful with memory allocation, buffer sizes etc etc.


On the other hand, if you have a bit of margin in cpu-speed, real-time
requirements, and memory, it is valid engineering to consider not
forcing the programmers to be careful with memory allocations. Let the
machine keep track of memory usage (keyword: "garbage collection").

If you have significant margin, you often have the chance to consider
reusing some programs that already exist. Reuse can have a dramatic
effect on cost, time and reliability of a system.
--
Kevin Willoughby
lid

Imagine that, a FROG ON-OFF switch, hardly the work
for test pilots. -- Mike Collins
  #3  
Old March 30th 04, 02:18 AM
John Doe
external usenet poster
 
Posts: n/a
Default

Kevin Willoughby wrote:
forcing the programmers to be careful with memory allocations. Let the
machine keep track of memory usage (keyword: "garbage collection").


My immediate reaction to that is sloppy programming where the programmer HOPES
the "system" (or more precisely, the run time library) will fix the stuff the
programmer forgot to do.

If you allocate some memory and the code that deallocates it doesn't always
get executed, it means that your logic is flawed and you haven't done the due
diligence to ensure your code reflects the intentions all the time.
  #4  
Old March 30th 04, 02:38 AM
John Doe
external usenet poster
 
Posts: n/a
Default

Marvin wrote:
The ideal is to have *just enough* computer to comfortably do the job,
without tempting to excess growth. Unfortunately witht he tempo of computer
growth nowadays, matching this balance is a bit tough.


Or worded a different way: justify the computer resources you really need. If
you do this justification in front of peers, it motivates you to remove bloat
that you know your peers will tell you is not necessary.
  #5  
Old March 30th 04, 03:09 AM
Marvin
external usenet poster
 
Posts: n/a
Default

Kevin Willoughby wrote in
:

Youch! In software engineering (heck, all forms of engineering), the
development cost is a strong, non-linear, function of the number of
constraints on the system. A computer that is almost too small
provides a number of constraints that make it *much* harder to build a
good system.


Quite correct, bit a somewhat myopic view of a real-life problem.
Having a computer that is more than ample in abilities virtually forces the
developers (via management's imperative not to 'waste') to utilise it to
capacity, even when it is not really needed. This causes code and
especially data bloat, which leads to exponential increase in cost of
quality control.

The ideal is to have *just enough* computer to comfortably do the job,
without tempting to excess growth. Unfortunately witht he tempo of computer
growth nowadays, matching this balance is a bit tough.
  #8  
Old March 31st 04, 01:19 AM
Derek Lyons
external usenet poster
 
Posts: n/a
Default

Kevin Willoughby wrote:

On the other hand, if you have a bit of margin in cpu-speed, real-time
requirements, and memory, it is valid engineering to consider not
forcing the programmers to be careful with memory allocations. Let the
machine keep track of memory usage (keyword: "garbage collection").


Of course to use that margin, you have to ensure not only that the
garbage collector is called, but that it actually functions as
intended and is itself bug free.

D.
--
Touch-twice life. Eat. Drink. Laugh.
  #9  
Old March 31st 04, 04:25 PM
Ami Silberman
external usenet poster
 
Posts: n/a
Default


"Derek Lyons" wrote in message
...
Kevin Willoughby wrote:

On the other hand, if you have a bit of margin in cpu-speed, real-time
requirements, and memory, it is valid engineering to consider not
forcing the programmers to be careful with memory allocations. Let the
machine keep track of memory usage (keyword: "garbage collection").


Of course to use that margin, you have to ensure not only that the
garbage collector is called, but that it actually functions as
intended and is itself bug free.

Of course, you also need to trust your compiler's deallocation routine. I
remember that at one time Turbo Pascal essentially ignored deallocation.

Course, you could then explicitly manage all your allocations in your own
code. (Which is much more efficient for things like OS code.)


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Unofficial Space Shuttle Launch Guide Steven S. Pietrobon Space Shuttle 0 August 5th 04 01:36 AM
Calculation of Shuttle 1/100,000 probability of failure perfb Space Shuttle 8 July 15th 04 09:09 PM
Unofficial Space Shuttle Launch Guide Steven S. Pietrobon Space Shuttle 0 April 2nd 04 12:01 AM
The wrong approach Bill Johnston Policy 22 January 28th 04 02:11 PM
UFO Activities from Biblical Times Kazmer Ujvarosy Astronomy Misc 0 December 25th 03 05:21 AM


All times are GMT +1. The time now is 06:41 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 SpaceBanter.com.
The comments are property of their posters.