Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

More Than Coding Errors Behind Bad Software

ScuttleMonkey posted more than 5 years ago | from the bad-decisions-go-all-the-way-up dept.

Programming 726

An anonymous reader writes "SANS' just-released list of the Top 15 most dangerous programming errors obscures the real problem with software development today, argues InfoWeek's Alex Wolfe. In More Than Coding Mistakes At Fault In Bad Software, he lays the blame on PC developers (read: Microsoft) who kicked the time-honored waterfall model to the curb and replaced it not with object-oriented or agile development but with a 'modus operandi of cramming in as many features as possible, and then fixing problems in beta.' He argues that youthful programmers don't know about error-catching and lack a sense of history, suggesting they read Fred Brooks' 'The Mythical Man-Month,' and Gerald Weinberg's 'The Psychology of Computer Programming.'"

Sorry! There are no comments related to the filter you selected.

Perfection Has a Price (5, Insightful)

alain94040 (785132) | more than 5 years ago | (#26421159)

The most common errors: SQL injection, command injection, cleartext transmission of Sensitive Information, etc.

People make mistakes. Software needs to ship, preferably yesterday.

How much would it cost to have perfect software? I happen to have worked in an industry that requires perfect coding. So I can imagine what it would look like if Microsoft tried it.

The debugger would cost half a million dollar per seat (gdb is free). There would be an entire industry dedicated to analyzing your source code and doing all kinds of proofs, coverage, what-if analysis and other stuff that require Ph.Ds to understand the results.

The industry I'm referring to is the chip industry. Hardware designers code pretty much like software developers (except the languages they use are massively parallel, but apart from that, they use the same basic constructs). Hardware companies can't afford a single mistake because once the chip goes to fab, that's it. No patches like software, no version 1.0.1.

It's just not practical. Let the NSA order special versions of Office that cost 10 times the price and ship three years after the consumer version.

But for me, "good enough" is indeed good enough.

--
FairSoftware.net [fairsoftware.net] -- work where geeks are their own boss

Re:Perfection Has a Price (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26421363)

You forgot hot beef injection

Re:Perfection Has a Price (5, Insightful)

Opportunist (166417) | more than 5 years ago | (#26421443)

The problem is that software doesn't even ship as "good enough" anymore. It's more like "it compiles, ship it".

Your example of hardware, and how it's impossible to patch it, was true to a point for software, too, in the past. Before it became easy to distribute software patches via the internet, companies actually invested a lot more time into testing. Why? Because yes, you could technically patch software, but it was tied to sometimes horrible costs to do just that.

You can actually see a similar trend with the parts of hardware (i.e. BIOSes) that are patchable. Have you ever seen hardware shipped with all BIOS options fully enabled and working? I haven't in the past 2-3 years. More often than not you get a "new" board or controller with the predecessor's BIOS flashed in, and the promise for an update "really soon now".

The easier it is to patch something, the sloppier the original implementation is. You'd see exactly the same with hardware if it wasn't so terribly hard (read: impossible) to rewire that damn printed circuit. I dread the day when they find some way to actually do it. Then the same will apply to hardware that you have today with some OSs: It's not done until it reads "SP1" on the cover.

Re:Perfection Has a Price (0)

Anonymous Coward | more than 5 years ago | (#26421633)

It's already possible to "patch" hardware. everything from updating microcode to work around hardware faults (done in processors all the time), to full blown programmable hardware (FPGAs).

Re:Perfection Has a Price (4, Interesting)

Anonymous Coward | more than 5 years ago | (#26422005)

The same is true in a way of software development.

Back when I was in high school, I could write a program (on punch cards) and put them in the tray to be submitted. Every week the intra-school courier came around and picked up the tray, returning the tray and output from the previous week. When every typo adds 2 weeks to your development time, you check your code *very* carefully, and examine every error message or warning that comes back from the compiler, to try to fix as many errors as possible in each submission.

With interactive compilers/interpretors, it is not worth spending that much time on verifying the mechanical coding - just fix the first obvious problem and re-submit because it is faster to let the compiler (1) refuse to complain about the parts you managed to type correctly, and (2) remove all of the messages that were cascaded issues from the mistake you just fixed, than it is to waste your time scanning for typos, or reading the subsequent error message in case there are some that are not cascades.

Re:Perfection Has a Price (4, Insightful)

Timothy Brownawell (627747) | more than 5 years ago | (#26421483)

I happen to have worked in an industry that requires perfect coding. [...] The industry I'm referring to is the chip industry. [...] Hardware companies can't afford a single mistake because once the chip goes to fab, that's it. No patches like software, no version 1.0.1.

What does "stepping: 9" in /proc/cpuinfo on my home computer mean? What is a f00f, and what happened with the TLB on the early Phenom processors?

Re:Perfection Has a Price (5, Informative)

networkBoy (774728) | more than 5 years ago | (#26421941)

they cost a shit ton of money is what happened.

A project I was on in 2000ish went as follows:
Steppings A0, A1, A2, and A3 were halted in-fab because someone found a critical bug in simulations.
A4-A7 did not work.
B0-B4 did not work B6 did not work
C0-C4 did not work
B5, B7, C5 sorta worked.
The company folded.
That's what a software mentality working on hardware will get you.

Steppings in CPUs are a little different. Often an earlier stepping was functional enough to start the design cycle for Dell HP, et.al. but not ideal. The later steppings start by fixing the deficiencies, then beyond that are likely cost cutting.
-nB

Re:Perfection Has a Price (1)

GraffitiKnight (724507) | more than 5 years ago | (#26422077)

Or what about firmware updates? I just updated my blu-ray player's firmware for the 5th time, while I've never updated any DVD player I owned.

Re:Perfection Has a Price (3, Insightful)

Jason1729 (561790) | more than 5 years ago | (#26421535)

People make mistakes. Software needs to ship, preferably yesterday.

This attitude is the number one problem with software development. When all you care about is getting it out the door, you send garbage out the door.

Software today is so damn buggy. I spend hours a week just doing the work of getting my computer to work. And even then it has random slowdowns and crashes.

I'm old enough to remember when it wasn't like that. You'd run your program and it was ready in a second, you'd exit and it left no trace. Crashes were virtually unheard of. We have people where I work who only do data entry, and they still use wordperfect 4.2 on 386 hardware. I've seen their workflow and how fast it works for them and I can see if they "modernized" it would cripple their productivity.

And for the money at stake, what's so wrong with hiring a few Ph.D's to analyze code. Amortized over a few million copies, a few 6-digit salaries aren't so bad. And the losses the software shops suffer in bad-will when their products fail costs them more.

Re:Perfection Has a Price (5, Interesting)

bb5ch39t (786551) | more than 5 years ago | (#26421753)

I'm an old timer, still working on IBM zSeries mainframes, mainly. We just got a new system, which runs on a combination of Linux and Windows servers, to replace an application which used to run on the mainframe. Nobody likes it. We are apparently a beta test site (though we were told it was production ready). It has a web administration interface. For some reason, on some users, the only PC which can update those users is my Linux PC running Firefox. Nobody can say why. Until early last week, it would take a person a full 5 minutes to login to the product. They made some change to the software and now it takes about 10 seconds. this is production quality? No, I won't say the product. Sorry. Against policy.

Re:Perfection Has a Price (1)

internerdj (1319281) | more than 5 years ago | (#26421767)

And the losses the software shops suffer in bad-will when their products fail costs them more.
If everyone is releasing incomplete software there is no cost for bad-will. Worse than that, if there is no cost for bad-will management escalates the problem by compressing the next project's timetable even further.

Re:Perfection Has a Price (4, Insightful)

Schuthrax (682718) | more than 5 years ago | (#26421903)

You'd run your program and it was ready in a second, you'd exit and it left no trace. Crashes were virtually unheard of.

And all without managed memory, automatic garbage collection, etc., imagine that! Seriously, I see so many devs (and M$, who has a product to sell) insisting that all that junk is what will save us. What they're doing is attempting to create a Fisher Price dev environment where you don't have to think anymore because they've done it all for you. What's going to happen to this world when GenC# programmers replace the old guard and they don't have the least clue about what is going on inside the computer that makes the magic happen?

Re:Perfection Has a Price (1)

KovaaK (1347019) | more than 5 years ago | (#26422123)

What's going to happen to this world when GenC# programmers replace the old guard and they don't have the least clue about what is going on inside the computer that makes the magic happen?

It depends on where the programmer is educated. At my school, I had plenty of CS majors in my classes where I learned (in much depth) about assembly, compilers, general computer architecture... Note: my degree is in Computer Engineering, not Computer Science. I don't know if the CS majors were required to take the classes that I saw a handful of them in, but they very likely were for most of the important ones.

Re:Perfection Has a Price (1, Insightful)

erroneus (253617) | more than 5 years ago | (#26422025)

I have to concur with the other "old timers." I am 40 years old and have been in this world since I was around 10 or so. It has been a rather long time since I did any serious programming, but I find myself hacking and tweaking from time to time and I recall vividly the type of thinking I had to engage in to write software that worked. VALIDATE INPUT. VALIDATE INPUT. VALIDATE INPUT. There is little more to writing good code than that. Actually, there is plenty more, but where security is concerned, that should be task #1. The move to object oriented code should not have changed this practice. "In Theory" validating input should now be handled by the object but it isn't always the case and good programmers should know better than to trust "black boxes" to do what they are supposed to do. So the other side is "VALIDATE OUTPUT" as well.

I find this remarkable disregard for fundamentals a bit unsettling... it is as unsettling as doctors who prescribe drugs without first doing a diagnosis.

Re:Perfection Has a Price (1)

clone53421 (1310749) | more than 5 years ago | (#26422135)

I am 40 years old and have been in this world since I was around 10 or so. It has been a rather long time since I did any serious programming, but I find myself hacking and tweaking from time to time and I recall vividly the type of thinking I had to engage in to write software that worked. VALIDATE INPUT. VALIDATE INPUT. VALIDATE INPUT.

Good memories. I remember one of the first BASIC programs I wrote; it asked you to enter a number. One of the first things I did to the program was "break" it by entering a non-number, which caused INPUT to crash. Subsequently I looked up the way to treat the input as a string, convert it to a number, and have the program complain without crashing if it couldn't make a number from what the user entered.

Re:Perfection Has a Price (1)

sjames (1099) | more than 5 years ago | (#26421759)

And even with all the extra costs and care, errors manage to slip in all the time (errata). Usually there is a workaround, but occasionally it results in a costly recall/replacement. There are a lot less of them compared to software, but they certainly happen. Much more so in the area of device firmware and microcode.

Computer hardware and software both are at the edges of what we know how to do and it shows.

Re:Perfection Has a Price (4, Informative)

Enter the Shoggoth (1362079) | more than 5 years ago | (#26421973)

The most common errors: SQL injection, command injection, cleartext transmission of Sensitive Information, etc.

People make mistakes. Software needs to ship, preferably yesterday.

How much would it cost to have perfect software? I happen to have worked in an industry that requires perfect coding. So I can imagine what it would look like if Microsoft tried it.

The debugger would cost half a million dollar per seat (gdb is free). There would be an entire industry dedicated to analyzing your source code and doing all kinds of proofs, coverage, what-if analysis and other stuff that require Ph.Ds to understand the results.

The industry I'm referring to is the chip industry. Hardware designers code pretty much like software developers (except the languages they use are massively parallel, but apart from that, they use the same basic constructs). Hardware companies can't afford a single mistake because once the chip goes to fab, that's it. No patches like software, no version 1.0.1.

It's just not practical. Let the NSA order special versions of Office that cost 10 times the price and ship three years after the consumer version.

But for me, "good enough" is indeed good enough.

-- FairSoftware.net [fairsoftware.net] -- work where geeks are their own boss

I worked within the same space about 10 years ago - I was a sysadmin for a group of asic design jockies as well as the firmware and device driver guys and I'm gonna call you on this...

The hardware designers were under the same sorts of pressures, if not more so, than the software guys and I saw many bugs that would end up in the shipping silicon. The general attitude was always "oh! a bug: well the software guys will just have to work around it."

And as for "no patching", well that's also BS, you can patch silicon, it's just rather messy having to have the factory do it post-fab by cutting traces on die.

So much for perfection!

Re:Perfection Has a Price (1)

mcgrew (92797) | more than 5 years ago | (#26422079)

IMO the most common error is in design. They don't design for use, they design for "cool", trying to get us to say "wow" rathar than simply creating easy to use tools. They cram everything but the kitchen sink in there, and it makes for lousy tools. Software designers ashould ask themselves why you need a screwdriver AND a hammer rather than a screwhammer.

mechanical tools have become easier to use; we have backhos where once men with spades dug. Manual tools gave way to power tools, which gave way to cordless electric tools. Unfortunatley, though toolmakers for the construction industry study design, toolmakers for the digital worker are woefully ignorant of the very purposes their tools are used for.

A swiss army knife will open a bottle of wine or spread butter, but a corkscrew and butter knife work better. With software we only have Swiss army knives.

Re:Perfection Has a Price (1)

0100010001010011 (652467) | more than 5 years ago | (#26422133)

If you really want to see what's out there (and running your banks, hospitals, etc) check out The Daily WTF [thedailywtf.com] . There's a whole section dedicated to WTF code. [thedailywtf.com]

Defects have a cost. Who pays? Change that. (1)

Animats (122034) | more than 5 years ago | (#26422151)

Indeed.

The reason software is so bad is that the customers absorb the cost of defects. That's a political decision. Cars used to be that way; today, if a car even stalls unexpectedly, that's considered a manufacturing defect. In the US, the manufacturer has to pay for the recall to fix the problem. Cars are far more reliable, and much safer, as a result.

In a few industries, the software developer is financially liable for errors. The gambling industry works that way. The companies that run lotteries pay back a few percent of their revenue as penalties for failures and errors. (And they try very hard not to have expensive errors.)

Back before the Bush administration caved on the Microsoft antitrust case, I proposed the Full Warranty remedy. [animats.com] The FTC took a look at this issue in 2000 [ftc.gov] , but the Bush Administration didn't do anything. It may be time to revisit this.

a book never written (5, Funny)

jollyreaper (513215) | more than 5 years ago | (#26421229)

Fred Brooks's 'The Mythical Man-Month',

I read that as "the Mythical Man-Moth." I bet that would be a great book.

Re:a book never written (1)

otis wildflower (4889) | more than 5 years ago | (#26421503)

I read that as "the Mythical Man-Moth." I bet that would be a great book.

Or a movie?

Where Richard Gere is drawn to a small West Virginia-based software consultancy, whose master hacker, living unseen in a locked, windowless office, demands a constant flow of wool sweaters be slipped under the door.. All wonder how so much code can be written so quickly..

Re:a book never written (2, Funny)

LMacG (118321) | more than 5 years ago | (#26422047)

Not to mention cartoon and live-action series [wikipedia.org] .

Re:a book never written (1)

FurtiveGlancer (1274746) | more than 5 years ago | (#26422057)

Perhaps it's a reference to Arthur from "The Tick" How can one forget a "superhero" with the battle cry of: "Not in the face! Not in the face!"?

Re:a book never written (1)

jd (1658) | more than 5 years ago | (#26422105)

The Man-Moth is not mythical. I saw him only this morning, chewing on people's jackets. He's not keen on the jackets of managers - he complains they're tasteless. Man-Moth is a moth that got bitten by a radioactive man and has acquired supermothian powers of wearing digital watches and burping in front of the television.

Don't forget peer review. (0)

Anonymous Coward | more than 5 years ago | (#26421231)

It would have avoided the embarrassing typo of 15, when in fact the article states 25. Oops!

The biggest software error (-1, Flamebait)

XPeter (1429763) | more than 5 years ago | (#26421249)

Vista.

Re:The biggest software bugfix (0)

Anonymous Coward | more than 5 years ago | (#26421471)

7.

Re:The biggest software bugfix (1)

McGiraf (196030) | more than 5 years ago | (#26421529)

of.

Re:The biggest software bugfix (1)

Erie Ed (1254426) | more than 5 years ago | (#26421735)

nine!

Re:The biggest software bugfix (1)

YrWrstNtmr (564987) | more than 5 years ago | (#26421775)

9.

Re:The biggest software bugfix (1)

Ethanol-fueled (1125189) | more than 5 years ago | (#26421777)

Fuck.

Re:The biggest software bugfix (0)

Anonymous Coward | more than 5 years ago | (#26421991)

You're all niggers in a troll thread

Re:The biggest software error (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26421787)

The biggest impact on software quality is putting the release schedule in the hands of businessmen. speaking as a former (long ago) MS SDE, the coders I worked with there were at least as good as a random developer (frequently /much/ better). However their job is to code things in triaged order, not make release schedule decisions. When the execs tell everyone to stop typing and RTM, then that's it. The state of the software is generally known prior to ship because of their full-time /real/ QA teams with ad-hoc testing, automation, and metrics that are all much better than all other teams I've been on before or since. Don't rag on the MS devs for their suits' decisions to release with known bugs.

Re:The biggest software error (1)

Mistshadow2k4 (748958) | more than 5 years ago | (#26421935)

ME. Vista is still in 2nd place compared to the Millennium Edition. I know, I've used both.

Top 15? (0)

Anonymous Coward | more than 5 years ago | (#26421253)

Someone must be counting in the wrong number base or something, because the article clearly states 25 in about a million (in base 400) places.....

When I was breaking in (5, Insightful)

PingXao (153057) | more than 5 years ago | (#26421261)

In the early '80s there were no "older" programmers unless you were talking mainframe data processing. On microprocessor CPU systems the average age was low, as I recall. Back then we didn't blame poor software on "youthful programmers". We blamed it on idiots who didn't know what they were doing. I think it's safe to say that much hasn't changed.

Re:When I was breaking in (5, Insightful)

Skapare (16644) | more than 5 years ago | (#26421489)

This is true with any group. There are geniuses and idiots in all groups. The problems exist because once the supply of geniuses have been exhausted, businesses tap into the idiots. And this is made worse when employers want to limit pay across the board based on what the idiots were accepting. Now they are going overseas to tap into cheaper geniuses, which are now running out, and in the mean time, lots of local geniuses have moved on to some other career path because they didn't want to live at the economic level of an idiot.

Re:When I was breaking in (5, Funny)

fishbowl (7759) | more than 5 years ago | (#26421725)

>There are geniuses and idiots in all groups.

Most of both groups are within two standard deviations of a norm. Your idiots are probably smarter than you think and your geniuses are probably not as smart as you'd like to believe.

Re:When I was breaking in (0)

Anonymous Coward | more than 5 years ago | (#26422115)

Agreed.
Last 'genius' I worked with made a complicated reporting program that worked automatically with some minor maintenance (flushing out the old data when Access hit row limits... we didn't have a choice on the tools).

When he left the tool stopped working as there was 0 documentation and no code comments.

He left to start up his own business selling custom implementations of a design toolkit he'd produced but hadn't had any pickup sales after a couple months.

The program he left worked wonderfully and was incredibly complicated until he wasn't there due to poor coding practice with documentation/comments to allow others to manage the tool.

Re:When I was breaking in (0)

Anonymous Coward | more than 5 years ago | (#26421995)

When they went overseas, they only went for cheap. Geniuses or idiots never came into it, so we got lots of idiots. I mean, if the code we had to fix every time it came back from overseas was the result of geniuses, I'm very scared of what the idiots would produce.

Re:When I was breaking in (5, Interesting)

77Punker (673758) | more than 5 years ago | (#26421495)

After working 3 months at my first programming job, the other two developers quit which left just me. I felt inadequate until I started interviewing other programmers to fill in the gap. Apparently lead developers with 10 years of experience can't solve the simplest programming problems, explain how databases work, or explain OOP. I'm convinced that most software sucks because most people writing it have no idea what they're doing and shouldn't be allowed to touch a computer. I'm currently in my 5th month at the same job and we've got someone good who will start soon, but it took a long time to find even one competent developer.

Re:When I was breaking in (2)

frosty_tsm (933163) | more than 5 years ago | (#26421731)

I'm convinced that most people presenting themselves as lead developers in interviews are far from it. There's a reason why thedailywtf.com [thedailywtf.com] has a "Tales from the Interview" section.

Re:When I was breaking in (0)

Anonymous Coward | more than 5 years ago | (#26421937)

Have you ever read "Blink"? If you had maybe you'd understand why someone with 10 years experience sometimes has trouble eith explaining how a database works or explaining OOP. Regarding "simplest programming problems", define simple.

Re:When I was breaking in (3, Interesting)

77Punker (673758) | more than 5 years ago | (#26422149)

"Write a function to sum all the numbers from 0 to 100"

Every code question I ask is about that simple. The solutions I get to EASY questions are almost always really stupid, incorrect, or get answered with "I don't know how to do that"

Re:When I was breaking in (0)

Anonymous Coward | more than 5 years ago | (#26422099)

Keep in mind that only people looking for jobs are interviewing. Theoretically your best developers are employed, and their employers don't want to lose them.

That said, people get in a routine doing only what their company needs. If your software development job is working in Excel and Access every day, you could very easily get paid to do that for 10 years and have no idea about SQL or compilers, for instance.

Hiring often means finding someone who is a good fit for the company, works well with others, and is interested or motivated enough to learn. From there you can always teach them your company's arcane mixture of software.

Re:When I was breaking in (5, Interesting)

Opportunist (166417) | more than 5 years ago | (#26421671)

Quite dead on. But the difference is that today, with all the RAD tools around, you have a fair lot of "programmers" who don't even know what they're doing and get away with it. They got some course, maybe at their school (and that's already the better ones of the lot), maybe as some sort of an attempt to get them back onto a job from their unemployment services, and of course tha intarwebz is where da money is, so everyone and their dog learned how to write a few lines in VB (or C#, the difference for this kind of code molesters is insignificant) and now they're let loose on the market.

Then you get hired by some company that signed up those "impressively cheap programmers, see, programmers needn't be horribly expensive" when the project goes south because deadlines lead to dead ends but no product, and you're greeted with code that makes you just yell out WTF? You got conditional branches that do exactly the same in every branch. You get loops that do nothing for 90% of the loop time and when asked you just get a blank stare and a "well, how do you think we could count from 80 to 90 besides counting from 0 to 90 and 'continue;' for 0-70?", because that's how they learned it and they never for a nanosecond pondered just WHAT those numbers in the 'for' block meant. And so on.

Granted, those are the extreme examples of people who learned programming like a poem. By heart. They have a hammer as a tool, so every problem has to be turned into a nail. But you'd be amazed at the blank stares and "what do I need that for?" when you ask some of those "programmers" about hash tables (include snide joke about Lady Mary Jane here...) or Big-O notation. And we're talking people who are supposedly able to write a database application here.

This is the problem here. It's not youthful programmers. It's simply people who know a minimum about programming and managed to trick some HR goon into believing they could actually do it.

Re:When I was breaking in (0, Troll)

rtechie (244489) | more than 5 years ago | (#26422159)

maybe at their school

Actually, I tend to have a dim view of those that took a few classes in computer programing and think they're a programmer.

If I was being real, I mean really real, my interview would consist of one question:

"When did you code your first C application?"

If it was any older than 12 (twelve), I'd reject them. *I* did this, and I don't even consider myself to be a programmer.

Experience has taught me that high-school dropouts with a passion for programming are generally LIGHT YEARS beyond people who aren't passionate that scraped through a BA in Computer Science. The dropout is far more likely to have real experience using software to solve real problems.

Self-taught programmers are almost always superior to those that have learned in a class. They're "doing it the hard way" and the extra effort shows.

Re:When I was breaking in (1)

Tom (822) | more than 5 years ago | (#26421897)

No, but there is a fairly high correlation between "young" and "idiot".

The older people have had more time for mistakes, and more opportunities to learn from them. Also, they have often learnt the most important lesson: You don't always know better, and sometimes what looks like sheer stupidity has a reason to it that you just don't know.

There is, however, another kind of stupidity that is more often present among older people. That of being stuck to the "we've always done it like that" way. It is just less common (they've also had time to thin their ranks out) and more obvious, so easier to avoid.

Honestly, after several companies and experiences from excellent to horrible, I'm fairly sure that you can show me the people in your IT department and I can tell you if your IT sucks. And age is one factor.

Waterfall (4, Insightful)

Brandybuck (704397) | more than 5 years ago | (#26421273)

The waterfall method is still the best development model. Uou have to analyze, then plan, then code, then test, then maintain. The steps need to be in order and you can't skip any of them. Unfortunately waterfall doesn't fit into the real world of software development because you can't freeze your requirements for so long a time. But cyclic models are a good second place, because they are essentially iterated waterfall models. When you boil all the trendy stuff out of Agile, you're basically left with a generic iterated waterfall, which is why it works. The trendy crap is just so you can sell the idea to management.

Re:Waterfall (5, Insightful)

Timothy Brownawell (627747) | more than 5 years ago | (#26421385)

The waterfall method is still the best development model. [...] Unfortunately waterfall doesn't fit into the real world

WTF? Not working in the real world makes it a crap model.

When you boil all the trendy stuff out of Agile, you're basically left with a generic iterated waterfall, which is[...]

...not a waterfall.

Re:Waterfall (0)

Anonymous Coward | more than 5 years ago | (#26421501)

That's like saying a rollercoaster lacks properties of a hill just because it goes back to the start at the end of the run.

Re:Waterfall (3, Insightful)

Brandybuck (704397) | more than 5 years ago | (#26421987)

You can't create quality software without planning before coding. Ditto for not testing after coding. This isn't rocket science, yet too many "professionals" think all they need to do is code.

The waterfall model isn't a management process, it's basic common sense. It's not about attending meetings and getting signatures, it's about knowing what to code before you code it, then verifying that that is what you coded. The classic waterfall took too long because you had to plan Z before you started coding A, but with an iterated waterfall (which is still a waterfall, duh) you only need to plan A before you code A.

Re:Waterfall (1)

pixelpusher220 (529617) | more than 5 years ago | (#26421997)

It works fine the *real* world. The one where its done when it's done and meets your requirements competently.

I'd call it unreal, to expect a bug free, optimized, intuitive application with anything *but* an intensive and robust requirements/design period. Most projects don't have the time, so everything is shortchanged in the goal of the ship date. Complaining about what the outcome is after you subvert that is like complaining your house fell down when they used balsawood twigs instead of 2x4's.

And to define a waterfall, just read the friggin word, water 'falls'. it doesn't say how much or whether it has to be x feet high. A 'rapids' is just a long series of very small waterfalls.

If you have to get from point a to point b along a line, do you want just one shot to get it right, or have multiple adjustable shots so that you end up closer to your actual goal?

Agile dev is good, but it's also damn hard to find people well versed in it to make it successful.

Re:Waterfall (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26421397)

The waterfall method is still the best development model.

I agree and that's why I painted over the windshield on my car and drive everywhere by dead reckoning. Pre-planning is the way to go for everything, I say!

(CAPTCHA: unaware. How appropriate.)

Waterfall was never valid (2)

wezeldog (982156) | more than 5 years ago | (#26421727)

Dr. Royce used it as an example of a methodology that doesn't work, but what he described was easy to understand so it gained traction with management types. It's like the joke where the guy says he's looking for a lost quarter under a streetlamp because the light is better than where he lost it.
http://www.cs.umd.edu/class/spring2003/cmsc838p/Process/waterfall.pdf [umd.edu]
I think his suggestion was to 'build it twice' via prototyping to discover what was missed in the requirements gathering and design phases.

Re:Waterfall (5, Insightful)

radish (98371) | more than 5 years ago | (#26421813)

The waterfall is broken, seriously. I'm paraphrasing from an excellent talk I attended a while back, but here goes.

For a typical waterfall you're doing roughly these steps: Requirements Analysis, Design, Implementation, Testing, Maintenance. So let's start at the beginning...requirements. So off you go notebook in hand to get some requirements. When are you done? How do you know you got them all? Hint: you will never have them all, and they will keep changing. But you have to stop at some point so you can move onto design, so when do we stop? Typically it's when we get to the end of the week/month/year allocated on the project plan. Awesome. Maybe we've got 50% of the reqs done, maybe not. It'll be a long time until we find out for sure...

Next up - Design! Woot, this bit is fun. So we crank up Rose or whatever and get to work. But when do we stop? Well again, that's tough. Because I don't know about you but I can design forever, getting better and better, more and more modular, more and more generic, until the whole thing has flipped itself inside out. So we stop when it's "good enough" - according to who? Or more likely, it's now 1 week to delivery and no-one's written any code yet so we better stop designing!

Implementation time. Well at least this time we know when we're done! We're up against it time wise though, because we did such a good job on Reqs & Design. Let's pull some all nighters and get stuff churned out pronto, who cares how good it is, no time for that now. That lovely, expensive design gets pushed aside.

No time to test...gotta hit the release date.

Sure this isn't the waterfall model as published in the text books, but it's how it works (fails) in real life. And the text books specifically don't say how to fix the problems inherent in the first two stages. What to do instead? Small, incremental feature based development. Gather requirements, assign costs to them, ask the sponsors to pick a few, implement & test the chosen features, repeat until out of time or money.

Re:Waterfall (1)

Brandybuck (704397) | more than 5 years ago | (#26422129)

You describe the classic waterfall, the one that does not work. Your solution, "small, incremental feature based development", is in essence an iterated waterfall. You can't gather all possible requirements, but you need to gather at least one or you can't even begin coding.

You are right, waterfall doesn't scale (1)

coryking (104614) | more than 5 years ago | (#26421943)

Waterfalling your entire project will doom it to failure. But that doesn't mean you you don't use waterfall on a small level. Most modern techniques are iterative versions of the waterfall model. You take a small chuck, plan it, spec it and ship it. If somebody thinks you scale waterfall to an entire project, well, get with the times gramps.

Scaling waterfall to a large project is a huge waste of resources. It would require your usability dudes, marketing dudes and stuff to do the plan and then sit idle for the remainder of the project. Likewise your programmers would only be usefully in the middle part and your testers in the final stages.

Lord help you if something changes. And guess what, it will change because this is reality and not the "good old days when I was a kid programming on punch cards uphill in the snow" that gray-beards wax poetic about. Unless you are designing an "about box", nobody knows the problem nor its solution initially. It takes many iterations to get to both a definition and a solution.

Waterfall died long before I ever entered the scene and it cracks me up when I hear people still promoting it as a good idea. These days we just sit down in front of the keyboard and click buttons in fancy "IDE's" using un-proven things like "Object Oriented Programming". Kids these days... we don't even manage our memory, the garbage collector does it for us. ;-)

Re:Waterfall (1)

Richard Steiner (1585) | more than 5 years ago | (#26422007)

The steps of analyzing, planning, coding, testing, and maintaining are used in all types of software development, not just those using formalized waterfall methodologies. You just do it in smaller steps if you're using a more agile development methodology in an in-house development environment.

Agile developers aren't idiots. They just tend to release finished software on the module level rather than as an entire cohesive product.

Re:Waterfall (1)

MrBigInThePants (624986) | more than 5 years ago | (#26422035)

I am sorry dude. Either you have never done agile or you just never got it.

Agile is not "iterative waterfall". It never was and never is. If you find yourself doing "iterative waterfall", then you are doing it all wrong.

You are not the first person I have heard say this and I dare say you wont be the last...

Top 10 Most Dangerous Summary-Writing Errors (0)

Anonymous Coward | more than 5 years ago | (#26421275)

10. Writing '15' instead of '25'

Modus Operandi (1)

HockeyPuck (141947) | more than 5 years ago | (#26421295)

modus operandi of cramming in as many features as possible, and then fixing problems in beta.

Sure his waterfall method works when you can guarantee that 100% of the features/functions you need in a product are determined during a Requirements phase. However, when company X will buy $10m of your product if you paint it red; then you agree to it; let the date slip and paint it red.

Re:Modus Operandi (2, Insightful)

John Hasler (414242) | more than 5 years ago | (#26421403)

Except that what you actually do is promise to paint it red even though you know that you do not have and cannot get any red paint. Then you deliver it green and try to tell the customer he is colorblind and besides the next model really will be red.

Re:Modus Operandi (1)

yttrstein (891553) | more than 5 years ago | (#26421627)

If you're the right kind of company that consistently sells a very high quality product, you have enough breathing room in every case to say "no" to a customer who's asking for things that will destabilize their product.

I do it all the time, and I've never once lost a customer doing so.

Grow some fucking balls.

Perfection Has a price (0)

Anonymous Coward | more than 5 years ago | (#26421335)

The industry I'm referring to is the chip industry. Hardware designers code pretty much like software developers (except the languages they use are massively parallel, but apart from that, they use the same basic constructs). Hardware companies can't afford a single mistake because once the chip goes to fab, that's it. No patches like software, no version 1.0.1.

Yeah...Right:

-There is never a revision AB/BB silicon

-Microcode updates don't exist

-No hardware designer has ever had to ECO something in

-The synthesis program NEVER makes mistakes

-The formal equivalence program is perfect

-The simulation environment is perfect

Keep dreaming...

Damn Lazy Programmers! (3, Interesting)

pete-wilko (628329) | more than 5 years ago | (#26421401)

Yeah, those good for nothing programmers cramming in features all over the place and not ad-hearing to time honored development practices like waterfall!

And requirement changes? WTF are those? Using waterfall you specify your requirements at the beginning, and these are then set in stone, IN STONE! Nothing will ever change in 6 -12 months.

It's not like they're working 60-80 hour weeks, been forced to implement features, having new requirements added and not being listened to! That would be like marketing driving engineering! Insanity!

As an aside - why is he dragging OO into this? Pretty sure you can use waterfall with OO - you even get pretty diagrams

typo in summary (1)

drquoz (1199407) | more than 5 years ago | (#26421411)

The list is actually 25, not 15.

Extensible Framework (2, Interesting)

should_be_linear (779431) | more than 5 years ago | (#26421433)

Most horrible projects I've seen were "extensible frameworks" that can do just about anything with appropriate extensions (plugins or whatever). But currently, without any existing extensions, it is bloated pile of crap. Also, there is nobody in sight willing to make one extension for it (except sample, done by author himself, on how to easily create extension).

Re:Extensible Framework (0, Flamebait)

Anonymous Coward | more than 5 years ago | (#26421823)

See: Eclipse

Those who fail to learn the lessons of history (3, Insightful)

overshoot (39700) | more than 5 years ago | (#26421459)

... are destined for greatness, because their bullshit is not burdened by reality.

I've heard from several ex-Softies that the company inculates its recruits with a serious dose of übermensch mentality: "those rumors about history and 'best practices' are for lesser beings who don't have the talent that we require of our programmers." "We don't need no steenking documentation," in witness whereof their network wireline protocols had to be reverse-engineered from code by what Brad Smith called 300 of their best people working for half a year.

However, I'll note that they were right: anyone who wants to say that they did it wrong should prove it by making more money.

Users are to blame (5, Insightful)

Chris_Jefferson (581445) | more than 5 years ago | (#26421499)

The sad truth is, given the choice between a well-written, stable and fast application with a tiny set of features and a giant slow buggy program with every feature under the sun, too many users choose the second.

If people refused to use and pay for buggy applications, they would either get fixed or die off.

Re:Users are to blame (4, Interesting)

curunir (98273) | more than 5 years ago | (#26422119)

The even sadder truth is that when faced with the choice of the two apps you describe and a third buggy application with a tiny set of features, users will choose the most visually appealing app, regardless of lack of features or the app being buggy.

The under-the-covers stuff is important, but finding a good designer to make it pretty is the single most important thing you can do to make people choose your product. If it's pretty, people will put up with a lot of hassle and give you the time necessary to make it work reliably and have all the necessary features.

Its all true (0, Troll)

yttrstein (891553) | more than 5 years ago | (#26421563)

Which is precisely why I would *never* in a million years hire a programmer under 30, and rising.

I interviewed someone who became proficient enough in computer programming to get a masters degree from what was, when I was in school, an amazingly advanced Comp Sci program -- who didn't know what a linker does.

Re:Its all true (5, Insightful)

Cornflake917 (515940) | more than 5 years ago | (#26422073)

I think refusing to hire someone solely because of their age is naive. Is there some magical event at the age of 30 that bestows knowledge of linkers to the aging programmer? Give me a break. You are making bad assumptions. Your first bad assumption is that just because of your anecdotal experience dealing with one individual, that all schools no longer teach anything about linkers. Your second bad assumption is that even if that was true, no programmer would learn that information on their own, as if no one is generally interested in learning comp sci any more outside of the classroom.

Re:Its all true (2, Funny)

yttrstein (891553) | more than 5 years ago | (#26422091)

You must be under 30.

Re:Its all true (1)

joh6nn (554969) | more than 5 years ago | (#26422097)

comments like yours are precisely why people my own age embraced sayings like "never trust anyone over the age of 30". you're right, though; a sweeping generalization is way better than admitting that Sturgeon's Law is the norm in every generation, and you ALWAYS have to work hard to separate the wheat from the chaff.

This is clearly Microsoft's fault! (3, Insightful)

sheldon (2322) | more than 5 years ago | (#26421595)

Oh great a rant by someone who knows nothing, providing no insight into a problem.

Must be an Op-Ed from a media pundit.

And they wonder why blogs are replacing them?

Re:This is clearly Microsoft's fault! (1)

Taagehornet (984739) | more than 5 years ago | (#26421901)

To be fair, the cheap stab at Microsoft was added by the submitter or the editor, in an attempt to stir up the pot a bit. Other than that, yes, your assessment is spot on.

cheap shot (5, Informative)

Anonymous Coward | more than 5 years ago | (#26421607)

I work at Microsoft. We use agile development and almost everybody I know here has read the Mythical Man Month. Get your facts straight before taking cheap shots in story submissions. Thanks.

Re:cheap shot (1)

Richard Steiner (1585) | more than 5 years ago | (#26422103)

Microsoft's monolithic products (and its traditionally slow production cycles and bug response times) don't seem to reflect the same types of results that are seen from the use of "agile" practices used in other commercial software operations and in various open source projects.

My own guess is that it isn't the fault of the developers, but rather lengthy processes at both ends which at least partially remove some of the advantages of agile development.

It's too bad that those with a clue in your company don't have more say into the design, marketing, and maintaining of the products bearing your company's label.

Re:cheap shot (0, Troll)

berend botje (1401731) | more than 5 years ago | (#26422141)

And see what it got you: the steaming turd called Vista. Nice jorb.

Software Engineering is Expensive (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26421645)

Most companies simply refuse to spend the money do get it right. The reason that early programmers didn't have as many bugs is that their development efforts had virtually unlimited funding to resolve errors, because a bug in the system was far more expensive relative to the cost of development (compared with today, where you can reboot the machine and try again in 5 minutes "for free").

It is pretty simple to see... (0)

Anonymous Coward | more than 5 years ago | (#26421651)

The reason that Microsoft or any other company finds itself using these methods of cramming everything it can into a build and then BETA testing is because of some marketing group who told the world that the product could be built in half the time that it really takes. I really don't see that business model changing anytime soon but it will definitely continue to cause engineers grief since it basically feels like we are always living a lie. This capitalistic world we live in will not allow this to change I fear

Waterfall versus OOD (2, Insightful)

BarryNorton (778694) | more than 5 years ago | (#26421659)

A waterfall process and object-oriented design and programming are orthogonal issues. The summary, at least, is nonsense.

Complete BS? (4, Insightful)

DoofusOfDeath (636671) | more than 5 years ago | (#26421687)

For the life I me, I can't figure out what the choice of {waterfall vs. cyclic} has to do with {writing code that checks for error return codes vs. not}.

Waterfall vs. cyclic development is mostly about how you discover requirements, including what features you want to include. It also lets you pipeline the writing of software tests, rather than waiting until the end and doing it big-bang approach. Whether or not you're sloppy about checking return codes, etc., is a completely separate issue.

Despite the author's protests to the contrary, he really is mostly complaining incoherently about the way whipper-snappers approach software development these days.

Re:Complete BS? (1)

dedazo (737510) | more than 5 years ago | (#26421895)

For the life I me, I can't figure out what the choice of {waterfall vs. cyclic} has to do with {writing code that checks for error return codes vs. not}.

Nothing whatsoever. You can produce quality software with both of them.

The first one just takes twice as long, costs twice as long as is twice as aggravating to everyone involved. But companies (especially large ones) love it because it gives them fuzzy (but bogus) sensations of control and continuity.

Microsoft? (5, Informative)

dedazo (737510) | more than 5 years ago | (#26421705)

Most of the teams I've had contact with inside the tools group at MS (in the last four years or so) use SCRUM.

I don't know how widespread that is in other divisions (say the MSN/Live folks or the Microsoft.com teams) but that clever comment in the submission is nothing more than an ignorant cheap shot.

Don't be so twitterish and make up crap about Microsoft. Get your facts straight or you just come across as an idiot.

Re:Microsoft? (0)

Anonymous Coward | more than 5 years ago | (#26421979)

sorry guy, this is slashdot, facts never get in the way when it comes to the two minute hate session. especially when microsoft is involved.

Typical: blame the process (5, Interesting)

SparkleMotion88 (1013083) | more than 5 years ago | (#26421739)

It is very common for people to blame all of the problems of software engineering on some particular methodology. We've been shifting blame from one method to another for decades, and the result is that we just get new processes that aren't necessarily better than the old ones.

The fact is that software development is very difficult. I think there are several reasons why it is more difficult to develop robust software now than it was 20 years ago. Some of these reasons are:
  • The customer expects more from software: more features, flashier interfaces, bigger datasets -- all of these things make the software much more complicated. The mainframe systems of a few decades ago can't even compare to the level of complexity of today's systems (ok, maybe OS/360).
  • There is just more software out there. So any new software that we create is not only supposed to do its job, but also interconnect with various other systems, using the standards and interfaces of various governing bodies. This effect has been growing unchecked for years, but we are starting to counter it with things like SOA, etc.
  • The software engineering talent has been diluted. There, I said it. The programmers of 20-30 years ago were, on average, better than the programmers of today. The reason is we have more need for software developers, so more mediocre developers slip in there. Aggravating this issue is the fact that the skilled folks, those who develop successful architectures and programming languages, don't tend to consider the ability level of the people who will be working with the systems they develop. This leads to chronic "cargo cult programming" (i.e. code, test, repeat) in many organizations.
  • Software has become a part of business in nearly every industry. This means that people who make high-level decisions about software don't necessarily know anything about software. In the old days, only nerds at IBM, Cray, EDS, etc got involved with software development. These days, the VP of technology at Kleenex (who has an MBA) will decide how the new inventory tracking software will be built.

I'm sure there are more causes and other folks will chime in.

It's economics, too... (5, Insightful)

gillbates (106458) | more than 5 years ago | (#26421757)

As long as:

  • Consumers buy software based on flashy graphics and bullet lists of features, without regard for quality...
  • Companies insist on paying the lowest wages possible to programmers...
  • Programmers are rewarded for shipping code, rather than its quality...

You will have buggy, insecure software.

Fast. Cheap. Good. Pick any two.

The market has spoken, and said that they would rather have the familiar and flashy than secure and stable. Microsoft fills this niche. There are other niches, such as the Stable and Secure Computer market, and they're owned by the mainframe and UNIX vendors. But these aren't as visible as the PC market, because they need not advertise as much; their reputation precedes them. But they are just as important, if not moreso, than the consumer market.

Re:It's economics, too... (1)

Opportunist (166417) | more than 5 years ago | (#26421785)

Fast. Cheap. Good. Pick any two.

Judging from the market and how people choose, even one can be enough if it is 'cheap'.

Why they get away with it (0)

Anonymous Coward | more than 5 years ago | (#26421761)

The software industry gets away with it, because they can.

-Becase software is 'licensed' with :

-NO WARRANTY

-NO 'FITNESS FOR A PARTICULAR PURPOSE'

-May Contain ERRORS/OMISSIONS

Taken from the license agreement of 99.999% of the software out there.

Programmers are responsible to nobody. Shoddy work, because users have no recourse, nobody to sue, no rights to complain, etc.

There (was) is no such thing as Waterfall Method (3, Informative)

hopopee (859193) | more than 5 years ago | (#26421789)

I find it more than a little amusing that summary mentions waterfall method being a time honed, excellent example of how all software should be made. Here's the history of the none existent waterfall method has to say about it.. Waterfall Method does not exist! [idinews.com]

Problem in a nutshell (1)

Ukab the Great (87152) | more than 5 years ago | (#26421861)

An experienced programmer can differentiate the concept of "can" from the concept of "should". For a younger, novice programmer, the concept of "can" and "should" are one in the same.

Quality is Job 1.1 (1)

Prototerm (762512) | more than 5 years ago | (#26421873)

The real source of the problem isn't the development model itself, but in the way Management does its own job. This isn't anything new. Ten years ago, I was working for a small software company (less than 100 employees), and was told by the owner to meet the promised deadline "at any cost". He was quite happy with fixing bugs later "when the customer complains about them", telling me that this would allow them to promise the customer a new (later) deadline. The result was an endless stream of unhappy customers, and a rapidly deteriorating reputation in the field.

In my experience, delivery deadlines are made by management hacks to meet customer requirements, instead of the restrictions placed upon the project by budget (only so much money for resources) and Physics (only so many hours in the day), and not the poor schleps tasked with meeting those impossible dates.

Good. Fast. Cheap. : Pick two. Or, to quote Star Trek's Montgomery Scott: "I canna change the laws of Physics, Captain!"

grumpy old coder (3, Insightful)

girlintraining (1395911) | more than 5 years ago | (#26422011)

It's just another "I have 40 years of experience doing X... Damn kids these days. Get off my lawn." Hey, here's something to chew on -- I bet he screwed up his pointers and data structures just as much when he was at the same experience level. Move along, slashdot, nothing to see here. I will never understand the compulsion to compare people with five years experience to those with twenty and then try to use age as the relevant factor. Age is a number... Unless you're over the age of 65, or under the age of about 14, your experience level is going to mean more in any industry. This isn't about new technology versus old, or people knowing their history, or blah blah blah -- it's all frosting on the poison cake of age discrimination.

P.S. Old man -- reading a book won't make you an expert. Doubly so for programming books. I'd have thought you'd know that by now. Why not get off your high horse and side-saddle with the younger generation and try to impart some of that knowledge with a little face time instead?

models (1)

Tom (822) | more than 5 years ago | (#26422029)

The main point he's making, I think, is a little hidden: "At least we had a model."

The problem with almost every software development department I've seen so far is that they rely entirely on the abilities of their coders. The few guidelines they have for coding style and documentation can be called something like a "standard", if you are gracious. But that's where it ends. There are few processes above the code level. Many heads of software developments can't answer trivial questions that are perfectly normal in every other industry where stuff is being built or developed - like "how many known problems do you have at the moment?" or "what is your margin of error before you pull the product?" or even just "what's the date of your last QA?".

So don't bother them with anything more complicated, like a development process. They think "process" is another word for "deadline" because that's what the process consists of "ship on this date, or as close as you can manage". That's it.

Shockingly? (1)

internerdj (1319281) | more than 5 years ago | (#26422067)

From the article:
"Shockingly, most of these errors are not well understood by programmers; their avoidance is not widely taught by computer science programs; and their presence is frequently not tested by organizations developing software for sale."
Shockingly programmers are not all knowing. Programmers are also, oddly, doing very poorly at designing against and testing for things that have not and are still not being identified as major issues within the academic community that produces programmers.

This is a field that has become vital to many businesses in the developed world. It is a field that is very young compared to other scientific and engineering disciplines. It is a field that is undergoing major changes because it isn't working with set rules of nature, but instead is built on top of the changing products of other disciplines.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?