Pretending OOP never happened

I ran across someone recently who says the way to move past object oriented programming (OOP) is to go back to simply telling the computer what to do, to clear OOP from your mind like it never happened. I don’t think that’s a good idea, but I also don’t think it’s possible.

Object oriented programming, for all its later excesses, was a big step forward in software engineering. It made it possible to develop much larger programs than before, maybe 10x larger. Someone might counter by saying that programs had to be 10x larger because of all the overhead of OOP, but that’s not the case. OOP does add some overhead, and the amount of overhead grew over time as tools and frameworks became more complicated, but OOP made it possible to write programs that could not have been written before.

OOP provides a way for programmers to organize their code. It may not be the best way, depending on the problem, but the way to move past OOP is to replace it with another discipline. And I imagine most people who have learned and then rejected OOP do just that, whether they realize it or not. Maybe they retain some organizational patterns that they learned in the context of OOP.

That has been my experience. I hardly ever write classes anymore; I write functions. But I don’t write functions quite the way I did before I spent years writing classes.

And while I don’t often write classes, I do often use classes that come from libraries. Sometimes these objects seem like they’d be better off as bare functions, but I imagine the same libraries would be harder to use if no functions were wrapped in objects.

There are many alternatives to OOP for organizing code. I generally like functional programming, but in my experience there’s a hockey stick effort curve as you try to push the purity to 100%. James Hague said it well:

100% pure functional programming doesn’t work. Even 98% pure functional programming doesn’t work. But if the slider between functional purity and 1980s BASIC-style imperative messiness is kicked down a few notches — say to 85% — then it really does work. You get all the advantages of functional programming, but without the extreme mental effort and unmaintainability that increases as you get closer and closer to perfectly pure.

It’s possible, and a good idea, to develop large parts of a system in purely functional code. But someone has to write the messy parts that interact with the outside world.

More programming posts

13 thoughts on “Pretending OOP never happened

  1. This is exactly where I’ve ended up. Write code that’s as functional as possible without distorting your logic to support it. Uses classes when they make sense, but again, don’t distort the structure of your code just for OOP’s sake. Ideological purity is less important than understandable, maintainable code.

  2. Exactly – OOP enabled writing larger programs, specifically by providing encapsulation (and messaging as a consequence).

    Functional programming also needs to provide encapsulation to scale to large programs.

    (here is a blog post of mine with more detailed argumentation: https://traversable.space/#essays/oop-2 )

  3. Smalltalk 80 sold me on OOP but I’ve spent a good deal of time with Scala and Python using functional style. Your observations are spot on. 100% pure OOP or Functional cause unreadable and therefore unmaintainable code. I could quote some lines of Scala that are impenetrable. They are idiomatic and functional but IMO nearly useless and to be avoided — the same for extreme OOP. These paradigms are both good tools when applied to problems they naturally solve. Purists push them to their breaking point. The same can be said of almost any programming tool, not sure why this isn’t obvious.

  4. Warren Henning

    OOP does not enable larger software to be developed. OOP does not solve any problems.

    Encapsulation destroys endless opportunities for global optimization and simplifications to data organization. It is an intellectually dishonest charade.

    Instead of doing something simple that solves the problem efficiently, you have to play the OOP game of designing Platonic ideal categorization systems and wrestling with existential quandaries about the nature of things that don’t actually exist.

    Things in computers don’t actually work the way they do in the real world, which means trying to model the real world in a computer just makes programs super confusing. Instead, the goal should be to make clear how the data is organized and how it is transformed.

    To get past OOP, we need to start by dropping old ideas that empirically don’t help and we need to stop taking direction from language designers who don’t work on real products/projects. Post-OOP languages need to be born out of the immediate experience of creating real systems, ideally systems that co-evolve with the language as each is being developed. We are still stuck, of course, with all the legacy C++, Java, Python, and Ruby code.

  5. Wow, Warren Henning’s comment is either fanatic or naive. As someone who has professionally developed software for 15 years w many different languages and patterns, i can say for sure that the only truth to the best programming patterns is that it always depends on the variables and the goal. Classes are sometimes the most accurate way to express and encapsulate a reusable concept. Anyone that blames the tools for bad work isn’t someone i would ever take advice from. Purists are the religious zealots of the programming community and shouldn’t be regarded as worthy of their consideration. Try everything, pick the patterns apart, think about the pros and cons of everything, and most importantly, write code that someone else can understand and use

  6. Johnathan Corgan

    For me it has been a transition from class hierarchies to single-level classes, with a lot of those classes being essentially callable objects holding internal configuration state. On that dial I’m probably around 75% functional.

  7. Many confuse OOP, the programming, with OOA&D, the analysis and design that typically precedes it. I’ve seen OOA&D taken to extremes that actively prevent good systems from being built. And needlessly chew up huge amounts of the schedule and budget.

    My approach is to do a rough design to identify the primary actors, then extract the “obvious” patterns, leaving lots of grey and a few “Here be Dragons” areas.

    I next pick one of the hard parts and code it (in my usual spaghetti experimental style), then beat on the code until it become self-documenting. So much hidden structure becomes self-evident that rippling it up the design yields multiple benefits.

    The thing about this early code is I have no intention to keep it. It mainly serves to let me visit the implementation domain rather than build architectural Ivory Towers. Most often, this early code gets transformed into test cases.

    This is why I enjoy Python so much, as it allows me to make these perspective leaps without getting lost. Even if the final project requires embedded C++. (Which, with Cxx-20, is finally becoming a truly fantastic language.)

  8. David J. Littleboy

    What Jonathan C. said.

    In some sense, there’s no such thing as “non-OOP”, since grouping together data structure definitions and their associated functions is a reasonable, obvious, and useful thing to do.

    But the _hierarchical_ OOP thing is probably a bad idea, since there aren’t a lot of things in the real world that can be shoehorned into a hierarchy. Which is why pretty much all the examples, even in the better programming books, are seriously unconvincing. (As an ex-AI person, I found it amusing that the OOP folks often seemed to be thinking that they were doing AI, and in particular, a kind of AI we couldn’t make fly.)

    Amusingly, GUIs seem to be quite shoehornable, with for example, MFC making extensive use of the hierarchical thing. Of course, I’ve never heard of anyone saying anything nice about MFC…

  9. Hm. I must admit I am somewhat sceptical about non-oop. The evolution from assembler to procedural, and procedural to OOP, is a somewhat biological pattern. I tend to view computing as a reflection of the human being, where our technological advances is simply a reflection of realities already present in man. The formation of the class, is in some ways comparable to the concept of DNA (with the VMT representing RNA).

    Personally, I feel the next leap is autonomous processes, where clusters of class entities spawn on demand, and can share workload. Guided by a self governing principle (and no, memory management doesnt quite fit that pattern, although it comes close).

    I think cloud is the place where the next leap will happen. But the programming language for global (in actual terms) coding, services that physically replicate to be closer to the user — we would need a new programming language, or to be honest – an older one. AREXX would in many ways deliver the goods — but it would require a radical shift in thinking.

  10. We forget that one of the largest codebases worked on by thousands of developers everyday is written in C. OOP has the issue of giving too many tools but not enough limitations, even though OOP promises reusability and modularity. It fails to deliver on that promise. The way newer languages like Java\C# are using it as the defacto standard for code design is when OOP goes to far. I also think that Functional Programming is a superior alternative saying that functional code is no good at writing UX is quite frankly the sign of a bad functional programmer.

  11. A hammer is a useful tool. Many tasks are made easier by using a hammer.

    If I declared that every tool needed to have a hammer attached to it, regardless of how much or little sense it made; that every job needed to be done using a hammer; and that anyone who complained about the inconvenience of all these extra hammers was simply not experienced enough in the use of hammers, people would rightly believe that I was crazy.

    So should it be with any tool.

Comments are closed.