Efficiency of C# on Linux

This week I attended Mads Torgersen’s talk Why you should take another look at C#. Afterward I asked him about the efficiency of C# on Linux. When I last looked into it, it wasn’t good. A few years ago I asked someone on my team to try running some C# software on Linux using Mono. The code worked correctly, but it ran several times slower than on Windows.

Mads said that now with .NET Core, C# code runs about as fast on Linux as Windows. Maybe a few percent slower on Linux. Scott Hanselman joined the conversation and explained that with .NET Core, the same code runs on every platform. The Mono project duplicated the much of the functionality of the .NET framework, but it was an entirely independent implementation and not as efficient.

I had assumed the difference was due to compiler optimizations or lack thereof, but Scott and Mads said that the difference was mostly the code implementation. There are some compiler optimizations that are better on the Windows side, and so C# might run a little faster on Windows, but the performance is essentially the same on both platforms.

I could recommend C# to a client running Linux if there’s a 5% performance penalty, but a 500% performance penalty was a show-stopper. Now I’d consider using C# on a project where I need more performance than Python or R, but wanted to use something easier to work with than C++.

Years ago I developed software with the Microsoft stack, but I moved away from Microsoft tools when I quit doing the kind of software development the tools are geared for. So I don’t write C# much any more. It’s been about a year since I had a project where I needed to write C# code. But I like the C# language. You can tell that a lot of thought has gone into the design, and the tool support is great. Now that the performance is better on Linux I’d consider using it for numerical simulations.

13 thoughts on “Efficiency of C# on Linux

  1. Depending on your needs, you may get decent performance from Python using Numba (if your task is mostly numerical). Of course dealing with deploying additional packages is another can of worms.

  2. Python is usually fast enough for me. When it’s not, I’ve used C++. Haven’t had much luck making Python faster.

    I tried Numba recently on a numerical project, but it didn’t help. Maybe I wasn’t using it well, or maybe my project didn’t have the right form even though it was numerical. I’ve heard great things about Numba, and I’m sure it can help. I’d try it again. But I’m not willing to invest much time optimizing Python when I could just move over to a semicolon language and know it’ll be faster.

  3. PyPy! The PyPy performance should be the standard of comparison before leaving Python to obtain faster performance. Even when I don’t plan to deploy via PyPy, it’s always part of my test suite, just in case.

    I build scientific instruments and devices. When I started 30 years ago, maximum performance required hand-crafted assembler on bare-metal 2 MHz 8-bit embedded systems. Now I run Linux+Python/PyPy on cheap multi-core 2 GHz 32/64-bit embedded systems with GPUs.

    The assembler work I did 30 years ago is beyond useless. My C/C++ code from that era is suitable only for archival reference. The Python code I wrote a decade ago still lives, evolves, and thrives.

    Swapping in a faster CPU/GPU typically costs less than the equivalent of a day’s pay for a programmer. Even doing a custom hardware respin to upgrade performance can cost less than a significant language porting effort.

    As a proof-of-concept, I recently upgraded an older embedded Linux instrument with a $35 RasPi3 wearing a couple HATs. Eyebrows were raised. Then I told them about RasPi clusters.

    Moore’s Law is a programmer’s BFF. Always look closely at the hardware options before changing languages, or worse, considering a multi-language implementation.

    I do some consulting with research scientists who need to write their own target-system code, generally to transition them away from Matlab (and the cost of deploying Matlab licenses). The simple fact that Python provides matplotlib and numpy is often enough to make them highly motivated, since it clearly shows the research and deployment environments can have great overlap without a 100% relearning curve.

    I’ve never (yet) had to reveal the existence of the Matlab Engine API for Python. But I always show them PyPy.

  4. I’m really chuffed about this.

    I’ve been working in Python for over a decade, and recently I’ve had to write C# for a project I’ve been working on. I found it to be a surprisingly easy transition (took me 3 weeks to achieve reasonably high proficiency in C#). Python taught me a lot about higher-level abstractions, so I was able to jump right into LINQ, collections, etc. Moving from a dynamically-typed language (Python) to a statically typed one (C#) required me to learn a couple of new tricks (generics, nullable types, dependency injection, etc.) because the problems that these solve in the latter did not exist in the former. Python doesn’t have the same guardrails, which makes it an unusually effective and productive language.

    Statically typed languages are much less permissive, and a good IDE is almost a necessity in order to be productive (I can’t see myself writing C# with a plain text editor). But the payoff is great in terms of refactorability. I can reliably rename functions/methods across multiple files, extract methods/interfaces and have the IDE do a bunch of stuff that would have been laborious to do correctly in Python. C# is a wonderfully designed statically typed language, and Intellisense is very impressive. I’ve known of Anders Heijsberg since the Delphi days, and the man just has good design taste. C# is actually pleasant to write.

    The only downside of C# is the dearth of good free numerical/data manipulation libraries. I know about Mathdotnet, DataTables (+LINQ) and such, but they’re not really as mature as numpy and pandas.

  5. The post does indict the Mono .NET implementation. C# without .NET is a shiny car with no engine or transmission. Not to condemn the Mono folks, MS has put a huge effort into C# and .NET that obviously isn’t easy to duplicate. Kudos to them for publishing the .NET source. And for creating the C# language. C# and LINQ are the two best things to come out of MS. Ever.

  6. Hi John,

    Based on what you know, would Go be a viable alternative to the numerical simulation problem space that you wanted to use C# for?

  7. Interesting.

    I was just asked to look at Xplatform development for simple (educational) tablet apps (Apple, Android, Windows) and Mono and Xamarin popped up a lot in my search.

    I’ve never used C#, but it’s been so long since I used C++ for anything but toy projects that I’m seriously considering it: I suspect the up to speed learning time would be about the same.

  8. Hmm… It seems SymPy was recently ported to C#.

    I looked over some C# code I did several years ago, and my main concern is its Java-esque verbosity. Have more recent C# language updates improved the situation?

  9. It’s been a long time since I’ve written Java, but my impression is that C# is not as verbose as Java. And some newer C# features do let you write more compact code. But certainly the two languages are similar.

  10. Denny: I haven’t had a chance to use Go, but I’d like to. I like the philosophy of the language, and I’ve talked to people who enjoy working in it. The only complaints I hear are from people who say it lacks some feature they want, and that’s a good sign: I like the idea of a language that’s too simple for some projects rather than being all things to all people.

Leave a Reply

Your email address will not be published. Required fields are marked *