juSensor is a lightweight programming IDE (Integrated Development Environment) for Julia language, it is fast and easy to use.
juSensor configures entire Julia development environment automatically, so it is quite suitable for beginners. After installation, users can write and run Julia programs instantly. It is a zero configuration IDE.
juSensor supports Julia language syntax highlight. It has an esay to use code editor, concise user interface.
Other features available:
--fast
--open to use, zero configure
--Julia syntax hightlight
--no code completion
--Find/Replace
--builtin terminal
--code comment/uncomment
What initially attracted me to Julia was its elegant functional syntax and powerful abstraction capabilities. In particular, LinearAlgebra.jl can be considered one of the best linear algebra libraries available today. Its wrapping of BLAS and LAPACK is nothing short of exemplary — one might even call it an art form. If you’ve ever worked with the interfaces of BLAS and LAPACK, you might be amazed at how those standard interfaces defined decades ago seem as if they were just waiting for LinearAlgebra.jl to come along.
However, as I gradually started using Julia in my work, I discovered that it’s not well-suited for large-scale software development. The main reason for this is the lack of full-fledged object-oriented programming (OOP) features.
Indeed, Julia's multiple dispatch mechanism is, in some ways, more expressive than OOP. But OOP is a capability that large engineering projects really need.
Let me give you an example. Suppose I want to define an Isotope class, which has attributes for the number of protons (Z) and neutrons (N):
struct Isotope
Z::Int
N::Int
end
Now suppose we want to get the total number of nucleons. Writing iso.Z + iso.N feels too crude, so you'd naturally want to define a function for this. The problem arises when this isn't your own package — can you guess what name the package author might have chosen for this function? Possible names I can imagine include: getA, getNucleonNumber, getMassNumber, their underscored variants, or versions without the get.
In C++, once you type iso., your editor would likely auto-complete the method for you. But in Julia, you must consult the package documentation. This becomes even worse in collaborative environments — it's hard to remember what functions others have written and what names they chose.
This issue isn’t very noticeable in domains like linear algebra, where there are well-established standards. In fact, Julia’s multiple dispatch can significantly enhance abstraction power in such cases. After all, when you see a matrix and want to compute eigenvalues, chances are there's always an eigen() function ready for you. However, did you know that you can't pass a complex-valued symmetric matrix into the eigen() function? You only discover this through trial and error — even reading all the documentation won't necessarily help, because the documentation is usually written at a high level of abstraction.
This is essentially what many people mean when they complain that Julia lacks interfaces — it's closely related to the lack of OOP. Multiple dispatch allows polymorphism beyond just the first argument. For instance, Julia overloads output behavior via the show function. This makes designing formal interfaces difficult, so you often end up relying on runtime errors to determine whether a certain feature is supported.
To sum up, Julia is perhaps best seen as a teaching language that feels very comfortable in a REPL environment — ideal for MATLAB-style development.
I mention all of this because I recently wanted to rewrite a Fortran codebase I'm maintaining in Julia. And this isn’t the first time I’ve tried — I’ve written quite a bit of code, but each time it just felt... off. It's a large-scale computational program with roughly the following structure:
Read configuration and input files, which contain various data formats. There is no industry-standard format, and we ourselves have several versions.
Preprocess the input files to prepare data for the computational model. These data types vary widely — integer arrays, floating-point arrays, enums, etc.
Run a multi-step computational model. For efficiency, we need to pre-allocate several buffers — a classic trade-off between time and memory.
Produce results and write them to output files.
For such a program, I find it hard not to wrap preprocessing data into something like an XXXConfig object, and the computational model into an XXXModel object. You wouldn’t want every one of the dozens of functions involved in the computation to take parameters like bufferA::Vector{Int}, bufferB::Matrix{Float64}, and so on. That kind of coding quickly devolves into a nightmare — like writing spaghetti code all over again.
Of course, you might say, "Julia still has struct." True — that's a possible compromise. But then, aren't we just manually simulating OOP like in C? It's hard to believe a modern programming language should force us to write code this way. Hell, even Fortran added OOP syntax back in the 2003 standard.
In conclusion, I think Julia is not suitable for large programs. It excels in interactive development environments like Jupyter, but these days, Python with NumPy, SciPy, Pandas, etc., dominates that space. And when it comes to neural networks, the gap is even wider.
Still, I will continue using Julia for my computational physics column. I believe Julia is still an excellent language for teaching — far superior to MATLAB. And for writing simple algorithms, it’s more concise and efficient than Python.