I've been running into a lot of happy and excited scientists lately. "Running into" in the virtual sense, of course, as conferences and other opportunities to collide with scientists in meatspace have been all but eliminated. Most scientists believe in the germ theory of disease.
Anyway, these scientists and mathematicians are excited about a new tool. It's not a new particle accelerator nor a supercomputer. Instead, this exciting new tool for scientific research is... a computer language.
How can a computer language be exciting, you ask? Surely, some are better than others, depending on your purposes and priorities. Some run faster, while others are quicker and easier to develop in. Some have a larger ecosystem, allowing you to borrow battle-tested code from a library and do less of the work yourself. Some are well-suited to particular types of problems, while others are good at being general-purpose.
For scientists who compute, languages, the quality of compilers and libraries, and, of course, the machines they run on, have always been important. For those whose job it is to simulate the atmosphere, or design nuclear weapons, Fortran was the traditional tool of choice (and still often is, although it has more competition now). That language has dominated the market because compilers are available that can take good advantage of the largest supercomputers. For the current breed of data scientists, Python is currently popular because of the momentum of its ecosystem and its interactivity and rapid development cycle.
Six years ago, I wrote in these pages about the enduring prominence of Fortran for scientific computing and compared it with several other languages. I ended that article with a prediction that, in 10 years, a new language called Julia stood a good chance of becoming the one that scientists would turn to when tackling large-scale numerical problems. My prediction was not very accurate, though.
It actually only took Julia about half that time.
From Ars Technica
View Full Article
No entries found