What Makes a Scripting Language?

By Deane Barker on November 9, 2005

I have a friend who programs in .Net (I referred to him in a prior post). We’ll call him “Trevor.”

Trevor and I have epic arguments about the superiority of our chosen platforms. I use open-source, “scripting” languages — PHP and Ruby, mainly — while Trevor develops in .Net.

Trevor has the habit of calling me a “[blank] scripter.” Usually the blank is filled with “crappy” or something else unprintable — he’s entertaining if nothing else.

However, this running argument got me thinking about what makes something a “scripting” language, and not a “programming” language. I’m sure this is not an original question, but I nevertheless came up with what I think are the two key differences:

  1. Strongly vs Weakly-Typed
    “Scripting” languages are usually weakly-typed, meaning anything can fit into any variable. “Programming” languages are strongly-typed, meaning you have to declare what a variable can hold before you put something in it.

  2. Compiled vs. Interpreted
    “Scripting” languages exist in source, which is compiled to machine code at runtime, which involves an extra step and is therefore slower. “Programming” languages are pre-compiled.

Put another way, if PHP was strongly-typed and pre-compiled, would it be considered a “programming” language, rather than a “scripting” language? Could I then tell Trevor to bite me?

Where is the line? I’m interested in opinions. I’m sure this question has been asked and answered before — can anyone point to some good resources on the subject?

In the process of looking at this, I found a great page on scripting at Wikipedia. Some quotes:

[…] it is usually faster to program in a scripting language, and script files are typically much smaller than, say, equivalent C program files. The flip side can be a performance penalty: scripting languages, often interpreted, may be significantly slower to execute and may consume more memory when running. In many relevant cases, however, e.g. with small scripts of some tens of lines, the write-time advantage far outweighs the run-time disadvantage. Also, this argument gets stronger with rising programmer salaries and falling hardware costs.

However, the boundary between scripting languages and regular programming languages tends to be vague, and is blurring ever more with the emergence of new languages and integrations in this fast-changing area. In some scripting languages, an experienced programmer can accomplish a good deal of optimization if they choose.

Gadgetopia
What This Links To

Comments

  1. Scripting languages don’t suck. I’ve programmed quite a bit in Ruby w/Rails lately (probably more than you, Deane) and I love it. PHP does suck however for two reasons. #1 because you’re in love with it and #2 because I have to give you crap for crying about how much .NET sucks all the time. If you weren’t such a crappy scripter, you’d know realize wicked awesome .NET is.

  2. probably more than you, Deane

    Probably true. For all my talk about it, I think I’ve only built one app with it.

    Joe is the Rails ninja around here.

  3. It may be just me, but there’s something deeply odd about the above discussion.

    …it is usually faster to program in a scripting language…

    In my book, a scripting language should refer specifically to something that exists solely to instruct another high-level abstraction to do stuff. A programming language should instruct the OS layers at least, if not the machine itself, to do stuff. Period. And the former doesn’t include instructing a layer specifically in place to enable access to OS or lower instuction. Again, that’s programming.

    “Hunh?”, you ask?

    I’d think of a scripting language as something like, oh, the application layer of AppleScript, or Flash, assuming the flash environment is qualifying as a VM. If I use something to, for example, tell Photoshop to invert a color table, or I use Excel’s scripts to doohickey up an bunch of accounting data into sumthin’ pretty, I’m scripting.

    If I ruby together a wicked calculator that converts the NYSE ticker into ASCII art images of fluffy bunnies at play in a manner that TCLs my fancy, I’m programming.

    In short, if I’m simply talking to a shell, an app, or a widget, and asking it please, sir, may I have another, I’m scripting, or speaking, or rather, having a dialogue. If I’m creating something intended to independently accomplish some task, even though it may include asking the OS to help out a bit, I’d argue that I’m programming.

    Alternatively, one might consider that scripting is button-pushing, whereas programming is the implementation of that which the buttons do.

    I was gonna toss out something regarding algorithms, but that’s, in retrospect, silly. Scripting can include just as much creative arrangement of buttons, flow control, etcetera as programming. It’s just that scripting simply does so in order to tell something else to do something it already knows how to do.

    I recognize the deep pit over which I’m currently dangling. I’m drawing a relatively arbitrary boundary somewhere in the vicinity of the OS. Which I can do. Because… Um… I haven’t much sleep.

  4. …assuming the flash environment is qualifying as a VM…

    I mean, of course, …isn’t qualifying…

    Or something…

    And I’m desperately hoping I can reread this after a nap, and still know what I meant.

  5. I think you’re looking at this from an entirely different direction. I was considering syntax and runtime mechanics. You’re considering intended functionality. Different things.

  6. Actually, again, I’d argue that intended functionality should be the distinguishing characteristic. Structures, flow control elements, typing, compilation, etcetera, in my mind aren’t, or rather, in an ideal world in which I am king , wouldn’t be, relevant. (Although I’d have a hard time seeing compilation likely for what I consider scripting.)

    That’s assuming, of course, that we’re using the concepts “programming” and “scripting”. Perhaps a different vocabulary would be more appropriate.

    Forgive me for requoting the same block in your post, but it very clearly illustrates the semantic issue I’m ranting about. A shift in vocabulary would simplify discussion and avoid the pits of despair we’re heading into with the current distinctions. Read the following from a functionality perspective, and it’s just, well, silly:

    it is usually faster to program in a scripting language, and script files are typically much smaller than, say, equivalent C program files.
    [Note the use of “…can be…,” “…often interpreted…,” “…may be…,” “…may consume…” balanced by “In many relevant cases, however…”]
    The flip side can be a performance penalty: scripting languages, often interpreted, may be significantly slower to execute and may consume more memory when running. In many relevant cases, however, e.g. with small scripts of some tens of lines, the write-time advantage far outweighs the run-time disadvantage. Also, this argument gets stronger with rising programmer salaries and falling hardware costs.

    However, the boundary between scripting languages and regular programming languages tends to be vague, and is blurring ever more with the emergence of new languages and integrations in this fast-changing area. In some scripting languages, an experienced programmer can accomplish a good deal of optimization if they choose.

    When I’m working in PHP, Ruby, Python, etcetera, in my mind I’m programming. When I’m creating, say, Word or Excel macros, I’m scripting. And, ideally, we should, I feel, focus our use of vocabulary on a distinction based on intended use.

    Of course, I still get annoyed when someone mentions organic tomatoes. It’s just wrong. So wrong. Irrespective of common usage.

    … … …

    That’s not to mention that my approach let’s you tell Trevor to “bite me.” Which, as I understood you, was the whole point of the discussion. Work with me, here.

  7. Okay — then you rename the two “levels” of language. What would you name the category of languages like PHP and Ruby, and what would you name the category like .Net and Java?

    Then tell me where the line between those two categories is.

  8. It’s tempting to play the interpreted vs compiled card, but that’s getting blurry, as well.

    The problem, for me at least, lies in what is considered programming, versus that of scripting. As I read it, Trevor’s argument is that you’re not actually programming. [I did say Trevor’s argument] Which is, I’m assuming based on the level of discussion you offer on Gadgetopia, crap.

    This all goes back to the PHP: The Camaro of Programming Languages post.

    Trevor’s sticking hard to this stance:

    So when you’re say “I’m a .Net programmer,” you’re implying that you’re an advanced programmer who understand and implements all good the programming architecture that .Net enforces.

    As I see it, the scripting versus programming nomenclature might better be served up as the, hmm, perhaps enforced versus open, or institutional versus open, or strict versus flexible. Man, this is a good question. I’m gonna obsess on this for a bit. Thanks a whole heckuva lot, Deane. Just what I need.

    It also brings up an interesting peripheral question:

    What is it that makes some names, phrases, terms in general “catchy,” such that folks immediately adopt them, that “of course” response, versus those that sort of clunk and turn people away? In short, what makes an effective meme label. Because the meme here is fairly clear. What to label the thing is the problem.

    Seems like that’s the root of the issue. Scripting and programming are simply easier words to hold, “cooler,” as it were, whether they apply to the actual question or not. Regardless of what Trevor says, there’s no question that PHP is something within which one programs.

Comments are closed. If you have something you really want to say, email editors@gadgetopia.com and we‘ll get it added for you.