How to convert a string to a boolean (the right way)

There are a couple of ways to convert a string variable to a boolean variable in Javascript. However, when doing this, you have to be kind of careful: it’s kind of easy to mess up this sort of logic and doing it wrong can result in some nasty bugs. So, in order to save you a headache or two, I’ve detailed how to do it properly in this article. Read on, if you’re interested.

1) You might be tempted to say if(myString)…

…and this would appear to work, for a while. However, it’s really the wrong way to go about this. Observe:

var myString = true;
if (myString) {
    // this should evaluate to true because myString = "true", and it does. Good!
if (!myString) {
    // uh oh! This evaluates to true as well. Why? Because if(!myString)
    // is checking to see if myString *exists*, not if it's *true*.

As I mentioned, if(!myString) evaluates to true because that statement only checks if myString exists (e.g. if it’s not undefined or null), not if it evaluates to false.

2) What about creating a boolean object from the string?

Why not try to create a Boolean object from the string? Well, you’d run into an issue similar to the previous problem. Let’s let some example code do the talking, though, have a look:

var myString = "true";
if(Boolean(myString)) {
    // very good, this if statement evaluates to true correctly!
if(!Boolean(myString)) {
    // and this one evaluates to false! Our problems are solved!
var myOtherString = "false";
if(Boolean(myOtherString)) {
    // ...or are they? This evaluates to true, although we clearly set it to "false"!

As you can see, if(Boolean(“false”)) evaluates to true–why? When you create a new Boolean object from a string, it doesn’t try to check whether the string equals “true” or “false”. Instead, rather misleadingly, it checks whether the variable is a non-falsy value (e.g. a value that evalutes to false–0, undefined, an empty string, null, etc). Because myString is not an empty string, the Boolean evaluates to true–even if myString equals false.

3) Right, let’s try comparing our string against the string “true”

Really, the correct way we should be going about this is to check if our string equals “true” — if so, then our string is obviously “true”. Otherwise, it must be false. We can do it this way:

var myString = "true";
if(myString == "true") {
    // this evaluates to true correctly, myString is true
if(myString == "false") {
    // this evaluates to false, also correct, since myString doesn't equal false.

Wonderful, it looks like our problem is solved! But, you know something? The above code is kind of messy and a bit long just to check if our string is “true” or not. Let’s see if we can’t clean it up a bit:

myString = (myString == "true");

Ahh, nice and clean. Just the way I like it! (note: the parentheses are just there for clarity–if you don’t like them or you’re extra extra concerned about line length, removing them won’t cause any errors).

15 thoughts on “How to convert a string to a boolean (the right way)”

    1. Why not remove a few of those non-string case conditions by just saying “value = value.toString()” on the first line of the function?

  1. do you see value in casting myString to lower so it is a little more robust?myString = (myString.toLowerCase() == “true”);

    1. Well, yes, that’s technically an improvement, although very slight. Realistically, though, it’s not worth the time you spend typing it. Plus, it encourages typing “True” instead of “true”, which could become a source of confusion.

  2. Something is wrong with your implementation. You say:
    var myString=”true”;
    would be evaluated right, no matter what, but how does this work when I initialize myString=true ??? It won’t!

    The right implementation that would work no matter if you initialize your variable with the Boolean true|false or with the String true|false would be:

    var myString=X; // where X could be any “true”|”false”|true|false
    will always evaluate right. Right?

  3. var myString = “true”;
    if (!myString) {
    // uh oh! This evaluates to true as well. Why?

    That’s wrong. !'true' evaluates to FALSE because the string ‘true’ exists.

    The correct example sholud be: !'false'

    !'false' evalutes to FALSE, because it evaluates if the string 'false' exists and negates the value

  4. Normally I would just ignore this post but it is well ranked on Google for me so I will post a correction for the sake of others who come here. This makes absolutely no sense and your code will confuse the hell out of anybody who needs to maintain it (including probably yourself).

    Just to illustrate what is wrong here, javascript has a very simple set of rules to define how variables are cast when doing loose equality checking (==) [1]. This method disobeys the language’s standard rules:

    function isTruthy (inp) {
    var bool = (inp == “true”);
    return bool;
    // returns true. wierd, but OK…
    // returns false. WTF?
    // javascript is a loosely typed language, so this
    // should work, but does not:
    // stupid corner case, but for demonstration
    var obj = {};
    obj.toString = function () { return ‘true’;}
    castToBool(obj); // returns true

    What you are actually looking for here is strict equality checking [2]. You are overriding the language’s behavior with your own string-based logic. You are not casting to bool, you are trying to see if a given variable is exactly the string “true”. This is fine (in fact, lots of JS gurus say you should never use the loose checking operator), but you need to be explicit about it. This is much simpler and easier to understand:

    myString === “true”



  5. Cool, thanks for this info. I was wondering why it wasn’t working, but this makes perfect sense. Thanks again and good luck.

Leave a Reply

Your email address will not be published. Required fields are marked *