Mastodon hachyterm.io

Here’s a simple TypeScript function:

function f() {
  const a = 2
  const b = 'a string'
  return a + b
}

What I expected:
TypeScript can infer types. It will recognize that a is of type number and b is of type string.

I expected a type error, because I try to add a string to a number.

You can also be more explicit and tell TypeScript the types. Like so:

function f() {
  const a: number = 2
  const b: string = 'a string'
  return a + b
}

What happened instead:
TypeScript (like JavaScript) coerces the types. So it coerces a to a string and the concatenates a and b. The output is '2a string'.

function f() {
  const a: number = 2
  const b: string = 'a string'
  return a + b
}
> '2a string'

Or rather: JavaScript’s (and thus TypeScript’s) unary plus arithmetic operator (+) “produces the sum of numeric operands or string concatenation.” 1 Using + with a number and a string will do string concatenation.

WTF!? Where is my type safety?

If you want to throw a type error in the function, you have to state the return type of the function.

function f(): number {
  const a: number = 2
  const b: string = 'a string'
  return a + b
}
> type error

Now the function expects a number as the output, but TypeScript will still concatenate strings inside the function and returns a string as the output. The compiler will finally complain that something doesn’t add up.

By the way, OCaml and ReasonML are much stricter. The plus operator doesn’t convert types.

let f = () => {
  let a = 2
  let b = "a string"
  a + b
}

f()

> This has type:
>  string
> But somewhere wanted:
>  int