- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
If you’re consciously and intentionally using JavaScript like that, I don’t want to be friends with you.
This is my favorite language: GHC Haskell
GHC Haskell:
GHCi> length (2, "foo") 1
Wait, now I need to know why.
* some time later *
I went to check why the hell this happened. It looks like the pair (“
(,)
”) is defined as an instance ofFoldable
, for some reason, which is the class used by functions likefoldl()
andfoldr()
. Meanwhile, triples and other tuples of higher order (such as triples, quadruples, …) are not instances ofFoldable
.The weirdest part is that, if you try to use a pair as a
Foldable
, you only get the second value, for some reason… Here is an example.ghci> foldl (\acc x -> x:acc) [] (1,2) [2]
This makes it so that the returned length is 1.
Oddly enough, in Haskell (as defined by the report), length is monomorphic, so it just doesn’t work on tuples (type error).
Due to the way kinds (types of types) work in Haskell, Foldable instances can only operate over (i.e. length only counts) elements of the last/final type argument. So, for (,) it only counts the second part, which is always there exactly once. If you provided a Foldable for (,) it would also have length of 1.
I don’t even know Haskell but it seems like (" ( , ) ") would be an instance of boob.
(.)
is a valid expression in Haskell. Normally it is the prefix form of the infix operator.
that does function composition.(.) (2*) (1+) 3
=((2*) . (1+)) 3
=2 * (1 + 3)
=8
.But, the most common use of the word “boob” in my experience in Haskell is the “boobs operator”:
(.)(.)
. It’s usage in Haskell is limited (tho valid), but it’s appearance in racy ASCII art predates even the first versions on Haskell.The pioneers of ASCII art in the 70s and 80s are the unsung heroes of porn.
It looks like two worms split running from another tinier worm. Makes you wonder what it has done to be so feared
[object Object][object Object]
The fun strings to enter in web forms once in a while.
If you mix types like that, it’s your own fault
BS. A language shouldn’t have operators that allow non sensical operations like string concatenation when one operand is not a string.
Especially that + and - act differently. If + does string concattenation, - should also do some string action or throw an error in this situation.
That’s the case in many languages, pretty much in all that don’t have a separate string concatenation operator.
Yeah, and almost all languages I know then would throw an exception when you try to use
-
with a string, and if they offer multiple operators that take a string and a number, they always only perform string operations with that and never cast to a number type to do math operations with it.(e.g. some languages have
+
for string concatenation and*
to add the same string X time together, so e.g."ab" * 2 => "abab"
. It’s a terrible idea to have+
perform a string operation and-
performs a math operation.)Sure, but then your issue is with type coercion, not operator overloading.
Because there’s in fact no operator overloading happening, true, but that’s mostly an under-the-hood topic.
It should not happen no matter why it does happen under the hood.
Operator overloading for
string - string
is wrong and type coercion to implicitly cast this toint(string) - int(string)
is just as wrong.There is operator overloading happening - the
+
operator has a different meaning depending on the types involved. Your issue however seems to be with the type coercion, not the operator overloading.It should not happen no matter why it does happen under the hood.
If you don’t want it to happen either use a different language, or ensure you don’t run into this case (e.g. by using Typescript). It’s an unfortunate fact that this does happen, and it will never be removed due to backwards compatibility.
- should also do some string action
Like what kind of string action?
“Hello” + " world" is what everyone can understand. Switch with “-” and it becomes pointless.
this the “or throw an error”
If you try what I wrote it will throw a NaN. I was asking about the first part of the proposal.
The NaN isn’t an thrown. It’s just silently put into the result. And in this case it’s completely unintelligible. Why would an operation between two strings result in a number?
"Hello" - "world"
is an obvious programmer mistake. The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.The main problem here is downward coercion. Coercion should only go towards the more permissive type, never towards the more restrictive type.
Coercing a number to a string makes sense, because each number has a representation as a string, so
"hello" + 1
makes intuitive sense.Coercing a string to a number makes no sense, because not every string has a representation as a number (in fact, most strings don’t).
"hello" - 1
makes no sense at all. So converting a string to a number should be done by an explicit cast or a conversion function. Using-
with a string should always result in a thrown error/exception.The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.
You basically defied the whole NaN thing. I may even agree that it should always throw an error instead, but… Found a good explanation by someone:
NaN is the number which results from math operations which make no sense
And the above example fits that.
"hello" - 1
makes no sense at all.Yeah but actually there can be many interpretations of what someone would mean by that. Increase the bytecode of the last symbol, or search for “1” and wipe it from string. The important thing is that it’s not obvious what a person who wrote that wants really, without additional input.
Anyway, your original suggestion was about discrepancy between + and - functionality. I only pointed out that it’s natural when dealing with various data types.
Maybe it is one of the reasons why some languages use . instead of + for strings.
It’s not nonsensical, implicit type coercion is a feature of JavaScript, it’s perfectly logical and predictable.
JavaScript is a filthy beast, it’s not the right tool for every job, but it’s not nonsensical.
When you follow a string with a
+
, it concatenates it with the next value (converted to string if needed). This makes sense, and it’s a very standard convention in most languages.Applying arithmetic to a string would be nonsensical, which they don’t do.
You are entitled to your opinion. implicit conversion to string is not a feature in most languages for good reasons.
Sure. And you’re entitled to yours. But words have meaning and this isn’t MY OPINION, it’s objective reality. It follows strict rules for predictable output, it is not nonsensical.
You’re entitled to think it’s nonsense, and you’d be wrong. You don’t have to like implicit type coercion, but it’s popular and in many languages for good reason…
Language Implicit Coercion Example JavaScript '5' - 1 → 4
PHP '5' + 1 → 6
Perl '5' + 1 → 6
Bash $(( '5' + 1 )) → 6
Lua "5" + 1 → 6
R "5" + 1 → 6
MATLAB '5' + 1 → 54
(ASCII math)SQL (MySQL) '5' + 1 → 6
Visual Basic '5' + 1 → 6
TypeScript '5' - 1 → 4
Tcl "5" + 1 → 6
Awk '5' + 1 → 6
PowerShell '5' + 1 → 6
ColdFusion '5' + 1 → 6
VBScript '5' + 1 → 6
ActionScript '5' - 1 → 4
Objective-J '5' - 1 → 4
Excel Formula "5" + 1 → 6
PostScript (5) 1 add → 6
I think JavaScript is filthy, I’m at home with C#, but I understand and don’t fear ITC.
C# is filthy. But it explains where you got your warped idea of righteousness.
Also, you contradicted yourself just then and there. Not a single of your examples does string concatenation for these types. It’s only JS
- In https://lemm.ee/comment/20947041 they claimed “implicit type coercion” and showed many examples; they did NOT claim “string concatenation”.
- However, that was in reply to https://lemmy.world/comment/17473361 which was talking about “implicit conversion to string” which is a specific type of “implicit type coercion”; NONE of the examples given involved a conversion to string.
- But also, that was in reply to https://lemm.ee/comment/20939144 which only mentions “implicit type coercion” in general.
So, I think probably everyone in the thread is “correct”, but you are actually talking past one another.
I think the JS behavior is a bad design choice, but it is well documented and consistent across implementations.
Read the thread again, it seems you slipped somewhere. This was all about the claim that implicit conversion to string somehow could make sense.
Lol. In a dynamically typed language? I will do this always, that’s why I am using it
You can have a dynamic language that is strongly typed to disallow stuff like this. Like Python for example
Aand what is your point?
It’s not that hard to understand what it is doing and why the decision was made to make it do that. JavaScript has a particular purpose and it’s mission is not consistency.
It’s not like TypeScript doesn’t exist if you just get lightheaded at the idea of learning JavaScript’s quirks and mastering using it despite them.
Scanned the article: neither mission, nor purpose, nor type coercion unga-bunga explained. Or was I expected to see the greatness of the language and be humbled by its glory and might?
Well then, rage against the machine for the next 30 years and see if they kill it in favor of a nice, strict language that everybody loves. Maybe you could suggest one here for consideration.
So, all you’ve mustered is some lame-ass whataboutism? Have a good day
So you don’t have a suggestion. Got it.
Of course. Nothing beats JS, oh guru mighty guru
So all you’ve mustered is some lame-ass ad-hominem? Have a good day
No, it just so happens I have a minute to talk about our lord and saviour JS. What is His holy and sacred mission?
Lets fix it. I think that since we are removing the ones, then “11” - 1 should be equal to “”.
Should it, or should it be “1”? (just removing one, one)
Which “1” did it remove? And did it search the string to find a “1” to remove, or did it remove whichever character happened to be at array index 1?
The one at the end. Subtraction is the opposite of addition. If addition adds a character to the end of the string, it must follow that subtraction would remove a character from the end of the string.
This is how we end up with an endian schism
maybe we removed the last n characters
It should just randomly pick any “1”. Add a bit of spice, you know
Hear me out: “11” - 1 = “11” - (-1) = “11” (did not find “-1” in "11)
Or
“11” - 1 = “11” - (-1) = “1” (removed first “1”)
What no type safety does to an MF…
It’s because
+
is two different operators and overloads based on the type to the left, while-
is only a numeric operator and coerces left and right operands to numeric. But frankly if you’re still using+
for math or string concatenation in 2025, you’re doing it wrong.I know nothing about javascript, what is wrong with using + for math? perhaps naively, I’d say it looks suited for the job
The correct way to do it is to load a 500mb library that has an add function in it.
Point taken but the one I use is only ~200k for the whole package, ~11k for the actual file that gets loaded
The native arithmetic operators are prone to floating point rounding errors
It’s much better to make your own function that uses bitwise operations to do addition.
function add(a, b) { while (b !== 0) { // Calculate carry let carry = a & b; // Sum without carry a = a ^ b; // Shift carry to the left b = carry << 1; } return a; }
(For certain definitions of better.)
It’s my favorite language too, but I also find this hilarious.
Heck, I need to learn some new languages apparently. Here I was expecting an angry "CS0029 cannot implicitly convert type ‘string’ to ‘int’!
It makes sense though
… It does?
This here is my absolute favorits way to diss someone. Send the a wikipeda link and bam!
It does to some degree.
- “11” is string, 1 is an int, because strings can be added (+) convert int to string and combine: “11”+“1” = “111”
- “11” is string, 1 is an int, because strings cant be subtracted (-) convert string to int and combine: 11-1 = 10
I’m not into JS so I don’t know how it takes priority. ints can be added too, so I guess its basing it on the first variable which is compatible with the operator: in the first case string, in the second case int.
If this is how it works, it makes sense. But imo its a case of the designers being preoccupied with whether or not they could, they didn’t stop to think if they should.
People that try to do mathematical operations with strings blaming the programming language that had a stated design goal to do its best and try to keep running scripts that make no sense because they realized it would be used by people that have no idea what they are doing. Clearly they were right.
the programming language that had a stated design goal to do its best and try to keep running scripts that make no sense…
…itself makes no sense. It is wrong and bad that Javascript was ever designed that way in the first place.
It was never intended to run full applications but only the small business scripts and hobbyist homepage stuff that were the thing in the 90s, across inconsistent browsers that were a jungle of hit and miss behaviour where it was preferred that menus keep working even if the mouse effect was not. Anything of scale was expected to be done in Java. Dynamic web pages did not exist and as anything not static was generated server side into a static html file to be rendered on the client.
Anyway, back then it wasn’t considered the job of the programming language to hold the hand of the aspiring developer as it is common today. It’s not a bad thing that IDE and even compilers and preprocessors try to help you write better code today, but then it simply didn’t exist.
JavaScript is from a different time and because it has the hard requirement or backwards compatibility there is no changing it and has not been for thirty years except to add stuff to it.
I think it’s just silly to ask the past to keep up with the present. Bad code is not the fault of the language regardless, even though junior devs and even seasoned ones like to think so to protect their ego. I think it is better to accept it, learn from it and roll with it because every single platform and language has their weird quirks anyway.
Signed, old dude that learned programming in 8 bit BASIC and 6502 machine code without an assembler, where code bad enough would freeze your machine that required a cold boot and starting over from your last save that you didn’t do.
Executing after undefined behavior is arguably worse than terminating with an exception. A terminated script can’t leak data or wreak havoc in other ways.
Anyway, back then it wasn’t considered the job of the programming language to hold the hand of the aspiring developer as it is common today.
But that’s exactly what it’s doing by trying to figure out what the developer meant. ‘“11” + 1’, should cause the compiler to tell the developer to to fuck themselves.
it would be used by people that have no idea what they are doing. Clearly
And so let’s enable these people?
Let’s add AI to the mix while we’re at it.Now that you mention it, it is a bit funny how Lemmy is hating LLMs as a code generation tool while also hating on the interpreter for their own hand typed code not running.
I mean, in both cases it’s because the LLM and interpreter do things you wouldn’t expect.
I seldom use an interpreter.
Then you do not do Javascript, because it is an interpreted language.
Edit: or Python, or a command line shell, or any CORS, or databases, or… Well idk really what you do use honestly.
Then you do not do Javascript, because it is an interpreted language.
No shit?! Wow… who would’ve known…
javascript is to web developers what powerpoint is to sales people
This is a really good interview, and does a good job highlighting Javascript’s biggest strength: it’s flexibility.
“It was also an incredible rush job, so there were mistakes in it. Something that I think is important about it is that I knew there would be mistakes, and there would be gaps, so I made it very malleable as a language.”
He cites the “discovery” of asm.js inside of JavaScript, calling it “another thing I’m particularly proud of in the last 10 years.” It uses the bitwise operators that were included in the original JavaScript which are now the basis for a statically-typed language with machine types for high-speed performance. “If it hadn’t been in there from 1995, it would’ve been hard to add later. And the fact that it was there all along meant we could do incredibly fast JavaScript.”
He tells InfoWorld it’s “this very potent seed that was in the original JavaScript from the 10 days of May in 1995.” JavaScript’s 32-bit math operators (known as bitwise operators) trace their lineage all the way back to the C programming language — and to Java. This eventually led to WebAssembly — a way to convert instructions into a quickly-executable binary format for virtual machines — and the realization that with a JavaScript engine, “you can have two languages — the old language I did with the curly braces and the functions and the shift operators, and this new language which is a binary language, not meant for reading by humans or writing. But it can be generated by compilers and tools, and can be read by tools…”
Obligatory link to wat? video