Haskell why not




















I also have extensive experience in both and I'd argue the reverse. I think it does, as the OP was asking what Haskell is great at. In fact this is the first time I ever heard someone say that Haskell is a great choice for that. But positioning as such doesn't necessarily make it true. Just because Haskell isn't advertised as the "data transformation" language doesn't mean it does better or worse than any other language at it. In my personal experience of using Clojure, Haskell and Python for data transformation and parsers, Haskell does the best job for both.

So this is the second time you hear it ; Anyway, we are just throwing anecdata at each other. Personal stories are still useful for programming experiences since "which is better" isn't that easy to measure. I would be happy to hear more about your experiences with Clojure.

Haskell is great for screwing with undergraduates' minds. Box proofs anyone? I did it more than two decades ago and for some reason haven't decided to pick it up again. I used to use it when I was reaching for a native binary with near-C performance, but wanted an easier to maintain, terser, GC language.

But if the problem would benefit from a lot of higher-level, Haskells still a good choice, given Gos lack of generics. D is my goto when I have similar requirements. I do like the language, though. Actually: in-place quicksort in Haskell is as verbose as in other languages. The 5 line quicksort uses linear order additional space. The matter with Haskell is that one often doesn't use in-place updates and that the language is designed such that this can be more efficient than in many other languages though Ocaml probably does this better with it's incremental garbage compilation strategy.

AzzieElbab 10 months ago root parent prev next [—]. Well, that 5 line quicksort is fake. Haskell is good for writing other languages like Elm for example. IshKebab 10 months ago parent prev next [—]. Haskell is good for transformation type programs, e. It sucks for pretty much everything else. As a mobile dev, I don't see much effort in Haskell community working on Haskell toolchain for mobile apps.

They seem to more interested in web frameworks. Compilers and logic programming. Making other languages feel like they are missing out :. Building DSLs in finance. In re: Haskell's goodness or badness, compare and contrast with, say, PHP crap language with wild success.

Or Prolog a stately Elven language with deep but obscure success. Haskell is what it is. In re: types and data, FP is good for that. See e. I've been using Elm recently and it gets the job done. It's weird though: on the one hand, as a experienced professional it feels like a toy. The error messages feel almost insulting, like I'm being patronized. On the other hand, once I got over that silly reaction, they're awesome. Changing code is a breeze, because Elm leverages the crap out of the type system, and the structure of the code and runtime prevent whole vast categories of errors.

Combine that with the sort of Domain-Driven development that Wlaschin is talking about and "baby, you got a stew going! KurtMueller 10 months ago parent next [—]. I was introduced to OCaml through Elm - a deeply-opinionated language with strict guard rails.

In Elm, there is either a happy path or there is no path. As a newcomer, you're not overwhelmed or paralyzed with a plethora of choices on how to get things done simply because Elm limits your choices. Each tool in your toolbox is documented with simple examples how to use that tool.

Instead of explaining Monads like all other design patterns out there, they insist on using some obscure definition from category theory to explain it.

The author is bashing people for explaining a concept from Category Theory with Category Theory?! Yes, there is value in pragmatism. Rant posts, on the other hand, have little value IMHO. Haskell has shortcomings like any other programming language. That absolutely does not make it a bad programming language.

Ericson 10 months ago parent next [—]. Well to be fair, Functor and Monad in base are pretty far removed from the Category Theory originals. Maybe we should just save the math explanation for the latter, and just call the former something else? Snarwin 10 months ago parent prev next [—]. Explaning monads in terms of category theory is like explaining regular expressions in terms of finite automata.

It's a good idea if you're writing a textbook, but maybe not so much if you're writing documentation for users of a programming language. Category Theory is the underlying since theory, but do you need to know about mechanics and gears to drive a car? There are explanations of Monads which are much much easier to understand for most people people not having much to do with Category Theory.

And then add the additional abstraction layer used in context of Monads like abstracting over the "external world state" IO and "combining computation descriptions". Monads are part of the underlying category theory. So asking for a full explanation of Monads you are asking to explain part of the "mechanics and gears" in your car analogy.

I think I agree with your main point here that teaching the internal details first is often not the optimal method. You do not need to know how Monads work to use them to great effect in Haskell and other languages. Depends on what you are trying to do. You don't need that knowledge in order to drive a car, but there are drivers that absolutely should know about mechanics and gears.

I enjoyed the courses I did in category theory very much, but the benefit to my code has been zero. You either think the astronomical number of bugs in delivered software is a problem or you don't and good luck with that. The use of Haskell is a huge win on this metric and demonstrably so. You don't have bugs in any of the Haskell programmed OSes you actually use, nor your editor written in Haskell, your Haskell mp3 player nor your Haskell time machine and Haskell en-truthenator.

Pandoc is good. Some like xmonad window manager, git-annexe has some fans. There's probably 4 or 5 more too! Mostly centred around parsing. And for all that you absolutely should learn Haskell.

You'll enjoy it and it will enable you to think about programming in New and powerful ways. Just don't fall so deep you expect to actually ship anything you write. The only Haskell software I regularly interact with is Hasura, and it has plenty of bugs even ones around nulls and other things Haskell is supposed to magic away. There any many types of bugs and Haskell may help prevent some of those.

I know there is some for development practices but it's independent from the language used. Empir Software Eng, DOI: A large-scale study of programming languages and code quality in GitHub.

Ray et al. However, programmers in dynamic languages are slightly slower, appear to produce more defects. There is a measurable benefit of static typing, but it's small. UncleMeat 10 months ago root parent next [—]. The paper by Ray et al. Interestingly, in this paper, Haskell displays a negative correlation in defect rate.

Economics of Software Quality has a function point to language conversion data, and function point to quality charts, so you could possibly say infer from that.

We don't need luck. Competition has already given us the answer that most bugs are ok. The vast majority of the software that creates literally trillions of dollars of economic activity, and pays most of our bills, is not mission critical. Software fails all the time with the only significant consequence being a developer has to spend some time fixing it. Sure sometimes money is lost. So is money lost when factory equipment needs repair. Some software does deserve to be bug free when it might put lives at risk, like flight software or medical software.

Perhaps even operating systems. But the vast majority of the software I use on a day to day basis does not fit that category. If bug free software gave a significant economic competitive advantage, smart folks would start writing it and win big in the marketplace. Considering this has had decades to happen, and has not, it's very unlikely that bug free is the winning competitive advantage when it comes to software. I'd guess the winning advantage is that the software is useful. Much like a car with many small problems is still useful.

The truth is that much of the software that exists today simply would not be worth building bug free and would never be profitable. You can extend this outside of software development to see that it's true in a more general sense. Most of the manufactured products we buy are not perfect and do not last forever. Some even have flaws from the day you buy them, but flaws that can be worked around. Once on vacation I bought screwdriver at a dollar store.

Poorly manufactured and it does have "bugs" compared to something I would have paid 10x the price for. But years later I still have it and it's good enough for some jobs. You could potentially build a car that doesn't fail for any reason for a few hundred years. Only Bezos and friends could afford it. With that said, I'd like to see the negative externalities of pollution and waste included in the true cost of things we buy, so that we don't produce so many disposable things that society pays for in the long run.

But that's a different discussion. Please don't take this to mean that I don't take great pride in writing quality software that is as bug free as possible. I also take great pride in meeting budget goals and deadlines. All successful businesses understand that competing goals must be balanced against each other. A well thought out reply, thank you. Haskellers would typically agree with you. Unfortunately harry8 who you were replying to isn't one.

He was being sarcastic. Your respondents don't seem to have realised that you are being sarcastic. It's always amusing to read critiques on Haskell written by people that don't know Haskell. This entire post is borderline trolling and it's sad so many people are falling for it.

I definitely agree with the documentation side of things, particularly lack of concrete examples. I found that overall however, learning Haskell made me a better programmer, even if it's more useful as more of an academic than practical language.

Dealing with pure functions, no access to loops so recursion is paramount and it is rather beautiful and useful some of the tail recursion and pattern matching stuff. Yes, it contains the core idea of quicksort partion the list and divide and conquer but it completely fails on the quick part, because the Haskell lists are are leaky abstraction of real computer memory. A real quicksort in Haskell is much more convoluted. What's a real quicksort anyway? You could argue that a reasonably fast implementation of quicksort is much more convoluted, which it most certainly is, but that doesn't make this implementation any less real.

A key aspect of quicksort is that it sorts the list in-place. If you dont sort in-place, you dont have quicksort and if you dont need in-place sort, then quicksort is the wrong choice anyway. Of course, canonical Haskell does not have a concept of in-place, which makes showing quicksort in Haskell also a questionable idea. This implementation will have issues with pathological cases, but that's a problem of the quicksort algorithm, not of the implementation whereas the Haskell one shown above has problems in the implementation.

Sounds like a hardware issue. Which hardware does not have this issue? True, although you forgot the recursion. The Haskell filter expression is much nicer as well. In particular, because you defined more and less before using them, vs after in the Haskell example.

I don't know if there's a way that could be achieved in Haskell too though. Edit: you forgot the recursive calls though. I used to think so as well, until I realised at some point that defining things like this means you focus on the actual "business logic" up front, but the applicable definitions are never far visually, spatially and logically.

In my opinion it lets you get to grips on the overall logic before hassling you with certain specifics. That is quite elegant for that variant, which is a great fit for Haskell's features. How easy is it to write a different variant, like using a different pivot, or sorting in place? It almost feels like Haskell with the annoying parts removed. Weird parentheses, and yet it took the world by storm Am I on another planet? I'd rate clojure slightly above haskell in terms of market share.

Is my radar broken? That raised my eyebrow as well. Semi-recently I was interested in learning Datalog so I wanted to kick the tires on Datomic written in Clojure.

I ended up getting stuck on some things; posted questions to StackOverflow and didn't get a response. Then someone on Twitter told me that I'd be better off asking questions in the Clojurians Slack channel.

So I go to sign up and Had to flag down someone on Twitter or IRC to fix it. As an outsider I felt a whiff of decay from the Clojure community no offense. Do people actually use UML in industry? I've never actually seen it used or mentioned for that matter in the few years I've been a software engineer I helped a european project on a uml graph versioning between various industrial uml applications, but I'd consider these niche. I think heavily regulated sectors are the most prevalent users of UML..

I guess haskell is just rarely enough on HN FP to get a boost just for that. Content is not worth spending much time IMO. To each his own, if the guy really suffers with Haskell then so be it, may he have a lot of fun with perl or js. If you try to set a google alert for "Clojure jobs" and "Haskell jobs", or just go through "HN: who's hiring" of the recent years and compare search results for Clojure and Haskell, you'd see that it's not "just slightly above".

Clojure currently is the most widely used FP lang. Doesn't Scala beat Clojure in industry? That would be my experience and it has higher number of jobs on Who's Hiring. F doesn't make much of a showing on Who's Hiring but I think it has stronger adoption in industry than Clojure.

It used to be that way. Note that I'm not bashing or defending any of the PLs mentioned. It is merely the fact - today, Clojure is the most popular FP language being utilized in the industry.

It doesn't mean that this all makes the language better or worse. Also, the overall share of languages with strong FP semantics is still way too small compared to the use of imperative PLs. This lets me safely ignore the rest of the article. This is based on the more general notion of a functor in mathematics, which is a mapping between categories. And that sounds like the description being used here.

If the author was using the description to explain Functors to someone who only knew oop it's a reasonable start. I got the impression the author was implying that is basically all you need to understand Functors and is not the case. Care to say why? Or just going to hit and run? Functor is a typeclass, which is the equivalent of an interface in Java, it's very basic, providing the ability to lift a function and execute it in the context of the functor whatever that is, this is the interface, remember , a generalization of map.

So a lot of types have an implementation of Functor. In theory one of those implementations could be guilty of using hidden state and all that, but in practice all of them are just straightforward functions transforming values into a new value, not mutating them. In short, not a single word of the description is correct. Now I'm more confused. If there is no hidden internal state what is the difference between a function and a Functor?

If it's just taking input and giving output without any internal state, that's just a function isn't it? Edit: ok I did some more refreshing of memory. So Functor is an interface with some properties like identity[1] and distributive morphism I think I'm wording that right. That's just an interface. I can implement that in Java or F if I want. How is haskell helping here?

You can't define the interface in either language. Implementations of Functor consist, in part, of type-level functions. In Haskell terms, these are "higher-kinded types". The standard example is the list type "[]" which, as a type-level function, takes an element type and gives back the type of lists whose elements are drawn from that element type. In Java and F , the only way to talk about the List type is in its fully applied context, where you've attached the element type.

What you don't have is the type-level function that's not been applied to anything. What makes this useful in Haskell is the typeclass overloading, which makes it effortless to write functions that abstract over arbitrary Functors, and use "fmap" multiple times locally for different Functor instances, letting the type system figure out what implementation is needed to map over the particular type you're working with.

And in such abstract code, where you may know very little about the Functor instance you're working with, it's extremely important that they all be absolutely law-abiding: in many cases, the laws are all you have to work with. These two features, higher-kinding and typeclass polymorphism, make it worth talking about Functors, and I don't think you can appreciate Functors in Haskell without seeing the interaction of these features and just how much it impacts the code style of the average Haskeller.

Man nothing against your dedication of explaining this to me, but everytime I talk about haskell it feels like a jargon salad. I very, very humbly ask you, so what? Like you wrote a short essay on this, and I still can't grok even in the slightest why this matters. Every other language I talk about, can at least tell me why certain feature is helpful, even if I don't get it.

What I got from this is this allows abstracting mapping over types. But what does that give you? Think about it this way. If the primary importance of something is only apparent from the big picture how is a user supposed to decide whether to use it or not? The big picture is rarely available to most programmers. Maybe a user isn't supposed to. I believe that's the premise of Paul Graham's Blub Paradox [1]. I certainly didn't learn, say, Common Lisp, because I was doing a feature comparison.

I had no idea what a lexical closure was at the time, and a couple of toy examples wouldn't have convinced me of their worth. I'd been programming quite happily without them for some time by then.

That is an excellent point. Haskell is such a fundamental shift in thinking, perhaps the only way to learn real application is to make something with it. But I would still maintain every language at least has a highly simplistic example of why certain features work well. In fact, my curiosity reignited by this discussion, I found this video series[1] on youtube which somehow made is x more clear where Functors, Applicatives and Monads are to be used.

The tree example is pretty abstract, but it helped me connect these features to my work. Still have the question about how haskell is helping here, because I can write a hidden state changing function in Julia if I want.

Haskell's functions are pure which make the typeclass laws more meaningful. I will take a stab at it. If I was writing a functional language, in an oop language, I could implement functors at least partially with an Object with an internal state changing method. The main issue with the author's statement is that it makes a claim of approximate equivalence and does not back it up with additional evidence or examples. I don't think anyone claimed that. Now we are getting somewhere. You said partially, what features are being left out?

I was not trying to refute the opposite claim. I was giving info on the differences of functors and the referenced oop feature. My points run somewhat counter to the authors claim "Functors are basically an Object with a internal state changing method in typical OOP terms. Hard to say what is in the author's head with the provided text. I did not mean to imply features would be left out but rather I would use more than the one oop feature, "an Object with an internal state changing method", to implement functors.

What other oop feature would you be using. What other oop feature would you be using I'm glad you think it was productive so far. I find the "I'm more productive with Python" claim in the post to be specious.

Maybe I've been abused and abusive by bad programming practices with python in the past, but it seems you really need to lint and exercise python code to have any kind of confidence that something you think is correct won't blow up at runtime. False confidence that something is "ready to go", is the worst, and it can cost a lot of money, and sometimes human life. Erlang has this problem too. It's so late binding you can make spelling mistakes and it won't be caught until runtime.

It's actually the basis of some very powerful features, but you really have to know that coding in Erlang is not like coding in Rust, Haskell, or something else strongly typed. So while I think the author has a point that concretions can get you into an inflexible, hard to refactor mess over time, I think sometimes those concretions don't have to be as bad as they seem. Consider Go. Interfaces are a form of concretion too, but the advice is to keep them small. An interface of exactly one function can be a beautiful thing.

I think it's better to have a type implement many interfaces than to have a type implement one huge interface. I like pyright. You can use protocols which are similar to Go interfaces. The benefit of go is really in performance and packaging. Frameworks like Spring in Java use types to direct dependency injection and it works well.

Types can be made abstract and blackboxy, and sometimes that's what you want. But doesn't a record type give info about its fields? Doesn't a sum type give info about possible alternatives in the values? Suddenly they are left reeling as they find out that the real world is, in fact, dynamic. The trick is to know what we are really modelling. If we are deserializing domain objects from JSON, it makes sense to have types for the domain objects.

Even if you want to design bottom-up, the moment you want to add anything to your program, you need a little top-down thinking, if only at the micro-level.

You want to create something new that isn't there, and then think how to accomplish that with the tools you have. I discovered Haskell in a comparative programming languages course I took at university last semester, and it completely changed how I think about programming. I can't speak for an industrial use case, but for a hobbyist writing open source software and personal projects, programming in Haskell has been an absolute joy and has reinvigorated my love for programming.

If you are pragmatically minded, you can get stuff done in Haskell. In my experience there's a lack of online resources for this kind of work in Haskell, but that's what I and others especially in the IHP world are working on.

If you just want to experiment, Haskell is great for that too. Hacker News new past comments ask show jobs submit. As soon as you need to be defining lots of complex or interesting computations, you start needing languages with good support for composability to manage that complexity. Here Go fails, for all the reasons that people have criticized it. The Unison programming language , and the unison.

By removing any friction and non-uniformity when programming multi-node software systems, such systems can once be assembled in a compositional fashion. The better composability of typed, pure FP once again becomes a significant lever, because process boundaries no longer destroy composition.

I wonder what comes after that? Like where? Time for some vague speculation…. See this Conal Elliott talk on this This is a problem, and it can be solved. My insertion sort algorithm runs just as fast on this 10 element list. Imagine traveling back in time to the days before C, and trying to convince an assembly language programmer that C was a massive step forward for programming.

In principle , you could build arbitrary programs by gluing together hand-written fragments of x86 assembly language. But you might have a hard time convincing the assembly language programmer of this, because toy examples of the sort that are easy to discuss would not reveal any major differences. What WAS likely convincing to assembly language programmers was the idea of not having to write the same program 5 times, for each different hardware architecture.

Haskell does not have unary prefix operators, so you cannot define it as a library function like all the other operators. The only exception to this is unary minus , which is a specific lexical hack to get around the fact that people expect to be able to write e.

Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. What will be the impacts if Haskell uses "! Asked 3 years, 4 months ago. Active 3 years, 4 months ago. Viewed 3k times. Improve this question. AlohaWorld AlohaWorld 41 1 1 silver badge 6 6 bronze badges. What do you mean by "change"?

To design Haskell like that from the beginning? Related: mail. Like all symbolic operators, it is a binary infix operator: arr! These are library functions. In Haskell, it was decided that all symbolic operators are binary infix ones except unary - , which does cause some issues. Having more exceptions to this rule seems unhelpful.



0コメント

  • 1000 / 1000