50 comments

  • CobrastanJorji 841 days ago
    If you regularly do command line JSON requests, I'm a big fan of HTTPie. It's so much easier to use correctly. https://httpie.io/docs/cli/examples

    For example, here's a JSONy POST request with cURL:

    curl -s -H "Content-Type: application/json" -X POST https://api.ctl.io/v2/authentication/login --data '{"username":"YOUR.USERNAME","password":"YOUR.PASSWORD"}'

    Here's that same request with HTTPie:

    http POST https://api.ctl.io/v2/authentication/login username=YOUR.USERNAME password=YOUR.PASSWORD

    • parhamn 841 days ago
      After this change the equivalent would be:

          curl -XPOST --jp username=YOUR.USERNAME password=YOUR.PASSWORD https://api.ctl.io/v2/authentication/login
      
      Which isn't too far from your desired outcome (notably, without relying on argument position for meaning). Although, I guess the argument that "curl is already installed on almost every server", sorta gets moot because I imagine it will take a while for most distros to move to the latest curl that will support --json/--jp
      • jedberg 841 days ago
        It's a lot easier for a distro to update a default utility than to add a new one though. I suspect having updated curl would happen faster than adding httpie to the list of default utilities.
        • derefr 841 days ago
          Especially singe HTTPie is a python package. curl exists even in plenty of embedded systems, in many Docker base-images, etc.; but scripting runtimes like Python generally don't.
        • sneak 841 days ago
          I wish this were the case. Unfortunately for us grownups that run stable/LTS distros, non-security updates like these don't usually make it to us for like 5 years or something.

          There's a happy medium, and we're not in it.

        • fragmede 840 days ago
          It depends. How long did to take for bash4-isms to get adopted? Sometimes the update time for more basic utilities, for all systems can take a long time. (Apple switching to zsh doesn't help here.)
        • bonkabonka 841 days ago
          Except if it's an enterprisey distro, it won't get updated. Specifically RHEL 7 ships 7.29.0 and cherry-picks features (TLS handshakes) to backport to their nine-year-old version.
        • thwarted 841 days ago
          The exact opposite is more likely true, depending on the nature of the change. A new utility has no current usage so there are no in-the-wild backwards compatibility concerns. The system python on redhat based distros was stuck at 2 for a long time because some of the sysadmin utilities relied on it and all those deps needs to be vetted or converted to either a newer python version or their own shipped version of the runtime that didn't use the general purpose python install.
          • jedberg 841 days ago
            Python is a huge outlier in this case. Very rarely do you have a default tool that can't be upgraded, and this case was only because Python is an interpreted language.

            Curl is a command line tool. As long as they only add new functionality, there is very little that prevents an upgrade.

          • jsight 841 days ago
            There's a big difference between a bump to the latest curl version and a jump between two major versions of a language runtime. The Python 2->3 transition took years and many distros kept both versions.

            Most dependencies get updated much more quickly. It wouldn't even shock me if this change got picked up mid-cycle.

            • bonkabonka 841 days ago
              Unhappily RedHat won't. Heck, RedHat has gone well out of their way to backport updated TLS handshakes (1.2 at least) to their 2013 version (7.29.0) of curl.
              • jsight 837 days ago
                I'm assuming that you mean for RHEL? That's a little bit of a special case.
          • ericbarrett 841 days ago
            This is only a valid comparison if curl deprecates or moves options, as Python 3 did with language features and standard library functions.
      • jlundberg 841 days ago
        And that -X POST is actually not needed. Adding any payload will make curl do a POST request.
      • floatingatoll 841 days ago
        nit for understanding's sake: is the second --jp missing?
    • jicea 841 days ago
      Shameless plug, we've built hurl [1] to add some syntactic sugar over curl, with plain text. For example, you can write a text file post.hurl:

          POST https://api.ctl.io/v2/authentication/login
          {
            "username": "YOUR.USERNAME",
            "password": "YOUR.PASSWORD"
          }
      
      and it will send the same POST request with a json body:

         hurl post.hurl
      
      You can add asserts on the response too:

          POST https://api.ctl.io/v2/authentication/login
          {
            "username": "YOUR.USERNAME",
            "password": "YOUR.PASSWORD"
          }
          HTTP/1.0 200
          [Asserts]
          jsonpath "$.status" == "LOGGED"
      
      Under the hood, we use libcurl and a Rust binding. The HTTP engine is curl because curl is awesome!

      [1] https://github.com/Orange-OpenSource/hurl

    • db65edfc7996 841 days ago
      I have recently been trying a port of HTTPie in Rust, xh [0] (which is needlessly hard to find in a websearch). I am a Python guy, but love having single executable tools.

      [0] https://github.com/ducaale/xh

      • kbd 841 days ago
        Happy to hear about xh, thanks for mentioning it. I recently ran into a problem where HTTPie didn't support HTTP/2, so I had to fall back to curl. The ticket for HTTP/2 in HTTPie is still open https://github.com/httpie/httpie/issues/692

        Happy to see that xh supports HTTP/2 out of the box.

      • petepete 841 days ago
        xh is great, just like HTTPie but incredibly fast.

        I alias it to http so it's memorable and the commands make more sense though.

    • lambdaba 841 days ago
      Since htttpie can't do nested JSON assignments I like to use jo, like this:

        $ jo foo=$(jo bar=qux) | http example.com
      
      Piped JSON automatically uses POST
      • treesciencebot 841 days ago
        (HTTPie maintainer here)

        You might be in luck since Nested JSON support is going to be star feature of our upcoming release. Here is a sneak peek:

        $ http --offline --print=B pie.dev/post \ search[type]=client \ search[stars]:=50000 \ search[platforms][]=Web \ search[platforms][]=Desktop \ search[platforms][]=Mobile \ search[platforms][]=CLI

        { "search": { "platforms": [ "Web", "Desktop", "Mobile", "CLI" ], "stars": 50000, "type": "client" } }

        We are rolling a brand-new mini-language that integrates really well with the existing request building syntax, but also features stuff like JSON type-safety and amazing error messages for basic syntax errors.

        • CobrastanJorji 841 days ago
          That sounds great! Any chance of HTTP/2 or HTTP/3 support?
          • treesciencebot 841 days ago
            It will probably be post 3.0, since the underlying HTTP interface we use (https://pypi.org/project/requests) does not support HTTP/2. We are currently discussing how to migrate from that to something like httpx, without causing any change of user-visible behavior.
      • CGamesPlay 841 days ago
        This is amazing! Definitely adding to my toolbox.
    • darrenf 841 days ago
      For longer than I can recall I've had a `post_json` alias like this:

          curl -H'Content-Type: application/json' -d @- << 'JSON'
      
      Invoke with

          post_json <url>
      
      Then just paste or write JSON without caring about quotes etc, finish the heredoc, done.
      • paledot 841 days ago
        TIL you could open (and not close) a heredoc string in an alias.
      • tomsmeding 841 days ago
        Don't you need `<<'JSON'` instead of `<<JSON`? Otherwise any $ in the json body would be interpreted as a shell variable reference.
        • darrenf 841 days ago
          Yes, I mistyped on my phone. Edited, thanks!
    • Jaepa 841 days ago
      curl's primary advantage is that its pretty much universal.

      I can hand another dev a curl statement without having to worry if they have the requisite software to reproduce the call.

      Also Postman, Insomnia, Sentry, & Swagger all support ability export to curl.

    • adolph 841 days ago
      This example isn't fair/steelmanning cURL.

        -s: if HTTPie disables progress bar by default, that's just a different design choice, the advantage is in the eye of the beholder
        --data '{"user...: likewise, HTTPie's default to JSON is a design choice. I wouldn't say the default is superior
        -H "Content-Type...: likewise, if HTTPie adds this header by default, thats just getting in my way when I don't want that header
        -X POST: you don't need to specify that in cURL if using --data
    • aembleton 840 days ago
      HTTPie can be even more compact by using the command https to indicate to use that protocol. This makes the command look like:

      https POST api.ctl.io/v2/authentication/login username=YOUR.USERNAME password=YOUR.PASSWORD

    • mimimi31 841 days ago
      I think --data implies -X POST, so that part at least would be unnecessary.
      • CobrastanJorji 841 days ago
        Ahh, I believe you're right. "PUT" would have been a better example, in that case. But I suppose getting the cURL syntax wrong helps my point that I find the cURL syntax confusing.
    • Diti 841 days ago
      HTTPie does not support multi-valued keys in query strings (sending several field[]= for example). It is very annoying and, for consistency, I would rather use cURL (instead of mixing syntax with HTTPie).
      • jkbr 840 days ago
        If I’m understanding your point correctly, HTTPie has support for multi-value keys in query strings.

        You can use the `name==value` query parameter syntax [0]:

          $ http --offline pie.dev/get AAA==AAA AAA==BBB
          GET /get?AAA=AAA&AAA=BBB HTTP/1.1
        
        Or simply:

          $ http --offline 'pie.dev/get?AAA=AAA&AAA=BBB'
          GET /get?AAA=AAA&AAA=BBB HTTP/1.1
        
        [0]https://httpie.io/docs/cli/querystring-parameters
    • mmcnl 841 days ago
      I appreciate HTTPie but it can never beat the fact that you curl is almost always installed by default.
    • 1vuio0pswjnm7 841 days ago
      There is a different way, without curl or other curl-like programs, without Python, Go, or the like. It is smaller and faster, IME as a non-developer.

         post api.ctl.io/v2/authentication/login < 1.txt|openssl s_client -connect api.ctl.io:443
      
      post is a 702-character shell script. printf is a built-in.

            #!/bin/sh
            (  
            y=Connection;n=0;while read x;do
            x1=${1#*//};x2=${x1%%/*};x3=${x1#*/};
            x=$(printf "%s" "${x%%:*}:";echo "${x#*:}");
            if test x"${x3}" = x"${x2}";then x3="";fi;
            printf "%s\r\n%s\r\n%s\r\n%s\r\n" \
            "POST /${x3} HTTP/1.1" \
            "Host: ${x2}" \
            "Content-Type: application/json" \
            "Content-Length: ${#x}";
            if [ $n -gt 1 ];then
            printf "%s\r\n\r\n%s\r\n" "$y: keep-alive" "$x";else
            printf "%s\r\n\r\n%s\r\n" "$y: close" "$x";fi;
            export n=$((n+1));
            done;
            if [ $n -gt 1 ];then
            printf "%s\r\n%s\r\n%s\r\n" \
            "GET /robots.txt HTTP/1.0" \
            "Host: ${x2}" \
            "$y: close";fi;
            )
      
      Data to be posted may be stored in a one-line file named "1.txt"

          cat > 1.txt
          { "username": "YOUR.USERNAME", "password": "YOUR.PASSWORD" }
          ^D
      
      Examine the request

          post api.ctl.io/v2/authentication/login < 1.txt
      
      POST the request

          post api.ctl.io/v2/authentication/login < 1.txt|nc -vvn 127.1 80
      
      Alternatively, data to be posted can be read from stdin

          echo '{ "username": "YOUR.USERNAME", "password": "YOUR.PASSWORD" }' \
          |post api.ctl.io/v2/authentication/login \
          |nc -vvn 127.1 80
      
      If the data to be posed is JSON formatted as multiple lines such as

          { 
            "username": "YOUR.USERNAME", 
            "password": "YOUR.PASSWORD" 
          }
          
      then

          (tr -d '\12' < 1.txt;echo) \
          |post api.ctl.io/v2/authentication/login \
          |nc -vvn 127.1 80
      
      For TLS we use proxy listening on 127.0.0.1, e.g., stunnel, sslsplit, haproxy, etc.

          cat > 1.cfg
          pid=/tmp/1.pid
          [ x ]
          accept=127.0.0.1:80
          client=yes
          connect=64.15.182.200:443
          options=NO_TICKET
          options=NO_RENEGOTIATION
          renegotiation=no
          sni=
          sslVersion=TLSv1.3
          ^D
      
          stunnel 1.cfg
    • 1vuio0pswjnm7 841 days ago
      There is a different way, without curl or other curl-like programs, without Python, Go, etc. It is smaller and faster, IME as a non-developer.

         post api.ctl.io/v2/authentication/login < 1.txt|openssl s_client -connect api.ctl.io:443 -ign_eof
      
      post is a 702-character shell script. printf is a built-in.

            #!/bin/sh
            (  
            y=Connection;n=0;while read x;do
            x1=${1#*//};x2=${x1%%/*};x3=${x1#*/};
            x=$(printf "%s" "${x%%:*}:";echo "${x#*:}");
            if test x"${x3}" = x"${x2}";then x3="";fi;
            printf "%s\r\n%s\r\n%s\r\n%s\r\n" \
            "POST /${x3} HTTP/1.1" \
            "Host: ${x2}" \
            "Content-Type: application/json" \
            "Content-Length: ${#x}";
            if [ $n -gt 1 ];then
            printf "%s\r\n\r\n%s\r\n" "$y: keep-alive" "$x";else
            printf "%s\r\n\r\n%s\r\n" "$y: close" "$x";fi;
            export n=$((n+1));
            done;
            if [ $n -gt 1 ];then
            printf "%s\r\n%s\r\n%s\r\n" \
            "GET /robots.txt HTTP/1.0" \
            "Host: ${x2}" \
            "$y: close";fi;
            )
      
      Data to be posted may be stored in a one-line file named "1.txt"

          cat > 1.txt
          { "username": "YOUR.USERNAME", "password": "YOUR.PASSWORD" }
          ^D
      
      Examine the request

          post api.ctl.io/v2/authentication/login < 1.txt
      
      POST the request

          post api.ctl.io/v2/authentication/login < 1.txt|nc -vvn 127.1 80
      
      Alternatively, data to be posted can be read from stdin

          echo '{ "username": "YOUR.USERNAME", "password": "YOUR.PASSWORD" }' \
          |post api.ctl.io/v2/authentication/login \
          |nc -vvn 127.1 80
      
      If the data to be posted is JSON formatted as multiple lines such as

          { 
            "username": "YOUR.USERNAME", 
            "password": "YOUR.PASSWORD" 
          }
          
      then something like

          (tr -d '\12' < 1.txt;echo) \
          |post api.ctl.io/v2/authentication/login \
          |nc -vvn 127.1 80
      
      For TLS we use proxy listening on 127.0.0.1, e.g., stunnel, sslsplit, haproxy, etc. This way we only ever have to type a single address and port, i.e., 127.1 80, or short alias for the hostname, e.g., echo 127.0.0.1 p >> /etc/hosts.

          cat > 1.cfg
          pid=/tmp/1.pid
          [ x ]
          accept=127.0.0.1:80
          client=yes
          connect=64.15.182.200:443
          options=NO_TICKET
          options=NO_RENEGOTIATION
          renegotiation=no
          sni=
          sslVersion=TLSv1.3
          ^D
      
          stunnel 1.cfg
      • 1vuio0pswjnm7 839 days ago
        Apologies for unintentional duplicate. Happens sometimes when I forget to change default cache settings in text-only browser.
  • hn_throwaway_99 841 days ago
    The --jp (json part) command line option, described at https://github.com/curl/curl/wiki/JSON, has "anti-pattern" written all over it to me. Why introduce some specific, curl-only wonky-ish version of JSON? Is this any easier to remember than normal JSON? I mean, right now, I use cURL all the time with JSON posts, just doing something like

    -d '{ "foo": "bar", "zed": "yow" }'

    The proposed --jp flag seems worse to me in every way.

    (Note I do like the --json as just syntactic sugar for -H "Accept: application/json" -d <jsonBody>)

    • floatingatoll 841 days ago
      --jp is a perfect fit for shell scripting integration:

          bash$ curl ... --jp "foo=$foo" ...
      
      This has zero shell-quoting, expansion, escaping, or separator issues. Whatever's in environment variable `foo` will be sent to the server as a single value associated with key `foo`, whether it's a zero-length empty string, or full of backslashes or spaces or newlines or whatever.

      There are fancier ways to accept multi-arg, but they all have weaknesses, and this matches the way curl handles -H arguments already today (one per header, stack them if you want many), so I think it's a sound way to handle CLI arguments.

      (I don't have any specific views on whether this is how curl should do JSON or not, but I recognized the CLI safety mechanism immediately.)

      • liquidify 841 days ago
        Does this handle nesting? I tend to do a lot of nesting in json. Almost always.
        • jcul 840 days ago
          Yes, if you click through to the wiki page there are examples of nested lists, maps.
          • Siira 840 days ago
            Then it will need quoting the metacharacters used for nesting.
        • floatingatoll 840 days ago
          I certainly hope not!
    • kortex 841 days ago
      > Is this any easier to remember than normal JSON?

      Yeah, actually it is. It's immediately intuitive to me. It makes string interpolation way easier. Quick, how do you do -d {object} and pass in environment variables with correct string escaping? Do you start with single quote or double quote? Where do I put backslashes? Bash vs zsh compatibility? Plus you have to make sure all the slashes, quotes, brackets and braces match.

      Vs

      --jp foo=$FOO --jp bar="baz-${BAR:-default-bar}" --jp date="\"$(date)\""

      (I'm iffy on the last one, but it's WAY easier than trying to build that into an object)

      • zimpenfish 841 days ago
        > --jp date="\"$(date)\""

        If it's going to properly crib `jo` syntax, you can just do `date="$(date)"` - no need for the second set of quotes (and, indeed, they'll mess it up.)

            > jo date="$(date)"
            {"date":"Thu 20 Jan 2022 20:59:16 GMT"}
            > jo date="\"$(date)\""
            {"date":"\"Thu 20 Jan 2022 21:00:28 GMT\""}
    • mdaniel 841 days ago
      Realizing that I'm screaming into the wrong textbox, but "-H 'Accept: application/json'" is the wrong header for curl to set in that circumstance, since all that curl is able to say with authority is that the content-type it emitted is application/json, not what the user wants/accepts back. Maybe this feature is an example of the 80/20 rule, and that more advanced usages can't use --json and must still craft the explicit C-T and Accept headers

      It is so weird that command-f on that wiki doesn't show a single content-type header

      • hn_throwaway_99 841 days ago
        Actually, you are correct, I had a copy/paste error from pulling a different curl command and merging it, the curl page correctly shows it as `-d [whatever] -H "Content-Type: application/json"`
    • jbverschoor 841 days ago
      --jp doesn't even make sense as it's an abbreviation.. it's rather be --json-part.

      The notation seems to be similar to HAML<>HTML. It's not JSON, and it doesn't make any sense other than really short ad hoc queries. It's just confusing and only solves a problem that a tiiiny amount of people have (regualar ad hoc json queries, where people are too lazy to actually write out json).

      Otherwise, why not do the same for XML, css. Or heck.. why not simply support HAML as well

      It's better to have such functionality extracted in some other tool and just pipe it

    • Karellen 841 days ago
      Yeah, it cries out to me for a separate `jp` tool instead, so you'd do:

          $ jp foo=bar zed=yow
          { "foo": "bar", "zed": "yow" }
          $ jp foo=bar zed=yow | curl --json - https://example.com/destination
          ...result here...
    • sicariusnoctis 840 days ago
      Why not use the UNIX idea of composability and relegate all the JSON construction to another tool, e.g `jo`:

          $ jo query="six times nine" answer=42 | curl --json -
          Sending JSON...
          {"query":"six times nine","answer":42}
  • petters 841 days ago
    > --jp a=b --jp c=d --jp e=2 --jp f=false

    Uh oh, this looks like it would have the problems of yaml. The data type changes based on the provided string.

    • zeroimpl 841 days ago
      Exactly. At least make it something like --jps for strings and --jpn for numbers.
  • leifg 841 days ago
    To everyone saying "just use tool x for this": the advantage of curl is that is so widely available.

    For your development laptop you can install anything you want but more often than not you need to log into a EC2 instance, a Docker container you name it.

    Curl is often pre installed or very easy to install. I know it's usually not an up to date version but as time goes by you will be able to rely on this feature on pretty much any machine.

    • faeriechangling 841 days ago
      There's no feature you cannot justify adding to curl on the basis that it would be convenient to have it on any system.
  • kodah 841 days ago
    I was wondering, "Why not pipe output to JQ" up until I read this:

    > A not insignificant amount of people on stackoverflow etc have problems to send correct JSON with curl and to get the quoting done right, as json uses double-qoutes by itself and shells don't expand variables within single quotes etc.

    It's about sanitized inputs.

    • masklinn 841 days ago
      > It's about sanitized inputs.

      Less sanitized and more correctly formatted. Writing literal JSON by hand at the CLI is not fun.

      • _wolfie_ 841 days ago
        That is why you use jq --arg for the input instead of crafting it by string concatenation.

            data=$(jq -n \
                    --arg title 'what"ever' \
                    --arg endpoint 'foo"bar' \
                    '{
                    "title": $title,
                    "endpoint": $endpoint,
                    "enabled": true
            }')
        • aesyondu 841 days ago
          I use jq exclusively for parsing output. Didn't know you could craft JSON with jq as well. Thanks for the tip!
      • oconnor663 841 days ago
        If I really had to get some JSON into a Bash script, and I couldn't just stick it in a file, I'd probably use a "heredoc" something like this:

            $ my_json="$(cat <<EOF
            > {
            >   "foo": "bar",
            >   "baz": 42
            > }
            > EOF
            > )"
            $ echo "$my_json"
            {
              "foo": "bar",
              "baz": 42
            }
        
        But I definitely didn't remember how to handle the closing paren and closing quote correctly, and I had to google for an example just now. So I'm not allowed to say this is easy to remember :)
        • aidenn0 841 days ago
          Random bit of trivia: on ksh you can do

            foo="$(cat <<EOF)"
            whatever
            EOF
          
          This is left undefined in the POSIX standard and bourne shells don't allow it.
    • aidenn0 841 days ago
      Fwiw, here docs solve both of these; they expand variables and allow double quotes and braces.
      • account42 840 days ago
        They don't solve escaping quotes inside the variables you paste into your JSON.
  • greenn 841 days ago
    Sounds like this idea is limited to the curl tool and wouldn't add anything to libcurl, which is great. I'd prefer libcurl leaving JSON to other libraries.

    I use bash variables inside JSON with curl all the time, which leads to string escape screw ups. I know there are alternatives that make testing REST + JSON easier, but since our software uses libcurl in production I prefer to test with curl to keep things consistent.

  • drewda 841 days ago
    I like using the httpie CLI, in part because it has a nice interface for sending JSON and receiving JSON: https://httpie.io/docs/cli/json
    • aesyondu 841 days ago
      I wish I knew about this feature sooner. I've been using:

      ``` echo { "my": "json" } | http post localhost/endpoint ```

  • justin_oaks 841 days ago
    I can see this being useful, but I'm not looking forward to the list of command line options being even longer. The output of "curl --help" on my system is already 212 lines long.

    I wish the curl command was split such that different protocols had different commands. I REALLY don't want to see a list of FTP specific command line options whenever I'm just trying to look up a lesser-used HTTP option.

    That said, this is really a minor gripe compared to just how useful curl has been for me over the years.

    • kortex 841 days ago
      I'm already well past the stage of using moar/micro/[rip]grep to scan through curl's manpage and find what I need.

      OTOH, if you ignore using curl to GET resources to download, >90% of my curl usage is slinging json, and often involves interpolating strings and hence copy-pasting, so this feature would be immediately useful to me.

      Curl is kind of the swiss army knife of the web so I don't think a long manpage is out of line.

    • prpl 841 days ago
      I always think of a perm everyone I do “man curl”, which is more often than I’d like.
      • jcul 840 days ago
        I don't think I'll be able to forget this going forward any time I check the curl manpage!
    • makapuf 841 days ago
      Yes,a multitool like busybox with separate subcommand names would seem nice
  • hnarn 841 days ago
    In the linked github wiki there's an example of the syntax of the suggested --jp flag used to pass key-value pairs and put them together as a JSON object:[1]

    --jp a=b --jp c=d --jp e=2 --jp f=false

    Gives:

    { "a": "b", "c": "d", "e": 2, "f": false }

    --jp map=europe --jp prime[]=13 --jp prime[]=17 --jp target[x]=-10 --jp target[y]=32

    Gives:

    { "map": "europe", "prime": [ 13, 17 ], "target": { "x": -10, "y": 32 } }

    While this is neat, I suppose, it seems like such a waste that the first one isn't given as:

    --jp a=b,c=d,e=2,f=false

    And the second as:

    --jp map=europe --jp prime[]=13,17 --jp target[]=x:-10,y:32

    ...or similar. The repetition kind of bothers me.

    [1]: https://github.com/curl/curl/wiki/JSON

    • jffry 841 days ago
      Then you wouldn't be able to reliably use shell substitution to pass in values, which is a pity. The issue being if $var contained commas, like

        var="foo,y=bar"
        curl --jp "x=$var"
      
      Then allowing comma-separated field=value pairs within a single --jp argument would cause non-obvious changing behavior
      • hnarn 841 days ago
        You could possibly solve it by supporting single quotes for value containment:

            curl --jp "x='$var'"
        
        Gives:

            curl --jp "x='foo,y=bar'"
        
        Assuming that is what you meant. Isn’t this normally how these things are handled in unix shells? That and escaping which probably doesn’t apply here.

        I don’t know what the best way to implement this would be, but the current proposal looks so weird for me that I’m either completely missing something or it’s wildly unnecessary.

        • jffry 841 days ago
          If you do something like single quotes then you can no longer use the shell's abilities to properly pass in escaped values like this:

            bash$ foo=this\"test\"
            bash$ echo "value of \$foo is: $foo"
            value of $foo is: this"test"
          
          If we're throwing away the ability to let the shell handle escaping properly then there's really no point to --jq at all versus just manually attempting to cobble together JSON directly
  • ainar-g 841 days ago
    I feel like if you only want to make a single JSON request, a simple curl invocation with the JSON data in single quotes or in a file should be enough. And if you make many different JSON requests, you're probably much better off with one of the alternative tools.

    Related to the second point, I really wish more people put more time into creating tools for their testers. Shell/Ruby/Python/Perl scripts that are custom-made for the specific service they're testing and provides better UI. So that instead of a sequence of curl invocations, logins, and error-prone copy-pasting, people could just:

      test-my-service --user j.doe:hunter2 --api comments/create --param body="hello world"
  • advisedwang 841 days ago
    Some of the replies say this is a layer violation: HTTP doesn't care about JSON so curl shouldn't either. But you have to add Content-type and Accept headers when working in JSON, which I personally often forget, so I think this does make sense.
    • matt_kantor 841 days ago
      There's also plenty of stuff like this already in curl: --aws-sigv4, --data-urlencode, --form, --metalink, --oauth2-bearer, etc.
  • Macha 841 days ago
    I'm indifferent if they do this or not, can always use pipes and jq, but if they do, I hope the json-part option uses some syntax that's a subset of jsonpath and/or jq, so I don't have to understand a third syntax when people start using this.
  • softwarebeware 841 days ago
    I think this idea violates the Unix Philosophy. What should happen is that a separate utility could be used to pipe in the request body to cURL similar to https://stackoverflow.com/questions/12583930/use-pipe-for-cu...
    • throwawayboise 841 days ago
      Absolutely. Exactly my sentiments. Formatting JSON is a separate concern from what curl does.
    • 1vuio0pswjnm7 841 days ago
      Here is how one might adhere to the so-called UNIX philosophy.

      Utility #1: 580-character shell script to generate HTTP (NB. printf is a built-in)

      Utility #2: TCP client to send HTTP, e.g., netcat

      Utility #3: (Optional) TLS proxy if encryption desired, e.g., stunnel^1

      1. For more convenience use a proxy that performs DNS name lookups. Alternatively, use TLS-enabled client, e.g., openssl s_client, etc.

      Advantages over curl and similar programs: HTTP/1.1 pipelining

      For the purpose of an example, the shell script will be called "post". To demonstrate pipelining POST requests, we can send multiple requests to DuckDuckGo over a single TCP connection. TLS proxy is listening on 127.0.0.1:80.

           #! /bin/sh
           (   
           y=Connection;n=0;while read x;do
           x1=${1#*//};x2=${x1%%/*};x3=${x1#*/};
           if test x$x3 = x$x2;then x3="";fi;
           x=$(printf "%s" "${x%%=*}=";echo "${x#*=}");
           printf "%s\r\n%s\r\n%s\r\n%s\r\n" \
           "POST /${x3} HTTP/1.1" \
           "Host: ${x2}" \
           "Content-Type: application/x-www-form-urlencoded" \
           "Content-Length: ${#x}";
           if [ $n -gt 1 ];then 
           printf "%s\r\n\r\n%s\r\n" "$y: keep-alive" "$x";else
           printf "%s\r\n\r\n%s\r\n" "$y: close" "$x";fi;
           export n=$((n+1));
           done;
           if [ $n -gt 1 ];then
           printf "%s\r\n%s\r\n%s\r\n" \
           "GET /robots.txt HTTP/1.0" \
           "Host: ${x2}" \
           "$y: close";fi;
           )
      
      Put the queries in a file

          cat > 1.txt
          q=one
          q=two
          q=three
          ^D
      
      Send the queries

          post https://lite.duckduckgo.com/lite < 1.txt|nc -vvn 127.1 80 
      
      Send the queries, save the result, then read the result

          echo "<base href=https://lite.duckduckgo.com />" > 1.htm
          post https://lite.duckduckgo.com/lite < 1.txt|nc -vvn 127.1 80 >> 1.htm
          firefox ./1.htm
          links -no-connect ./1.htm
      
      Based on personal experience as an end user, I find that using separate utilities is faster and more flexible than curl or similar program mentioned in this thread. For me, 1. storage space for programs, e.g. large scripting language interpreters and/or other large binaries, is in short supply and 2. HTTP/1.1 pipelining is a must-have. Using separate, small utilities 1. conserves space and 2. lets me do pipelining easily. I write many single purpose utilties for own use, including one that replaces the "post" shell script in this comment.
    • pixl97 841 days ago
      Unix philosophy is dead. It's body was crucified on the systemd cross.
      • pphysch 841 days ago
        I know systemdphobia is cool, but AWK predates systemd by 3 decades (and the Unix philosophy by 1 year!).

        Needless to say, the industry found powerful tools like AWK (and SystemD) more useful than rigid dogmas.

        • softwarebeware 841 days ago
          AWK was designed by Kernighan who is on the record as subscribing to the Unix Philosophy. For all I know Aho and Weinberger also subscribe to the philosophy. I think its safe to say that AWK and the Unix Philosophy are compatible. I have never seen anything that says otherwise.

          Was "rigid dogmas" in reference to the Unix Philosophy? I haven't ever seen it described that way.

          • pphysch 840 days ago
            My most readily available `man awk` spit out 1235 lines of text. Its namesake physical book is even longer. It is a Turing-complete language. It can execute arbitrary system commands with `system()`.

            It is the antithesis of the Unix Philosophy. Always has been. And that's okay.

  • er0k 841 days ago
    Check out curlie[0] which is really great and already does this. It's essentially a wrapper for curl with JSON support.

    [ 0 ] https://github.com/rs/curlie

  • tester756 841 days ago
    Great to see that author is open minded and pragmatic
  • wmanley 841 days ago
    It seems the goal is to make it easier to craft JSON by having curl perform escaping, while the proposal would seem to require some sort of in-memory tree representation of the data.

    One alternative would be to provide escaping more directly like this:

        curl --json '{
          "map": %s,
          "prime": [
            %i,
            %i
          ],
          "target": {
            "x": %i,
            "y": %i
          }
        }' "$continent" "$p1" "$p2" "$x" "$y" https://example.com
    
    And then curl would do the substitution with the appropriate type-specific escaping for each variable. This has a few nice properties:

    1. What's on the command line resembles what's actually going to be sent.

    2. Curl doesn't actually need to parse (nor validate) the JSON, or to create a tree representation of the data within itself. %s is invalid JSON anyway, so you can do a string substitution - all you need to keep track of are matching quotes (including escape sequences).

    I've used a printf style format string here, which could be expanded for extra convenience. For example the Python-style `%(env_var)s` sequences could be used which could expand environment variables directly. Or something could be added for convenient handling of bash arrays.

  • gumby 841 days ago
    JSON is underspecified, leading to various incompatibilities between implementations.

    Because cURL is so ubiquitous, whatever Daniel implements may become the de facto standard.

    • Diggsey 841 days ago
      JSON isn't under-specified, you can tell if something's valid JSON just based on the rules here: http://www.json.org/json-en.html. The mapping of the JSON data model to the data models found in various languages is what's ambiguous. Which is not impacted by curl supporting JSON in the slightest:

      - The `--json` option only adds a content-type header, it doesn't alter the transmitted data at all.

      - The `--jp` option has a bespoke format that's not part of the JSON spec, and which doesn't actually depend on a specific data model, it's just string manipulation.

      • gumby 841 days ago
        See a detailed reply to a parallel comment.

        Also, —jp is actually generating a JSON.

        • Diggsey 840 days ago
          Numbers are the main thing I assumed you were talking about: the JSON spec is clear though, numbers can be arbitrarily long.

          The problem is not with the JSON spec, the problem is when you are converting from one data model to another. Any program which claims to perfectly round-trip the JSON data-model should support arbitrarily long numbers, there's no ambiguity in the spec about that.

          If you are only parsing JSON as a means to encode your own data-model, then there's no obligation to support arbitrary precision, but users should not expect to be able to round-trip arbitrary JSON data.

          AFAICT, `--jp` doesn't do anything which would affect the length of supported numbers, even though it's generating JSON.

    • ch4s3 841 days ago
      I think that's the most interesting implication here. I really hope he lands on something suitable.
    • doliveira 841 days ago
      How is JSON underspecified? What kinds of incompatibilities?
      • latk 841 days ago
        JSON lets you write numbers. They can have a sign, decimal part, and an exponent. The standard euphemistically describes this as:

        > JSON is agnostic about the semantics of numbers. […] JSON instead offers only the representation of numbers that humans use: a sequence of digits. […] That is enough to allow interchange.

        But can you encode/decode an arbitrary integer or a float? Probably not!

        * Float values like Infinity or NaN cannot be represented.

        * JSON doesn't have separate representation for ints and floats. If an implementation decodes an integer value as a float, this might lose precision.

        * JSON doesn't impose any size limits. A JSON number could validly describe a 1000-bit integer, but no reasonable implementation would be able to decode this.

        The result is that sane programs – that don't want to be at the mercy of whatever JSON implementation processes the document – might encode large integers as strings. In particular, integers beyond JavaScript's Number.MAX_SAFE_INTEGER (2^53 - 1) should be considered unsafe in a JSON document.

        Another result is that no real-world JSON representation can round-trip “correctly”: instead of treating numbers as “a sequence of digits” they might convert them to a float64, in which case a JSON → data model → JSON roundtrip might result in a different document. I would consider that to be a problem due to underspecification.

        • gumby 841 days ago
          The numbers was what I was mainly thinking of, so thanks for your exhausting enumeration of those problems.

          Jason.org requires white space for empty arrays and objects while RFC 8259 does not (and I often see [] and {} in the wild).

          A lot of packages de fact break the spec in other ways, such as ppl blatting python maps out rather than converting them to JSON so that the keys are quoted as ‘foo’ rather than “foo”. I’ve complained about this when trying to parse the stuff only to receive the response “it works for me so you must have a bug” from the pythonistas. This has happened in multiple projects.

  • jdc0589 841 days ago
    This would provide some additional utility, but honestly I don't see the point. Anyone sending JSON via curl CLI a lot is probably having to manipulate JSON via CLI for purposes other than sending requests with curl as well. It makes more sense for most people to just learn one json manipulation tool and pipe input in and out of things that need it.
  • stackedinserter 841 days ago
    Also, setting "content-type: application/json" in POST/PUT/PATCH requests would be helpful.
    • makapuf 841 days ago
      Yes I dont understand why accept header is set and not content type which seems even more obvious, any reason ?
  • abotsis 841 days ago
    What happened to “Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".”?

    If it’s too tough to integrate with other tools like jq, maybe that could provide for a better outcome.

    • pphysch 841 days ago
      Pragmatism happened. Pragmatism always beats dogma in the long run.
    • softwarebeware 841 days ago
      100%! People keep forgetting about the Unix Philosophy.
  • adolph 841 days ago
    I use cURL a lot. I can see how this would maybe somewhat useful for working very quickly, but the wiki-given use cases of k-v pairs and lists are simple enough in raw JSON.

    Something that would be helpful is for cURL, HTTPie, Postman, Fiddler, etc to standardize on a request/response pair format such as Chrome's HAR. There are some tools in NPM and the below HAR to cURL too, so I think native HAR support would be more helpful than a JSON builder.

    https://mattcg.github.io/har-to-curl/

  • mg 841 days ago
    I would rather write a new tool - say jcurl - which uses curl under the hood.

    As a user I would not expect curl to have json functionality.

    And as a developer I would prefer to have one codebase deal with http and another one with json.

    • jrockway 841 days ago
      Curl has LDAP support, email support, etc. JSON is not a stretch of the imagination by any means.
    • pixl97 841 days ago
      When dealing with support on someones docker image, for me it's far better to have this in one utility. Yea, you can write a command with the current version, but cutting it down to -jp will be much easier.
  • JRGC1 841 days ago
    Perhaps use an existing JSON command line tool and piping it into Curl?
  • throwaddzuzxd 840 days ago
    I've included --json in a custom redefinition for years, glad to see something like that coming to the official binary!

        curl() {
          args=()
          for arg in "$@"; do
            case $arg in
            --json) args+=("-H" "Content-Type: application/json") ;;
            *) args+=("$arg") ;;
            esac
          done
          command curl "${args[@]}"
        }
  • fra 841 days ago
    If this means I can just use libcurl to GET a web endpoint and parse the JSON in a C program rather than have to manage multiple dependencies, I'm all for it!
  • smrtinsert 841 days ago
    Please no, I'm thinking of me having to deal with someones 800 line curl script in the future. Agreed this doesn't feel very unixy.
    • fhd2 841 days ago
      Them adding some conditional logic next doesn't seem like that much of a stretch now!

      If it wasn't for this kind of stuff, there probably wouldn't be as many jobs in IT as there are.

  • otar 841 days ago
    Hmm… any actual use cases of this? I don’t find curl —jp a=b to be better than directly sending a payload on a HTTP resource.
    • mkdirp 841 days ago
      The issue is probably related to quotations among other shell related things. Observe:

          NAME=taterman
          EMAIL=sweettaterhater@taterman.com
          curl --jp "user=$NAME" --jp "email=$EMAIL" http://getdemtaters.com
      
      vs

          curl -d "{\"user:\"$NAME\",\"email\":\"$EMAIL\"}" http://getdemtaters.com
      
      Even adding jq to requirements doesn't make it that much better:

          jq -n --arg name "$NAME" --arg email "$EMAIL" '{ "user": $name, "email": $email }' | curl -d @-
      • overtomanu 841 days ago
        it might be easier if you use heredoc
    • masklinn 841 days ago
      > Hmm… any actual use cases of this?

      ... interacting with APIs using cURL?

      > I don’t find curl —jp a=b to be better than directly sending a payload on a HTTP resource

      Getting JSON syntax right, error free, by hand, in a terminal, is not easy. The current equivalent of an eventual `curl --jp a=b` is

          curl -s -H "Content-Type: application/json" --data '{"a":"b"}'
      
      that's a lot of opportunities for getting it wrong.
      • ChrisOstler 841 days ago
        And even more when the JSON is dynamic, and not static: curl -jp a="$B"
      • naikrovek 841 days ago
        > Getting JSON syntax right, error free, by hand, in a terminal, is not easy.

        you are right and I wonder why shells haven't done anything to address this. Fish might, actually. colorization isn't really useful in aiding comprehension, but colorization is good at giving an indicator that there is a parse error somewhere.

      • 8n4vidtmkvmk 841 days ago
        its also pretty trivial to alias/wrap that tho
  • pbiggar 841 days ago
    This is great. When a new user uses Darklang, we want them to be able to make JSON API requests quickly and easily, and there aren't great client-side tools for that that you can expect users to have installed. giving them a big long curl command is no fun, but `curl --json 'the-body' would be amazing`
  • syspec 841 days ago
    Doesn't really look like it's adding anything, and the `jp` part looks like the people referenced on stackoverflow will just be more confused.

    Often times the JSON being sent down is complex, I can't imagine anyone wanting to basically rewrite it into something else for anything other than 2 field JSON objects

  • softwarebeware 841 days ago
    I feel like jq already cornered the market on this. I'm unlikely to go back and update my scripts to use curl
  • jeremyjh 841 days ago
    I know I've done the quoting dance before, while exploring an API in one project I resorted to using zsh heredocs to build the payload argument to avoid all quoting issues. I'm sure there is a better way already but it sounds nice to have this built into curl as its so common.
  • bonkabonka 841 days ago
    I would prefer to use the --json flag to provide syntactic sugar for setting the content type and accepts headers and leave the marshaling of data to a separate tool. Or if it has to be baked in, refactor `jo` into `libjo` and a CLI wrapper so that the two tools behave the same way.
  • svnpenn 841 days ago
    Shouldn't this be a GitHub issue or GitHub discussion? Wiki is a weird format to use for a proposal.
    • cpach 841 days ago
      Honest question: Why does it matter…?
      • svnpenn 840 days ago
        How are you supposed to have a discussion on a wiki page, by editing in a comment?
  • Zamicol 841 days ago
    I use cURL for local development with JSON cookies and I think it's perfectly adequate for that purpose.

    curl --insecure --cookie test_cookie='{"test":"bob"}' https://localhost:8081/

    • mdoms 841 days ago
      Good news, you can keep doing it that way.
    • pbiggar 841 days ago
      You forgot `-H "Content-type: application/json; charset=utf-8"`
      • Zamicol 841 days ago
        --cookie puts the value of the cookie in the header.

        Host: localhost:8081

        Accept: /

        Cookie: test_cookie={"test":"bob"}

        User-Agent: curl/7.74.0

      • edoceo 841 days ago
        No charset on this mime type.
        • pbiggar 841 days ago
          TIL! Seems to be a quite confused topic as many things (it seems for example some java servers) even require it, but you do seem to be right, as JSON must always be utf8.
      • MrStonedOne 841 days ago
        undefined
    • svnpenn 841 days ago
      I think its bad practice to only quote part of the argument like that.
      • Zamicol 841 days ago
        I agree. The quote at the beginning is better for multiple cookies:

        curl --insecure --cookie 'test_cookie={"test":"bob"};test_cookie2={"test2":"bob"}' https://localhost:8081/

  • svnpenn 841 days ago
    Shouldn't this be a GitHub issue or GitHub discussion:

    https://github.com/curl/curl/wiki/JSON

    Wiki is a weird format to use for a proposal.

    • naikrovek 841 days ago
      this page started before discussions were a thing, and issues were never a good fit for this kind of thing.
  • pkrumins 841 days ago
    So great! This has been one of the most requested curl features for years. Without this feature, to send JSON, you had to craft a valid JSON string yourself or shell out to another utility that creates a valid JSON string.
  • pluc 841 days ago
  • datavirtue 841 days ago
    I just want to pass it a filename that contains the JSON. Never been a fan of heaving around post bodies that dangle from a curl command...and I hate postman.
  • tastroder 841 days ago
    • diogenesjunior 841 days ago
      It's not a dupe. They each link to pages talkin about a similar topic, but the content is entirely different.
      • aequitas 841 days ago
        The other post's page literally links to this post's page. So we're discussing the same here, just with less context.
  • pixel_tracing 841 days ago
    How about opening up an Vim or Nano like editor with an option flag —editor where I can just paste the request body instead of passing flags, etc.?
  • intrasight 841 days ago
    I now use a C# REPL like csharp-script rather than such task specific scripts such as curl. It's more flexible, powerful, and consistent.
  • 999900000999 841 days ago
    On one hand, this is awesome.

    But aren't there also several command line utilities which already support JSON.

    Why cram new stuff into such an industry standard tool?

    • soheil 841 days ago
      But curl has so many features already, it's odd to say now is the time to stop adding more. There is even Socks4 proxy support, does anyone even use that now or ever?
    • ollien 841 days ago
      You kind of said it yourself: it's an industry standard tool, so it will (almost) always be available.
      • 999900000999 841 days ago
        My concern is they'll accidentally break something.
        • bryanlarsen 841 days ago
          I trust Daniel Stenberg to not break something more than I trust any other tool that currently does JSON.
    • masklinn 841 days ago
      > But aren't there also several command line utilities which already support JSON.

      There are command line utilities which consume, query, or format json.

      But aside from e.g. httpie (which is essentially a competitor to Curl), which "several command-line utilities" make authoring JSON easy and convenient?

      Because if you check point (3), the link, and the paragraph before it, this is entirely about sending valid JSON (ideally with the correct headers).

      In fact the second section of the link in question literally states:

      > # JSON response

      > Not particular handling. Pipe output to jq or similar.

      • bonkabonka 841 days ago
        I mostly use `jo` to format JSON from the CLI (except when I have to use `jq` to do so and then I suffer).
      • Hjfrf 841 days ago
        ConvertTo-Json is a second example, if you're in that ecosystem.
      • 999900000999 841 days ago
        You can author your JSON in Postman and then export to Curl.

        Infact you can run postman on the command line

        https://learning.postman.com/docs/running-collections/using-...

        Or you can write your own script in Python to do this.

        I really don't like adding new functionality to standardized tools. It's just risky.

        Make an extension, call it CURLson, but don't cram it into curl.

    • edoceo 841 days ago
      JSON is used a lot, really a lot, with CURL. It's ubiquitous on the web. cURLs intention is to support all this common URL stuff. Adding JSON seems a natural fit to me.
  • willcipriano 841 days ago
    I imagine this will be part of libcurl? If so that makes it a one stop shop for JSON REST programming in C.
  • bborud 841 days ago
    I wish curl had CoAP support
  • axiosgunnar 841 days ago
    Btw, did you notice how quickly the page loaded?
  • a45a33s 841 days ago
    will libcurl have built in json support or just the command line?
  • draw_down 841 days ago
    undefined
  • chrismeller 841 days ago
    Thank $deity. Jq suuuucks.
    • naikrovek 841 days ago
      it requires learning the query language, sure, but I would not say that it "suuuucks" by any means. maybe learning the query syntax sucks.
      • chrismeller 841 days ago
        Ok, I will further explain my complaints.

        There are two standards for selecting an element in a document. XPath and, ugh, JQuery. JQuery is easy for newbies. XPath is the “real solution”.

        JQ uses neither of these. Why? Who the hell knows.

  • blibble 841 days ago
    hopefully we'll be getting command line arguments to build up XML documents soon too.....?

    (not serious)

    • fhd2 841 days ago
      Maybe you're being downvoted because you're just not thinking big enough? How about first class SOAP support?