A Simple API

I always like to spend some time reviewing and learning new stuff before starting a project or task. The amount of time depends on the urgency of the task. In this case I went over chapter 2 of RESTful Web APIs by Leonard Richardson and Mike Amundsen.

As I have mentioned several times, I like to research a think about the task, implement a Minimum Viable Product (MVP) and with the remaining time enhance the software until the scheduled time runs out. Based on my experience, this approach is welcomed by most customers and development teams. When I say research and think, depending on the level and type of the task (e.g., architecting, designing, implementing), such activities may take from an hour or so to several days or weeks. The time is greatly reduced when you constantly read and experiment and when the task is implementation (i.e., generating and testing code).

To start I wanted to get the APIs for the http://www.youtypeitwepostit.com sample web site. A simple way is using the wget command. By the way I run my experiments for this chapter on a Windows 10 machine. If you are not familiar or do not recall the tons of options for wget here they are:

C:\>wget --help
GNU Wget 1.20, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
  -V,  --version                   display the version of Wget and exit
  -h,  --help                      print this help
  -b,  --background                go to background after startup
  -e,  --execute=COMMAND           execute a `.wgetrc'-style command

Logging and input file:
  -o,  --output-file=FILE          log messages to FILE
  -a,  --append-output=FILE        append messages to FILE
  -d,  --debug                     print lots of debugging information
  -q,  --quiet                     quiet (no output)
  -v,  --verbose                   be verbose (this is the default)
  -nv, --no-verbose                turn off verboseness, without being quiet
       --report-speed=TYPE         output bandwidth as TYPE.  TYPE can be bits
  -i,  --input-file=FILE           download URLs found in local or external FILE
       --input-metalink=FILE       download files covered in local Metalink FILE
  -F,  --force-html                treat input file as HTML
  -B,  --base=URL                  resolves HTML input-file links (-i -F)
                                     relative to URL
       --config=FILE               specify config file to use
       --no-config                 do not read any config file
       --rejected-log=FILE         log reasons for URL rejection to FILE

Download:
  -t,  --tries=NUMBER              set number of retries to NUMBER (0 unlimits)
       --retry-connrefused         retry even if connection is refused
       --retry-on-http-error=ERRORS    comma-separated list of HTTP errors to retry
  -O,  --output-document=FILE      write documents to FILE
  -nc, --no-clobber                skip downloads that would download to
                                     existing files (overwriting them)
       --no-netrc                  don't try to obtain credentials from .netrc
  -c,  --continue                  resume getting a partially-downloaded file
       --start-pos=OFFSET          start downloading from zero-based position OFFSET
       --progress=TYPE             select progress gauge type
       --show-progress             display the progress bar in any verbosity mode
  -N,  --timestamping              don't re-retrieve files unless newer than
                                     local
       --no-if-modified-since      don't use conditional if-modified-since get
                                     requests in timestamping mode
       --no-use-server-timestamps  don't set the local file's timestamp by
                                     the one on the server
  -S,  --server-response           print server response
       --spider                    don't download anything
  -T,  --timeout=SECONDS           set all timeout values to SECONDS
       --dns-timeout=SECS          set the DNS lookup timeout to SECS
       --connect-timeout=SECS      set the connect timeout to SECS
       --read-timeout=SECS         set the read timeout to SECS
  -w,  --wait=SECONDS              wait SECONDS between retrievals
       --waitretry=SECONDS         wait 1..SECONDS between retries of a retrieval
       --random-wait               wait from 0.5*WAIT...1.5*WAIT secs between retrievals
       --no-proxy                  explicitly turn off proxy
  -Q,  --quota=NUMBER              set retrieval quota to NUMBER
       --bind-address=ADDRESS      bind to ADDRESS (hostname or IP) on local host
       --limit-rate=RATE           limit download rate to RATE
       --no-dns-cache              disable caching DNS lookups
       --restrict-file-names=OS    restrict chars in file names to ones OS allows
       --ignore-case               ignore case when matching files/directories
  -4,  --inet4-only                connect only to IPv4 addresses
  -6,  --inet6-only                connect only to IPv6 addresses
       --prefer-family=FAMILY      connect first to addresses of specified family,
                                     one of IPv6, IPv4, or none
       --user=USER                 set both ftp and http user to USER
       --password=PASS             set both ftp and http password to PASS
       --ask-password              prompt for passwords
       --use-askpass=COMMAND       specify credential handler for requesting
                                     username and password.  If no COMMAND is
                                     specified the WGET_ASKPASS or the SSH_ASKPASS
                                     environment variable is used.
       --no-iri                    turn off IRI support
       --local-encoding=ENC        use ENC as the local encoding for IRIs
       --remote-encoding=ENC       use ENC as the default remote encoding
       --unlink                    remove file before clobber
       --keep-badhash              keep files with checksum mismatch (append .badhash)
       --metalink-index=NUMBER     Metalink application/metalink4+xml metaurl ordinal NUMBER
       --metalink-over-http        use Metalink metadata from HTTP response headers
       --preferred-location        preferred location for Metalink resources

Directories:
  -nd, --no-directories            don't create directories
  -x,  --force-directories         force creation of directories
  -nH, --no-host-directories       don't create host directories
       --protocol-directories      use protocol name in directories
  -P,  --directory-prefix=PREFIX   save files to PREFIX/..
       --cut-dirs=NUMBER           ignore NUMBER remote directory components

HTTP options:
       --http-user=USER            set http user to USER
       --http-password=PASS        set http password to PASS
       --no-cache                  disallow server-cached data
       --default-page=NAME         change the default page name (normally
                                     this is 'index.html'.)
  -E,  --adjust-extension          save HTML/CSS documents with proper extensions
       --ignore-length             ignore 'Content-Length' header field
       --header=STRING             insert STRING among the headers
       --compression=TYPE          choose compression, one of auto, gzip and none. (default: none)
       --max-redirect              maximum redirections allowed per page
       --proxy-user=USER           set USER as proxy username
       --proxy-password=PASS       set PASS as proxy password
       --referer=URL               include 'Referer: URL' header in HTTP request
       --save-headers              save the HTTP headers to file
  -U,  --user-agent=AGENT          identify as AGENT instead of Wget/VERSION
       --no-http-keep-alive        disable HTTP keep-alive (persistent connections)
       --no-cookies                don't use cookies
       --load-cookies=FILE         load cookies from FILE before session
       --save-cookies=FILE         save cookies to FILE after session
       --keep-session-cookies      load and save session (non-permanent) cookies
       --post-data=STRING          use the POST method; send STRING as the data
       --post-file=FILE            use the POST method; send contents of FILE
       --method=HTTPMethod         use method "HTTPMethod" in the request
       --body-data=STRING          send STRING as data. --method MUST be set
       --body-file=FILE            send contents of FILE. --method MUST be set
       --content-disposition       honor the Content-Disposition header when
                                     choosing local file names (EXPERIMENTAL)
       --content-on-error          output the received content on server errors
       --auth-no-challenge         send Basic HTTP authentication information
                                     without first waiting for the server's
                                     challenge

HTTPS (SSL/TLS) options:
       --secure-protocol=PR        choose secure protocol, one of auto, SSLv2,
                                     SSLv3, TLSv1, TLSv1_1, TLSv1_2 and PFS
       --https-only                only follow secure HTTPS links
       --no-check-certificate      don't validate the server's certificate
       --certificate=FILE          client certificate file
       --certificate-type=TYPE     client certificate type, PEM or DER
       --private-key=FILE          private key file
       --private-key-type=TYPE     private key type, PEM or DER
       --ca-certificate=FILE       file with the bundle of CAs
       --ca-directory=DIR          directory where hash list of CAs is stored
       --crl-file=FILE             file with bundle of CRLs
       --pinnedpubkey=FILE/HASHES  Public key (PEM/DER) file, or any number
                                   of base64 encoded sha256 hashes preceded by
                                   'sha256//' and separated by ';', to verify
                                   peer against
       --random-file=FILE          file with random data for seeding the SSL PRNG

       --ciphers=STR           Set the priority string (GnuTLS) or cipher list string (OpenSSL) directly.
                                   Use with care. This option overrides --secure-protocol.
                                   The format and syntax of this string depend on the specific SSL/TLS engine.
HSTS options:
       --no-hsts                   disable HSTS
       --hsts-file                 path of HSTS database (will override default)

FTP options:
       --ftp-user=USER             set ftp user to USER
       --ftp-password=PASS         set ftp password to PASS
       --no-remove-listing         don't remove '.listing' files
       --no-glob                   turn off FTP file name globbing
       --no-passive-ftp            disable the "passive" transfer mode
       --preserve-permissions      preserve remote file permissions
       --retr-symlinks             when recursing, get linked-to files (not dir)

FTPS options:
       --ftps-implicit                 use implicit FTPS (default port is 990)
       --ftps-resume-ssl               resume the SSL/TLS session started in the control connection when
                                         opening a data connection
       --ftps-clear-data-connection    cipher the control channel only; all the data will be in plaintext
       --ftps-fallback-to-ftp          fall back to FTP if FTPS is not supported in the target server
WARC options:
       --warc-file=FILENAME        save request/response data to a .warc.gz file
       --warc-header=STRING        insert STRING into the warcinfo record
       --warc-max-size=NUMBER      set maximum size of WARC files to NUMBER
       --warc-cdx                  write CDX index files
       --warc-dedup=FILENAME       do not store records listed in this CDX file
       --no-warc-compression       do not compress WARC files with GZIP
       --no-warc-digests           do not calculate SHA1 digests
       --no-warc-keep-log          do not store the log file in a WARC record
       --warc-tempdir=DIRECTORY    location for temporary files created by the
                                     WARC writer

Recursive download:
  -r,  --recursive                 specify recursive download
  -l,  --level=NUMBER              maximum recursion depth (inf or 0 for infinite)
       --delete-after              delete files locally after downloading them
  -k,  --convert-links             make links in downloaded HTML or CSS point to
                                     local files
       --convert-file-only         convert the file part of the URLs only (usually known as the basename)
       --backups=N                 before writing file X, rotate up to N backup files
  -K,  --backup-converted          before converting file X, back up as X.orig
  -m,  --mirror                    shortcut for -N -r -l inf --no-remove-listing
  -p,  --page-requisites           get all images, etc. needed to display HTML page
       --strict-comments           turn on strict (SGML) handling of HTML comments

Recursive accept/reject:
  -A,  --accept=LIST               comma-separated list of accepted extensions
  -R,  --reject=LIST               comma-separated list of rejected extensions
       --accept-regex=REGEX        regex matching accepted URLs
       --reject-regex=REGEX        regex matching rejected URLs
       --regex-type=TYPE           regex type (posix|pcre)
  -D,  --domains=LIST              comma-separated list of accepted domains
       --exclude-domains=LIST      comma-separated list of rejected domains
       --follow-ftp                follow FTP links from HTML documents
       --follow-tags=LIST          comma-separated list of followed HTML tags
       --ignore-tags=LIST          comma-separated list of ignored HTML tags
  -H,  --span-hosts                go to foreign hosts when recursive
  -L,  --relative                  follow relative links only
  -I,  --include-directories=LIST  list of allowed directories
       --trust-server-names        use the name specified by the redirection
                                     URL's last component
  -X,  --exclude-directories=LIST  list of excluded directories
  -np, --no-parent                 don't ascend to the parent directory

Email bug reports, questions, discussions to <bug-wget@gnu.org>
and/or open issues at https://savannah.gnu.org/bugs/?func=additem&group=wget.

Based on that, I issued the following wget command and wrote the returned API to a file:

C:\Temp> wget -S --output-document=api.txt http://www.youtypeitwepostit.com/api/
--2019-03-27 10:10:49--  http://www.youtypeitwepostit.com/api/
Resolving www.youtypeitwepostit.com (www.youtypeitwepostit.com)... 35.170.227.83, 34.225.219.245, 34.206.37.239, ...
Connecting to www.youtypeitwepostit.com (www.youtypeitwepostit.com)|35.170.227.83|:80... connected.
HTTP request sent, awaiting response...
  HTTP/1.1 200 OK
  Server: Cowboy
  Connection: keep-alive
  Content-Type: application/json    <====
  Etag: "0af97d8172d203b6a83bbb64ff616b81"
  Last-Modified: Wed, 27 Mar 2019 14:13:43 GMT
  Date: Wed, 27 Mar 2019 15:10:49 GMT
  Transfer-Encoding: chunked
  Via: 1.1 vegur
Length: unspecified [application/json]
Saving to: 'api.txt'

api.txt                                                                        [ <=>

2019-03-27 10:10:50 (17.0 MB/s) - 'api.txt' saved [3590]

The book emphasizes the use of the content-type: application/vnd.collection+json but as you can see in the last console screen capture, the Content-Type is application/json. I did spend some time looking for the specification of this content type on the web. I found the document (https://www.iana.org/assignments/media-types/application/vnd.collection+json) which led me to RFC4627 (https://www.ietf.org/rfc/rfc4627.txt). After reading both I was not able to determine why the test site did not understand the content-type in question as I will demonstrate shortly. Perhaps it was a good idea at the time, but it appears it did not gain traction with web browsers and servers.

Following are the contents of the file generated by the last command:

C:\Temp>type api.txt
{
    "collection" :
    {
        "version" : "1.0",
        "href" : "http://www.youtypeitwepostit.com/api/",

        "items" :
        [
    {
        "href": "http://www.youtypeitwepostit.com/api/8642134473193437",
        "data": [
            {
                "name": "text",
                "value": "Simple"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T14:13:43.226Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/035530898720026016",
        "data": [
            {
                "name": "text",
                "value": "Hello "
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T13:59:41.664Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/24315901892259717",
        "data": [
            {
                "name": "text",
                "value": "cv"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T11:18:44.151Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/21129854791797698",
        "data": [
            {
                "name": "text",
                "value": "test"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T11:33:11.294Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/613856331910938",
        "data": [
            {
                "name": "text",
                "value": "Squidly!"
            },
            {
                "name": "date_posted",
                "value": "2013-03-28T21:51:08.406Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/9628056429792196",
        "data": [
            {
                "name": "text",
                "value": "ju"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T10:41:52.501Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/6797472003381699",
        "data": [
            {
                "name": "text",
                "value": "John"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T14:01:03.392Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/0663905143737793",
        "data": [
            {
                "name": "text",
                "value": "123456"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T14:01:20.029Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/5335018462501466",
        "data": [
            {
                "name": "text",
                "value": "Hello_"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T14:00:32.111Z"
            }
        ]
    },
    {
        "href": "http://www.youtypeitwepostit.com/api/9095603178720921",
        "data": [
            {
                "name": "text",
                "value": "hjjhj"
            },
            {
                "name": "date_posted",
                "value": "2019-03-27T10:37:43.881Z"
            }
        ]
    }
]
        ,

        "template" : {
            "data" : [
                {"prompt" : "Text of message", "name" : "text", "value" : ""}
            ]
        }
    }
}

The format is JSON. It contains a collection of information. Each resource is described in detail. The template section at the end of the JSON document illustrates how to issue a post. The only thing a user of the API needs to do is fill the “value” field. We will give this a try.

Let’s quickly take a look at the options for curl. We will use curl to send a POST request. It might be a good idea to first use your web browser and with the URL provided in the API document experiment with the site. It is quite trivial by design. You can only post a text message up to 6 characters.

C:\Temp>curl --help
Usage: curl [options...] <url>
     --abstract-unix-socket <path> Connect via abstract Unix domain socket
     --anyauth       Pick any authentication method
 -a, --append        Append to target file when uploading
     --basic         Use HTTP Basic Authentication
     --cacert <CA certificate> CA certificate to verify peer against
     --capath <dir>  CA directory to verify peer against
 -E, --cert <certificate[:password]> Client certificate file and password
     --cert-status   Verify the status of the server certificate
     --cert-type <type> Certificate file type (DER/PEM/ENG)
     --ciphers <list of ciphers> SSL ciphers to use
     --compressed    Request compressed response
 -K, --config <file> Read config from a file
     --connect-timeout <seconds> Maximum time allowed for connection
     --connect-to <HOST1:PORT1:HOST2:PORT2> Connect to host
 -C, --continue-at <offset> Resumed transfer offset
 -b, --cookie <data> Send cookies from string/file
 -c, --cookie-jar <filename> Write cookies to <filename> after operation
     --create-dirs   Create necessary local directory hierarchy
     --crlf          Convert LF to CRLF in upload
     --crlfile <file> Get a CRL list in PEM format from the given file
 -d, --data <data>   HTTP POST data
     --data-ascii <data> HTTP POST ASCII data
     --data-binary <data> HTTP POST binary data
     --data-raw <data> HTTP POST data, '@' allowed
     --data-urlencode <data> HTTP POST data url encoded
     --delegation <LEVEL> GSS-API delegation permission
     --digest        Use HTTP Digest Authentication
 -q, --disable       Disable .curlrc
     --disable-eprt  Inhibit using EPRT or LPRT
     --disable-epsv  Inhibit using EPSV
     --dns-interface <interface> Interface to use for DNS requests
     --dns-ipv4-addr 

<address> IPv4 address to use for DNS requests
     --dns-ipv6-addr 

<address> IPv6 address to use for DNS requests
     --dns-servers <addresses> DNS server addrs to use
 -D, --dump-header <filename> Write the received headers to <filename>
     --egd-file <file> EGD socket path for random data
     --engine <name> Crypto engine to use
     --expect100-timeout <seconds> How long to wait for 100-continue
 -f, --fail          Fail silently (no output at all) on HTTP errors
     --fail-early    Fail on first transfer error, do not continue
     --false-start   Enable TLS False Start
 -F, --form <name=content> Specify HTTP multipart POST data
     --form-string <name=string> Specify HTTP multipart POST data
     --ftp-account <data> Account data string
     --ftp-alternative-to-user <command> String to replace USER [name]
     --ftp-create-dirs Create the remote dirs if not present
     --ftp-method <method> Control CWD usage
     --ftp-pasv      Use PASV/EPSV instead of PORT
 -P, --ftp-port 

<address> Use PORT instead of PASV
     --ftp-pret      Send PRET before PASV
     --ftp-skip-pasv-ip Skip the IP address for PASV
     --ftp-ssl-ccc   Send CCC after authenticating
     --ftp-ssl-ccc-mode <active/passive> Set CCC mode
     --ftp-ssl-control Require SSL/TLS for FTP login, clear for transfer
 -G, --get           Put the post data in the URL and use GET
 -g, --globoff       Disable URL sequences and ranges using {} and []
 -I, --head          Show document info only
 -H, --header <header/@file> Pass custom header(s) to server
 -h, --help          This help text
     --hostpubmd5 <md5> Acceptable MD5 hash of the host public key
 -0, --http1.0       Use HTTP 1.0
     --http1.1       Use HTTP 1.1
     --http2         Use HTTP 2
     --http2-prior-knowledge Use HTTP 2 without HTTP/1.1 Upgrade
     --ignore-content-length Ignore the size of the remote resource
 -i, --include       Include protocol response headers in the output
 -k, --insecure      Allow insecure server connections when using SSL
     --interface <name> Use network INTERFACE (or address)
 -4, --ipv4          Resolve names to IPv4 addresses
 -6, --ipv6          Resolve names to IPv6 addresses
 -j, --junk-session-cookies Ignore session cookies read from file
     --keepalive-time <seconds> Interval time for keepalive probes
     --key <key>     Private key file name
     --key-type <type> Private key file type (DER/PEM/ENG)
     --krb <level>   Enable Kerberos with security <level>
     --libcurl <file> Dump libcurl equivalent code of this command line
     --limit-rate <speed> Limit transfer speed to RATE
 -l, --list-only     List only mode
     --local-port <num/range> Force use of RANGE for local port numbers
 -L, --location      Follow redirects
     --location-trusted Like --location, and send auth to other hosts
     --login-options<options> Server login options
     --mail-auth 

<address> Originator address of the original email
     --mail-from 

<address> Mail from this address
     --mail-rcpt 

<address> Mail from this address
 -M, --manual        Display the full manual
     --max-filesize <bytes> Maximum file size to download
     --max-redirs <num> Maximum number of redirects allowed
 -m, --max-time <time> Maximum time allowed for the transfer
     --metalink      Process given URLs as metalink XML file
     --negotiate     Use HTTP Negotiate (SPNEGO) authentication
 -n, --netrc         Must read .netrc for user name and password
     --netrc-file <filename> Specify FILE for netrc
     --netrc-optional Use either .netrc or URL
 -:, --next          Make next URL use its separate set of options
     --no-alpn       Disable the ALPN TLS extension
 -N, --no-buffer     Disable buffering of the output stream
     --no-keepalive  Disable TCP keepalive on the connection
     --no-npn        Disable the NPN TLS extension
     --no-sessionid  Disable SSL session-ID reusing
     --noproxy <no-proxy-list> List of hosts which do not use proxy
     --ntlm          Use HTTP NTLM authentication
     --ntlm-wb       Use HTTP NTLM authentication with winbind
     --oauth2-bearer <token> OAuth 2 Bearer Token
 -o, --output <file> Write to file instead of stdout
     --pass <phrase> Pass phrase for the private key
     --path-as-is    Do not squash .. sequences in URL path
     --pinnedpubkey <hashes> FILE/HASHES Public key to verify peer against
     --post301       Do not switch to GET after following a 301
     --post302       Do not switch to GET after following a 302
     --post303       Do not switch to GET after following a 303
     --preproxy [protocol://]host[:port] Use this proxy first
 -#, --progress-bar  Display transfer progress as a bar
     --proto <protocols> Enable/disable PROTOCOLS
     --proto-default <protocol> Use PROTOCOL for any URL missing a scheme
     --proto-redir <protocols> Enable/disable PROTOCOLS on redirect
 -x, --proxy [protocol://]host[:port] Use this proxy
     --proxy-anyauth Pick any proxy authentication method
     --proxy-basic   Use Basic authentication on the proxy
     --proxy-cacert <file> CA certificate to verify peer against for proxy
     --proxy-capath <dir> CA directory to verify peer against for proxy
     --proxy-cert <cert[:passwd]> Set client certificate for proxy
     --proxy-cert-type <type> Client certificate type for HTTS proxy
     --proxy-ciphers <list> SSL ciphers to use for proxy
     --proxy-crlfile <file> Set a CRL list for proxy
     --proxy-digest  Use Digest authentication on the proxy
     --proxy-header <header/@file> Pass custom header(s) to proxy
     --proxy-insecure Do HTTPS proxy connections without verifying the proxy
     --proxy-key <key> Private key for HTTPS proxy
     --proxy-key-type <type> Private key file type for proxy
     --proxy-negotiate Use HTTP Negotiate (SPNEGO) authentication on the proxy
     --proxy-ntlm    Use NTLM authentication on the proxy
     --proxy-pass <phrase> Pass phrase for the private key for HTTPS proxy
     --proxy-service-name <name> SPNEGO proxy service name
     --proxy-ssl-allow-beast Allow security flaw for interop for HTTPS proxy
     --proxy-tlsauthtype <type> TLS authentication type for HTTPS proxy
     --proxy-tlspassword <string> TLS password for HTTPS proxy
     --proxy-tlsuser <name> TLS username for HTTPS proxy
     --proxy-tlsv1   Use TLSv1 for HTTPS proxy
 -U, --proxy-user <user:password> Proxy user and password
     --proxy1.0 <host[:port]> Use HTTP/1.0 proxy on given port
 -p, --proxytunnel   Operate through a HTTP proxy tunnel (using CONNECT)
     --pubkey <key>  SSH Public key file name
 -Q, --quote         Send command(s) to server before transfer
     --random-file <file> File for reading random data from
 -r, --range <range> Retrieve only the bytes within RANGE
     --raw           Do HTTP "raw"; no transfer decoding
 -e, --referer <URL> Referrer URL
 -J, --remote-header-name Use the header-provided filename
 -O, --remote-name   Write output to a file named as the remote file
     --remote-name-all Use the remote file name for all URLs
 -R, --remote-time   Set the remote file's time on the local output
 -X, --request <command> Specify request command to use
     --request-target Specify the target for this request
     --resolve <host:port:address> Resolve the host+port to this address
     --retry <num>   Retry request if transient problems occur
     --retry-connrefused Retry on connection refused (use with --retry)
     --retry-delay <seconds> Wait time between retries
     --retry-max-time <seconds> Retry only within this period
     --sasl-ir       Enable initial response in SASL authentication
     --service-name <name> SPNEGO service name
 -S, --show-error    Show error even when -s is used
 -s, --silent        Silent mode
     --socks4 <host[:port]> SOCKS4 proxy on given host + port
     --socks4a <host[:port]> SOCKS4a proxy on given host + port
     --socks5 <host[:port]> SOCKS5 proxy on given host + port
     --socks5-basic  Enable username/password auth for SOCKS5 proxies
     --socks5-gssapi Enable GSS-API auth for SOCKS5 proxies
     --socks5-gssapi-nec Compatibility with NEC SOCKS5 server
     --socks5-gssapi-service <name> SOCKS5 proxy service name for GSS-API
     --socks5-hostname <host[:port]> SOCKS5 proxy, pass host name to proxy
 -Y, --speed-limit <speed> Stop transfers slower than this
 -y, --speed-time <seconds> Trigger 'speed-limit' abort after this time
     --ssl           Try SSL/TLS
     --ssl-allow-beast Allow security flaw to improve interop
     --ssl-no-revoke Disable cert revocation checks (WinSSL)
     --ssl-reqd      Require SSL/TLS
 -2, --sslv2         Use SSLv2
 -3, --sslv3         Use SSLv3
     --stderr        Where to redirect stderr
     --suppress-connect-headers Suppress proxy CONNECT response headers
     --tcp-fastopen  Use TCP Fast Open
     --tcp-nodelay   Use the TCP_NODELAY option
 -t, --telnet-option <opt=val> Set telnet option
     --tftp-blksize <value> Set TFTP BLKSIZE option
     --tftp-no-options Do not send any TFTP options
 -z, --time-cond <time> Transfer based on a time condition
     --tls-max <VERSION> Use TLSv1.0 or greater
     --tlsauthtype <type> TLS authentication type
     --tlspassword   TLS password
     --tlsuser <name> TLS user name
 -1, --tlsv1         Use TLSv1.0 or greater
     --tlsv1.0       Use TLSv1.0
     --tlsv1.1       Use TLSv1.1
     --tlsv1.2       Use TLSv1.2
     --tlsv1.3       Use TLSv1.3
     --tr-encoding   Request compressed transfer encoding
     --trace <file>  Write a debug trace to FILE
     --trace-ascii <file> Like --trace, but without hex output
     --trace-time    Add time stamps to trace/verbose output
     --unix-socket <path> Connect through this Unix domain socket
 -T, --upload-file <file> Transfer local FILE to destination
     --url <url>     URL to work with
 -B, --use-ascii     Use ASCII/text transfer
 -u, --user <user:password> Server user and password
 -A, --user-agent <name> Send User-Agent <name> to server
 -v, --verbose       Make the operation more talkative
 -V, --version       Show version number and quit
 -w, --write-out <format> Use output FORMAT after completion
     --xattr         Store metadata in extended file attributes

As previously described, we need to populate a template with our extremely small post. The JSON document for our first POST request follows:

{
   "template" : {
        "data" :    [
                        {"prompt" : "Text of message", "name" : "text", "value" : "JC_000"}
                    ]
    }
}

Now we can use curl to POST our entry.

# **** failure ****
C:\Temp> curl http://www.youtypeitwepostit.com -H "Content-Type: application/vnd.collection+json" -X POST -d @api_post.txt


<h1>Method Not Allowed

<h1>

# **** failure ****
C:\Temp> curl http://www.youtypeitwepostit.com -H "Content-Type: application/json" -X POST -d @api_post.txt


<h1>Method Not Allowed

<h1>

# **** success [1] ****
C:\Temp> curl http://www.youtypeitwepostit.com/api/ -H "Content-Type: application/vnd.collection+json" -X POST -d @api_post.txt

# **** success [2]****
C:\Temp> curl http://www.youtypeitwepostit.com/api/ ^
-H "Content-Type: application/vnd.collection+json" ^
-X POST ^
-d @api_post.txt

# **** success [3] ****
C:\Temp>curl http://www.youtypeitwepostit.com/api/ ^
More? -H "Content-Type: application/vnd.collection+json" ^
More? -X POST ^
More? -d  "{ \"template\" : { \"data\" : [ {\"prompt\" : \"Text of message\", \"name\" : \"text\", \"value\" : \"JC_004\"} ] } }" ^
More? -vvv
Note: Unnecessary use of -X or --request, POST is already inferred.
*   Trying 54.152.111.238...
* TCP_NODELAY set
* Connected to www.youtypeitwepostit.com (54.152.111.238) port 80 (#0)
> POST /api/ HTTP/1.1
> Host: www.youtypeitwepostit.com
> User-Agent: curl/7.55.1
> Accept: */*
> Content-Type: application/vnd.collection+json
> Content-Length: 101
>
* upload completely sent off: 101 out of 101 bytes
< HTTP/1.1 201 Created
< Server: Cowboy
< Connection: keep-alive
< Location: http://www.youtypeitwepostit.com/api/6395999125670642   <====
< Date: Thu, 28 Mar 2019 10:56:07 GMT
< Transfer-Encoding: chunked
< Via: 1.1 vegur
<
* Connection #0 to host www.youtypeitwepostit.com left intact

The first two attempts fail. The reason is that we used the main URL. The POST request needs to be made at the API. This is illustrated in the rest of the attempts. On the third attempt we use the api_post.txt file and the proper URL. That worked.

We then split the same command into four consecutive lines. Note that in Windows you need to use the ‘^’ (caret) to continue a line. In Linux you use a ‘\’ (backslash). It would be nice if Windows would standardize on Linux which looks like UNIX and was on the market a decade or so before. As another unrelated example, in windows you use “cls” to clear the screen. On Linus you use “clear”.

On the next attempt we eliminate the file and write the body of the request in the command line. We also use multiple lines. Note that this time we need to escape all the ‘”’ (double quote) characters and enclose the JSON in double quotes.

I tried several different messages by using different numbers in the “JC_xxx” string. Note that in the last attempt the resource is located at the URL flagged with an arrow. Let’s see if we can retrieve that last post.

C:\Temp> curl http://www.youtypeitwepostit.com/api/6395999125670642 -vvv
*   Trying 52.71.195.70...
* TCP_NODELAY set
* Connected to www.youtypeitwepostit.com (52.71.195.70) port 80 (#0)
> GET /api/6395999125670642 HTTP/1.1
> Host: www.youtypeitwepostit.com
> User-Agent: curl/7.55.1
> Accept: */*
>
< HTTP/1.1 200 OK
< Server: Cowboy
< Connection: keep-alive
< Content-Type: application/json
< Etag: "f2e2f7f016542392aef14105108693e1"
< Last-Modified: Thu, 28 Mar 2019 10:56:07 GMT
< Date: Thu, 28 Mar 2019 10:59:02 GMT
< Transfer-Encoding: chunked
< Via: 1.1 vegur
<
{
    "collection" :
    {
        "version" : "1.0",
        "href" : "http://www.youtypeitwepostit.com/api/",

        "items" :
        [{
    "href": "http://www.youtypeitwepostit.com/api/6395999125670642",
    "data": [
        {
            "name": "text",
            "value": "JC_004"
        },
        {
            "name": "date_posted",
            "value": "2019-03-28T10:56:07.280Z"
        }
    ]
}]
        ,

        "template" : {
            "data" : [
                {"prompt" : "Text of message", "name" : "text", "value" : ""}
            ]
        }
    }
}
* Connection #0 to host www.youtypeitwepostit.com left intact

We used curl on a GET request (default) and were able to retrieve a JSON file with the text “JC_004”.

In a nutshell, we were able to get in a JSON file the URLs of the different resources in the site. We were able to get individual posts. Most important we found how to make a POST and GET requests using the API provided. It does not take much to realize that we could write a simple program to get the format and make POST and GET requests.

I will not put the examples in this post in GitHub. I believe it is quite easy to copy, edit and paste the text if you wish to experiment.

Hope you enjoyed this post. If you have comments or questions regarding this or any other post in this blog, or if you need help with a software development project, please leave me a note below. All requests for help will be kept confidential.

Keep on reading and experimenting. It is the best way to learn.

John

Follow me on Twitter:  @john_canessa

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.