Hi to all
i want to grab all files belongs to website?
i think it is possible with the help of URL,URI classes.
I can download a file form site using URL class.
but i couldn't find out what are the files under the site.
if Any one know how to extract files names under web directory ,share yours with me
Every page is a file... xx.html
i know the site name but i dont know what files are in that site.if i know all files paths i can download entire site.
For security reasons, all files on a webserver/website arent accessible to the general public unless made so by a webmaster admin. An easy way of displaying all the files on a webserver is by simply not having an index page.
Thanks a lot tommosimmo
In earlier ,i have tried to copy the site by accessing file names of server.But by your command i accept that it is not possible.
Now my try is to access all file name via index.html
do u know how to parse html to find out the links available in site?
Search on regular expressions.
You can use "http://" or "www" to find the links.
you know one thing index files contain links as a relative path so it difficult to trace it by regular expression as www and http:
You can also do it with <a href=", that will work.
Now i am tring a way as you told here...