Results 1 to 4 of 4
  1. #1
    Juuno is offline Member
    Join Date
    Feb 2009
    Posts
    18
    Rep Power
    0

    Default Fetch files over web server

    Hi, I am just a newbie and I need to write some programs.

    I have one directory and under that directory, I have 35 xml files in the Apache Web Server. The path for the directory is like //localhost:8081/dirxml/

    And path for the xml files are like..
    //localhost:8081/dirxml/mycontacts.xml
    //localhost:8081/dirxml/favourtieactor.xml
    //localhost:8081/dirxml/favouritesinger.xml
    ....
    ....
    .... and so on....


    I would like to fetch them from my java program and then produce an xml file which is a combination of those xml files.

    So, how can I fetch those xml files from web server simultaneously? Is it ok if I use java.net.URL class. But what I found so far is it needs to give the exact URL path like //localhost:8081/dirxml/mycontacts.xml?? But I can't do it coz I don't want to give those xml file names and fetches. What I want to do is to fetch the files under that directory without knowing the file name. It is like to fetch all the files under that directory which ends with .xml

    How can I do this? And is there any reference for it.

    Thanks in advance.

  2. #2
    fishtoprecords's Avatar
    fishtoprecords is offline Senior Member
    Join Date
    Jun 2008
    Posts
    571
    Rep Power
    7

    Default

    You really want to use a Java library that lets your application emulate a web browser. While you can do it all by hand using URL classes, its fairly hard to get all the special cases to work.

    Get the Apache java HTTP client code, use it.

  3. #3
    Steve11235's Avatar
    Steve11235 is offline Senior Member
    Join Date
    Dec 2008
    Posts
    1,046
    Rep Power
    7

    Default

    FTR is pointing you in the right direction. Java will do a lot of the work for you, but getting directory listings from an HTTP server tends to be server dependent.

    Also, make sure the server allows directory listing against that directory. I'm guessing the default is to disallow that function, as a security measure.

  4. #4
    graveyard220 is offline Member
    Join Date
    Feb 2009
    Posts
    1
    Rep Power
    0

    Default this is what you want!?

    there is no simple way to do that because this is not a real path for a file system so this will be some huge code because you want to parse an html resulted from specified url and get the resulted inline files,
    but fortunately apache did that effort and make a package called ivr this have more utilities to do such that things,
    so,
    you can download the binary from this location:
    ht tp:// ant. apache .org/ivy /do wnload. cgi

    and then use it as simple as:
    Java Code:
    package networkanddatabase;
    
    import java.io.BufferedInputStream;
    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.net.URL;
    import java.net.URLConnection;
    import java.util.Iterator;
    import java.util.List;
    
    import org.apache.ivy.util.url.ApacheURLLister;
    
    public class FetchFilesFromHttpURL {
    	public static void main(String[] args) {
    		URL url;
    		try {
    			url = new URL(youre url);
    			File destFolder = new File("c:\\test");
    			ApacheURLLister lister = new ApacheURLLister();
    			// this list of URLs objects
    			List files = lister.listAll(url);
    			System.out.println("list file is complete.."+files);
    			for (Iterator iter = files.iterator(); iter.hasNext();) {
    				URL fileUrl = (URL) iter.next();
    				httpFileDownload(fileUrl, destFolder);
    			}
    			System.out.println("download is complete..");
    		} catch (Exception e) {
    			e.printStackTrace();
    		}
    
    	}
    
    	public static void httpFileDownload(URL url, File destFolder) throws Exception {
    		File destination = new File(destFolder, url.getFile());
    		destination.getParentFile().mkdirs();
    		BufferedInputStream bis = null;
    		BufferedOutputStream bos = null;
    		try {
    			URLConnection urlc = url.openConnection();
    
    			bis = new BufferedInputStream(urlc.getInputStream());
    			bos = new BufferedOutputStream(new FileOutputStream(destination.getPath()));
    
    			int i;
    			while ((i = bis.read()) != -1) {
    				bos.write(i);
    			}
    		} finally {
    			if (bis != null)
    				try {
    					bis.close();
    				} catch (IOException ioe) {
    					ioe.printStackTrace();
    				}
    			if (bos != null)
    				try {
    					bos.close();
    				} catch (IOException ioe) {
    					ioe.printStackTrace();
    				}
    		}
    	}
    }
    this is all thing that I can do..

    salam :cool:

Similar Threads

  1. I want to fetch last 7 days records in java
    By rasikow in forum New To Java
    Replies: 3
    Last Post: 12-05-2008, 01:34 PM
  2. Fetch Webpage - not working
    By jodyflorian in forum New To Java
    Replies: 5
    Last Post: 10-10-2008, 03:49 PM
  3. how to send .jar files client to server
    By gobinathm in forum Networking
    Replies: 1
    Last Post: 12-25-2007, 05:05 AM
  4. sending jar files from client to server?
    By gobinathm in forum New To Java
    Replies: 2
    Last Post: 11-13-2007, 06:12 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •