I am using the following code:
Java Code:
for(int r = 0;r<joe.size();r++)
{
           if(joe.get(r).url!=null&&joe.get(r).url.trim()!=""&&joe.get(r).url.trim().contains("http"))
            {
               java.awt.Image img = java.awt.Toolkit.getDefaultToolkit().createImage(new URL(joe.get(r).url));
					comeonwork.add(new workdamnit(img,r));
					}
					}
					for(workdamnit imgin : comeonwork)
					{
					Image img = imgin.joe;
            if(img==null)
				continue;
				System.out.println(img.getWidth(null) + "   " + img.getHeight(null));
		while(img.getWidth(null)==-1)
		{
		
		}

BufferedImage bi = new BufferedImage(img.getWidth(null),img.getHeight(null), java.awt.image.BufferedImage.TYPE_INT_RGB );
Graphics2D g2 = bi.createGraphics();
// Draw img into bi so we can write it to file.
g2.drawImage(img, 0, 0, null);
g2.dispose();
// Now bi contains the img.
// Note: img may have transparent pixels in it; if so, okay.
// If not and you can use TYPE_INT_RGB you will get better
// performance with it in the jvm.
ImageIO.write(bi, "jpg", new File("Images\\" + joe.get(imgin.r).pair + ".jpg"));
}
There is some code in there that isnt very clear, however I dont believe that it is in need of explanation as it is irrelavent to the solution. If you need to though, I could upload the full code. For some reason, some of the time this code puts good images, and others it puts black images of the right width and height. I am reading a url and I want a hard copy of the image at the url. If you know a better way of doing this, that would be great. It would be especially great if you knew a way to make it say basically "Take a url in, if its valid then make an image out of it and save it, otherwise skip it."

The workdamnit class is as follows:
Java Code:
static class workdamnit
{
public Image joe;
public int r;
public workdamnit(Image joe2, int r2)
{
r=r2;
joe=joe2;
}
}
Thanks for the help! Sorry if its a little unclear, I can clarify any questions.