GIR Logo

Arctic Region Supercomputing Center

Common Errors

OGSA-DAI: The backend is having trouble due to a communication error with ...?WSDL
Issue: The OGSA-DAI backend is not finding the WSDL file. (An example of this error proper is the following: uk.org.ogsadai.client.toolkit.exception.ServiceCommsException: A problem arose during communication with service http://snowy.arsc.alaska.edu:8080/axis/services/ogsadai/Gov2264?WSDL.)

Solution: Go to the URL provided with the ?WSDL attached. If the WSDL file shows the XML, then reload the servlet and try again. It is likely, however, that there is an error somewhere in Axis causing the WSDL to not be properly found. The URL should give you further information on why this is. (I had this same issue with a missing class file. A no longer used service was not undeployed properly and Axis could not find its jar files.)p

Lemur: No .key file being generated with BuildIndex
Issue: When a correct parameter file is submitted via command ./BuildIndex parameterFile, the appropriate .key file is not being generated, making it impossible to search the Lemur index.

Solution: Still in progress.

Axis: faultString: org.xml.sax.SAXParseException: XML document structures must start and end within the same entity.
Issue: Assuming that this error is being generated by a Client where the objects are seralized via beans, this error means that the XML file is wrapping too many objects and is not being completed before being sent over the wire.

Solution: Reduce the number of results sent over the wire from Service.

Hadoop: Exceptions are not being thrown/no output is being generated.
Issue: Hadoop is not throwing exceptions, even if there is a planted error. OR, no output is being generated by Hadoop and the run time is very short.

Solution: Try adding more (even if they are identical) services to the XML input file. I had this error when I tried to reduce my Hadoop down to one service, it would not actually generate any map objects and therefore would run fast and not produce planted errors.

Hadoop: The Reduce phase of the program is full of null objects.
Issue: The map phase of Hadoop is receiving objects and sending them to the output collector, but when the Reduce phase begins all the objects are null and/or empty-instance objects.

Solution: Between map() and reduce() all information is written to a SequenceFile. The readFields() and write() functions of your Writable object will write the data to the file and read it back in--so they are malfunctioning in some way.

Hadoop: The first <service> XML is not being read by ServiceRecordReader.
Issue: A file containing only one XML service is not being read, or the service needs to be declared twice in the XML to be used.

Solution: Actually, the second service inputs aren't read. I've put the XML files one-at-a-time to combat that.

ARSC UAF

Arctic Region Supercomputing Center
PO Box 756020, Fairbanks, AK 99775

© Arctic Region Supercomputing Center 2006-2008. This page was last updated on 15 August 2008.
These files are part of a portfolio for Kylie McCormick's online resume. See the disclaimer for more information.