Question on running the tests in the test suite

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Question on running the tests in the test suite

Ed Day

We are trying to run tests in the test suite with our XBinder data
binding tool and are struggling to determine what constitutes a
successful test.  Note that our tool is primarily an XML schema binding
tool and does not have the capability to handle the WSDL files as
defined in the suite at this time.  However, it appears that you have
provided XSD files and XML instances for all tests which is what we are
trying to use.

Where we are running into difficulty is comparing the XML instances we
generate with the original input instances.  Our process is to create a
read/write driver with our data binding tool for each test schema.  This
driver reads in an instance, decodes it into data variables, and then
re-encodes to form a new instance.  We then compare the input to the
output.  The problem is that what comes out does not match what goes in
for a variety of reasons - mostly because the input instances contain
extra namespace declarations that are not needed and therefore dropped
when the data is re-encoded. Therefore, the output instances are
equivalent to the input instances, but don't match exactly.

So the question is how to compare input with output?  Do the patterns
you have specified help with this in any way?  Or is there some kind of
XML differencing tool that you can recommend that can cope with
differences like this? (we have not been able to find any).  Or are we
totally off base as to how to run the tests?

Regards,

--
Ed Day
Objective Systems, Inc.
http://www.obj-sys.com



Reply | Threaded
Open this post in threaded view
|

RE: Question on running the tests in the test suite

paul.downey

Hi Ed,

We use a Java tool to compare the received XML infoset with
the one sent at the infoset level, the tool is aware of types
such as floating point - the source is available
from the report directory:

http://www.w3.org/2002/ws/databinding/edcopy/report/

Obviously we'd love to add the results of running your tool to
our interoperability reports - Yves is a good direct contact for the
comparison tool.

Regards,
Paul


-----Original Message-----
From: [hidden email] on behalf of Ed Day
Sent: Tue 7/3/2007 4:21 PM
To: [hidden email]
Subject: Question on running the tests in the test suite
 

We are trying to run tests in the test suite with our XBinder data
binding tool and are struggling to determine what constitutes a
successful test.  Note that our tool is primarily an XML schema binding
tool and does not have the capability to handle the WSDL files as
defined in the suite at this time.  However, it appears that you have
provided XSD files and XML instances for all tests which is what we are
trying to use.

Where we are running into difficulty is comparing the XML instances we
generate with the original input instances.  Our process is to create a
read/write driver with our data binding tool for each test schema.  This
driver reads in an instance, decodes it into data variables, and then
re-encodes to form a new instance.  We then compare the input to the
output.  The problem is that what comes out does not match what goes in
for a variety of reasons - mostly because the input instances contain
extra namespace declarations that are not needed and therefore dropped
when the data is re-encoded. Therefore, the output instances are
equivalent to the input instances, but don't match exactly.

So the question is how to compare input with output?  Do the patterns
you have specified help with this in any way?  Or is there some kind of
XML differencing tool that you can recommend that can cope with
differences like this? (we have not been able to find any).  Or are we
totally off base as to how to run the tests?

Regards,

--
Ed Day
Objective Systems, Inc.
http://www.obj-sys.com





Reply | Threaded
Open this post in threaded view
|

RE: Question on running the tests in the test suite

Yves Lafon

On Tue, 3 Jul 2007, [hidden email] wrote:

>
> Hi Ed,
>
> We use a Java tool to compare the received XML infoset with
> the one sent at the infoset level, the tool is aware of types
> such as floating point - the source is available
> from the report directory:

Just to clarify, we are not using Java floats to compare floats, but
double, to avoid Java-specific rounding errors. So the tools knows the
type it needs to compare but tries to avoid Java specific type definition
limitations.
Cheers,

>
> http://www.w3.org/2002/ws/databinding/edcopy/report/
>
> Obviously we'd love to add the results of running your tool to
> our interoperability reports - Yves is a good direct contact for the
> comparison tool.
>
> Regards,
> Paul
>
>
> -----Original Message-----
> From: [hidden email] on behalf of Ed Day
> Sent: Tue 7/3/2007 4:21 PM
> To: [hidden email]
> Subject: Question on running the tests in the test suite
>
>
> We are trying to run tests in the test suite with our XBinder data
> binding tool and are struggling to determine what constitutes a
> successful test.  Note that our tool is primarily an XML schema binding
> tool and does not have the capability to handle the WSDL files as
> defined in the suite at this time.  However, it appears that you have
> provided XSD files and XML instances for all tests which is what we are
> trying to use.
>
> Where we are running into difficulty is comparing the XML instances we
> generate with the original input instances.  Our process is to create a
> read/write driver with our data binding tool for each test schema.  This
> driver reads in an instance, decodes it into data variables, and then
> re-encodes to form a new instance.  We then compare the input to the
> output.  The problem is that what comes out does not match what goes in
> for a variety of reasons - mostly because the input instances contain
> extra namespace declarations that are not needed and therefore dropped
> when the data is re-encoded. Therefore, the output instances are
> equivalent to the input instances, but don't match exactly.
>
> So the question is how to compare input with output?  Do the patterns
> you have specified help with this in any way?  Or is there some kind of
> XML differencing tool that you can recommend that can cope with
> differences like this? (we have not been able to find any).  Or are we
> totally off base as to how to run the tests?
>
> Regards,
>
>

--
Baroula que barouleras, au tiƩu toujou t'entourneras.

         ~~Yves