[EmotionML] implementation release and feedbacks

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

[EmotionML] implementation release and feedbacks

Alexandre Denis
Hello all,
I'm happy to announce that we released the very first version of our EmotionML Java implementation. It is hosted on google code and released under the MIT license: https://code.google.com/p/loria-synalp-emotionml/
It is still considered as an alpha version, we would need some users to validate its use. And there is still some work on the documentation but the core of the code is there.

If we could be listed as an implementation in the next round of the implementation report it would be nice. Here is the description:

Alexandre Denis, LORIA laboratory, SYNALP team, France
The LORIA/SYNALP implementation of EmotionML is a Java standalone library developed in the context of the ITEA Empathic Products project by the LORIA/SYNALP team. It enables to import Java objects from EmotionML XML files and export them to EmotionML as well. It guarantees standard compliance by performing a two steps validation after all export operations and before all import operations: first the EmotionML schema is tested, then all EmotionML assertions are tested. If one or the other fails, an error message is produced and the document cannot be imported or exported. The library contains a corpus of badly formatted EmotionML files that enables to double check if both the schema and the assertions manage to correctly invalidate them. The API is hosted on google code (https://code.google.com/p/loria-synalp-emotionml/) and is released under the MIT License.


Moreover I don't come to you with empty hands, and I have a bunch of remarks related to the EmotionML specification. Sorry to give you more work!
best regards,
Alexandre Denis


*** Comments about EmotionML specification

In what follows:
- "specification" refers to the document at http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April 2013)
- "assertions" refers to the list of assertions at http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class


** Specification clarification questions
- About relative and absolute timing ?
- Is that possible to mix relative and absolute timing ? Intuitively this would seem weird but nothing in the 
specification prevents it.

- About consistency of start/end/duration ? 
- I think the specification does not enforce the consistency of start, end and duration which are
possible alltogether. Hence it is possible to have inconsistent triplets (start=0, end=5, duration=10).
- About text nodes ?
- the emotion element can have text nodes children, it is not specified how many. Is it possible to intersperse text nodes all over
an emotion element ? The fact that an emotion element can have text children is not specified in its children list.
- About emotion children combinations ?
- the specification states "There are no constraints on the combinations of children that are allowed.", it is maybe confusing since
an emotion cannot contain two categories that belong to different category-sets or two categories with the same name.
- About default values ?
- some attributes have default values (reference role, time ref anchor point, duration, etc.), is it desirable to have a default 
value also for other attributes, especially for the "value" attribute ? For instance, how would you compare <category name="surprise"/> 
and <category name="surprise" value="1.0"/> ? Are they semantically equivalent ? A similar question could be made about the "confidence"
attribute, how would you compare <category name="surprise"/> and <category name="surprise" confidence="1.0"/> ?
- About the number of <trace> ? 
- the specification does not state clearly if it is possible to have several <trace> elements inside a descriptor, it is stated
"a <trace> element". Maybe it should be stated "If present the following child element can occur one or more time: <trace>". 
The schema allows that. If this comment is accepted, the assertions 215, 224, 235, 245 should also be clarified.
- About conformance ?
- In section 4.3, it is stated "It is the responsibility of an EmotionML processor to verify that the use of descriptor names and values 
is consistent with the vocabulary definition", which is true but incomplete with regards to the assertions, 
maybe it would be beneficial to specify all the assertions that are not under the schema responsability but rather the EmotionML processor 
(see below) or at least warn that there are many assertions not checked by the schema.


** Discrepancies between schema/assertions/specification
- Assertions not tested by the schema
- I found that the following assertions are not tested by the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212, 
216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.
There are assertions that are impossible to test with a XSD schema I think:
114, 117, 120, 123, 161, 164, 167, 170 : vocabulary set id and type checking
212, 222, 232, 242 : vocabulary name membership
417 : media type (unless enumerating them)
Some may be possible with some tweaking:
210, 220, 230, 240 : vocabulary set presence
216, 224, 236, 246 : <trace> and "value"
There are two "true" errors I think:
172 : The "version" attribute of <emotion>, if present, MUST have the  value "1.0"
I think it should not be "optional with default value 1.0" but rather "optional with fixed value 1.0"
410 : The <reference> element MUST contain a "uri" attribute
the "uri" attribute is optional by default in the schema

- 2.4.1, "The end value MUST be greater than or equal to the start value", 
- the schema does not check it and there is no assertion enforcing it

- 2.1.2, "a typical use case is expected to be embedding an <emotion> into some other markup", 
- there is no assertion that describe that <emotion> may be embedded in another markup, does it imply we could embed other elements ?
- is a document containing a sole <emotion> a valid document (not in the sense of <emotionml> document) ? If yes, maybe an assertion clarifiying the use of <emotion> would be useful.

- assertions 105, 155, 601, 606, status "Req=N"
- the assertions mix the presence of <info> and the number of <info> elements, while the presence is not restricted, the number 
MUST be 0 or 1, hence the required status wrt this part of assertions should be "Req=Y"

- 2.1.2, "There are no constraints on the order in which children occur"
- the schema does actually restrict the order of elements, <info> needs to be first, then the descriptors, then the references
 
** Invalid documents
(I have not systematically tested examples with non-valid vocabulary URIs such as http://www.example....)

- http://www.w3.org/TR/emotion-voc/xml does not comply with assertion 110 (hence all examples that refer to vocabularies there also fail)

- 2.3.3 The <info> element
- The last example of this section does not comply with assertion 212 since the name "neutral" does not belong to every-day categories

- 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:
- In the <meta:doc> element, the character & is found, which does not pass XML validation, it should be &amp; (so does the example below)
- It also does not comply with assertion 212 since Disgust and Anger are not part of every-day categories

Reply | Threaded
Open this post in threaded view
|

AW: [EmotionML] implementation release and feedbacks

Felix.Burkhardt

Congratulations, Alexandre

 

>Sorry to give you more work!

Not at all, I’m indeed very happy you work with EmotionML and grateful you do such a thorough job in revising it!

It’s just it’ll take me/us some time to react on this, sorry about this.

 

Kind regards,

Felix

 

Von: Alexandre Denis [mailto:[hidden email]]
Gesendet: Donnerstag, 2. Mai 2013 11:43
An: [hidden email]; Samuel CRUZ-LARA
Betreff: [EmotionML] implementation release and feedbacks

 

Hello all,

I'm happy to announce that we released the very first version of our EmotionML Java implementation. It is hosted on google code and released under the MIT license: https://code.google.com/p/loria-synalp-emotionml/

It is still considered as an alpha version, we would need some users to validate its use. And there is still some work on the documentation but the core of the code is there.

 

If we could be listed as an implementation in the next round of the implementation report it would be nice. Here is the description:

 

Alexandre Denis, LORIA laboratory, SYNALP team, France

The LORIA/SYNALP implementation of EmotionML is a Java standalone library developed in the context of the ITEA Empathic Products project by the LORIA/SYNALP team. It enables to import Java objects from EmotionML XML files and export them to EmotionML as well. It guarantees standard compliance by performing a two steps validation after all export operations and before all import operations: first the EmotionML schema is tested, then all EmotionML assertions are tested. If one or the other fails, an error message is produced and the document cannot be imported or exported. The library contains a corpus of badly formatted EmotionML files that enables to double check if both the schema and the assertions manage to correctly invalidate them. The API is hosted on google code (https://code.google.com/p/loria-synalp-emotionml/) and is released under the MIT License.

 

 

Moreover I don't come to you with empty hands, and I have a bunch of remarks related to the EmotionML specification. Sorry to give you more work!

best regards,

Alexandre Denis

 

 

*** Comments about EmotionML specification

 

In what follows:

- "specification" refers to the document at http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April 2013)

- "assertions" refers to the list of assertions at http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class

 

 

** Specification clarification questions

- About relative and absolute timing ?

            - Is that possible to mix relative and absolute timing ? Intuitively this would seem weird but nothing in the 

            specification prevents it.

 

- About consistency of start/end/duration ? 

            - I think the specification does not enforce the consistency of start, end and duration which are

            possible alltogether. Hence it is possible to have inconsistent triplets (start=0, end=5, duration=10).

           

- About text nodes ?

            - the emotion element can have text nodes children, it is not specified how many. Is it possible to intersperse text nodes all over

            an emotion element ? The fact that an emotion element can have text children is not specified in its children list.

           

- About emotion children combinations ?

            - the specification states "There are no constraints on the combinations of children that are allowed.", it is maybe confusing since

            an emotion cannot contain two categories that belong to different category-sets or two categories with the same name.     

           

- About default values ?

            - some attributes have default values (reference role, time ref anchor point, duration, etc.), is it desirable to have a default 

            value also for other attributes, especially for the "value" attribute ? For instance, how would you compare <category name="surprise"/> 

            and <category name="surprise" value="1.0"/> ? Are they semantically equivalent ? A similar question could be made about the "confidence"

            attribute, how would you compare <category name="surprise"/> and <category name="surprise" confidence="1.0"/> ?

           

- About the number of <trace> ? 

            - the specification does not state clearly if it is possible to have several <trace> elements inside a descriptor, it is stated

            "a <trace> element". Maybe it should be stated "If present the following child element can occur one or more time: <trace>". 

            The schema allows that. If this comment is accepted, the assertions 215, 224, 235, 245 should also be clarified.         

           

- About conformance ?

            - In section 4.3, it is stated "It is the responsibility of an EmotionML processor to verify that the use of descriptor names and values 

            is consistent with the vocabulary definition", which is true but incomplete with regards to the assertions, 

            maybe it would be beneficial to specify all the assertions that are not under the schema responsability but rather the EmotionML processor 

            (see below) or at least warn that there are many assertions not checked by the schema.

 

 

** Discrepancies between schema/assertions/specification

- Assertions not tested by the schema

            - I found that the following assertions are not tested by the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212, 

            216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.

           

            There are assertions that are impossible to test with a XSD schema I think:

                        114, 117, 120, 123, 161, 164, 167, 170 : vocabulary set id and type checking

                        212, 222, 232, 242 : vocabulary name membership

                        417 : media type (unless enumerating them)

           

            Some may be possible with some tweaking:

                        210, 220, 230, 240 : vocabulary set presence

                        216, 224, 236, 246 : <trace> and "value"

                       

            There are two "true" errors I think:

                        172 : The "version" attribute of <emotion>, if present, MUST have the  value "1.0"

                                    I think it should not be "optional with default value 1.0" but rather "optional with fixed value 1.0"

                        410 : The <reference> element MUST contain a "uri" attribute

                                    the "uri" attribute is optional by default in the schema

 

- 2.4.1, "The end value MUST be greater than or equal to the start value", 

            - the schema does not check it and there is no assertion enforcing it

 

- 2.1.2, "a typical use case is expected to be embedding an <emotion> into some other markup", 

            - there is no assertion that describe that <emotion> may be embedded in another markup, does it imply we could embed other elements ?

            - is a document containing a sole <emotion> a valid document (not in the sense of <emotionml> document) ? If yes, maybe an assertion clarifiying the use of <emotion> would be useful.

 

- assertions 105, 155, 601, 606, status "Req=N"

            - the assertions mix the presence of <info> and the number of <info> elements, while the presence is not restricted, the number 

            MUST be 0 or 1, hence the required status wrt this part of assertions should be "Req=Y"

 

- 2.1.2, "There are no constraints on the order in which children occur"

            - the schema does actually restrict the order of elements, <info> needs to be first, then the descriptors, then the references

           

           

** Invalid documents

(I have not systematically tested examples with non-valid vocabulary URIs such as http://www.example....)

 

- http://www.w3.org/TR/emotion-voc/xml does not comply with assertion 110 (hence all examples that refer to vocabularies there also fail)

 

- 2.3.3 The <info> element

            - The last example of this section does not comply with assertion 212 since the name "neutral" does not belong to every-day categories

 

- 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:

            - In the <meta:doc> element, the character & is found, which does not pass XML validation, it should be &amp; (so does the example below)

            - It also does not comply with assertion 212 since Disgust and Anger are not part of every-day categories

 

Reply | Threaded
Open this post in threaded view
|

Re: AW: [EmotionML] implementation release and feedbacks

Kazuyuki Ashimura
Dear Alexandre and EmotionML implementers,

Thank you very much for implementing EmotionML, Alexandre!
Also your thorough review on the EmotionML [1] specification and
the Implementation Report [2] is really appreciated.

We are very sorry it took much longer to get consensus about how to
respond to you and wrap-up the procedure [3] to publish EmotionML
as a W3C Recommendation.

We the W3C Multimodal Interaction Working Group have already fixed
typos in the spec and added necessary clarifications to it.  In
addition, we have generated an updated version of the schema [5, 6].

Now the remaining question is how to deal with your comments on the
Implementation Report which wouldn't change the spec itself.

I talked within the W3C Team about what we should have done from the
W3C Process viewpoint, and it seems we need to make sure that there
are enough implementation experience for the following two features
which were not explicitly described in the published Implementation
Report [2].

Feature1:
    In Section 2.4.1 of the sepc [1], there is a feature "The end value
    MUST be greater than or equal to the start value", which is not
    checked in the Implementation Report.

Feature2:
    In Section 2.1.2 of the spec [1], there is a feature "a typical use
    case is expected to be embedding an <emotion> into some other
    markup", which is not checked in the Implementation Report.

We have already checked with EmotionML implementers (including you)
and it seems we can get several implementations for the above two
features as well.

Now we would like to ask all the EmotionML implementers to respond
to this message and express if the aobve features are implmented
so that we can finalize the procedure and publish EmotionML as a
W3C Recommendation.

[1] http://www.w3.org/TR/2013/PR-emotionml-20130416/
[2] http://www.w3.org/2002/mmi/2013/emotionml-ir/
[3] http://www.w3.org/2004/02/Process-20040205/tr.html#maturity-levels
[4] http://lists.w3.org/Archives/Public/www-multimodal/2013May/0000.html
[5] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml.xsd
[6] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml-fragments.xsd

Sincerely,

Kazuyuki Ashimura;
for the W3C Multimodal Interaction Working Group



On 05/02/2013 07:00 PM, [hidden email] wrote:

> Congratulations, Alexandre
>
>  >Sorry to give you more work!
>
> Not at all, I’m indeed very happy you work with EmotionML and grateful
> you do such a thorough job in revising it!
>
> It’s just it’ll take me/us some time to react on this, sorry about this.
>
> Kind regards,
>
> Felix
>
> *Von:*Alexandre Denis [mailto:[hidden email]]
> *Gesendet:* Donnerstag, 2. Mai 2013 11:43
> *An:* [hidden email]; Samuel CRUZ-LARA
> *Betreff:* [EmotionML] implementation release and feedbacks
>
> Hello all,
>
> I'm happy to announce that we released the very first version of our
> EmotionML Java implementation. It is hosted on google code and released
> under the MIT license: https://code.google.com/p/loria-synalp-emotionml/
>
> It is still considered as an alpha version, we would need some users to
> validate its use. And there is still some work on the documentation but
> the core of the code is there.
>
> If we could be listed as an implementation in the next round of the
> implementation report it would be nice. Here is the description:
>
> Alexandre Denis, LORIA laboratory, SYNALP team, France
>
> The LORIA/SYNALP implementation of EmotionML is a Java standalone
> library developed in the context of the ITEA Empathic Products project
> by the LORIA/SYNALP team. It enables to import Java objects from
> EmotionML XML files and export them to EmotionML as well. It guarantees
> standard compliance by performing a two steps validation after all
> export operations and before all import operations: first the EmotionML
> schema is tested, then all EmotionML assertions are tested. If one or
> the other fails, an error message is produced and the document cannot be
> imported or exported. The library contains a corpus of badly formatted
> EmotionML files that enables to double check if both the schema and the
> assertions manage to correctly invalidate them. The API is hosted on
> google code (https://code.google.com/p/loria-synalp-emotionml/) and is
> released under the MIT License.
>
> Moreover I don't come to you with empty hands, and I have a bunch of
> remarks related to the EmotionML specification. Sorry to give you more work!
>
> best regards,
>
> Alexandre Denis
>
> *** Comments about EmotionML specification
>
> In what follows:
>
> - "specification" refers to the document at
> http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April 2013)
>
> - "assertions" refers to the list of assertions at
> http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class
>
> - "schema" refers to the schemas
> http://www.w3.org/TR/emotionml/emotionml.xsd and
> http://www.w3.org/TR/emotionml/emotionml-fragments.xsd
>
> ** Specification clarification questions
>
> - About relative and absolute timing ?
>
>              - Is that possible to mix relative and absolute timing ?
> Intuitively this would seem weird but nothing in the
>
>              specification prevents it.
>
> - About consistency of start/end/duration ?
>
>              - I think the specification does not enforce the
> consistency of start, end and duration which are
>
>              possible alltogether. Hence it is possible to have
> inconsistent triplets (start=0, end=5, duration=10).
>
> - About text nodes ?
>
>              - the emotion element can have text nodes children, it is
> not specified how many. Is it possible to intersperse text nodes all over
>
>              an emotion element ? The fact that an emotion element can
> have text children is not specified in its children list.
>
> - About emotion children combinations ?
>
>              - the specification states "There are no constraints on the
> combinations of children that are allowed.", it is maybe confusing since
>
>              an emotion cannot contain two categories that belong to
> different category-sets or two categories with the same name.
>
> - About default values ?
>
>              - some attributes have default values (reference role, time
> ref anchor point, duration, etc.), is it desirable to have a default
>
>              value also for other attributes, especially for the "value"
> attribute ? For instance, how would you compare <category name="surprise"/>
>
>              and <category name="surprise" value="1.0"/> ? Are they
> semantically equivalent ? A similar question could be made about the
> "confidence"
>
>              attribute, how would you compare <category
> name="surprise"/> and <category name="surprise" confidence="1.0"/> ?
>
> - About the number of <trace> ?
>
>              - the specification does not state clearly if it is
> possible to have several <trace> elements inside a descriptor, it is stated
>
>              "a <trace> element". Maybe it should be stated "If present
> the following child element can occur one or more time: <trace>".
>
>              The schema allows that. If this comment is accepted, the
> assertions 215, 224, 235, 245 should also be clarified.
>
> - About conformance ?
>
>              - In section 4.3, it is stated "It is the responsibility of
> an EmotionML processor to verify that the use of descriptor names and
> values
>
>              is consistent with the vocabulary definition", which is
> true but incomplete with regards to the assertions,
>
>              maybe it would be beneficial to specify all the assertions
> that are not under the schema responsability but rather the EmotionML
> processor
>
>              (see below) or at least warn that there are many assertions
> not checked by the schema.
>
> ** Discrepancies between schema/assertions/specification
>
> - Assertions not tested by the schema
>
>              - I found that the following assertions are not tested by
> the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212,
>
>              216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.
>
>              There are assertions that are impossible to test with a XSD
> schema I think:
>
>                          114, 117, 120, 123, 161, 164, 167, 170 :
> vocabulary set id and type checking
>
>                          212, 222, 232, 242 : vocabulary name membership
>
>                          417 : media type (unless enumerating them)
>
>              Some may be possible with some tweaking:
>
>                          210, 220, 230, 240 : vocabulary set presence
>
>                          216, 224, 236, 246 : <trace> and "value"
>
>              There are two "true" errors I think:
>
>                          172 : The "version" attribute of <emotion>, if
> present, MUST have the  value "1.0"
>
>                                      I think it should not be "optional
> with default value 1.0" but rather "optional with fixed value 1.0"
>
>                          410 : The <reference> element MUST contain a
> "uri" attribute
>
>                                      the "uri" attribute is optional by
> default in the schema
>
> - 2.4.1, "The end value MUST be greater than or equal to the start value",
>
>              - the schema does not check it and there is no assertion
> enforcing it
>
> - 2.1.2, "a typical use case is expected to be embedding an <emotion>
> into some other markup",
>
>              - there is no assertion that describe that <emotion> may be
> embedded in another markup, does it imply we could embed other elements ?
>
>              - is a document containing a sole <emotion> a valid
> document (not in the sense of <emotionml> document) ? If yes, maybe an
> assertion clarifiying the use of <emotion> would be useful.
>
> - assertions 105, 155, 601, 606, status "Req=N"
>
>              - the assertions mix the presence of <info> and the number
> of <info> elements, while the presence is not restricted, the number
>
>              MUST be 0 or 1, hence the required status wrt this part of
> assertions should be "Req=Y"
>
> - 2.1.2, "There are no constraints on the order in which children occur"
>
>              - the schema does actually restrict the order of elements,
> <info> needs to be first, then the descriptors, then the references
>
> ** Invalid documents
>
> (I have not systematically tested examples with non-valid vocabulary
> URIs such as http://www.example....)
>
> - http://www.w3.org/TR/emotion-voc/xml does not comply with assertion
> 110 (hence all examples that refer to vocabularies there also fail)
>
> - 2.3.3 The <info> element
>
>              - The last example of this section does not comply with
> assertion 212 since the name "neutral" does not belong to every-day
> categories
>
> - 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:
>
>              - In the <meta:doc> element, the character & is found,
> which does not pass XML validation, it should be &amp; (so does the
> example below)
>
>              - It also does not comply with assertion 212 since Disgust
> and Anger are not part of every-day categories
>


--
Kaz Ashimura, W3C Staff Contact for Web&TV, MMI and Voice
Tel: +81 466 49 1170

Reply | Threaded
Open this post in threaded view
|

Re: AW: [EmotionML] implementation release and feedbacks

Alexandre Denis
Hello everyone,
here is what is done in the Loria/Synalp EmotionML implementation with regards to these two features:

Feature1:

   In Section 2.4.1 of the sepc [1], there is a feature "The end value
   MUST be greater than or equal to the start value", which is not
   checked in the Implementation Report.

-> this is implemented, I will just need an assert id for this assertion, for now I consider it as a variant of assertion 420, but it may be a new one


Feature2:
   In Section 2.1.2 of the spec [1], there is a feature "a typical use

   case is expected to be embedding an <emotion> into some other
   markup", which is not checked in the Implementation Report.

-> the question is what exactly it means to implement this feature. For the Loria/Synalp EmotionML implementation, it is possible to validate both an EmotionML document or a standalone emotion. I added a way to validate all emotions that are found in a non-EmotionML document, but I'm not 100% sure that's what you mean by implementing this feature.

best regards,
Alexandre Denis




On Mon, Oct 28, 2013 at 7:56 AM, Kazuyuki Ashimura <[hidden email]> wrote:
Dear Alexandre and EmotionML implementers,

Thank you very much for implementing EmotionML, Alexandre!
Also your thorough review on the EmotionML [1] specification and
the Implementation Report [2] is really appreciated.

We are very sorry it took much longer to get consensus about how to
respond to you and wrap-up the procedure [3] to publish EmotionML
as a W3C Recommendation.

We the W3C Multimodal Interaction Working Group have already fixed
typos in the spec and added necessary clarifications to it.  In
addition, we have generated an updated version of the schema [5, 6].

Now the remaining question is how to deal with your comments on the
Implementation Report which wouldn't change the spec itself.

I talked within the W3C Team about what we should have done from the
W3C Process viewpoint, and it seems we need to make sure that there
are enough implementation experience for the following two features
which were not explicitly described in the published Implementation
Report [2].

Feature1:

   In Section 2.4.1 of the sepc [1], there is a feature "The end value
   MUST be greater than or equal to the start value", which is not
   checked in the Implementation Report.

Feature2:
   In Section 2.1.2 of the spec [1], there is a feature "a typical use

   case is expected to be embedding an <emotion> into some other
   markup", which is not checked in the Implementation Report.

We have already checked with EmotionML implementers (including you)
and it seems we can get several implementations for the above two
features as well.

Now we would like to ask all the EmotionML implementers to respond
to this message and express if the aobve features are implmented
so that we can finalize the procedure and publish EmotionML as a
W3C Recommendation.
On 05/02/2013 07:00 PM, [hidden email] wrote:
Congratulations, Alexandre

 >Sorry to give you more work!

Not at all, I’m indeed very happy you work with EmotionML and grateful
you do such a thorough job in revising it!

It’s just it’ll take me/us some time to react on this, sorry about this.

Kind regards,

Felix

*Von:*Alexandre Denis [mailto:[hidden email]]
*Gesendet:* Donnerstag, 2. Mai 2013 11:43
*An:* [hidden email]; Samuel CRUZ-LARA
*Betreff:* [EmotionML] implementation release and feedbacks


Hello all,

I'm happy to announce that we released the very first version of our
EmotionML Java implementation. It is hosted on google code and released
under the MIT license: https://code.google.com/p/loria-synalp-emotionml/

It is still considered as an alpha version, we would need some users to
validate its use. And there is still some work on the documentation but
the core of the code is there.

If we could be listed as an implementation in the next round of the
implementation report it would be nice. Here is the description:

Alexandre Denis, LORIA laboratory, SYNALP team, France

The LORIA/SYNALP implementation of EmotionML is a Java standalone
library developed in the context of the ITEA Empathic Products project
by the LORIA/SYNALP team. It enables to import Java objects from
EmotionML XML files and export them to EmotionML as well. It guarantees
standard compliance by performing a two steps validation after all
export operations and before all import operations: first the EmotionML
schema is tested, then all EmotionML assertions are tested. If one or
the other fails, an error message is produced and the document cannot be
imported or exported. The library contains a corpus of badly formatted
EmotionML files that enables to double check if both the schema and the
assertions manage to correctly invalidate them. The API is hosted on
google code (https://code.google.com/p/loria-synalp-emotionml/) and is
released under the MIT License.

Moreover I don't come to you with empty hands, and I have a bunch of
remarks related to the EmotionML specification. Sorry to give you more work!

best regards,

Alexandre Denis

*** Comments about EmotionML specification

In what follows:

- "specification" refers to the document at
http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April 2013)

- "assertions" refers to the list of assertions at
http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class

- "schema" refers to the schemas
http://www.w3.org/TR/emotionml/emotionml.xsd and
http://www.w3.org/TR/emotionml/emotionml-fragments.xsd

** Specification clarification questions

- About relative and absolute timing ?

             - Is that possible to mix relative and absolute timing ?
Intuitively this would seem weird but nothing in the

             specification prevents it.

- About consistency of start/end/duration ?

             - I think the specification does not enforce the
consistency of start, end and duration which are

             possible alltogether. Hence it is possible to have
inconsistent triplets (start=0, end=5, duration=10).

- About text nodes ?

             - the emotion element can have text nodes children, it is
not specified how many. Is it possible to intersperse text nodes all over

             an emotion element ? The fact that an emotion element can
have text children is not specified in its children list.

- About emotion children combinations ?

             - the specification states "There are no constraints on the
combinations of children that are allowed.", it is maybe confusing since

             an emotion cannot contain two categories that belong to
different category-sets or two categories with the same name.

- About default values ?

             - some attributes have default values (reference role, time
ref anchor point, duration, etc.), is it desirable to have a default

             value also for other attributes, especially for the "value"
attribute ? For instance, how would you compare <category name="surprise"/>

             and <category name="surprise" value="1.0"/> ? Are they
semantically equivalent ? A similar question could be made about the
"confidence"

             attribute, how would you compare <category
name="surprise"/> and <category name="surprise" confidence="1.0"/> ?

- About the number of <trace> ?

             - the specification does not state clearly if it is
possible to have several <trace> elements inside a descriptor, it is stated

             "a <trace> element". Maybe it should be stated "If present
the following child element can occur one or more time: <trace>".

             The schema allows that. If this comment is accepted, the
assertions 215, 224, 235, 245 should also be clarified.

- About conformance ?

             - In section 4.3, it is stated "It is the responsibility of
an EmotionML processor to verify that the use of descriptor names and
values

             is consistent with the vocabulary definition", which is
true but incomplete with regards to the assertions,

             maybe it would be beneficial to specify all the assertions
that are not under the schema responsability but rather the EmotionML
processor

             (see below) or at least warn that there are many assertions
not checked by the schema.

** Discrepancies between schema/assertions/specification

- Assertions not tested by the schema

             - I found that the following assertions are not tested by
the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212,

             216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.

             There are assertions that are impossible to test with a XSD
schema I think:

                         114, 117, 120, 123, 161, 164, 167, 170 :
vocabulary set id and type checking

                         212, 222, 232, 242 : vocabulary name membership

                         417 : media type (unless enumerating them)

             Some may be possible with some tweaking:

                         210, 220, 230, 240 : vocabulary set presence

                         216, 224, 236, 246 : <trace> and "value"

             There are two "true" errors I think:

                         172 : The "version" attribute of <emotion>, if
present, MUST have the  value "1.0"

                                     I think it should not be "optional
with default value 1.0" but rather "optional with fixed value 1.0"

                         410 : The <reference> element MUST contain a
"uri" attribute

                                     the "uri" attribute is optional by
default in the schema

- 2.4.1, "The end value MUST be greater than or equal to the start value",

             - the schema does not check it and there is no assertion
enforcing it

- 2.1.2, "a typical use case is expected to be embedding an <emotion>
into some other markup",

             - there is no assertion that describe that <emotion> may be
embedded in another markup, does it imply we could embed other elements ?

             - is a document containing a sole <emotion> a valid
document (not in the sense of <emotionml> document) ? If yes, maybe an
assertion clarifiying the use of <emotion> would be useful.

- assertions 105, 155, 601, 606, status "Req=N"

             - the assertions mix the presence of <info> and the number
of <info> elements, while the presence is not restricted, the number

             MUST be 0 or 1, hence the required status wrt this part of
assertions should be "Req=Y"

- 2.1.2, "There are no constraints on the order in which children occur"

             - the schema does actually restrict the order of elements,
<info> needs to be first, then the descriptors, then the references

** Invalid documents

(I have not systematically tested examples with non-valid vocabulary
URIs such as http://www.example....)

- http://www.w3.org/TR/emotion-voc/xml does not comply with assertion
110 (hence all examples that refer to vocabularies there also fail)

- 2.3.3 The <info> element

             - The last example of this section does not comply with
assertion 212 since the name "neutral" does not belong to every-day
categories

- 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:

             - In the <meta:doc> element, the character & is found,
which does not pass XML validation, it should be &amp; (so does the
example below)

             - It also does not comply with assertion 212 since Disgust
and Anger are not part of every-day categories



--
Kaz Ashimura, W3C Staff Contact for Web&TV, MMI and Voice
Tel: <a href="tel:%2B81%20466%2049%201170" value="+81466491170" target="_blank">+81 466 49 1170

Reply | Threaded
Open this post in threaded view
|

Re: AW: [EmotionML] implementation release and feedbacks

Gerhard Fobe
In reply to this post by Kazuyuki Ashimura
Hello Kaz and other friends of EmotionML,

as a implementer of an EmotionML library for C# [1] I want to answer  
to the request of implementation of the two new features.

> Feature1:
>    In Section 2.4.1 of the sepc [1], there is a feature "The end value
>    MUST be greater than or equal to the start value", which is not
>    checked in the Implementation Report.

I implemented my library out of the specification, so I only don't  
allow that a start value is greater then a end value. Start value <=  
end value works fine. Please have a look at my code [2].

> Feature2:
>    In Section 2.1.2 of the spec [1], there is a feature "a typical use
>    case is expected to be embedding an <emotion> into some other
>    markup", which is not checked in the Implementation Report.

I doesn't interact with surrounding markup in my EmotionML library (it  
doesn't make sense for me) but within my master thesis I integrate RDF  
in EmotionML in XMPP (Combination of XMPP, EmotionML, Smiley Ontology  
and Emotion Ontology). You can read about it in German/see some XML at  
page 77 [3]. Furthermore I wrote a short blog post [4] about with some  
example XML. So I can also confirm that there is a implementation of  
this feature.

I look forward to the EmotionML Recommendation.

[1] http://lists.w3.org/Archives/Public/www-multimodal/2012Sep/0002.html
[2]  
https://github.com/gfobe/EmotionML-Lib-CSharp/blob/7f90f6de1667ac7425a0ab2f12ca8e85ef1636c4/Emotion.cs#L227
[3]  
http://www.gerhard-fobe.de/public/publications/Serialisierung_von_Emotionen_in_der_textuellen_Kommunikation/Masterarbeit%20Gerhard%20Fobe%20-%20Serialisierung%20von%20Emotionen%20in%20der%20textuellen%20Kommunikation.pdf
[4] http://www.gerhard-fobe.de/embedded-emotionml-in-xmpp-with-rdf/

Kind regards

Gerhard Fobe
http://www.gerhard-fobe.de/


Reply | Threaded
Open this post in threaded view
|

AW: AW: [EmotionML] implementation release and feedbacks

Felix.Burkhardt
Thanks, Gerhard,
Most helpful.
Regards,
Felix

>-----Ursprüngliche Nachricht-----
>Von: Gerhard Fobe [mailto:[hidden email]]
>Gesendet: Montag, 4. November 2013 05:30
>An: [hidden email]
>Betreff: Re: AW: [EmotionML] implementation release and feedbacks
>
>Hello Kaz and other friends of EmotionML,
>
>as a implementer of an EmotionML library for C# [1] I want to answer to the
>request of implementation of the two new features.
>
>> Feature1:
>>    In Section 2.4.1 of the sepc [1], there is a feature "The end value
>>    MUST be greater than or equal to the start value", which is not
>>    checked in the Implementation Report.
>
>I implemented my library out of the specification, so I only don't allow that a
>start value is greater then a end value. Start value <= end value works fine.
>Please have a look at my code [2].
>
>> Feature2:
>>    In Section 2.1.2 of the spec [1], there is a feature "a typical use
>>    case is expected to be embedding an <emotion> into some other
>>    markup", which is not checked in the Implementation Report.
>
>I doesn't interact with surrounding markup in my EmotionML library (it doesn't
>make sense for me) but within my master thesis I integrate RDF in EmotionML
>in XMPP (Combination of XMPP, EmotionML, Smiley Ontology and Emotion
>Ontology). You can read about it in German/see some XML at page 77 [3].
>Furthermore I wrote a short blog post [4] about with some example XML. So I
>can also confirm that there is a implementation of this feature.
>
>I look forward to the EmotionML Recommendation.
>
>[1] http://lists.w3.org/Archives/Public/www-multimodal/2012Sep/0002.html
>[2]
>https://github.com/gfobe/EmotionML-Lib-
>CSharp/blob/7f90f6de1667ac7425a0ab2f12ca8e85ef1636c4/Emotion.cs#L227
>[3]
>http://www.gerhard-
>fobe.de/public/publications/Serialisierung_von_Emotionen_in_der_textuelle
>n_Kommunikation/Masterarbeit%20Gerhard%20Fobe%20-
>%20Serialisierung%20von%20Emotionen%20in%20der%20textuellen%20Kom
>munikation.pdf
>[4] http://www.gerhard-fobe.de/embedded-emotionml-in-xmpp-with-rdf/
>
>Kind regards
>
>Gerhard Fobe
>http://www.gerhard-fobe.de/
>

Reply | Threaded
Open this post in threaded view
|

AW: AW: [EmotionML] implementation release and feedbacks

Felix.Burkhardt
In reply to this post by Kazuyuki Ashimura
Hi all
As an EmotionML implementer (Speechalyzer [1, 2]) I'd have to say current state:
"Not-impl"
On both features:
Feature 1: start and end are not implemented cause we always consider the whole audio-file
Feature 2: only whole EmotionML docs are used for in- and output currently.

Kind regards,
Felix




[1] https://github.com/dtag-dbu/speechalyzer
[2] http://felix.syntheticspeech.de/publications/speechalyzer.pdf

>-----Ursprüngliche Nachricht-----
>Von: Kazuyuki Ashimura [mailto:[hidden email]]
>Gesendet: Montag, 28. Oktober 2013 07:57
>An: [hidden email]; [hidden email]
>Cc: Burkhardt, Felix; [hidden email]
>Betreff: Re: AW: [EmotionML] implementation release and feedbacks
>
>Dear Alexandre and EmotionML implementers,
>
>Thank you very much for implementing EmotionML, Alexandre!
>Also your thorough review on the EmotionML [1] specification and the
>Implementation Report [2] is really appreciated.
>
>We are very sorry it took much longer to get consensus about how to respond
>to you and wrap-up the procedure [3] to publish EmotionML as a W3C
>Recommendation.
>
>We the W3C Multimodal Interaction Working Group have already fixed typos
>in the spec and added necessary clarifications to it.  In addition, we have
>generated an updated version of the schema [5, 6].
>
>Now the remaining question is how to deal with your comments on the
>Implementation Report which wouldn't change the spec itself.
>
>I talked within the W3C Team about what we should have done from the W3C
>Process viewpoint, and it seems we need to make sure that there are enough
>implementation experience for the following two features which were not
>explicitly described in the published Implementation Report [2].
>
>Feature1:
>    In Section 2.4.1 of the sepc [1], there is a feature "The end value
>    MUST be greater than or equal to the start value", which is not
>    checked in the Implementation Report.
>
>Feature2:
>    In Section 2.1.2 of the spec [1], there is a feature "a typical use
>    case is expected to be embedding an <emotion> into some other
>    markup", which is not checked in the Implementation Report.
>
>We have already checked with EmotionML implementers (including you) and
>it seems we can get several implementations for the above two features as
>well.
>
>Now we would like to ask all the EmotionML implementers to respond to this
>message and express if the aobve features are implmented so that we can
>finalize the procedure and publish EmotionML as a W3C Recommendation.
>
>[1] http://www.w3.org/TR/2013/PR-emotionml-20130416/
>[2] http://www.w3.org/2002/mmi/2013/emotionml-ir/
>[3] http://www.w3.org/2004/02/Process-20040205/tr.html#maturity-levels
>[4] http://lists.w3.org/Archives/Public/www-multimodal/2013May/0000.html
>[5] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml.xsd
>[6] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml-
>fragments.xsd
>
>Sincerely,
>
>Kazuyuki Ashimura;
>for the W3C Multimodal Interaction Working Group
>
>
>
>On 05/02/2013 07:00 PM, [hidden email] wrote:
>> Congratulations, Alexandre
>>
>>  >Sorry to give you more work!
>>
>> Not at all, I'm indeed very happy you work with EmotionML and grateful
>> you do such a thorough job in revising it!
>>
>> It's just it'll take me/us some time to react on this, sorry about this.
>>
>> Kind regards,
>>
>> Felix
>>
>> *Von:*Alexandre Denis [mailto:[hidden email]]
>> *Gesendet:* Donnerstag, 2. Mai 2013 11:43
>> *An:* [hidden email]; Samuel CRUZ-LARA
>> *Betreff:* [EmotionML] implementation release and feedbacks
>>
>> Hello all,
>>
>> I'm happy to announce that we released the very first version of our
>> EmotionML Java implementation. It is hosted on google code and
>> released under the MIT license:
>> https://code.google.com/p/loria-synalp-emotionml/
>>
>> It is still considered as an alpha version, we would need some users
>> to validate its use. And there is still some work on the documentation
>> but the core of the code is there.
>>
>> If we could be listed as an implementation in the next round of the
>> implementation report it would be nice. Here is the description:
>>
>> Alexandre Denis, LORIA laboratory, SYNALP team, France
>>
>> The LORIA/SYNALP implementation of EmotionML is a Java standalone
>> library developed in the context of the ITEA Empathic Products project
>> by the LORIA/SYNALP team. It enables to import Java objects from
>> EmotionML XML files and export them to EmotionML as well. It
>> guarantees standard compliance by performing a two steps validation
>> after all export operations and before all import operations: first
>> the EmotionML schema is tested, then all EmotionML assertions are
>> tested. If one or the other fails, an error message is produced and
>> the document cannot be imported or exported. The library contains a
>> corpus of badly formatted EmotionML files that enables to double check
>> if both the schema and the assertions manage to correctly invalidate
>> them. The API is hosted on google code
>> (https://code.google.com/p/loria-synalp-emotionml/) and is released under
>the MIT License.
>>
>> Moreover I don't come to you with empty hands, and I have a bunch of
>> remarks related to the EmotionML specification. Sorry to give you more
>work!
>>
>> best regards,
>>
>> Alexandre Denis
>>
>> *** Comments about EmotionML specification
>>
>> In what follows:
>>
>> - "specification" refers to the document at
>> http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April
>> 2013)
>>
>> - "assertions" refers to the list of assertions at
>> http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class
>>
>> - "schema" refers to the schemas
>> http://www.w3.org/TR/emotionml/emotionml.xsd and
>> http://www.w3.org/TR/emotionml/emotionml-fragments.xsd
>>
>> ** Specification clarification questions
>>
>> - About relative and absolute timing ?
>>
>>              - Is that possible to mix relative and absolute timing ?
>> Intuitively this would seem weird but nothing in the
>>
>>              specification prevents it.
>>
>> - About consistency of start/end/duration ?
>>
>>              - I think the specification does not enforce the
>> consistency of start, end and duration which are
>>
>>              possible alltogether. Hence it is possible to have
>> inconsistent triplets (start=0, end=5, duration=10).
>>
>> - About text nodes ?
>>
>>              - the emotion element can have text nodes children, it is
>> not specified how many. Is it possible to intersperse text nodes all
>> over
>>
>>              an emotion element ? The fact that an emotion element can
>> have text children is not specified in its children list.
>>
>> - About emotion children combinations ?
>>
>>              - the specification states "There are no constraints on
>> the combinations of children that are allowed.", it is maybe confusing
>> since
>>
>>              an emotion cannot contain two categories that belong to
>> different category-sets or two categories with the same name.
>>
>> - About default values ?
>>
>>              - some attributes have default values (reference role,
>> time ref anchor point, duration, etc.), is it desirable to have a
>> default
>>
>>              value also for other attributes, especially for the "value"
>> attribute ? For instance, how would you compare <category
>> name="surprise"/>
>>
>>              and <category name="surprise" value="1.0"/> ? Are they
>> semantically equivalent ? A similar question could be made about the
>> "confidence"
>>
>>              attribute, how would you compare <category
>> name="surprise"/> and <category name="surprise" confidence="1.0"/> ?
>>
>> - About the number of <trace> ?
>>
>>              - the specification does not state clearly if it is
>> possible to have several <trace> elements inside a descriptor, it is
>> stated
>>
>>              "a <trace> element". Maybe it should be stated "If
>> present the following child element can occur one or more time: <trace>".
>>
>>              The schema allows that. If this comment is accepted, the
>> assertions 215, 224, 235, 245 should also be clarified.
>>
>> - About conformance ?
>>
>>              - In section 4.3, it is stated "It is the responsibility
>> of an EmotionML processor to verify that the use of descriptor names
>> and values
>>
>>              is consistent with the vocabulary definition", which is
>> true but incomplete with regards to the assertions,
>>
>>              maybe it would be beneficial to specify all the
>> assertions that are not under the schema responsability but rather the
>> EmotionML processor
>>
>>              (see below) or at least warn that there are many
>> assertions not checked by the schema.
>>
>> ** Discrepancies between schema/assertions/specification
>>
>> - Assertions not tested by the schema
>>
>>              - I found that the following assertions are not tested by
>> the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212,
>>
>>              216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.
>>
>>              There are assertions that are impossible to test with a
>> XSD schema I think:
>>
>>                          114, 117, 120, 123, 161, 164, 167, 170 :
>> vocabulary set id and type checking
>>
>>                          212, 222, 232, 242 : vocabulary name
>> membership
>>
>>                          417 : media type (unless enumerating them)
>>
>>              Some may be possible with some tweaking:
>>
>>                          210, 220, 230, 240 : vocabulary set presence
>>
>>                          216, 224, 236, 246 : <trace> and "value"
>>
>>              There are two "true" errors I think:
>>
>>                          172 : The "version" attribute of <emotion>,
>> if present, MUST have the  value "1.0"
>>
>>                                      I think it should not be
>> "optional with default value 1.0" but rather "optional with fixed value 1.0"
>>
>>                          410 : The <reference> element MUST contain a
>> "uri" attribute
>>
>>                                      the "uri" attribute is optional
>> by default in the schema
>>
>> - 2.4.1, "The end value MUST be greater than or equal to the start
>> value",
>>
>>              - the schema does not check it and there is no assertion
>> enforcing it
>>
>> - 2.1.2, "a typical use case is expected to be embedding an <emotion>
>> into some other markup",
>>
>>              - there is no assertion that describe that <emotion> may
>> be embedded in another markup, does it imply we could embed other
>elements ?
>>
>>              - is a document containing a sole <emotion> a valid
>> document (not in the sense of <emotionml> document) ? If yes, maybe an
>> assertion clarifiying the use of <emotion> would be useful.
>>
>> - assertions 105, 155, 601, 606, status "Req=N"
>>
>>              - the assertions mix the presence of <info> and the
>> number of <info> elements, while the presence is not restricted, the
>> number
>>
>>              MUST be 0 or 1, hence the required status wrt this part
>> of assertions should be "Req=Y"
>>
>> - 2.1.2, "There are no constraints on the order in which children occur"
>>
>>              - the schema does actually restrict the order of
>> elements, <info> needs to be first, then the descriptors, then the
>> references
>>
>> ** Invalid documents
>>
>> (I have not systematically tested examples with non-valid vocabulary
>> URIs such as http://www.example....)
>>
>> - http://www.w3.org/TR/emotion-voc/xml does not comply with assertion
>> 110 (hence all examples that refer to vocabularies there also fail)
>>
>> - 2.3.3 The <info> element
>>
>>              - The last example of this section does not comply with
>> assertion 212 since the name "neutral" does not belong to every-day
>> categories
>>
>> - 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:
>>
>>              - In the <meta:doc> element, the character & is found,
>> which does not pass XML validation, it should be &amp; (so does the
>> example below)
>>
>>              - It also does not comply with assertion 212 since
>> Disgust and Anger are not part of every-day categories
>>
>
>
>--
>Kaz Ashimura, W3C Staff Contact for Web&TV, MMI and Voice
>Tel: +81 466 49 1170

Reply | Threaded
Open this post in threaded view
|

RE: AW: [EmotionML] implementation release and feedbacks

Deborah Dahl
In reply to this post by Kazuyuki Ashimura
I can report on an implementation of Feature 2, embedded EmotionML.
I didn't send an implementation report during the CR period because I didn't
have an implementation at that time. However, since then, I've implemented
EmotionML applied to emotion recognition from text input. The implementation
includes an implementation of Feature 2,  embedding of the "emotion" element
in other markup, specifically EMMA [1] and MMI Architecture Life-Cycle
events [2].

Feature2:
    In Section 2.1.2 of the spec [1], there is a feature "a typical use
    case is expected to be embedding an <emotion> into some other
    markup", which is not checked in the Implementation Report.

The implementation is at http://nlportal.elasticbeanstalk.com. Please use
Firefox or Chrome. Click on "show XML event results" and enable popups to
see the EmotionML markup embedded in EMMA and MMI Architecture Life-Cycle
Events.

Feature 1 wasn't implemented because the primary use case was expected to be
cut and paste of text content. The original composition of the text could
have been any time in the past, so timing of the cut and paste wasn't
considered to be particularly useful.

Best regards,
Debbie Dahl

[1] EMMA: http://www.w3.org/TR/emma/
[2] MMI Architecture: http://www.w3.org/TR/mmi-arch/ 

-----Original Message-----
From: Kazuyuki Ashimura [mailto:[hidden email]]
Sent: Monday, October 28, 2013 2:57 AM
To: [hidden email]; [hidden email]
Cc: [hidden email]; [hidden email]
Subject: Re: AW: [EmotionML] implementation release and feedbacks

Dear Alexandre and EmotionML implementers,

Thank you very much for implementing EmotionML, Alexandre!
Also your thorough review on the EmotionML [1] specification and the
Implementation Report [2] is really appreciated.

We are very sorry it took much longer to get consensus about how to respond
to you and wrap-up the procedure [3] to publish EmotionML as a W3C
Recommendation.

We the W3C Multimodal Interaction Working Group have already fixed typos in
the spec and added necessary clarifications to it.  In addition, we have
generated an updated version of the schema [5, 6].

Now the remaining question is how to deal with your comments on the
Implementation Report which wouldn't change the spec itself.

I talked within the W3C Team about what we should have done from the W3C
Process viewpoint, and it seems we need to make sure that there are enough
implementation experience for the following two features which were not
explicitly described in the published Implementation Report [2].

Feature1:
    In Section 2.4.1 of the sepc [1], there is a feature "The end value
    MUST be greater than or equal to the start value", which is not
    checked in the Implementation Report.

Feature2:
    In Section 2.1.2 of the spec [1], there is a feature "a typical use
    case is expected to be embedding an <emotion> into some other
    markup", which is not checked in the Implementation Report.

We have already checked with EmotionML implementers (including you) and it
seems we can get several implementations for the above two features as well.

Now we would like to ask all the EmotionML implementers to respond to this
message and express if the aobve features are implmented so that we can
finalize the procedure and publish EmotionML as a W3C Recommendation.

[1] http://www.w3.org/TR/2013/PR-emotionml-20130416/
[2] http://www.w3.org/2002/mmi/2013/emotionml-ir/
[3] http://www.w3.org/2004/02/Process-20040205/tr.html#maturity-levels
[4] http://lists.w3.org/Archives/Public/www-multimodal/2013May/0000.html
[5] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml.xsd
[6] http://www.w3.org/TR/2013/PR-emotionml-20130416/emotionml-fragments.xsd

Sincerely,

Kazuyuki Ashimura;
for the W3C Multimodal Interaction Working Group



On 05/02/2013 07:00 PM, [hidden email] wrote:

> Congratulations, Alexandre
>
>  >Sorry to give you more work!
>
> Not at all, I'm indeed very happy you work with EmotionML and grateful
> you do such a thorough job in revising it!
>
> It's just it'll take me/us some time to react on this, sorry about this.
>
> Kind regards,
>
> Felix
>
> *Von:*Alexandre Denis [mailto:[hidden email]]
> *Gesendet:* Donnerstag, 2. Mai 2013 11:43
> *An:* [hidden email]; Samuel CRUZ-LARA
> *Betreff:* [EmotionML] implementation release and feedbacks
>
> Hello all,
>
> I'm happy to announce that we released the very first version of our
> EmotionML Java implementation. It is hosted on google code and
> released under the MIT license:
> https://code.google.com/p/loria-synalp-emotionml/
>
> It is still considered as an alpha version, we would need some users
> to validate its use. And there is still some work on the documentation
> but the core of the code is there.
>
> If we could be listed as an implementation in the next round of the
> implementation report it would be nice. Here is the description:
>
> Alexandre Denis, LORIA laboratory, SYNALP team, France
>
> The LORIA/SYNALP implementation of EmotionML is a Java standalone
> library developed in the context of the ITEA Empathic Products project
> by the LORIA/SYNALP team. It enables to import Java objects from
> EmotionML XML files and export them to EmotionML as well. It
> guarantees standard compliance by performing a two steps validation
> after all export operations and before all import operations: first
> the EmotionML schema is tested, then all EmotionML assertions are
> tested. If one or the other fails, an error message is produced and
> the document cannot be imported or exported. The library contains a
> corpus of badly formatted EmotionML files that enables to double check
> if both the schema and the assertions manage to correctly invalidate
> them. The API is hosted on google code
> (https://code.google.com/p/loria-synalp-emotionml/) and is released under
the MIT License.
>
> Moreover I don't come to you with empty hands, and I have a bunch of
> remarks related to the EmotionML specification. Sorry to give you more
work!

>
> best regards,
>
> Alexandre Denis
>
> *** Comments about EmotionML specification
>
> In what follows:
>
> - "specification" refers to the document at
> http://www.w3.org/TR/2013/PR-emotionml-20130416/ (version of 16 April
> 2013)
>
> - "assertions" refers to the list of assertions at
> http://www.w3.org/2002/mmi/2013/emotionml-ir/#test_class
>
> - "schema" refers to the schemas
> http://www.w3.org/TR/emotionml/emotionml.xsd and
> http://www.w3.org/TR/emotionml/emotionml-fragments.xsd
>
> ** Specification clarification questions
>
> - About relative and absolute timing ?
>
>              - Is that possible to mix relative and absolute timing ?
> Intuitively this would seem weird but nothing in the
>
>              specification prevents it.
>
> - About consistency of start/end/duration ?
>
>              - I think the specification does not enforce the
> consistency of start, end and duration which are
>
>              possible alltogether. Hence it is possible to have
> inconsistent triplets (start=0, end=5, duration=10).
>
> - About text nodes ?
>
>              - the emotion element can have text nodes children, it is
> not specified how many. Is it possible to intersperse text nodes all
> over
>
>              an emotion element ? The fact that an emotion element can
> have text children is not specified in its children list.
>
> - About emotion children combinations ?
>
>              - the specification states "There are no constraints on
> the combinations of children that are allowed.", it is maybe confusing
> since
>
>              an emotion cannot contain two categories that belong to
> different category-sets or two categories with the same name.
>
> - About default values ?
>
>              - some attributes have default values (reference role,
> time ref anchor point, duration, etc.), is it desirable to have a
> default
>
>              value also for other attributes, especially for the "value"
> attribute ? For instance, how would you compare <category
> name="surprise"/>
>
>              and <category name="surprise" value="1.0"/> ? Are they
> semantically equivalent ? A similar question could be made about the
> "confidence"
>
>              attribute, how would you compare <category
> name="surprise"/> and <category name="surprise" confidence="1.0"/> ?
>
> - About the number of <trace> ?
>
>              - the specification does not state clearly if it is
> possible to have several <trace> elements inside a descriptor, it is
> stated
>
>              "a <trace> element". Maybe it should be stated "If
> present the following child element can occur one or more time: <trace>".
>
>              The schema allows that. If this comment is accepted, the
> assertions 215, 224, 235, 245 should also be clarified.
>
> - About conformance ?
>
>              - In section 4.3, it is stated "It is the responsibility
> of an EmotionML processor to verify that the use of descriptor names
> and values
>
>              is consistent with the vocabulary definition", which is
> true but incomplete with regards to the assertions,
>
>              maybe it would be beneficial to specify all the
> assertions that are not under the schema responsability but rather the
> EmotionML processor
>
>              (see below) or at least warn that there are many
> assertions not checked by the schema.
>
> ** Discrepancies between schema/assertions/specification
>
> - Assertions not tested by the schema
>
>              - I found that the following assertions are not tested by
> the schema : 114, 117, 120, 123, 161, 164, 167, 170, 172, 210, 212,
>
>              216, 220, 222, 224, 230, 232, 236, 240, 242, 246, 410, 417.
>
>              There are assertions that are impossible to test with a
> XSD schema I think:
>
>                          114, 117, 120, 123, 161, 164, 167, 170 :
> vocabulary set id and type checking
>
>                          212, 222, 232, 242 : vocabulary name
> membership
>
>                          417 : media type (unless enumerating them)
>
>              Some may be possible with some tweaking:
>
>                          210, 220, 230, 240 : vocabulary set presence
>
>                          216, 224, 236, 246 : <trace> and "value"
>
>              There are two "true" errors I think:
>
>                          172 : The "version" attribute of <emotion>,
> if present, MUST have the  value "1.0"
>
>                                      I think it should not be
> "optional with default value 1.0" but rather "optional with fixed value
1.0"

>
>                          410 : The <reference> element MUST contain a
> "uri" attribute
>
>                                      the "uri" attribute is optional
> by default in the schema
>
> - 2.4.1, "The end value MUST be greater than or equal to the start
> value",
>
>              - the schema does not check it and there is no assertion
> enforcing it
>
> - 2.1.2, "a typical use case is expected to be embedding an <emotion>
> into some other markup",
>
>              - there is no assertion that describe that <emotion> may
> be embedded in another markup, does it imply we could embed other elements
?

>
>              - is a document containing a sole <emotion> a valid
> document (not in the sense of <emotionml> document) ? If yes, maybe an
> assertion clarifiying the use of <emotion> would be useful.
>
> - assertions 105, 155, 601, 606, status "Req=N"
>
>              - the assertions mix the presence of <info> and the
> number of <info> elements, while the presence is not restricted, the
> number
>
>              MUST be 0 or 1, hence the required status wrt this part
> of assertions should be "Req=Y"
>
> - 2.1.2, "There are no constraints on the order in which children occur"
>
>              - the schema does actually restrict the order of
> elements, <info> needs to be first, then the descriptors, then the
> references
>
> ** Invalid documents
>
> (I have not systematically tested examples with non-valid vocabulary
> URIs such as http://www.example....)
>
> - http://www.w3.org/TR/emotion-voc/xml does not comply with assertion
> 110 (hence all examples that refer to vocabularies there also fail)
>
> - 2.3.3 The <info> element
>
>              - The last example of this section does not comply with
> assertion 212 since the name "neutral" does not belong to every-day
> categories
>
> - 5.1.1 Annotation of Text, "Annotation of text" Lewis Caroll example:
>
>              - In the <meta:doc> element, the character & is found,
> which does not pass XML validation, it should be &amp; (so does the
> example below)
>
>              - It also does not comply with assertion 212 since
> Disgust and Anger are not part of every-day categories
>


--
Kaz Ashimura, W3C Staff Contact for Web&TV, MMI and Voice
Tel: +81 466 49 1170