Abstract: Humans are used to convey their thought through their (conscious or unconscious) choice of words. Some words possess emotive meaning together with their descriptive meaning. We develop a prototype of a synthetic 3D face that shows emotion associated to text-based speech in an automated way. As a first step, we studied how humans express emotions in face to face communication. Based on this study, we develop a 2D affective lexicon database and a set of rules that describes dependencies between linguistic contents and emotions. The result described in this paper proposes an initial step for developing knowledge for an affective-based multimodal fission.