Ç¥ÁØÈ­ Âü¿©¾È³»

TTAÀÇ Ç¥ÁØÇöȲ

Ȩ > Ç¥ÁØÈ­ °³¿ä > TTAÀÇ Ç¥ÁØÇöȲ

Ç¥ÁعøÈ£ TTAK.KO-10.1253 ±¸Ç¥ÁعøÈ£
Á¦°³Á¤ÀÏ 2020-12-10 ÃÑÆäÀÌÁö 23
ÇѱÛÇ¥ÁØ¸í »ç¿ëÀÚ°æÇè Æò°¡¸¦ À§ÇÑ ¸ÖƼ¸ð´Þ ÀûÀÀÇü °¨Á¤ À¶ÇÕ ÇÁ·¹ÀÓ¿öÅ©
¿µ¹®Ç¥Áظí Multi-modal Adaptive Emotion Fusion Framework for UX Evaluation
Çѱ۳»¿ë¿ä¾à º» Ç¥ÁØÀº ´ëÈ­Çü ½Ã½ºÅÛÀÇ »ç¿ëÀÚ°æÇè Æò°¡ °úÁ¤¿¡¼­ »ç¿ëÀÚµéÀÌ ´À³¢°í Ç¥ÇöÇÏ´Â °¨Á¤À» ÀνÄÇÏ´Â ÀýÂ÷¸¦ Á¤ÀÇÇϱâ À§ÇÏ¿© ½Ã°¢, û°¢, ÃË°¢, ½Å°æ, »ýü, »ý¸®, µî ´Ù¾çÇÑ ¸Åü¸¦ ±â¹ÝÀ¸·Î ÇÏ´Â ÀÔ·Â ¾ç½ÄÀ» È°¿ëÇÑ´Ù. ÀÌ·¸°Ô ´Ù¾çÇÑ ÀÔ·Â ¾ç½Ä°ú ±×µé°ú ¿¬°üµÈ Ư¡, ±×¸®°í Áß°£ °áÁ¤µéÀ» ÀûÀÀÀûÀ¸·Î À¶ÇÕÇÏ¿© UX Æò°¡¸¦ À§ÇÑ Á¤º¸¸¦ Á¦°øÇÏ´Â ÀýÂ÷¸¦ Á¤ÀÇÇÑ´Ù. ´Ù¾çÇÑ ¾ç½ÄÀ¸·Î Ç¥ÇöµÇ´Â °¨Á¤À» À¶ÇÕÇÏ¿© »ç¿ëÀÚÀÇ °¨Á¤, »óȲ ¶Ç´Â °í¼öÁØÀÇ È°µ¿¿¡ ´ëÇÑ ±íÀº ÅëÂû·ÂÀ» ¾ò´Â °ÍÀÌ ¸ñÇ¥ÀÌ´Ù. À̸¦ À§ÇÏ¿© ½Ã°¢, û°¢, ÃË°¢, ½Å°æ, »ýü, »ý¸®, µî ´Ù¾çÇÑ ¸Åü¸¦ ±â¹ÝÀ¸·Î ÇÏ´Â ÀÔ·Â ¾ç½ÄÀÇ µ¥ÀÌÅ͸¦ ó¸®ÇÑ´Ù.
¿µ¹®³»¿ë¿ä¾à This standard introduces the adaptive integration of modalities in order to infer the human emotion based on facial expressions, body movements, audio and cognition sensors. It extracts associated features, processes them to obtain intermediate decisions required to perform an analysis task. The analysis is used for producing UX evaluation for interactive systems. The proposed MAEF involves the processing of multi-modal data in order to obtain valuable insights about human emotions, a situation, or a higher-level activity.
±¹Á¦Ç¥ÁØ
°ü·ÃÆÄÀÏ TTAK.KO-10.1253.pdf TTAK.KO-10.1253.pdf            

ÀÌÀü
ÇÇ½Ì »ç°í¿¡ ´ëÇÑ º¸¾È Áöħ
´ÙÀ½
½º¸¶Æ® ´Ü¸» º¸¾È Ç÷§ÆûÀ» ÀÌ¿ëÇÑ ÀüÀÚ ±ÝÀ¶ ¼­ºñ½º ¾ÆÅ°ÅØó