ÀÚ·á°Ë»ö-Ç¥ÁØ

Ȩ > ÀڷḶ´ç > ÀÚ·á°Ë»ö > Ç¥ÁØ

ÀÚ·á °Ë»ö°á°ú

°Ë»öÆäÀÌÁö·Î
Ç¥ÁØÁ¾·ù Á¤º¸Åë½Å´ÜüǥÁØ(TTAS)
Ç¥ÁعøÈ£ TTAK.KO-10.1253 ±¸ Ç¥ÁعøÈ£
Á¦°³Á¤ÀÏ 2020-12-10 ÃÑ ÆäÀÌÁö 23
ÇÑ±Û Ç¥ÁØ¸í »ç¿ëÀÚ°æÇè Æò°¡¸¦ À§ÇÑ ¸ÖƼ¸ð´Þ ÀûÀÀÇü °¨Á¤ À¶ÇÕ ÇÁ·¹ÀÓ¿öÅ©
¿µ¹® Ç¥Áظí Multi-modal Adaptive Emotion Fusion Framework for UX Evaluation
ÇÑ±Û ³»¿ë¿ä¾à º» Ç¥ÁØÀº ´ëÈ­Çü ½Ã½ºÅÛÀÇ »ç¿ëÀÚ°æÇè Æò°¡ °úÁ¤¿¡¼­ »ç¿ëÀÚµéÀÌ ´À³¢°í Ç¥ÇöÇÏ´Â °¨Á¤À» ÀνÄÇÏ´Â ÀýÂ÷¸¦ Á¤ÀÇÇϱâ À§ÇÏ¿© ½Ã°¢, û°¢, ÃË°¢, ½Å°æ, »ýü, »ý¸®, µî ´Ù¾çÇÑ ¸Åü¸¦ ±â¹ÝÀ¸·Î ÇÏ´Â ÀÔ·Â ¾ç½ÄÀ» È°¿ëÇÑ´Ù. ÀÌ·¸°Ô ´Ù¾çÇÑ ÀÔ·Â ¾ç½Ä°ú ±×µé°ú ¿¬°üµÈ Ư¡, ±×¸®°í Áß°£ °áÁ¤µéÀ» ÀûÀÀÀûÀ¸·Î À¶ÇÕÇÏ¿© UX Æò°¡¸¦ À§ÇÑ Á¤º¸¸¦ Á¦°øÇÏ´Â ÀýÂ÷¸¦ Á¤ÀÇÇÑ´Ù. ´Ù¾çÇÑ ¾ç½ÄÀ¸·Î Ç¥ÇöµÇ´Â °¨Á¤À» À¶ÇÕÇÏ¿© »ç¿ëÀÚÀÇ °¨Á¤, »óȲ ¶Ç´Â °í¼öÁØÀÇ È°µ¿¿¡ ´ëÇÑ ±íÀº ÅëÂû·ÂÀ» ¾ò´Â °ÍÀÌ ¸ñÇ¥ÀÌ´Ù. À̸¦ À§ÇÏ¿© ½Ã°¢, û°¢, ÃË°¢, ½Å°æ, »ýü, »ý¸®, µî ´Ù¾çÇÑ ¸Åü¸¦ ±â¹ÝÀ¸·Î ÇÏ´Â ÀÔ·Â ¾ç½ÄÀÇ µ¥ÀÌÅ͸¦ ó¸®ÇÑ´Ù.
¿µ¹® ³»¿ë¿ä¾à This standard introduces the adaptive integration of modalities in order to infer the human emotion based on facial expressions, body movements, audio and cognition sensors. It extracts associated features, processes them to obtain intermediate decisions required to perform an analysis task. The analysis is used for producing UX evaluation for interactive systems. The proposed MAEF involves the processing of multi-modal data in order to obtain valuable insights about human emotions, a situation, or a higher-level activity.
°ü·Ã IPR È®¾à¼­ Á¢¼öµÈ IPR È®¾à¼­ ¾øÀ½
°ü·ÃÆÄÀÏ    TTAK.KO-10.1253.pdf TTAK.KO-10.1253.pdf
Ç¥ÁØÀÌ·Â
Ç¥Áظí Ç¥ÁعøÈ£ Á¦°³Á¤ÀÏ ±¸ºÐ À¯È¿
¿©ºÎ
IPR
È®¾à¼­
ÆÄÀÏ
»ç¿ëÀÚ°æÇè Æò°¡¸¦ À§ÇÑ ¸ÖƼ¸ð´Þ ÀûÀÀÇü °¨Á¤ À¶ÇÕ ÇÁ·¹ÀÓ¿öÅ© TTAK.KO-10.1253 2020-12-10 Á¦Á¤ À¯È¿ ¾øÀ½ TTAK.KO-10.1253.pdf