{"id":117,"date":"2025-03-17T15:27:07","date_gmt":"2025-03-17T19:27:07","guid":{"rendered":"https:\/\/businessphysics.ai\/?p=117"},"modified":"2025-03-17T15:33:47","modified_gmt":"2025-03-17T19:33:47","slug":"understanding-transformer-architecture-in-simple-terms","status":"publish","type":"post","link":"https:\/\/businessphysics.ai\/fr\/understanding-transformer-architecture-in-simple-terms\/","title":{"rendered":"Comprendre l'architecture des transformateurs en termes simples"},"content":{"rendered":"<h2 class=\"wp-block-heading\">Qu'est-ce qu'un transformateur ?<\/h2>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Les transformateurs sont un type d'architecture de r\u00e9seau neuronal qui doit son nom \u00e0 sa capacit\u00e9 \u00e0 \"transformer\" la mani\u00e8re dont l'intelligence artificielle (IA) traite les s\u00e9quences de donn\u00e9es, en particulier les textes. <\/p>\n<\/blockquote>\n\n\n\n<p>Introduit par les chercheurs de Google dans leur article de 2017, <em>L'attention est tout ce dont vous avez besoin<\/em>Les Transformers ont consid\u00e9rablement am\u00e9lior\u00e9 les t\u00e2ches de traitement du langage naturel (NLP) en utilisant un m\u00e9canisme appel\u00e9 <strong>Attention \u00e0 soi<\/strong> (Golroudbari).<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Pourquoi le nom \"Transformer\" ?<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Transformateurs<\/strong> ont gagn\u00e9 leur nom parce qu'ils modifient la fa\u00e7on dont l'IA comprend les s\u00e9quences de texte.<\/li>\n\n\n\n<li>Les mod\u00e8les d'IA traditionnels traitent les textes de mani\u00e8re s\u00e9quentielle (mot par mot), ce qui entra\u00eene un traitement plus lent et moins pr\u00e9cis.<\/li>\n\n\n\n<li>Les transformateurs, au contraire, analysent l'ensemble du texte simultan\u00e9ment, en identifiant les relations entre les mots, quelle que soit leur position.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Innovation cl\u00e9 : M\u00e9canisme d'auto-attention<\/h2>\n\n\n\n<p>L'auto-attention permet \u00e0 l'IA d'identifier et de hi\u00e9rarchiser les mots les plus importants d'une phrase, quelle que soit leur position (Golroudbari).<\/p>\n\n\n\n<p><strong>Exemple :<\/strong><\/p>\n\n\n\n<p><em>Phrase<\/em>: \"Le chat s'est assis sur le tapis\".<\/p>\n\n\n\n<p>Le mod\u00e8le comprend que \"chat\" et \"tapis\" sont \u00e9troitement li\u00e9s, m\u00eame s'ils sont s\u00e9par\u00e9s par d'autres mots. Cette capacit\u00e9 rend la compr\u00e9hension du contexte et des relations plus pr\u00e9cise et plus efficace.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"566\" height=\"498\" src=\"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif\" alt=\"\" class=\"wp-image-118\" \/><\/figure>\n\n\n\n<p>Cr\u00e9dit :&nbsp;<a href=\"https:\/\/github.com\/jessevig\/bertviz\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/jessevig\/bertviz<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Comment fonctionnent les transformateurs<\/h2>\n\n\n\n<p>Les transformateurs fonctionnent en plusieurs \u00e9tapes :<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><strong>Int\u00e9gration des donn\u00e9es d'entr\u00e9e<\/strong>: Les mots sont convertis en repr\u00e9sentations num\u00e9riques.<\/li>\n\n\n\n<li><strong>Attention \u00e0 soi<\/strong>: Identifie et hi\u00e9rarchise simultan\u00e9ment les mots pertinents.<\/li>\n\n\n\n<li><strong>Couches d'alimentation<\/strong>: Traite et affine ces informations.<\/li>\n\n\n\n<li><strong>G\u00e9n\u00e9ration de sorties<\/strong>: produit des r\u00e9sultats significatifs (tels que des r\u00e9ponses ou des traductions).<\/li>\n<\/ol>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Pourquoi les transformateurs sont-ils importants ?<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Vitesse<\/strong>: Ils traitent tous les mots en m\u00eame temps plut\u00f4t que de mani\u00e8re s\u00e9quentielle.<\/li>\n\n\n\n<li><strong>Efficacit\u00e9<\/strong>: R\u00e9duit le temps de calcul et la complexit\u00e9.<\/li>\n\n\n\n<li><strong>Pr\u00e9cision<\/strong>: Am\u00e9liore la compr\u00e9hension en saisissant mieux le contexte et les relations entre les mots.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Applications dans le monde r\u00e9el<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Chatbots (par exemple, ChatGPT)<\/li>\n\n\n\n<li>Outils de traduction<\/li>\n\n\n\n<li>Outils de g\u00e9n\u00e9ration de contenu par l'IA<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">R\u00e9sum\u00e9<\/h2>\n\n\n\n<p>Les transformateurs modifient fondamentalement la fa\u00e7on dont l'IA comprend et traite le langage en utilisant l'auto-attention pour capturer efficacement les relations entre les mots, ce qui rend l'IA plus rapide et plus pr\u00e9cise dans des t\u00e2ches telles que la traduction, la cr\u00e9ation de contenu et les chatbots.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Ouvrages cit\u00e9s<\/h2>\n\n\n\n<p>Golroudbari, Arman Asgharpoor. \"Comprendre l'attention \u00e0 soi - un guide \u00e9tape par \u00e9tape. <a href=\"https:\/\/armanasq.github.io\/nlp\/self-attention\/\"><em>armanasq.github.io<\/em>, armanasq.github.io\/nlp\/self-attention\/<\/a>. Consult\u00e9 le 17 mars 2025.<\/p>","protected":false},"excerpt":{"rendered":"<p>What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[],"class_list":["post-117","post","type-post","status-publish","format-standard","hentry","category-ai-article"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/businessphysics.ai\/fr\/understanding-transformer-architecture-in-simple-terms\/\" \/>\n<meta property=\"og:locale\" content=\"fr_CA\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab\" \/>\n<meta property=\"og:description\" content=\"What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/businessphysics.ai\/fr\/understanding-transformer-architecture-in-simple-terms\/\" \/>\n<meta property=\"og:site_name\" content=\"Business Physics AI Lab\" \/>\n<meta property=\"article:published_time\" content=\"2025-03-17T19:27:07+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-03-17T19:33:47+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif\" \/>\n\t<meta property=\"og:image:width\" content=\"566\" \/>\n\t<meta property=\"og:image:height\" content=\"498\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/gif\" \/>\n<meta name=\"author\" content=\"Hichem Benzair\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u00c9crit par\" \/>\n\t<meta name=\"twitter:data1\" content=\"Hichem Benzair\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimation du temps de lecture\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"},\"author\":{\"name\":\"Hichem Benzair\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/person\\\/3f462f79c4b4d3b3d4fda03f03263bbc\"},\"headline\":\"Understanding Transformer Architecture in Simple Terms\",\"datePublished\":\"2025-03-17T19:27:07+00:00\",\"dateModified\":\"2025-03-17T19:33:47+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"},\"wordCount\":311,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"articleSection\":[\"AI Article\"],\"inLanguage\":\"fr-CA\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\",\"name\":\"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"datePublished\":\"2025-03-17T19:27:07+00:00\",\"dateModified\":\"2025-03-17T19:33:47+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#breadcrumb\"},\"inLanguage\":\"fr-CA\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-CA\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"contentUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/businessphysics.ai\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Understanding Transformer Architecture in Simple Terms\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#website\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/\",\"name\":\"Business Physics AI Lab\",\"description\":\"About the Founder: Professor Thomas Hormaza Dow fosters an environment where research meets application, and where interns, researchers, and industry leaders collaborate to unlock new frontiers.\",\"publisher\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/businessphysics.ai\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-CA\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\",\"name\":\"Business Physics AI Lab\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-CA\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/business-physics-logo-large-720.jpg\",\"contentUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/business-physics-logo-large-720.jpg\",\"width\":720,\"height\":720,\"caption\":\"Business Physics AI Lab\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/person\\\/3f462f79c4b4d3b3d4fda03f03263bbc\",\"name\":\"Hichem Benzair\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-CA\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"caption\":\"Hichem Benzair\"},\"url\":\"https:\\\/\\\/businessphysics.ai\\\/fr\\\/author\\\/hichem-benzair\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/businessphysics.ai\/fr\/understanding-transformer-architecture-in-simple-terms\/","og_locale":"fr_CA","og_type":"article","og_title":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","og_description":"What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]","og_url":"https:\/\/businessphysics.ai\/fr\/understanding-transformer-architecture-in-simple-terms\/","og_site_name":"Business Physics AI Lab","article_published_time":"2025-03-17T19:27:07+00:00","article_modified_time":"2025-03-17T19:33:47+00:00","og_image":[{"url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","width":566,"height":498,"type":"image\/gif"}],"author":"Hichem Benzair","twitter_card":"summary_large_image","twitter_misc":{"\u00c9crit par":"Hichem Benzair","Estimation du temps de lecture":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#article","isPartOf":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"},"author":{"name":"Hichem Benzair","@id":"https:\/\/businessphysics.ai\/#\/schema\/person\/3f462f79c4b4d3b3d4fda03f03263bbc"},"headline":"Understanding Transformer Architecture in Simple Terms","datePublished":"2025-03-17T19:27:07+00:00","dateModified":"2025-03-17T19:33:47+00:00","mainEntityOfPage":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"},"wordCount":311,"commentCount":0,"publisher":{"@id":"https:\/\/businessphysics.ai\/#organization"},"image":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"thumbnailUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","articleSection":["AI Article"],"inLanguage":"fr-CA","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/","url":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/","name":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","isPartOf":{"@id":"https:\/\/businessphysics.ai\/#website"},"primaryImageOfPage":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"image":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"thumbnailUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","datePublished":"2025-03-17T19:27:07+00:00","dateModified":"2025-03-17T19:33:47+00:00","breadcrumb":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#breadcrumb"},"inLanguage":"fr-CA","potentialAction":[{"@type":"ReadAction","target":["https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"]}]},{"@type":"ImageObject","inLanguage":"fr-CA","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage","url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","contentUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif"},{"@type":"BreadcrumbList","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/businessphysics.ai\/"},{"@type":"ListItem","position":2,"name":"Understanding Transformer Architecture in Simple Terms"}]},{"@type":"WebSite","@id":"https:\/\/businessphysics.ai\/#website","url":"https:\/\/businessphysics.ai\/","name":"Business Physics AI Lab","description":"\u00c0 propos du fondateur : Le professeur Thomas Hormaza Dow favorise un environnement o\u00f9 la recherche rencontre l'application, et o\u00f9 les stagiaires, les chercheurs et les leaders de l'industrie collaborent pour ouvrir de nouvelles fronti\u00e8res.","publisher":{"@id":"https:\/\/businessphysics.ai\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/businessphysics.ai\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-CA"},{"@type":"Organization","@id":"https:\/\/businessphysics.ai\/#organization","name":"Business Physics AI Lab","url":"https:\/\/businessphysics.ai\/","logo":{"@type":"ImageObject","inLanguage":"fr-CA","@id":"https:\/\/businessphysics.ai\/#\/schema\/logo\/image\/","url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/01\/business-physics-logo-large-720.jpg","contentUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/01\/business-physics-logo-large-720.jpg","width":720,"height":720,"caption":"Business Physics AI Lab"},"image":{"@id":"https:\/\/businessphysics.ai\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/businessphysics.ai\/#\/schema\/person\/3f462f79c4b4d3b3d4fda03f03263bbc","name":"Hichem Benzair","image":{"@type":"ImageObject","inLanguage":"fr-CA","@id":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","caption":"Hichem Benzair"},"url":"https:\/\/businessphysics.ai\/fr\/author\/hichem-benzair\/"}]}},"_links":{"self":[{"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/posts\/117","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/comments?post=117"}],"version-history":[{"count":2,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/posts\/117\/revisions"}],"predecessor-version":[{"id":120,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/posts\/117\/revisions\/120"}],"wp:attachment":[{"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/media?parent=117"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/categories?post=117"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/businessphysics.ai\/fr\/wp-json\/wp\/v2\/tags?post=117"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}