{"id":117,"date":"2025-03-17T15:27:07","date_gmt":"2025-03-17T19:27:07","guid":{"rendered":"https:\/\/businessphysics.ai\/?p=117"},"modified":"2025-03-17T15:33:47","modified_gmt":"2025-03-17T19:33:47","slug":"understanding-transformer-architecture-in-simple-terms","status":"publish","type":"post","link":"https:\/\/businessphysics.ai\/pt\/understanding-transformer-architecture-in-simple-terms\/","title":{"rendered":"Entendendo a arquitetura do transformador em termos simples"},"content":{"rendered":"<h2 class=\"wp-block-heading\">O que s\u00e3o transformadores?<\/h2>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Os transformadores s\u00e3o um tipo de arquitetura de rede neural que recebeu esse nome por sua capacidade de \"transformar\" o modo como a intelig\u00eancia artificial (IA) processa sequ\u00eancias de dados, especialmente textos. <\/p>\n<\/blockquote>\n\n\n\n<p>Apresentado por pesquisadores do Google em seu artigo de 2017, <em>Aten\u00e7\u00e3o \u00e9 tudo o que voc\u00ea precisa<\/em>Os Transformers melhoraram significativamente as tarefas de processamento de linguagem natural (NLP) usando um mecanismo chamado <strong>Auto-aten\u00e7\u00e3o<\/strong> (Golroudbari).<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Por que o nome \"Transformer\"?<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Transformadores<\/strong> ganharam esse nome porque mudam a forma como a IA entende as sequ\u00eancias de texto.<\/li>\n\n\n\n<li>Os modelos tradicionais de IA lidavam com o texto sequencialmente (palavra por palavra), o que resultava em um processamento mais lento e menos preciso.<\/li>\n\n\n\n<li>Em vez disso, os transformadores analisam o texto inteiro simultaneamente, identificando as rela\u00e7\u00f5es entre as palavras independentemente de sua posi\u00e7\u00e3o.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Inova\u00e7\u00e3o fundamental: Mecanismo de autoaten\u00e7\u00e3o<\/h2>\n\n\n\n<p>O Self-Attention permite que a IA identifique e priorize as palavras mais importantes em uma frase, independentemente de sua posi\u00e7\u00e3o (Golroudbari).<\/p>\n\n\n\n<p><strong>Exemplo:<\/strong><\/p>\n\n\n\n<p><em>Senten\u00e7a<\/em>: \"O gato sentou-se no tapete\".<\/p>\n\n\n\n<p>O modelo entende que \"cat\" e \"mat\" est\u00e3o intimamente relacionados, mesmo que estejam separados por outras palavras. Essa capacidade torna a compreens\u00e3o do contexto e das rela\u00e7\u00f5es mais precisa e eficiente.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"566\" height=\"498\" src=\"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif\" alt=\"\" class=\"wp-image-118\" \/><\/figure>\n\n\n\n<p>Cr\u00e9dito:&nbsp;<a href=\"https:\/\/github.com\/jessevig\/bertviz\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/jessevig\/bertviz<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Como os transformadores funcionam<\/h2>\n\n\n\n<p>Os transformadores operam em v\u00e1rias etapas:<\/p>\n\n\n\n<ol start=\"1\" class=\"wp-block-list\">\n<li><strong>Incorpora\u00e7\u00e3o de entrada<\/strong>: As palavras s\u00e3o convertidas em representa\u00e7\u00f5es num\u00e9ricas.<\/li>\n\n\n\n<li><strong>Auto-aten\u00e7\u00e3o<\/strong>: Identifica e prioriza palavras relevantes simultaneamente.<\/li>\n\n\n\n<li><strong>Camadas de feed-forward<\/strong>: Processa e refina essas informa\u00e7\u00f5es.<\/li>\n\n\n\n<li><strong>Gera\u00e7\u00e3o de sa\u00edda<\/strong>: Produz resultados significativos (como respostas ou tradu\u00e7\u00f5es).<\/li>\n<\/ol>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Por que os transformadores s\u00e3o importantes?<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Velocidade<\/strong>: Eles processam todas as palavras de uma s\u00f3 vez, em vez de sequencialmente.<\/li>\n\n\n\n<li><strong>Efici\u00eancia<\/strong>: Reduz o tempo e a complexidade da computa\u00e7\u00e3o.<\/li>\n\n\n\n<li><strong>Precis\u00e3o<\/strong>: Melhora a compreens\u00e3o ao capturar melhor o contexto e as rela\u00e7\u00f5es entre as palavras.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Aplicativos do mundo real<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Chatbots (por exemplo, ChatGPT)<\/li>\n\n\n\n<li>Ferramentas de tradu\u00e7\u00e3o<\/li>\n\n\n\n<li>Ferramentas de gera\u00e7\u00e3o de conte\u00fado de IA<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Resumo<\/h2>\n\n\n\n<p>Os Transformers mudam fundamentalmente a forma como a IA entende e processa a linguagem, usando a autoaten\u00e7\u00e3o para capturar com efici\u00eancia as rela\u00e7\u00f5es entre as palavras, tornando a IA mais r\u00e1pida e precisa em tarefas como tradu\u00e7\u00e3o, cria\u00e7\u00e3o de conte\u00fado e chatbots.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Trabalhos citados<\/h2>\n\n\n\n<p>Golroudbari, Arman Asgharpoor. \"Entendendo a autoaten\u00e7\u00e3o - um guia passo a passo\". <a href=\"https:\/\/armanasq.github.io\/nlp\/self-attention\/\"><em>armanasq.github.io<\/em>, armanasq.github.io\/nlp\/self-attention\/<\/a>. Acessado em 17 de mar\u00e7o de 2025.<\/p>","protected":false},"excerpt":{"rendered":"<p>What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[],"class_list":["post-117","post","type-post","status-publish","format-standard","hentry","category-ai-article"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.4 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/businessphysics.ai\/pt\/understanding-transformer-architecture-in-simple-terms\/\" \/>\n<meta property=\"og:locale\" content=\"pt_BR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab\" \/>\n<meta property=\"og:description\" content=\"What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/businessphysics.ai\/pt\/understanding-transformer-architecture-in-simple-terms\/\" \/>\n<meta property=\"og:site_name\" content=\"Business Physics AI Lab\" \/>\n<meta property=\"article:published_time\" content=\"2025-03-17T19:27:07+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-03-17T19:33:47+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif\" \/>\n\t<meta property=\"og:image:width\" content=\"566\" \/>\n\t<meta property=\"og:image:height\" content=\"498\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/gif\" \/>\n<meta name=\"author\" content=\"Hichem Benzair\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Escrito por\" \/>\n\t<meta name=\"twitter:data1\" content=\"Hichem Benzair\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. tempo de leitura\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutos\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"},\"author\":{\"name\":\"Hichem Benzair\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/person\\\/3f462f79c4b4d3b3d4fda03f03263bbc\"},\"headline\":\"Understanding Transformer Architecture in Simple Terms\",\"datePublished\":\"2025-03-17T19:27:07+00:00\",\"dateModified\":\"2025-03-17T19:33:47+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"},\"wordCount\":311,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"articleSection\":[\"AI Article\"],\"inLanguage\":\"pt-BR\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\",\"name\":\"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"datePublished\":\"2025-03-17T19:27:07+00:00\",\"dateModified\":\"2025-03-17T19:33:47+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#breadcrumb\"},\"inLanguage\":\"pt-BR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"pt-BR\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#primaryimage\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\",\"contentUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/03\\\/head-view.gif\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/understanding-transformer-architecture-in-simple-terms\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/businessphysics.ai\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Understanding Transformer Architecture in Simple Terms\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#website\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/\",\"name\":\"Business Physics AI Lab\",\"description\":\"About the Founder: Professor Thomas Hormaza Dow fosters an environment where research meets application, and where interns, researchers, and industry leaders collaborate to unlock new frontiers.\",\"publisher\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/businessphysics.ai\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"pt-BR\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#organization\",\"name\":\"Business Physics AI Lab\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"pt-BR\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/business-physics-logo-large-720.jpg\",\"contentUrl\":\"https:\\\/\\\/businessphysics.ai\\\/wp-content\\\/uploads\\\/2025\\\/01\\\/business-physics-logo-large-720.jpg\",\"width\":720,\"height\":720,\"caption\":\"Business Physics AI Lab\"},\"image\":{\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/businessphysics.ai\\\/#\\\/schema\\\/person\\\/3f462f79c4b4d3b3d4fda03f03263bbc\",\"name\":\"Hichem Benzair\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"pt-BR\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g\",\"caption\":\"Hichem Benzair\"},\"url\":\"https:\\\/\\\/businessphysics.ai\\\/pt\\\/author\\\/hichem-benzair\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/businessphysics.ai\/pt\/understanding-transformer-architecture-in-simple-terms\/","og_locale":"pt_BR","og_type":"article","og_title":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","og_description":"What Are Transformers? Transformers are a type of neural network architecture named for their ability to &#8220;transform&#8221; how artificial intelligence (AI) processes sequences of data, especially text. Introduced by Google researchers in their 2017 paper, Attention Is All You Need, Transformers significantly improved Natural Language Processing (NLP) tasks by using a mechanism called Self-Attention (Golroudbari). [&hellip;]","og_url":"https:\/\/businessphysics.ai\/pt\/understanding-transformer-architecture-in-simple-terms\/","og_site_name":"Business Physics AI Lab","article_published_time":"2025-03-17T19:27:07+00:00","article_modified_time":"2025-03-17T19:33:47+00:00","og_image":[{"url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","width":566,"height":498,"type":"image\/gif"}],"author":"Hichem Benzair","twitter_card":"summary_large_image","twitter_misc":{"Escrito por":"Hichem Benzair","Est. tempo de leitura":"2 minutos"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#article","isPartOf":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"},"author":{"name":"Hichem Benzair","@id":"https:\/\/businessphysics.ai\/#\/schema\/person\/3f462f79c4b4d3b3d4fda03f03263bbc"},"headline":"Understanding Transformer Architecture in Simple Terms","datePublished":"2025-03-17T19:27:07+00:00","dateModified":"2025-03-17T19:33:47+00:00","mainEntityOfPage":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"},"wordCount":311,"commentCount":0,"publisher":{"@id":"https:\/\/businessphysics.ai\/#organization"},"image":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"thumbnailUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","articleSection":["AI Article"],"inLanguage":"pt-BR","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/","url":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/","name":"Understanding Transformer Architecture in Simple Terms - Business Physics AI Lab","isPartOf":{"@id":"https:\/\/businessphysics.ai\/#website"},"primaryImageOfPage":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"image":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage"},"thumbnailUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","datePublished":"2025-03-17T19:27:07+00:00","dateModified":"2025-03-17T19:33:47+00:00","breadcrumb":{"@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#breadcrumb"},"inLanguage":"pt-BR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/"]}]},{"@type":"ImageObject","inLanguage":"pt-BR","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#primaryimage","url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif","contentUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/03\/head-view.gif"},{"@type":"BreadcrumbList","@id":"https:\/\/businessphysics.ai\/understanding-transformer-architecture-in-simple-terms\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/businessphysics.ai\/"},{"@type":"ListItem","position":2,"name":"Understanding Transformer Architecture in Simple Terms"}]},{"@type":"WebSite","@id":"https:\/\/businessphysics.ai\/#website","url":"https:\/\/businessphysics.ai\/","name":"Laborat\u00f3rio de IA para F\u00edsica Empresarial","description":"Sobre o fundador: O professor Thomas Hormaza Dow promove um ambiente onde a pesquisa encontra a aplica\u00e7\u00e3o e onde estagi\u00e1rios, pesquisadores e l\u00edderes do setor colaboram para desvendar novas fronteiras.","publisher":{"@id":"https:\/\/businessphysics.ai\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/businessphysics.ai\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"pt-BR"},{"@type":"Organization","@id":"https:\/\/businessphysics.ai\/#organization","name":"Laborat\u00f3rio de IA para F\u00edsica Empresarial","url":"https:\/\/businessphysics.ai\/","logo":{"@type":"ImageObject","inLanguage":"pt-BR","@id":"https:\/\/businessphysics.ai\/#\/schema\/logo\/image\/","url":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/01\/business-physics-logo-large-720.jpg","contentUrl":"https:\/\/businessphysics.ai\/wp-content\/uploads\/2025\/01\/business-physics-logo-large-720.jpg","width":720,"height":720,"caption":"Business Physics AI Lab"},"image":{"@id":"https:\/\/businessphysics.ai\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/businessphysics.ai\/#\/schema\/person\/3f462f79c4b4d3b3d4fda03f03263bbc","name":"Hichem Benzair","image":{"@type":"ImageObject","inLanguage":"pt-BR","@id":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/05dd03546ad6af9acfa51f5450a2e74f167c3259d223e556d2d438634927bf26?s=96&d=mm&r=g","caption":"Hichem Benzair"},"url":"https:\/\/businessphysics.ai\/pt\/author\/hichem-benzair\/"}]}},"_links":{"self":[{"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/posts\/117","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/comments?post=117"}],"version-history":[{"count":2,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/posts\/117\/revisions"}],"predecessor-version":[{"id":120,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/posts\/117\/revisions\/120"}],"wp:attachment":[{"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/media?parent=117"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/categories?post=117"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/businessphysics.ai\/pt\/wp-json\/wp\/v2\/tags?post=117"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}