{"id":1288,"date":"2026-05-07T09:05:00","date_gmt":"2026-05-07T16:05:00","guid":{"rendered":"https:\/\/www.kenwalger.com\/blog\/?p=1288"},"modified":"2026-03-17T11:57:45","modified_gmt":"2026-03-17T18:57:45","slug":"the-local-eye-sovereign-vision","status":"publish","type":"post","link":"https:\/\/www.kenwalger.com\/blog\/ai\/the-local-eye-sovereign-vision\/","title":{"rendered":"The Local Eye (Sovereign Vision)"},"content":{"rendered":"<p>We\u2019ve built a system that is <a href=\"https:\/\/www.kenwalger.com\/blog\/ai\/ai-agent-reliability-llm-as-a-judge\">Reliable<\/a>, <a href=\"https:\/\/www.kenwalger.com\/blog\/ai\/the-accountant-optimizing-ai-costs-with-semantic-routing\/\">Affordable<\/a>, and <a href=\"https:\/\/www.kenwalger.com\/blog\/ai\/ai-agent-governance-human-in-the-loop-hitl\/\">Governed<\/a>. But until now, our Forensic Team has been &#8220;blind.&#8221; It could only reconcile text-based metadata.<\/p>\n<p>In the world of rare book forensics, the text is only half the story. The typography, paper grain, and binding texture are the true &#8220;fingerprints.&#8221; However, sending high-resolution, proprietary scans of a $50,000 asset to a cloud-based LLM is a Data Sovereignty nightmare.<\/p>\n<p>Today, we introduce <strong>The Local Eye<\/strong>: Edge-based Multimodal Vision that processes pixels without letting them leak into the cloud.<\/p>\n<h2>The Sovereignty Gap in Multimodal AI<\/h2>\n<p>Most multimodal implementations send raw images directly to frontier models (like GPT-4o). For an enterprise, this is a liability.<\/p>\n<ol>\n<li><strong>Intellectual Property:<\/strong> Who owns the training data rights to the scan?<\/li>\n<li><strong>Privacy:<\/strong> Does the image contain metadata or background information that violates NDAs?<\/li>\n<li><strong>Cost:<\/strong> Sending 10MB 4K images for every query is an &#8220;Accountant&#8217;s&#8221; nightmare.<\/li>\n<\/ol>\n<h2>Implementing &#8220;Feature Extraction&#8221; at the Edge<\/h2>\n<p>Instead of sending the image to the cloud, we use <a href=\"https:\/\/ollama.com\/library\/llama3.2-vision\">Llama 3.2 Vision<\/a> running locally via <a href=\"https:\/\/ollama.com\/\">Ollama<\/a>. Our MCP server acts as an &#8220;Airlock.&#8221;<\/p>\n<p><strong>The Handshake:<\/strong><br \/>\n&#8211; <strong>Normalization:<\/strong> The <code>sharp<\/code> library resizes and standardizes the forensic scan locally.<br \/>\n&#8211; <strong>Local Inference:<\/strong> The Vision SLM analyzes the image and generates a text-based &#8220;Feature Map.&#8221;<br \/>\n&#8211; <strong>Metadata Egress:<\/strong> Only the textual description is passed to the reasoning agents. Even if The Accountant routes the task to a Cloud model for deep analysis, the cloud only sees our description, never the pixels.<\/p>\n<figure id=\"attachment_1291\" aria-describedby=\"caption-attachment-1291\" style=\"width: 840px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"1291\" data-permalink=\"https:\/\/www.kenwalger.com\/blog\/ai\/the-local-eye-sovereign-vision\/attachment\/sovereign-ai-local-vision-mcp-architecture\/\" data-orig-file=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-scaled.png\" data-orig-size=\"2560,308\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"sovereign-ai-local-vision-mcp-architecture\" data-image-description=\"\" data-image-caption=\"&lt;p&gt;The Sovereign Vision Workflow\u2014Extracting intelligence at the edge to prevent data leakage.&lt;\/p&gt;\n\" data-large-file=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-1024x123.png\" class=\"size-large wp-image-1291\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-1024x123.png\" alt=\"Architectural diagram of the 'Local Eye' workflow. An artifact image is processed locally using the Sharp library and Llama 3.2 Vision. Only the resulting text metadata is allowed to pass through the security airlock to cloud-based reasoning models, ensuring the original pixels never leave the local environment.\" width=\"840\" height=\"101\" srcset=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-1024x123.png 1024w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-300x36.png 300w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-768x92.png 768w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-1536x185.png 1536w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-2048x246.png 2048w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/sovereign-ai-local-vision-mcp-architecture-1200x144.png 1200w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><figcaption id=\"caption-attachment-1291\" class=\"wp-caption-text\">The Sovereign Vision Workflow\u2014Extracting intelligence at the edge to prevent data leakage.<\/figcaption><\/figure>\n<p><em>The Sovereign Vision Workflow\u2014Extracting intelligence at the edge to prevent data leakage.<\/em><br \/>\n<img decoding=\"async\" src=\"sovereign-ai-local-vision-mcp-architecture.png\" alt=\"Architectural diagram of the 'Local Eye' workflow. An artifact image is processed locally using the Sharp library and Llama 3.2 Vision. Only the resulting text metadata is allowed to pass through the security airlock to cloud-based reasoning models, ensuring the original pixels never leave the local environment.\" \/><\/p>\n<p>In code we might have something like this then:<\/p>\n<pre><code class=\"language-typescript\">\/\/ From src\/index.ts: The Vision Airlock\nasync function analyzeArtifactVision(imagePath: string, focus: string) {\n  const processedImage = await sharp(imagePath).resize(512, 512).toBuffer();\n\n  \/\/ Local-only call to Ollama\n  const description = await ollama.generate({\n    model: 'llama3.2-vision',\n    prompt: `Analyze the ${focus} of this artifact.`,\n    images: [processedImage.toString('base64')]\n  });\n\n  return description; \/\/ Pixels stay here. Only text leaves.\n}\n<\/code><\/pre>\n<h2>The &#8220;Zero-Pixel&#8221; Policy<\/h2>\n<p>The goal is to maximize <strong>Intelligence<\/strong> while minimizing <strong>Exposure<\/strong>. By implementing Local Vision, we treat the cloud as a &#8220;Reasoning Utility,&#8221; not a &#8220;Data Store.&#8221; We send it the logic puzzle, but we never give it the raw forensic evidence. We gain the power of frontier-model reasoning without the risk of data harvesting.<\/p>\n<h3>Developer Lessons: The &#8220;Latency of Locality&#8221;<\/h3>\n<p>In building the Sovereign Vault, we learned that &#8216;Data Sovereignty&#8217; has a physical cost: <strong>Time<\/strong>.<\/p>\n<p>While a cloud-based API might analyze a 4K image in seconds, running a deep-dive OCR and visual analysis on local consumer hardware using Llama 3.2-Vision takes significantly longer. We had to tune our &#8220;Airlock&#8221; timeouts\u2014raising the ceiling from <strong>120 seconds<\/strong> to <strong>300 seconds<\/strong>\u2014to give the local &#8220;Eye&#8221; enough time to process complex handwriting on a standard CPU.<\/p>\n<p>Additionally, we realized that our error logs were a potential privacy leak. We implemented <em>Log Truncation<\/em> to ensure that even our failures respect the Sovereign Vault&#8217;s privacy mandate.<\/p>\n<h2>The &#8220;Zero-Glue&#8221; Discovery<\/h2>\n<p>In a traditional setup, adding vision would require rewriting the orchestrator&#8217;s core logic. Because we use the <strong>Model Context Protocol<\/strong>, the orchestrator simply asked the server: &#8220;What can you do?&#8221;. The server replied with the <code>analyze_artifact_vision<\/code> manifest. The agent then dynamically decided to use this new &#8220;Eye&#8221; to investigate the Gatsby image. No new glue code was written to connect the vision model to the reasoning brain.<\/p>\n<h2>Case Study: The Gatsby Inscription<\/h2>\n<p>To test our <em>Sovereign Vault<\/em>, we ran a forensic audit on a high-value first edition of <em>The Great Gatsby<\/em>. Our local Vision Agent detected something anomalous on the title page: a cursive, multi-line inscription.<\/p>\n<figure id=\"attachment_1292\" aria-describedby=\"caption-attachment-1292\" style=\"width: 700px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"1292\" data-permalink=\"https:\/\/www.kenwalger.com\/blog\/ai\/the-local-eye-sovereign-vision\/attachment\/great_gatsby\/\" data-orig-file=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/great_gatsby.jpg\" data-orig-size=\"700,504\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}\" data-image-title=\"great_gatsby\" data-image-description=\"\" data-image-caption=\"&lt;p&gt;Image credit: [University of Southern Mississippi Special Collections](https:\/\/lib.usm.edu\/spcol\/exhibitions\/item_of_the_month\/iotm_june_2021.html) (June 2021 Item of the Month)&lt;\/p&gt;\n\" data-large-file=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/great_gatsby.jpg\" class=\"size-full wp-image-1292\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/great_gatsby.jpg\" alt=\"An image of The Great Gatsby copyright page\" width=\"700\" height=\"504\" srcset=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/great_gatsby.jpg 700w, https:\/\/www.kenwalger.com\/blog\/wp-content\/uploads\/2026\/03\/great_gatsby-300x216.jpg 300w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 984px) 61vw, (max-width: 1362px) 45vw, 600px\" \/><figcaption id=\"caption-attachment-1292\" class=\"wp-caption-text\">Image credit: [University of Southern Mississippi Special Collections](https:\/\/lib.usm.edu\/spcol\/exhibitions\/item_of_the_month\/iotm_june_2021.html) (June 2021 Item of the Month)<\/figcaption><\/figure>\n<h3>The Sovereign Trace<\/h3>\n<p>When we ran the <code>analyze_artifact_vision<\/code> tool, the local Llama 3.2 Vision model performed a deep scan and returned a fascinating finding:<\/p>\n<pre><code class=\"language-plaintext\">**Visual Findings: Handwritten Inscription**\n* Location: Right-hand margin of title page\n* Medium: Faint pencil, cursive script\n* Transcribed Content: \"Then we are not alone at all when we remember that we have in our hearts that something so precious...\"\n<\/code><\/pre>\n<p><strong>Why this matters:<\/strong> Notice that the model didn&#8217;t just see &#8220;scribbles.&#8221; It attempted to transcribe a 40-word passage. Crucially, the <strong>Forensic Analyst<\/strong> (Claude) recognized that this text does not exist in any canonical version of <em>The Great Gatsby<\/em>.<\/p>\n<p>This is a massive forensic win. The &#8220;Eye&#8221; identified a potential <strong>fabricated provenance<\/strong> or a non-standard owner intervention. Because this happened inside our &#8220;<strong>Airlock<\/strong>,&#8221; the specific handwriting and the non-canonical text were captured without ever touching a cloud API.<\/p>\n<p><strong>The Architect\u2019s Trade-off: The Reasoning Gap<\/strong><br \/>\nWhile our local Llama 3.2-Vision is an incredible &#8220;Eye,&#8221; it occasionally faces a <strong>Reasoning Gap<\/strong>. In certain runs, it may identify a note as &#8220;illegible&#8221; or produce repetitive output due to CPU thermal throttling or model constraints.<\/p>\n<p>Instead of hallucinating a &#8220;clean&#8221; signature, our system is designed to <strong>Safe-Fail<\/strong>. It flags the finding as <strong>&#8220;Indeterminate&#8221;<\/strong> and triggers a <strong>High-Severity Human Authorization<\/strong> request.<\/p>\n<p><strong>The Governance Challenge:<\/strong> We now have a transcribed inscription that might contain a previous owner&#8217;s private thoughts or names. If we simply passed this output to an LLM for summarization, we would have leaked a private message to a third-party server. This discovery sets the stage for our next architectural layer: <strong>The Redactor<\/strong>.<\/p>\n<a class=\"synved-social-button synved-social-button-share synved-social-size-48 synved-social-resolution-single synved-social-provider-facebook nolightbox\" data-provider=\"facebook\" target=\"_blank\" rel=\"nofollow\" title=\"Share on Facebook\" href=\"https:\/\/www.facebook.com\/sharer.php?u=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288&#038;t=The%20Local%20Eye%20%28Sovereign%20Vision%29&#038;s=100&#038;p&#091;url&#093;=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288&#038;p&#091;images&#093;&#091;0&#093;=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-content%2Fuploads%2F2026%2F03%2Fsovereign-ai-local-vision-mcp-architecture-1024x123.png&#038;p&#091;title&#093;=The%20Local%20Eye%20%28Sovereign%20Vision%29\" style=\"font-size: 0px;width:48px;height:48px;margin:0;margin-bottom:5px;margin-right:5px\"><img loading=\"lazy\" decoding=\"async\" alt=\"Facebook\" title=\"Share on Facebook\" class=\"synved-share-image synved-social-image synved-social-image-share\" width=\"48\" height=\"48\" style=\"display: inline;width:48px;height:48px;margin: 0;padding: 0;border: none;box-shadow: none\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/plugins\/social-media-feather\/synved-social\/image\/social\/regular\/96x96\/facebook.png\" \/><\/a><a class=\"synved-social-button synved-social-button-share synved-social-size-48 synved-social-resolution-single synved-social-provider-twitter nolightbox\" data-provider=\"twitter\" target=\"_blank\" rel=\"nofollow\" title=\"Share on Twitter\" href=\"https:\/\/twitter.com\/intent\/tweet?url=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288&#038;text=Hey%20check%20this%20out\" style=\"font-size: 0px;width:48px;height:48px;margin:0;margin-bottom:5px;margin-right:5px\"><img loading=\"lazy\" decoding=\"async\" alt=\"twitter\" title=\"Share on Twitter\" class=\"synved-share-image synved-social-image synved-social-image-share\" width=\"48\" height=\"48\" style=\"display: inline;width:48px;height:48px;margin: 0;padding: 0;border: none;box-shadow: none\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/plugins\/social-media-feather\/synved-social\/image\/social\/regular\/96x96\/twitter.png\" \/><\/a><a class=\"synved-social-button synved-social-button-share synved-social-size-48 synved-social-resolution-single synved-social-provider-reddit nolightbox\" data-provider=\"reddit\" target=\"_blank\" rel=\"nofollow\" title=\"Share on Reddit\" href=\"https:\/\/www.reddit.com\/submit?url=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288&#038;title=The%20Local%20Eye%20%28Sovereign%20Vision%29\" style=\"font-size: 0px;width:48px;height:48px;margin:0;margin-bottom:5px;margin-right:5px\"><img loading=\"lazy\" decoding=\"async\" alt=\"reddit\" title=\"Share on Reddit\" class=\"synved-share-image synved-social-image synved-social-image-share\" width=\"48\" height=\"48\" style=\"display: inline;width:48px;height:48px;margin: 0;padding: 0;border: none;box-shadow: none\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/plugins\/social-media-feather\/synved-social\/image\/social\/regular\/96x96\/reddit.png\" \/><\/a><a class=\"synved-social-button synved-social-button-share synved-social-size-48 synved-social-resolution-single synved-social-provider-linkedin nolightbox\" data-provider=\"linkedin\" target=\"_blank\" rel=\"nofollow\" title=\"Share on Linkedin\" href=\"https:\/\/www.linkedin.com\/shareArticle?mini=true&#038;url=https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288&#038;title=The%20Local%20Eye%20%28Sovereign%20Vision%29\" style=\"font-size: 0px;width:48px;height:48px;margin:0;margin-bottom:5px;margin-right:5px\"><img loading=\"lazy\" decoding=\"async\" alt=\"linkedin\" title=\"Share on Linkedin\" class=\"synved-share-image synved-social-image synved-social-image-share\" width=\"48\" height=\"48\" style=\"display: inline;width:48px;height:48px;margin: 0;padding: 0;border: none;box-shadow: none\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/plugins\/social-media-feather\/synved-social\/image\/social\/regular\/96x96\/linkedin.png\" \/><\/a><a class=\"synved-social-button synved-social-button-share synved-social-size-48 synved-social-resolution-single synved-social-provider-mail nolightbox\" data-provider=\"mail\" rel=\"nofollow\" title=\"Share by email\" href=\"mailto:?subject=The%20Local%20Eye%20%28Sovereign%20Vision%29&#038;body=Hey%20check%20this%20out:%20https%3A%2F%2Fwww.kenwalger.com%2Fblog%2Fwp-json%2Fwp%2Fv2%2Fposts%2F1288\" style=\"font-size: 0px;width:48px;height:48px;margin:0;margin-bottom:5px\"><img loading=\"lazy\" decoding=\"async\" alt=\"mail\" title=\"Share by email\" class=\"synved-share-image synved-social-image synved-social-image-share\" width=\"48\" height=\"48\" style=\"display: inline;width:48px;height:48px;margin: 0;padding: 0;border: none;box-shadow: none\" src=\"https:\/\/www.kenwalger.com\/blog\/wp-content\/plugins\/social-media-feather\/synved-social\/image\/social\/regular\/96x96\/mail.png\" \/><\/a>","protected":false},"excerpt":{"rendered":"<p>We\u2019ve built a system that is Reliable, Affordable, and Governed. But until now, our Forensic Team has been &#8220;blind.&#8221; It could only reconcile text-based metadata. In the world of rare book forensics, the text is only half the story. The typography, paper grain, and binding texture are the true &#8220;fingerprints.&#8221; However, sending high-resolution, proprietary scans &hellip; <a href=\"https:\/\/www.kenwalger.com\/blog\/ai\/the-local-eye-sovereign-vision\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;The Local Eye (Sovereign Vision)&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"pmpro_default_level":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_post_was_ever_published":false},"categories":[1669,1670],"tags":[1749,1731,1763,1760,1761,1680,1671,1762],"yst_prominent_words":[],"class_list":["post-1288","post","type-post","status-publish","format-standard","hentry","category-ai","category-mcp","tag-computer-vision","tag-data-sovereignty","tag-edge-computing","tag-llama-3-2-vision","tag-local-ai","tag-mcp","tag-model-context-protocol","tag-ollama","pmpro-has-access"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p8lx70-kM","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/posts\/1288","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/comments?post=1288"}],"version-history":[{"count":3,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/posts\/1288\/revisions"}],"predecessor-version":[{"id":1293,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/posts\/1288\/revisions\/1293"}],"wp:attachment":[{"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/media?parent=1288"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/categories?post=1288"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/tags?post=1288"},{"taxonomy":"yst_prominent_words","embeddable":true,"href":"https:\/\/www.kenwalger.com\/blog\/wp-json\/wp\/v2\/yst_prominent_words?post=1288"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}