<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Ai - Tag - Dimas Maulana</title>
    <link>https://dimasmaulana.pages.dev/tags/ai/</link>
    <description>Dimas Maulana Website</description>
    <generator>Hugo 0.150.0 &amp; FixIt v0.4.3-20260130042349-e23a50d7</generator>
    <language>en</language>
    <lastBuildDate>Fri, 21 Jun 2024 20:29:49 +0700</lastBuildDate>
    <atom:link href="https://dimasmaulana.pages.dev/tags/ai/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Setting up Local AI Chat on VSCode on Mac</title>
      <link>https://dimasmaulana.pages.dev/posts/software/setting-up-local-ai-chat-on-vscode-on-mac/</link>
      <pubDate>Fri, 21 Jun 2024 20:29:49 +0700</pubDate>
      <guid>https://dimasmaulana.pages.dev/posts/software/setting-up-local-ai-chat-on-vscode-on-mac/</guid>
      <category domain="https://dimasmaulana.pages.dev/categories/software/">Software</category>
      <description>&lt;p&gt;In this tutorial, we&amp;rsquo;ll guide you through the process of installing and setting up the ollama local AI chat service in VSCode on your Mac. We&amp;rsquo;ll also explore how to integrate it with the Continue extension for seamless AI-powered coding experiences.&lt;/p&gt;&#xA;&lt;h2 class=&#34;heading-element&#34; id=&#34;step-1-install-ollama&#34;&gt;&lt;span&gt;Step 1: Install ollama&lt;/span&gt;&#xA;  &lt;a href=&#34;#step-1-install-ollama&#34; class=&#34;heading-mark&#34;&gt;&#xA;    &lt;svg class=&#34;octicon octicon-link&#34; viewBox=&#34;0 0 16 16&#34; version=&#34;1.1&#34; width=&#34;16&#34; height=&#34;16&#34; aria-hidden=&#34;true&#34;&gt;&lt;path d=&#34;m7.775 3.275 1.25-1.25a3.5 3.5 0 1 1 4.95 4.95l-2.5 2.5a3.5 3.5 0 0 1-4.95 0 .751.751 0 0 1 .018-1.042.751.751 0 0 1 1.042-.018 1.998 1.998 0 0 0 2.83 0l2.5-2.5a2.002 2.002 0 0 0-2.83-2.83l-1.25 1.25a.751.751 0 0 1-1.042-.018.751.751 0 0 1-.018-1.042Zm-4.69 9.64a1.998 1.998 0 0 0 2.83 0l1.25-1.25a.751.751 0 0 1 1.042.018.751.751 0 0 1 .018 1.042l-1.25 1.25a3.5 3.5 0 1 1-4.95-4.95l2.5-2.5a3.5 3.5 0 0 1 4.95 0 .751.751 0 0 1-.018 1.042.751.751 0 0 1-1.042.018 1.998 1.998 0 0 0-2.83 0l-2.5 2.5a1.998 1.998 0 0 0 0 2.83Z&#34;&gt;&lt;/path&gt;&lt;/svg&gt;&#xA;  &lt;/a&gt;&#xA;&lt;/h2&gt;&lt;p&gt;To get started, open your terminal and run the following command:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Local AI Chat on Web-Based Client on Mac</title>
      <link>https://dimasmaulana.pages.dev/posts/software/local-ai-chat-on-web-based-client-on-mac/</link>
      <pubDate>Thu, 20 Jun 2024 14:26:18 +0700</pubDate>
      <guid>https://dimasmaulana.pages.dev/posts/software/local-ai-chat-on-web-based-client-on-mac/</guid>
      <category domain="https://dimasmaulana.pages.dev/categories/software/">Software</category>
      <description>&lt;p&gt;In this tutorial, we will set up a local AI chat client using ollama and Open WebUI. This will allow us to interact with our AI model locally without relying on any cloud services.&lt;/p&gt;&#xA;&lt;h2 class=&#34;heading-element&#34; id=&#34;install-ollama&#34;&gt;&lt;span&gt;Install ollama&lt;/span&gt;&#xA;  &lt;a href=&#34;#install-ollama&#34; class=&#34;heading-mark&#34;&gt;&#xA;    &lt;svg class=&#34;octicon octicon-link&#34; viewBox=&#34;0 0 16 16&#34; version=&#34;1.1&#34; width=&#34;16&#34; height=&#34;16&#34; aria-hidden=&#34;true&#34;&gt;&lt;path d=&#34;m7.775 3.275 1.25-1.25a3.5 3.5 0 1 1 4.95 4.95l-2.5 2.5a3.5 3.5 0 0 1-4.95 0 .751.751 0 0 1 .018-1.042.751.751 0 0 1 1.042-.018 1.998 1.998 0 0 0 2.83 0l2.5-2.5a2.002 2.002 0 0 0-2.83-2.83l-1.25 1.25a.751.751 0 0 1-1.042-.018.751.751 0 0 1-.018-1.042Zm-4.69 9.64a1.998 1.998 0 0 0 2.83 0l1.25-1.25a.751.751 0 0 1 1.042.018.751.751 0 0 1 .018 1.042l-1.25 1.25a3.5 3.5 0 1 1-4.95-4.95l2.5-2.5a3.5 3.5 0 0 1 4.95 0 .751.751 0 0 1-.018 1.042.751.751 0 0 1-1.042.018 1.998 1.998 0 0 0-2.83 0l-2.5 2.5a1.998 1.998 0 0 0 0 2.83Z&#34;&gt;&lt;/path&gt;&lt;/svg&gt;&#xA;  &lt;/a&gt;&#xA;&lt;/h2&gt;&lt;p&gt;First, install ollama using Homebrew:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Local Chat AI on Mac</title>
      <link>https://dimasmaulana.pages.dev/posts/software/local-chat-ai-on-mac/</link>
      <pubDate>Wed, 19 Jun 2024 13:32:37 +0700</pubDate>
      <guid>https://dimasmaulana.pages.dev/posts/software/local-chat-ai-on-mac/</guid>
      <category domain="https://dimasmaulana.pages.dev/categories/software/">Software</category>
      <description>&lt;p&gt;In this post, we&amp;rsquo;ll be diving into the installation and usage of ollama, a local chat AI that runs on your Mac.&lt;/p&gt;&#xA;&lt;h2 class=&#34;heading-element&#34; id=&#34;installing-ollama&#34;&gt;&lt;span&gt;Installing Ollama&lt;/span&gt;&#xA;  &lt;a href=&#34;#installing-ollama&#34; class=&#34;heading-mark&#34;&gt;&#xA;    &lt;svg class=&#34;octicon octicon-link&#34; viewBox=&#34;0 0 16 16&#34; version=&#34;1.1&#34; width=&#34;16&#34; height=&#34;16&#34; aria-hidden=&#34;true&#34;&gt;&lt;path d=&#34;m7.775 3.275 1.25-1.25a3.5 3.5 0 1 1 4.95 4.95l-2.5 2.5a3.5 3.5 0 0 1-4.95 0 .751.751 0 0 1 .018-1.042.751.751 0 0 1 1.042-.018 1.998 1.998 0 0 0 2.83 0l2.5-2.5a2.002 2.002 0 0 0-2.83-2.83l-1.25 1.25a.751.751 0 0 1-1.042-.018.751.751 0 0 1-.018-1.042Zm-4.69 9.64a1.998 1.998 0 0 0 2.83 0l1.25-1.25a.751.751 0 0 1 1.042.018.751.751 0 0 1 .018 1.042l-1.25 1.25a3.5 3.5 0 1 1-4.95-4.95l2.5-2.5a3.5 3.5 0 0 1 4.95 0 .751.751 0 0 1-.018 1.042.751.751 0 0 1-1.042.018 1.998 1.998 0 0 0-2.83 0l-2.5 2.5a1.998 1.998 0 0 0 0 2.83Z&#34;&gt;&lt;/path&gt;&lt;/svg&gt;&#xA;  &lt;/a&gt;&#xA;&lt;/h2&gt;&lt;pre&gt;&lt;code&gt;brew install ollama&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Once installed, you can pull down a pre-trained model (in this case, we&amp;rsquo;ll be using the &amp;ldquo;llama3&amp;rdquo; model):&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
