Thursday, February 5, 2026

Building Websites in 2026

Building Websites in 2026: From $2000 Custom Jobs to AI-Generated Apps in Minutes

Remember the early 2000s? Building a website for a company or organization would cost around $2,000 and require weeks of coding with HTML, JavaScript, and PHP/MySQL. Fast forward to today, and the landscape has completely transformed. Thanks to automation and AI, anyone can build a functional website within minutes.

The Evolution of Web Development

The core challenge of building websites hasn't changed—it's still about control, flexibility, and maintenance. What has changed dramatically are the tools and approaches available to us. Let me break down your options:

1. Code It Yourself

For developers who want maximum control, powerful frameworks like Bootstrap combined with HTML5 and JavaScript remain excellent choices. This approach gives you complete flexibility but requires coding knowledge.

2. Use a CMS Platform

If you don't want to write code, Content Management Systems (CMS) are your friend. Popular options include:

  • WordPress - The most widely used CMS
  • Drupal - Great for complex sites
  • Joomla - A solid middle-ground option
  • Wix - The ultimate beginner-friendly platform

3. Let AI Build It For You

This is where things get really interesting. AI-powered tools like Google's AI Studio can now generate complete websites based on simple prompts.

My Experiment with Google AI Studio

I decided to put AI website generation to the test. Here's what happened when I asked AI to revamp my existing site at https://www.yanaihome.com/aiStudio.

The Prompt

"I have an old website at https://www.yanaihome.com/. Can you revamp it with a mobile-friendly version?"


The Results

Within minutes, AI Studio generated a fully functional site. But it didn't stop there. I kept requesting additional features:

  • Blog section with admin login
  • Google AdSense integration
  • Various improvements during testing

The AI updated the code accordingly with each prompt. The default technology stack it chose was React, which is a modern and powerful choice.

Getting It Running Locally

Once I was satisfied with the initial build, I downloaded the project as a zip file. To run it locally, I needed to set up the development environment:

Prerequisites

  • Node.js (version 18 or higher recommended)
  • NPM (Node Package Manager)
  • Vite (comes with the project—this is the build tool that makes everything lightning fast)

You can verify your setup by checking your NPM version:

npm -v

Download the latest version at https://nodejs.org/en/download if needed.

Running the Project

npm install        # Install dependencies
npm run build      # Create production build
npm run dev        # Run development server

Since I use VS Code for development, I asked the AI to generate the appropriate .vscode configuration files. Now I can simply hit F5 to run and preview changes instantly.


Deployment: The Old Way vs. The Modern Way

The Old Way (What I'm Currently Doing)

I'm using HostPapa, a traditional hosting service with cPanel and FTP access. This means:

  1. Build the project locally (npm run build)
  2. Upload the contents of the dist folder via FTP
  3. Repeat every time I make changes

It works, but it's tedious and outdated.


The Modern Way (CI/CD)

Modern frontend engineers use Continuous Integration/Continuous Deployment (CI/CD) platforms like:

  • Vercel
  • Netlify
  • Cloudflare Pages

Here's the streamlined workflow:

  1. Make changes in VS Code
  2. Commit and push code to GitHub
  3. The hosting platform automatically detects the push, runs npm run build on their servers, and updates your website in seconds

No manual uploads. No FTP. Just pure automation.

My Next Steps

Check out the live site at https://www.yanaihome.com/aiStudio/ to see the AI-generated result!


Here's what I'm planning next:

  1. Stop manual uploads - Create a free account on Vercel.com
  2. Connect GitHub - Put this code in a GitHub repository
  3. Deploy automatically - Link the repo to Vercel so that whenever I save and push code in VS Code, my site updates automatically
  4. Optimize and iterate - Continue refining the site with AI assistance

Understanding Vite

You might be wondering, "What's Vite?" Think of it as the super-fast engine behind your modern website—a translator and organizer that makes development incredibly efficient. It's one of the reasons why modern web development feels so much faster than it did in the early 2000s.

The Bottom Line

We've come a long way from spending weeks and thousands of dollars on basic websites. AI-powered tools have democratized web development, making it accessible to everyone regardless of technical skill. Whether you choose to code from scratch, use a CMS, or leverage AI generation, the barriers to getting online have never been lower.

The future of web development isn't just about writing code—it's about knowing how to direct AI tools, understanding deployment workflows, and choosing the right approach for your needs. Welcome to 2025, where your biggest challenge isn't building a website—it's deciding which of the many excellent options to use.

After thought: AI is just infant and inmature..... a long way to go, dont think will replace any software within my life. Just play for fun.

Tuesday, February 3, 2026

Cloud Migration Opportunity at University

 


From Database Developer to Data Engineer

After more than two decades as a database and application developer at my university, I find myself at an exciting crossroads. Our institution is embarking on a significant digital transformation, gradually migrating from PeopleSoft to a modern Microsoft Azure-based data warehouse and data lake solution. For someone like me, with over 20 years of database experience, this isn't just an institutional change—it's a golden opportunity to evolve my career into data engineering.

Why This Transition Makes Sense

The shift from traditional database development to data engineering might seem daunting, but the reality is quite different. My two decades of experience aren't becoming obsolete; they're becoming the foundation for something bigger. Data engineering is essentially database work at scale, dealing with the same fundamental concepts I've worked with for years—data modeling, SQL optimization, ETL processes, data integrity, and application integration—just applied in new, cloud-native ways.

What makes this particularly exciting is the timing. Being present during an active migration to Azure means I can learn by doing, gaining hands-on experience with cutting-edge technology while applying my deep institutional knowledge. I understand our institution's data landscape, the quirks of our systems, and how different departments interact with data. This domain knowledge is incredibly valuable and nearly impossible for an outsider to replicate quickly.

Understanding the Technology Shift

The move from PeopleSoft to Microsoft's cloud ecosystem represents a fundamental architectural change. Instead of monolithic on-premises systems, we're moving toward a distributed, cloud-native architecture built on Azure's data platform services.

At the heart of this transition are two key components: the data lake and the data warehouse. The data lake (likely Azure Data Lake Storage) will serve as our repository for raw data extracted from PeopleSoft and other sources, stored in its native format. Meanwhile, the data warehouse (probably Azure Synapse Analytics) will contain structured, cleaned, and organized data ready for reporting and analytics. Together, these form the "single source of truth" that will gradually replace PeopleSoft's role.

The strategy is deliberate and gradual. Rather than a risky "big bang" migration, we'll keep PeopleSoft running while slowly moving functionality to modern cloud applications. Departments across campus will access their data through Power BI dashboards, built on top of this new infrastructure. It's a smart approach that minimizes disruption while modernizing our technology stack.

What's Different in Data Engineering

While my database background provides an excellent foundation, data engineering does involve some new concepts and skills. The scale is different—we're talking about massive datasets across distributed systems rather than single databases. The variety of data types expands beyond structured database tables to include semi-structured formats like JSON and XML, and even unstructured data like documents and logs.

The toolset is also evolving. Instead of on-premises servers, I'll be working with cloud-native services like Azure Data Factory for building data pipelines, Azure Databricks for large-scale data processing, and Azure Synapse Analytics for our data warehouse. While SQL remains crucial, Python becomes increasingly important, particularly with libraries like pandas and PySpark for data processing. Pipeline orchestration—building automated data workflows that run on schedules—becomes a core part of the job.

My Learning Roadmap

I've developed a practical learning path that balances formal education with hands-on experience. 

In the immediate term (next 1-3 months), I plan to earn the Microsoft Azure Fundamentals (AZ-900) certification. With my background, this should be straightforward and will give me the big picture of Azure's ecosystem. I'll also start experimenting with Azure Data Factory through Microsoft's free learning platform and, if possible, get some hands-on time in Azure's portal.

For the short term (3-6 months), my focus shifts to the Azure Data Engineer Associate (DP-203) certification—this is the key credential for my target role. I'll deepen my Python skills, particularly focusing on pandas and PySpark, and study data lake architecture concepts like the medallion architecture (raw/bronze, curated/silver, and processed/gold zones). These patterns are becoming industry standards for organizing data lakes.

Over the medium term (6-12 months), I'll work on real projects in the PeopleSoft migration, learn Azure Databricks and Apache Spark for distributed data processing, and study modern data warehouse modeling techniques like star schemas and slowly changing dimensions. The goal is to become proficient enough to take on significant responsibilities in our migration project.

Leveraging My Experience

What excites me most is how my existing knowledge translates into immediate value. I know PeopleSoft's data structures intimately—where the critical data lives, which tables are clean versus messy, and where the proverbial "bodies are buried." This knowledge is gold during a migration. I can bridge the gap between our legacy systems and the new architecture, helping ensure that nothing important gets lost in translation.

I'm actively seeking ways to get involved in the migration project: expressing my interest to management, volunteering for pilot projects, attending internal training sessions, and connecting with the team leading the Azure implementation. My goal is to position myself as someone who understands both the old and new worlds.

The Broader Context

This transition isn't happening in a vacuum. Azure skills are in high demand across Canada and beyond. Universities, government agencies, healthcare organizations, and various industry sectors are all moving to the cloud, with many choosing Microsoft due to existing relationships and integration needs.

For our institution specifically, this migration makes strategic sense. Our existing Microsoft infrastructure—Active Directory, Office 365, Teams, SharePoint—integrates seamlessly with Azure, simplifying authentication and data flows. Microsoft's strong hybrid cloud capabilities also mean we can maintain some on-premises systems while gradually moving to the cloud, reducing risk during the transition.

Looking Ahead

What's remarkable about this career evolution is that I'm not starting from scratch—I'm building on a solid foundation. My 20 years of database experience, combined with deep institutional knowledge of our systems, positions me uniquely for this transition. The cloud migration isn't making my skills obsolete; it's creating an opportunity to apply them in more powerful and modern ways.

The next few years will be transformative, both for my institution and for my career. As we move from PeopleSoft to a modern, cloud-based data platform, I have the chance to be at the forefront of that change, helping shape how our organization manages and leverages data in the decades to come.

For anyone else in a similar position—a database professional watching their organization move to the cloud—my advice is simple: embrace the change. Your existing skills are more valuable than you might think. The fundamental concepts haven't changed; we're just applying them at a larger scale with more powerful tools. The best time to start learning these new technologies is now, while the migration is still in progress and opportunities for hands-on learning abound.

The future of data is in the cloud, and for those of us willing to grow with it, the opportunities are boundless.

Tuesday, January 20, 2026

Google is Moving!

During the year-end break of 2025, I finally gave myself permission to slow down a little—and as expected for someone who enjoys technology, that “slow down” turned into time spent exploring Google AI tools. What started as casual curiosity quickly became hands-on experimentation, personal projects, and even deeper thoughts about career direction. This post is a reflection on that journey, what I learned, and where it might lead.

1. Google Gemini: A Name That Stuck with Me

Let’s start with Google Gemini.

First of all, the name itself is genuinely cool. “Gemini” implies duality, intelligence, adaptability, and multiple perspectives—qualities that fit perfectly with what modern AI is trying to achieve. On a more personal note, Gemini also happens to be my wife’s astrological sign, which instantly made the name feel familiar and warm rather than abstract or corporate. 😊

From a usage perspective, Gemini feels like Google’s answer to a new generation of AI interactions. It’s not just about answering questions—it’s about reasoning, summarizing, coding, brainstorming, and creating. I found myself using it in three main ways:

  • Drafting and refining text
  • Exploring technical ideas quickly
  • Acting as a thinking partner rather than just a search engine

What impressed me most is how natural it feels to collaborate with Gemini. It doesn’t replace thinking; instead, it accelerates it. That distinction matters.



2. Google AI Studio: Learning by Building

The most fun part of my exploration was Google AI Studio.

I played with the free version, and honestly, it was more powerful than I expected. Rather than just reading documentation, I jumped straight into building small but real projects. This “learn by doing” approach made the experience much more meaningful.

Here are two projects I created during that time:

🔹 Personal Website Revamp

https://yanaihome.com/aiStudio/

I used AI Studio to rethink and rework parts of my personal website. Instead of starting from scratch, I experimented with layouts, content structure, and interaction ideas. AI accelerated the brainstorming phase and helped me move faster from idea to execution.

🔹 Maple Leaf ETF Tracker (For Fun)

https://yanaihome.com/MapleLeafETF/

This was more of a playful experiment—tracking ETFs related to Canadian themes. It wasn’t meant to be production-grade software, but it helped me connect AI tooling with financial data concepts, UI generation, and backend logic. Even “just for fun” projects can teach real skills if you approach them seriously.

The biggest takeaway from AI Studio is this: AI lowers the barrier between an idea and a working prototype. That’s incredibly empowering.


3. Google Developer Platform and Skill Growth 

Another pleasant surprise was Google Developer resources, especially skills.google/

The platform is clean, structured, and encouraging. Instead of overwhelming users, it offers guided learning paths that feel achievable—even for someone balancing a full-time job. As I explored it, something unexpected happened: I started thinking seriously about career expansion, not just skill upgrades.



4. Career "Pivot" Strategy: From Backand to Data Engineering

Having worked at the University of Alberta for nearly 20 years, I’ve accumulated something that’s hard to replicate: institutional knowledge. I know how legacy systems work, where the pain points are, and—figuratively speaking—where the “bodies are buried.”

That knowledge is not a liability in the age of AI; it’s an asset.

Rather than a radical career change, what makes sense for me is a pivot—leveraging my backend experience while moving toward data engineering and cloud platforms.

Here’s how I now see the transition:

The Tech Shift: How to Move Forward

Current Skill → New Skill → Learning Path

  • SQL Queries / Stored Procedures → BigQuery / Snowflake
    Learn Analytical SQL: window functions, CTEs, and query optimization at scale.

  • Database Administration → Cloud Architecture
    Pursue a GCP Associate Cloud Engineer certification to understand infrastructure, security, and scalability.

  • Manual Data Cleaning → Python (Pandas / PySpark)
    Python is the glue of the cloud—connecting data, APIs, transformation logic, and automation.

  • Fixed Schemas → Data Lakes (Parquet, NoSQL)
    Learn how to store and process data before the schema is fully defined—a key modern data concept.

This isn’t abandoning what I know; it’s extending it into environments where AI and data naturally live.


The Tech Shift (The "How")

Current Skill (Backend)New Skill (Data Engineering)Learning Path
SQL Queries / Stored ProcsBigQuery / SnowflakeLearn "Analytical SQL" (Window functions, CTEs).
Database AdministrationCloud ArchitectureGet a "GCP Associate Cloud Engineer" cert.
Manual Data CleaningPython (Pandas/PySpark)Python is the "glue" of the cloud.
Fixed SchemasData Lakes (NoSQL/Parquet)Understand how to store data before it has a schema.

Final Thoughts

What began as playful experimentation during a holiday break turned into real inspiration. Google AI tools—Gemini, AI Studio, and Developer learning paths—did more than showcase technology. They reminded me that learning doesn’t stop with seniority, and careers don’t have to follow a straight line.

Sometimes, all it takes is curiosity, a bit of free time, and the willingness to build something—even “just for fun.”

2026 suddenly feels a lot more interesting.