Skip to content

merledu/genify

Repository files navigation

Genify

A Comprehensive Course on Artificial Intelligence & Chip Design

Genify is a structured, end-to-end learning program that bridges Artificial Intelligence and Computer Architecture, with a strong focus on Machine Learning, Deep Learning, RISC-V, and Hardware Design using HDLs.


📘 Course Curriculum

Lecture 01 – Foundations

Topics Covered:

  • Genify Overview
  • Python Basics
  • Digital Logic Design

Lecture 02 – Programming & Architecture

Topics Covered:

  • Object-Oriented Programming (OOP) in Python
  • RISC-V Instruction Set Architecture (ISA)

Lecture 03 – Machine Learning Basics

Topics Covered:

  • Data Cleaning
  • Feature Engineering
  • Data Preprocessing
  • Types of Machine Learning

Lecture 04 – Supervised Learning

Topics Covered:

  • Machine Learning
  • Supervised Machine Learning

Lecture 05 – Unsupervised & Reinforcement Learning

Topics Covered:

  • Unsupervised Machine Learning
  • Reinforcement Learning

Lecture 06 – Neural Networks & Architecture Review

Topics Covered:

  • Artificial Neural Networks (ANNs)
  • RISC-V Revision

Lecture 07 – RISC-V Core Design

Topics Covered:

  • RISC-V Single-Cycle Core Deep Dive

Lecture 08 – Artificial Neural Networks

Topics Covered:

  • Artificial Neural Networks (ANNs)

Lecture 09 – Convolutional Neural Networks (CNNs)

Topics Covered:

  • Convolutional Neural Networks (CNNs)

Lecture 10 – CNNs Continued

Topics Covered:

  • Advanced CNN Concepts

Lecture 11 – Hardware Description Languages

Topics Covered:

  • Introduction to HDLs
  • CHISEL HDL
  • Verilog HDL

Lecture 12 – Recurrent Neural Networks

Topics Covered:

  • Recurrent Neural Networks (RNNs)

Lecture 13 – Advanced RNNs & HDLs

Topics Covered:

  • Long Short-Term Memory (LSTM) Networks
  • Introduction to HDLs
    • CHISEL
    • SystemVerilog

Lecture 14 – Transformers (Part I)

Topics Covered:

  • Transformers Overview
  • Self-Attention Mechanism

Lecture 15 – Transformers (Part II)

Topics Covered:

  • Deep Dive into Self-Attention

Lecture 16 – Transformer Architecture

Topics Covered:

  • Multi-Head Attention
  • Positional Encoding
  • Encoder Architecture

Lecture 17 – Transformers (Decoder)

Topics Covered:

  • Transformers Recap
  • Decoder Architecture

Lecture 18 – Large Language Models

Topics Covered:

  • LLM Inferencing
  • Prompt Engineering
  • Vector Databases

Lecture 19 – LLM Ecosystem

Topics Covered:

  • Tokenization
  • Vector Databases
  • Semantic Search
  • LLM Ecosystem Overview
  • LangChain

Lecture 20 – Retrieval Augmented Generation

Topics Covered:

  • Retrieval Augmented Generation (RAG)

Lecture Extra – AI Agents

Topics Covered:

  • AI Agents
  • AI Agent Frameworks

🛠 Technologies Covered

  • Python
  • Machine Learning & Deep Learning
  • RISC-V Architecture
  • Verilog & SystemVerilog
  • CHISEL HDL
  • Transformers & LLMs
  • Vector Databases & RAG

📌 License

This project is intended for educational purposes.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published