Search by Category
Close

Filter by Category

Artificial Intelligence
React
Ruby on Rails
Next.js
Python
Javascript
HTML & Frontend
CSS
CSS Tricks
Caching
Database & Designs
Authentication & Security
Web Hosting
Technology
user profile avatar

the don

Published on • 🕑2 min


Role of AI in Mass Censorship

The rise of GenAI such as ChatGPT and Gemini has ushered in a new era in human revolution. This era has fairly been predicted as having catastrophic consequences for humanity. However, as people worry about AI enslaving us, they often overlook one major problem. Mass censorship!

Search engines like Google and Bing are integrating AI, promising a smoother, more relevant online experience. But who decides what's "relevant"? The corporations behind these platforms have a troubling history of cooperating with governments to restrict information access.

Take the example of the recent Gemini catastrophe. The launch of Gemini on Feb 2024 served to prove this point. Google took online to apologize after anti-woke heretics such as Elon Musk criticized Gemini for generating mixed-race images of American founding fathers.

While this incident might seem isolated, it has proven that governments are more likely to weaponize AI for mass censorship.  While AI is efficient, it could be used to filter search results based on undisclosed criteria to hide dissenting ideas.

While this might sound paranoid, these things are happening in nations such as China and South Korea. Conditions for internet users in these nations remain oppressive, with users facing criminal penalties for accessing or sharing certain information. This is a severe violation of freedom of speech.Gemini-Censorship

Imagine what such governments could achieve with AI censorship. Governments will not only be able to filter out information and spoon-feed the public but will have the right to decide what you see and hear.

Therefore, as AI advances, We must demand transparency from search engines about how AI algorithms rank information. We need strong legal safeguards against government influence on online searches.
Governments are currently rushing to regulate AI. However, such regulations only serve to empower government censorship capabilities.  But as the integration of GenAI becomes ubiquitous in everyday technology it is not a given that search, word processing, and email will continue to allow humans to be fully in control.

The perspectives are frightening.

Imagine a world where your word processor prevents you from analyzing, criticizing, lauding, or reporting on a topic deemed “harmful” by an AI programmed to only process ideas that are “respectful and appropriate for all.”  

 

324Blog views

1.6K

share this blog
user profile avatar

the don

Published on • 🕑3 min


Data Fetching and Caching in NextJS

In the realm of web development, data fetching stands tall as a fundamental aspect, akin to the very fabric that weaves together intricate applications. In this journey through the corridors of React and Next.js, let’s delve into the art of fetching, caching, and revalidating data, unraveling the mysteries and unveiling the potentials that lie within.

Fetching Data on the Server with fetch

The saga begins with the native fetch API, a trusty companion in the quest for data. Next.js, with its adept prowess, extends fetch to the server, empowering developers to configure caching and revalidating behaviors with finesse. Whether it’s within Server Components, Route Handlers, or Server Actions, the landscape is ripe for exploration.

async function fetchData() {
  try {
    const res = await fetch('https://api.example.com/...')
    if (!res.ok) {
      throw new Error('Failed to fetch data')
    }
    return res.json()
  } catch (error) {
    console.error('Error fetching data:', error)
  }
}

Caching: The Guardian of Data

As data flows into our applications, caching emerges as a stalwart guardian, ensuring swift access and reducing the burden on our sources. Next.js, ever vigilant, automatically caches fetch responses on the server, unleashing the power of the Data Cache.

// Utilizing cache-control for caching
fetch('https://...', { cache: 'force-cache' })

Revalidating Data: Keeping It Fresh

Yet, in the ever-changing landscape of data, staleness looms as a lurking shadow. Fear not, for revalidation comes to our aid, breathing life into our cache and ensuring our data remains as fresh as the morning dew. With Next.js, we wield the power to revalidate data based on time or demand, sculpting experiences that stand the test of time.

// Revalidating at timed intervals
fetch('https://...', { next: {cache:"force-cache", revalidate: 3600 } })

On-Demand Revalidation: A Call to Action

In the throes of interaction, on-demand revalidation emerges as our trusted ally. Whether it’s a form submission or an event trigger, the ability to summon forth the latest data at our command ensures our applications remain responsive and relevant.

import { NextRequest } from 'next/server'
import { revalidateTag } from 'next/cache'
export async function GET(request: NextRequest) {
  const tag = request.nextUrl.searchParams.get('tag')
  revalidateTag(tag)
  return Response.json({ revalidated: true, now: Date.now() })
}
// Summoning on-demand revalidation
revalidateTag('collection')

Opting Out and Beyond

Yet, in the vast tapestry of options, lies the freedom to choose. Next.js bestows upon us the power to opt out of caching, to dance to our own rhythm, and to dictate the flow of data according to our needs. From individual fetch requests to segment-wide configurations, the canvas is ours to paint.

// Opt out of caching for all data requests in the route segment
export const dynamic = 'force-dynamic'

Journey Beyond: Client-Side Adventures

As we traverse the realms of React and Next.js, we encounter the duality of client-side fetching. From the hallowed halls of Route Handlers to the enigmatic realms of third-party libraries like SWR and TanStack Query, the journey continues, each path leading to new discoveries and boundless possibilities.

In the grand tapestry of web development, data fetching, caching, and revalidation stand as pillars of strength, guiding our journey and shaping our experiences. With Next.js as our companion, let us embark on this odyssey, where each line of code is a step closer to unlocking the full potential of the digital realm.

216Blog views

1.2K

share this blog
user profile avatar

the don

Published on • 🕑2 min


How Artificial Intelligence (AI) will Affect the Future of Education

The arrival of artificial intelligent chat-bots that can complete student's assignments has been a game changer, at least for students. While most educational institutions are rushing to implement AI detectors, there is a possibility that current AI will develop beyond the simple metrics tested by the detectors. Here is my opinion on how AI will shape the future of education.

Academic Dishonesty

One possible impact of AI on education is lowering the quality of education. This means that students could easily have ChatGPT and other LLMs handle their assignments.

student using ai

While schools are getting clever at addressing such issues, it is possible that students would easily get away with it. Increased academic dishonesty will lead to superficial knowledge and decline in of critical thinking, problem-solving, and communication skills – essential for academic and professional success.

Decline In Comprehension

Students' overreliance on AI tools is likely to affect their ability to learn and remember things. More caution must be taken when examining students on a topic. Abundance of information could indeed make individuals more stupid.

The ease and speed of access to information through AI tools might encourage students to skim over information or focus on surface-level understanding rather than delving deeper into concepts and making connections. This can hinder the development of deeper cognitive skills like analysis, synthesis, and evaluation.

Beyond the Worries

Despite the current challenges, education sector can also fully leverage the power of AI to accelerate learning. Use of AI tools can help provide personalized learning materials based on each student's specific needs. For example, if a student is interested in biology and ancient history, AI can create learning modules that seamlessly integrate both subjects.

AI tools can also help students in research and explanation, potentially leading to a deeper understanding of the subject matter for students who actively engage with the information it provides. This can also an stimulate curiosity and critical thinking, making learning more engaging and encouraging independent exploration.

However, as we look into the future, schools must be prepared to adopt AI and become smarter than their students. By being early adopters of AI, educators can help fight challenges such as academic dishonesty. Afterall, we need a thief to catch a thief.

There is also a likelihood that legislations related to data privacy, fairness and bias, transparency and ethics will help shape how educators interact with AI and AI generated materials. These regulations will help ensure schools provide better quality education to their students and guarantee their rights

415Blog views

1.1K

share this blog
user profile avatar

tech girl

Published on • 🕑2 min


Why My Opinion on NextJS is Changing

I have been the greatest fan of NextJS, but I believe there is more hype to the framework. My opinion about NextJS is changing and I am considering switching to Remix or Just the old plain React. Here is why I think NextJS is overhyped.


NextJS, which is a react framework that enables developers create full-stack React-based web applications with server-side rendering and static website generation, is currently in version 14. Ironically, React is currently in version 18 despite being released in 2013 while NextJS was released in 2016. Undoubtedly the framework is way over its head.

One of the major concerns is that NextJS packages and ships Experimental React Features, and marks them as stable, ultimately using developers as guinea pigs for new features. Shipping experimental react features such as useServer significantly introduces a lot of bugs into NextJS code, requiring the work of a bounty hunter!

Make Money By Hacking?? Bug Bounty Guide (Resources) | by Om Arora |  InfoSec Write-ups

Call me a noob but I have not experienced any improvement in performance after switching to NextJS. As a matter of fact, my application has gotten too slow compared to applications created using React. However, other developers have reported perfomance improvements after switching to NextJS. I presume I have not been able to fully leverage NextJS server-side rendering and data fetching.

Another significant issue is that Vercel  has made it exceedingly difficult to host NextJS code in other hosting service providers such as Netlify. But part of the problem is that the line between Next.js and Vercel is very thin so if you're not deploying on Vercel, you're actually using a different framework from what's documented in the Next.js docs and it's not always clear what those differences are because Vercel isn't incentivized to invest time in that.
We can argue about whether Vercel is right or wrong about their current approach. But the fact remains that if Vercel’s pricing or other things become a problem for you, getting off of Vercel will also be a problem. It comes back down to the incentives, although the company still remains unprofitable

Let me know what you guys think about NextJS. PS this website is built using NEXTJS!

214Blog views

1.9K

share this blog
user profile avatar

Tech Wizard

Published on • 🕑3 min


My Experience with PRISMA so Far and Why I am Considering Switching to Drizzle

I recently switched to using Prisma ORM for the backend of my blog. This decision came after encountering difficulties hosting my Ruby on Rails API for free. Most hosting platforms, including my favorite Railway, require a monthly payment, which exceeds my current budget. Here is my experience with Prisma so far.

About Prisma

Prisma is a next-generation ORM that makes working with databases easy for application developers.

  • Object-relational mapper
  • A way to interact with our database
  • Completely type safe

Prisma is super easy to work with, and I was able to switch my whole database in just 4 days. The Prisma schema makes it super easy to declare relationships between models, and with Prisma seeding feature also makes it easy to populate the database with existing data.

Here is a sample prisma schema

sample-prisma-schema

However, I initially struggled with the seeding process and I had to create data manually via API routes. Together with NextJS API routes, Prisma is super fun to work with. One major benefit of NextJS is that expensive queries can be cached so that refreshing the route does not re-fetch the data from the database.
Another super cool feature of Prisma is the Prisma studio. This opens up a new tab where users can easily visualize data and even manage the data, such as creating new records, updating existing data, and deleting records. Individuals can access Prisma Studio by running.

npx prisma studio

prisma-studio

Challenges

Immediately after switching to Prisma, I realized my server was taking too long to respond. A single login request, which often took 700ms to complete, took over 10s. This delay is significant and could affect user performance. 

Requests made with Prisma were ten times slower than normal!

I looked around to try and understand why this was happening or some errors that I could be making. Prisma documentation suggests indexing frequently queried columns such as username and email by adding @@index([username, email]).

Query is 5 times slower when using `findMany` compared to running identical  query via `queryRaw` · Issue #11130 · prisma/prisma · GitHub

However, this does not solve the issue. Prisma is too slow, especially in queries that involve selecting data from more than one table.

The database also has a cold start, which means that after the Prisma client disconnects, new requests will take time.

PRISMA is also not designed to run on the edge, where vercel hosts their serveless functions. This implies that the servers are not up and running all the time in a serverless environment, thus the cold starts before Vercel spins the function and awakens the database.

Here is what I have realized so far:

  • Prisma is not suited to run on the edge and thus the database sleeps when the Prisma client disconnects. Prisma takes time to connect again.
  • Every new insert via Prisma opened a database-level transaction (?). An extremely weird design choice by Prisma. This makes you exhaust your connection pool very fast.
  • There is no concept of SQL-level joins in Prisma. This means for every transaction that requires join, prisma fetches both tables, merges the data, and returns it to the user. This makes the findMany query take too long.
  • Prisma is fixing the SQL-level joins with a new feature called relation joins. However, this is still a preview feature. Users can enable this by adding this to their schema: preview features = ["relationJoins"
  • On every insert, Prisma opens a transaction where it inserts and then returns the inserted record too, even if we do not want it.
  • For every action (Create, Read, Update, and Delete) prisma returns the object by default even if you do not want it. Even for delete!
  • Prisma supports JSONB datatypes in Postgresql and MySQL. However, there is no way to update such data type. The existing way is to fetch the whole object, spread it, and then add or delete data, then post the same object again. This is utterly ridiculous.
  • Prisma has no way of hiding sensitive data, such as not returning password fields when you fetch a user object. 

I have heard good things about some other ORMs such as Drizzle and Kysely, although they are not as easy to work with as Prisma. However, I will consider interacting with them first before making a concrete decision.

Update:

I switched to Prisma Accelerate and now Prisma is very fast with queries although it does not beat Active-Record.

Maybe Prisma is intentionally slow so individuals can switch to the paid plan on Prisma Accelerate! However, I do agree that simplicity can make you get hooked.

314Blog views

1.3K

share this blog