-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat: improve SEO #79
Conversation
WalkthroughThe pull request introduces a new asynchronous function Changes
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms (3)
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
✅ Deploy Preview for kleros-website-v2 ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (1)
frontend/src/app/home/page.tsx (1)
22-28
: Enhance SEO meta tags coverage.While the basic meta tags are good, consider adding these additional tags for better SEO and social sharing:
<Head> <title>{title}</title> <meta name="description" content={subtitle} /> + <meta name="viewport" content="width=device-width, initial-scale=1" /> + <meta name="robots" content="index, follow" /> <meta property="og:title" content={title} /> <meta property="og:description" content={subtitle} /> <meta property="og:image" content={background.url} /> + <meta property="og:type" content="website" /> + <meta name="twitter:card" content="summary_large_image" /> + <meta name="twitter:title" content={title} /> + <meta name="twitter:description" content={subtitle} /> + <meta name="twitter:image" content={background.url} /> </Head>
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
frontend/src/app/icon.svg
is excluded by!**/*.svg
📒 Files selected for processing (1)
frontend/src/app/home/page.tsx
(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Redirect rules - kleros-website-v2
- GitHub Check: Header rules - kleros-website-v2
- GitHub Check: Pages changed - kleros-website-v2
🔇 Additional comments (1)
frontend/src/app/home/page.tsx (1)
16-18
: LGTM! Well-structured data fetching implementation.The async component with GraphQL data fetching is well implemented, following Next.js server components pattern.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
♻️ Duplicate comments (3)
frontend/src/app/home/page.tsx (1)
16-28
: 🛠️ Refactor suggestionAdd error handling and data validation for SEO metadata.
The same error handling and data validation concerns apply to this implementation.
frontend/src/app/for-builders/page.tsx (1)
14-26
: 🛠️ Refactor suggestionAdd error handling and data validation for SEO metadata.
The same error handling and data validation concerns apply to this implementation.
frontend/src/app/cooperative/page.tsx (1)
18-30
: 🛠️ Refactor suggestionAdd error handling and data validation for SEO metadata.
The same error handling and data validation concerns apply to this implementation.
🧹 Nitpick comments (2)
frontend/src/queries/seo.ts (1)
13-43
: Consider adding error handling for missing SEO fields.While the query structure is good, consider adding fallback values or error handling for cases where SEO data might be missing.
Add null checks in the GraphQL query:
const SEO_CONTENT = ` SEO { - title - description - image { - url + title + description + image { + url + } + # Provide fallback values + ... on Error { + message + } } `;frontend/src/app/cooperative/page.tsx (1)
1-1
: Consider creating a shared utility for SEO metadata generation.To avoid duplicating error handling and validation logic across multiple pages, consider creating a shared utility function:
// utils/seo.ts import type { Metadata } from "next"; import { request } from "@/utils/graphQLClient"; import { seoQuery, SEOQueryType } from "@/queries/seo"; export async function generatePageMetadata( pageKey: string, fallbackTitle: string, fallbackDescription: string ): Promise<Metadata> { try { const seoData = await request<SEOQueryType>(seoQuery); const seo = seoData?.[pageKey]?.SEO; if (!seo) { throw new Error(`SEO data is missing for ${pageKey}`); } const { title, description, image } = seo; if (!title || !description) { throw new Error('Required SEO fields are missing'); } return { title, description, openGraph: { title, description, ...(image?.url && { images: image.url }), }, }; } catch (error) { console.error(`Failed to fetch SEO data for ${pageKey}:`, error); return { title: fallbackTitle, description: fallbackDescription, }; } }Usage in pages:
export const generateMetadata = async (): Promise<Metadata> => { return generatePageMetadata( 'homePageSeo', 'Home | Kleros', 'Welcome to Kleros' ); };
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
frontend/src/app/brand-assets/page.tsx
(2 hunks)frontend/src/app/community/page.tsx
(1 hunks)frontend/src/app/cooperative/page.tsx
(2 hunks)frontend/src/app/earn/page.tsx
(1 hunks)frontend/src/app/for-builders/page.tsx
(2 hunks)frontend/src/app/for-lawyers/page.tsx
(1 hunks)frontend/src/app/home/page.tsx
(2 hunks)frontend/src/app/pnk-token/page.tsx
(2 hunks)frontend/src/app/r-and-d/page.tsx
(1 hunks)frontend/src/queries/seo.ts
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Redirect rules - kleros-website-v2
- GitHub Check: Header rules - kleros-website-v2
- GitHub Check: Pages changed - kleros-website-v2
🔇 Additional comments (8)
frontend/src/queries/seo.ts (2)
3-11
: LGTM! Well-structured SEO content fragment.The SEO content fragment is well-defined with essential SEO fields (title, description, image).
45-65
: LGTM! Well-typed SEO interfaces.The type definitions are properly structured and provide good type safety for the SEO data.
frontend/src/app/earn/page.tsx (1)
10-22
: Same error handling improvements needed as in Community page.The metadata generation implementation needs the same error handling improvements as suggested for the Community page.
frontend/src/app/for-builders/page.tsx (1)
28-41
: LGTM! Well-structured component with proper data passing.The component properly fetches and passes data to child components using destructuring.
frontend/src/app/cooperative/page.tsx (1)
32-50
: LGTM! Well-structured component with proper data handling.The component properly fetches and passes data to child components using destructuring.
frontend/src/app/pnk-token/page.tsx (1)
1-1
: LGTM!The imports for
Metadata
type and SEO-related types are correctly placed and necessary for implementing SEO metadata.Also applies to: 20-20
frontend/src/app/brand-assets/page.tsx (2)
1-1
: LGTM!The imports for
Metadata
type and SEO-related types are correctly placed and necessary for implementing SEO metadata.Also applies to: 40-40
43-55
: Verify consistency of SEO implementation across components.According to the AI summary, similar changes have been made across multiple components. Let's verify the consistency of the SEO implementation.
Run the following script to check the implementation across all components:
✅ Verification successful
🏁 Script executed:
#!/bin/bash # Description: Check consistency of SEO implementation across components. # Test: Search for generateMetadata function implementations echo "Checking generateMetadata implementations..." rg -A 15 "export const generateMetadata = async \(\): Promise<Metadata>" frontend/src/app # Test: Search for SEO query imports echo "Checking SEO query imports..." rg "import.*seoQuery.*from.*@/queries/seo" frontend/src/appLength of output: 10030
The SEO metadata generation is consistent across all components.
All pages implement
generateMetadata
in the same way:
- They use the same import for
seoQuery
andSEOQueryType
from"@/queries/seo"
.- They all fetch SEO data using
request<SEOQueryType>(seoQuery)
.- They consistently extract
title
,description
, andimage
from the corresponding page-specific SEO object and return metadata with anopenGraph
object that includes these properties.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
frontend/src/app/layout.tsx (1)
20-24
: Consider adding more SEO-related metadata properties.While allowing indexing is good, consider enhancing SEO further by adding more metadata properties such as:
title
description
openGraph
Here's an example of a more comprehensive metadata configuration:
export const metadata: Metadata = { robots: { index: true, }, + title: 'Your Site Title', + description: 'Your site description for search engines', + openGraph: { + title: 'Your Site Title', + description: 'Your site description for social sharing', + type: 'website', + }, + twitter: { + card: 'summary_large_image', + title: 'Your Site Title', + description: 'Your site description for Twitter', + }, };
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
frontend/src/app/layout.tsx
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Redirect rules - kleros-website-v2
- GitHub Check: Header rules - kleros-website-v2
- GitHub Check: Pages changed - kleros-website-v2
🔇 Additional comments (2)
frontend/src/app/layout.tsx (2)
9-9
: Verify if this import is still needed.According to the AI summary, this import should have been removed. However, it's still present and being used in the component. Let's verify its usage.
#!/bin/bash # Check if these imports are used elsewhere in the codebase rg --type ts --type tsx "HeroImagesQueryType|herosImagesQuery" -lLikely an incorrect or invalid review comment.
31-33
: LGTM! Good use of image preloading for performance.The implementation correctly preloads hero images to improve performance. The usage of
preload
with proper image attributes will help with Core Web Vitals, which is beneficial for SEO.Also applies to: 38-47
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
frontend/src/app/home/page.tsx (1)
15-17
:⚠️ Potential issueAdd error handling to the metadata generation.
While the implementation follows Next.js conventions, it should include error handling to gracefully handle failures.
Apply this diff to add error handling:
export const generateMetadata = async (): Promise<Metadata> => { - return await getPageMetadata("homePageSeo"); + try { + return await getPageMetadata("homePageSeo"); + } catch (error) { + console.error("Failed to fetch home page metadata:", error); + // Provide fallback metadata + return { + title: "Kleros", + description: "Welcome to Kleros", + }; + } };
🧹 Nitpick comments (1)
frontend/src/utils/seo.ts (1)
8-21
: Consider implementing caching for SEO metadata.Since SEO data likely changes infrequently, implementing a caching strategy could improve performance by reducing unnecessary API calls.
Consider implementing a simple in-memory cache:
const metadataCache = new Map<PageKey, Metadata>(); const CACHE_TTL = 5 * 60 * 1000; // 5 minutes export const getPageMetadata = async (pageKey: PageKey): Promise<Metadata> => { const cached = metadataCache.get(pageKey); if (cached) return cached; // ... existing implementation ... metadataCache.set(pageKey, metadata); setTimeout(() => metadataCache.delete(pageKey), CACHE_TTL); return metadata; };
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (9)
frontend/src/app/community/page.tsx
(1 hunks)frontend/src/app/cooperative/page.tsx
(2 hunks)frontend/src/app/earn/page.tsx
(1 hunks)frontend/src/app/for-builders/page.tsx
(2 hunks)frontend/src/app/for-lawyers/page.tsx
(1 hunks)frontend/src/app/home/page.tsx
(2 hunks)frontend/src/app/pnk-token/page.tsx
(2 hunks)frontend/src/app/r-and-d/page.tsx
(2 hunks)frontend/src/utils/seo.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (7)
- frontend/src/app/pnk-token/page.tsx
- frontend/src/app/cooperative/page.tsx
- frontend/src/app/earn/page.tsx
- frontend/src/app/community/page.tsx
- frontend/src/app/for-lawyers/page.tsx
- frontend/src/app/r-and-d/page.tsx
- frontend/src/app/for-builders/page.tsx
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Redirect rules - kleros-website-v2
- GitHub Check: Header rules - kleros-website-v2
- GitHub Check: Pages changed - kleros-website-v2
🔇 Additional comments (2)
frontend/src/app/home/page.tsx (1)
1-4
: LGTM! Imports are correctly set up for Next.js Metadata API.The imports are properly configured to support SEO improvements using Next.js' recommended Metadata API.
frontend/src/utils/seo.ts (1)
1-6
: LGTM! Well-structured imports and type definitions.The imports are well-organized and the PageKey type definition ensures type safety when accessing SEO data.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good!
Summary by CodeRabbit
robots.txt
file to guide web crawlers on site accessibility.