Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GSoC 2024: GitHub Discussion UI at Home, Discussion and Contributor's route #463

Merged
merged 82 commits into from
Jul 25, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
82 commits
Select commit Hold shift + click to select a range
b04efab
GSoC_Week_1:Refactor present scrapper into typescript and addition in…
dgparmar14 Jun 6, 2024
6669679
GSoC_Week_1:Refactor present scrapper into typescript and addition in…
dgparmar14 Jun 6, 2024
37965a3
Delete newGt.ts
dgparmar14 Jun 6, 2024
95804ed
remove unwated changes
dgparmar14 Jun 6, 2024
57e9445
Merge branch 'refactor-scrapper-gsoc' of https://github.com/dgparmar1…
dgparmar14 Jun 6, 2024
0bb6d00
Scraper githu.ts divided into different files for better understanding
dgparmar14 Jun 8, 2024
6cc5dac
fix scraper setup
rithviknishad Jun 14, 2024
a1491b9
Integrate Github Discussion in scraper and update scraper-dry-run wor…
dgparmar14 Jun 18, 2024
7583a62
Removing dry-run work flow error (date-fns)
dgparmar14 Jun 18, 2024
1d51cd1
Fixing scraper-dry-run date error by puting null as a default value f…
dgparmar14 Jun 24, 2024
0e5fde7
Fixing scraper-dry-run failing
dgparmar14 Jun 24, 2024
8596095
Fixing scraper-dry-run failing
dgparmar14 Jun 24, 2024
30b092a
Fixing scraper-dry-run failing
dgparmar14 Jun 24, 2024
8a3de03
Fixing scraper-dry-run failing (Genrate markdown files)
dgparmar14 Jun 24, 2024
6e31084
Fixing scraper-dry-run failing (Genrate markdown files)
dgparmar14 Jun 24, 2024
8c62674
Fixing scraper-dry-run failing (Genrate markdown files)
dgparmar14 Jun 24, 2024
0e807d8
Fixing scraper-dry-run failing (Genrate markdown files)
dgparmar14 Jun 24, 2024
a62a9d9
Merge branch 'main' into refactor-scrapper-gsoc
rithviknishad Jun 26, 2024
6c14a72
Suggested cahnges done
dgparmar14 Jun 27, 2024
60bf70e
Merge branch 'refactor-scrapper-gsoc' of https://github.com/dgparmar1…
dgparmar14 Jun 27, 2024
4fd59d4
Update test schema for discussion
dgparmar14 Jun 27, 2024
8980042
resolve-dry-run error with pnpm
dgparmar14 Jun 27, 2024
68f8fd4
Revert accidental cahnges in scraper0dry-run.yaml
dgparmar14 Jun 27, 2024
8675c01
Revert accidental cahnges in scraperdry-run.yaml
dgparmar14 Jun 27, 2024
638f272
dotenv used in generateNewContrbutors.js
dgparmar14 Jun 27, 2024
386448d
remove: dotenv used in generateNewContrbutors.js
dgparmar14 Jun 27, 2024
eaadb2a
Fix path for data repository to solve dry-run error
dgparmar14 Jun 27, 2024
a5f2371
update pnpm-lock.yaml
dgparmar14 Jun 27, 2024
d66fae9
Store seprately all discussion in discussion folder
dgparmar14 Jun 27, 2024
cac83cf
Update discussion schema
dgparmar14 Jun 28, 2024
af061d5
Update discussion schema
dgparmar14 Jun 28, 2024
83df93d
Merge branch 'coronasafe:main' into refactor-scrapper-gsoc
dgparmar14 Jun 28, 2024
122d662
Update scraper-dry-run.yaml
dgparmar14 Jun 28, 2024
3ca99f2
Merge branch 'refactor-scrapper-gsoc' of https://github.com/dgparmar1…
dgparmar14 Jun 28, 2024
1c57cc7
Update scraper-dry-run.yaml and fix some typos
dgparmar14 Jun 28, 2024
05a8b40
Update scraper-dry-run.yaml
dgparmar14 Jun 28, 2024
e678e68
Remove casting in fetchEvents.ts
dgparmar14 Jun 28, 2024
2dc5b94
fix type error
rithviknishad Jun 28, 2024
7ea2a3f
Fix type errors
dgparmar14 Jul 1, 2024
cdda148
Modify types and remove all types error from scraper
dgparmar14 Jul 2, 2024
4a5dd7e
Modify types and remove all types error from scraper
dgparmar14 Jul 2, 2024
6609994
Description added to discussion scraper
dgparmar14 Jul 5, 2024
e3bb35d
Description added to discussion scraper
dgparmar14 Jul 5, 2024
9337484
Discussion UI created at home, disucssions and cotrnbutors profile route
dgparmar14 Jul 12, 2024
464efd6
uncomment in scraper
dgparmar14 Jul 12, 2024
0141997
Merge pull request #458 from dgparmar14/refactor-scrapper-gsoc
rithviknishad Jul 12, 2024
68f9b5e
Update Github Dicussions to Discussion
dgparmar14 Jul 13, 2024
9d1c102
Point mechanism for discussions and responsiveness added
dgparmar14 Jul 16, 2024
5c90cd4
Merge branch 'coronasafe:main' into gsoc-discussion-ui
dgparmar14 Jul 16, 2024
59cddcf
Site map updated for gh-discussion
dgparmar14 Jul 16, 2024
a34f030
Merge branch 'gsoc/gh-discussions' into gsoc-discussion-ui
dgparmar14 Jul 16, 2024
a52d61e
type error fix in api.ts and modify logic of leaderboard for discussions
dgparmar14 Jul 16, 2024
92999ce
Merge branch 'gsoc-discussion-ui' of https://github.com/dgparmar14/le…
dgparmar14 Jul 16, 2024
9a9f63b
prose-h2 added to fix markdown bug
dgparmar14 Jul 16, 2024
83e062d
Modified suggested changes still one type error remaining
dgparmar14 Jul 18, 2024
a032253
Modified suggested changes still one type error remaining
dgparmar14 Jul 18, 2024
1ac9b0a
Chages done as per review
dgparmar14 Jul 19, 2024
02508f8
fix type error in api.ts mismatch in return type of discussion
dgparmar14 Jul 19, 2024
5cfe6c6
Add suspense boundary for discussions filter
rithviknishad Jul 19, 2024
d48fe37
fix open in github button responsiveness issue
rithviknishad Jul 19, 2024
f1bab51
fix incorrect roots and remove unused imports
rithviknishad Jul 19, 2024
b89aaeb
Enable empathy badge and merge discussions with old data
dgparmar14 Jul 19, 2024
b67e6ee
requested changes are done participants logic in progress
dgparmar14 Jul 22, 2024
cc6993c
remove unnecessary `useMemo`
rithviknishad Jul 22, 2024
355b5ad
Implement logic to fetch discussion within daterange (updated or crea…
dgparmar14 Jul 22, 2024
66a86ab
Removing previous logic of fetch participants
dgparmar14 Jul 22, 2024
61c0521
Remove unwanted changes
dgparmar14 Jul 22, 2024
1a16748
Handle nullable values while scraping discussion
dgparmar14 Jul 22, 2024
1f7f36a
Modify testing if the discussion.json is empty or notpresent
dgparmar14 Jul 22, 2024
aa64d41
Fixing testing logic
dgparmar14 Jul 22, 2024
951dd45
Change in discussion scraper logic and modify reposName to repository…
dgparmar14 Jul 23, 2024
c0027fe
fix naming convetntions and discussion-schema-testing when there is n…
dgparmar14 Jul 23, 2024
1f210f5
Update tests/github-discussion-schema.test.mjs
rithviknishad Jul 23, 2024
dfd6af1
remove unused packages, scripts and upgraded package versions
rithviknishad Jul 25, 2024
0f6c0e8
remove unused methods
rithviknishad Jul 25, 2024
dbc4827
move markdown render to server side
rithviknishad Jul 25, 2024
5b471b0
check if discussions dir. exists before reading
rithviknishad Jul 25, 2024
c0deaa1
Merge pull request #466 from dgparmar14/gsoc-discussion-ui
rithviknishad Jul 25, 2024
b3a222f
support for scraper workflow to run on non-main branches
rithviknishad Jul 25, 2024
322fdde
update scraper action workflow
rithviknishad Jul 25, 2024
9d773d5
update data repo dir. in scraper workflow
rithviknishad Jul 25, 2024
cd52504
update data dir. in scraper workflow
rithviknishad Jul 25, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions lib/discussion.ts
Original file line number Diff line number Diff line change
Expand Up @@ -70,12 +70,12 @@ export async function fetchGithubDiscussion(
export async function checkAnsweredByUser(
github: string,
number: string,
repoName: string,
repository: string,
) {
const org = env.NEXT_PUBLIC_GITHUB_ORG;

const dicussion: Dicussion = await octokit.graphql(`query {
repository(owner: "${org}", name: "${repoName}") {
repository(owner: "${org}", name: "${repository}") {
discussion (number: ${number}) {
answer {
author {
Expand Down Expand Up @@ -106,7 +106,7 @@ export async function getGithubDiscussions(githubHandle: string) {
const isAnswered = await checkAnsweredByUser(
githubHandle,
discussion.link?.split("/").pop() ?? "",
discussion.repoName,
discussion.repository,
);
title = isAnswered ? "Discussion Answered" : "Commented on discussion";
activityType = isAnswered
Expand Down
4 changes: 2 additions & 2 deletions schemas/discussion-data.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ required:
- url
- category
- time
- repoName
- repository
- participants

properties:
Expand Down Expand Up @@ -40,6 +40,6 @@ properties:
type: array
items:
type: string
repoName:
repository:
type: string

54 changes: 27 additions & 27 deletions scraper/src/github-scraper/discussion.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { octokit } from "./config.js";
import { Discussion, ParsedDiscussion } from "./types.js";
import { Discussion, ParsedDiscussion, Repository } from "./types.js";
import { saveDiscussionData } from "./utils.js";

const query = `query($org: String!, $cursor: String) {
Expand Down Expand Up @@ -50,44 +50,42 @@ async function fetchGitHubDiscussions(
org: string,
endDate: Date,
startDate: Date,
cursor = null,
) {
const variables = {
org,
cursor,
};
for await (const response of octokit.graphql.paginate.iterator(
query,
variables,
)) {
const repositories = await response.organization.repositories.edges;
type repo = (typeof repositories)[0];
const iterator = octokit.graphql.paginate.iterator(query, { org });

let Discussions: { repository: string; discussion: Discussion }[] = [];
rithviknishad marked this conversation as resolved.
Show resolved Hide resolved

for await (const response of iterator) {
const repositories: Repository[] = response.organization.repositories.edges;

for (const repo of repositories) {
const discussions = await repo.node.discussions.edges.map(
(discussion: repo) => ({
repoName: repo.node.name,
discussion: discussion.node,
}),
);
const discussionsWithinDateRange = await discussions.find((d: repo) => {
const discussions = repo.node.discussions.edges.map((discussion) => ({
repository: repo.node.name,
discussion: discussion.node,
}));
const discussionsWithinDateRange = discussions.find((d) => {
const createdAt = new Date(d.discussion.createdAt);
const updatedAt = new Date(d.discussion.updatedAt);

return (
createdAt >= new Date(startDate) || updatedAt >= new Date(startDate)
// When created or updated date will be lower than the start date and greater than end date then return true
(createdAt <= new Date(startDate) &&
createdAt >= new Date(endDate)) ||
(updatedAt <= new Date(startDate) && updatedAt >= new Date(endDate))
);
});
if (discussionsWithinDateRange) {
return discussions;
return Discussions;
}
Discussions = Discussions.concat(discussions);
}
}

return null;
return Discussions;
}

async function parseDiscussionData(
allDiscussions: { repoName: string; discussion: Discussion }[],
allDiscussions: { repository: string; discussion: Discussion }[],
endDate: Date,
startDate: Date,
) {
Expand Down Expand Up @@ -119,7 +117,7 @@ async function parseDiscussionData(
emoji: d.discussion.category.emojiHTML.replace(/<\/?div>/g, ""),
},
participants: participants || [],
repoName: d.repoName,
repository: d.repository,
};
},
);
Expand All @@ -138,9 +136,11 @@ export async function scrapeDiscussions(
endDate,
startDate,
);
const parsedDiscussions =
allDiscussions &&
(await parseDiscussionData(allDiscussions, endDate, startDate));
const parsedDiscussions = await parseDiscussionData(
allDiscussions,
endDate,
startDate,
);
await saveDiscussionData(parsedDiscussions, dataDir);
} catch (error: any) {
throw new Error(`Error fetching discussions: ${error.message}`);
Expand Down
1 change: 0 additions & 1 deletion scraper/src/github-scraper/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,6 @@ const scrapeGitHub = async (
console.log("Scraping completed");
};

// Type Done and check done
const main = async () => {
// Extract command line arguments (skip the first two default arguments)
const args: string[] = process.argv.slice(2);
Expand Down
13 changes: 12 additions & 1 deletion scraper/src/github-scraper/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,17 @@ export type Discussion = {
updatedAt: string;
};

export type Repository = {
node: {
name: string;
discussions: {
edges: {
node: Discussion;
}[];
};
};
};

export type ParsedDiscussion = {
source?: string;
title: string;
Expand All @@ -211,5 +222,5 @@ export type ParsedDiscussion = {
emoji: string;
};
participants?: string[];
repoName: string;
repository: string;
};
10 changes: 6 additions & 4 deletions scraper/src/github-scraper/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -205,16 +205,18 @@ export async function saveDiscussionData(
dataDir: string,
) {
// check data dir present or not and file is present or not if not then create it
await mkdir(dataDir + "/discussions", { recursive: true });
if (discussions.length === 0) {
return;
}
const discussionsDir = path.join(dataDir, "discussions");
await mkdir(discussionsDir, { recursive: true });

const file = path.join(discussionsDir, "discussions.json");
try {
// Try reading the file
const response = await readFile(file);
const oldData = JSON.parse(response.toString());
const newData = await mergeDiscussions(oldData, discussions);
const jsonData = JSON.stringify(newData, null, 2);
const mergedData = await mergeDiscussions(oldData, discussions);
const jsonData = JSON.stringify(mergedData, null, 2);
await writeFile(file, jsonData);
} catch (err) {
// File doesn't exist, create it with initial data
Expand Down
12 changes: 5 additions & 7 deletions tests/github-discussion-schema.test.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,11 @@ const filesInDir = fs

filesInDir.forEach((file) => {
const content = fs.readFileSync(join(GH_DATA, file)).toString();
if (content !== "null") {
const data = JSON.parse(stripJsonComments(content));
const data = JSON.parse(stripJsonComments(content));

describe(`Validate '${file}'`, function () {
it("should be properly validated by the json schema", () => {
expect(data).to.be.jsonSchema(schema);
});
describe(`Validate '${file}'`, function () {
it("should be properly validated by the json schema", () => {
expect(data).to.be.jsonSchema(schema);
});
}
});
});
Loading