-
-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[VNEXT] feat: Multi-DB type support #291
Conversation
Important Review skippedAuto reviews are disabled on base/target branches other than the default branch. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the WalkthroughThe pull request introduces significant enhancements to the database connection logic across the backend, supporting both PostgreSQL and SQLite. Key changes include the addition of a new function to validate PostgreSQL SSL modes, improved error handling that logs connection details, and a new configuration structure for database settings. The migration tool has been updated to accommodate multiple database types, and several new tasks have been added to facilitate PostgreSQL operations. Additionally, frontend components were modified to include input length validation features. Changes
Assessment against linked issues
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
The migration creation for postgres now works properly. Removed MySQL support, having too many issues with it at this time.
…age in language dropdown
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 12
🧹 Outside diff range and nitpick comments (6)
backend/internal/data/migrations/migrations.go (3)
Line range hint
22-41
: Enhance security for file operations.The file operations need additional security measures:
- Path sanitization to prevent path traversal
- File content validation before writing
- Secure cleanup of temporary files
- fsDir, err := Files.ReadDir(dialect) + // Ensure dialect path is clean and within allowed directory + dialectPath := path.Clean(dialect) + if path.IsAbs(dialectPath) || strings.Contains(dialectPath, "..") { + return fmt.Errorf("invalid dialect path: %s", dialect) + } + + fsDir, err := Files.ReadDir(dialectPath) if err != nil { return err } for _, f := range fsDir { if f.IsDir() { continue } - b, err := Files.ReadFile(path.Join(dialect, f.Name())) + // Sanitize filename and validate extension + filename := path.Clean(f.Name()) + if !strings.HasSuffix(filename, ".sql") { + continue + } + + b, err := Files.ReadFile(path.Join(dialectPath, filename)) if err != nil { return err } - err = os.WriteFile(path.Join(temp, f.Name()), b, 0o644) + targetPath := path.Join(temp, filename) + if !strings.HasPrefix(targetPath, temp) { + return fmt.Errorf("invalid target path: %s", targetPath) + } + + err = os.WriteFile(targetPath, b, 0o644) if err != nil { return err } }
Line range hint
1-41
: Consider adding cleanup functionality.The package should provide a cleanup function to securely remove temporary migration files after they're no longer needed.
+// Cleanup removes the temporary migration files +func Cleanup(temp string) error { + // Ensure temp path is within allowed directory + tempPath := path.Clean(temp) + if !strings.HasPrefix(tempPath, os.TempDir()) { + return fmt.Errorf("invalid temp directory: %s", temp) + } + return os.RemoveAll(tempPath) +}
Line range hint
1-41
: Add documentation for supported dialects.Update package documentation to clearly specify supported database dialects and their corresponding migration file requirements.
// Package migrations provides a way to embed the migrations into the binary. +// +// Supported database dialects: +// - sqlite3: SQLite migrations in backend/internal/data/migrations/sqlite3/ +// - mysql: MySQL migrations in backend/internal/data/migrations/mysql/ +// - postgres: PostgreSQL migrations in backend/internal/data/migrations/postgres/ +// +// Migration files must have .sql extension and follow the naming convention: +// YYYYMMDDHHMMSS_description.sql package migrationsbackend/app/tools/migrations/main.go (1)
76-89
: Consider adding build info validationThe build function handles commit truncation but doesn't validate other fields.
Consider adding validation to prevent potential injection via build variables:
func build() string { short := commit if len(short) > 7 { short = short[:7] } + // Sanitize version and buildTime + version = strings.ReplaceAll(version, "\n", "") + buildTime = strings.ReplaceAll(buildTime, "\n", "") return fmt.Sprintf("%s, commit %s, built at %s", version, short, buildTime) }backend/internal/data/migrations/postgres/20241027025146_init.sql (2)
1-8
: Add audit fields for tracking record modificationsConsider adding
created_by
andupdated_by
fields to track who creates and modifies records. This is crucial for security auditing and compliance.Example addition for the groups table:
CREATE TABLE "groups" ( "id" uuid NOT NULL, "created_at" timestamptz NOT NULL, "updated_at" timestamptz NOT NULL, + "created_by" uuid NOT NULL, + "updated_by" uuid NOT NULL, "name" character varying NOT NULL, "currency" character varying NOT NULL DEFAULT 'usd', PRIMARY KEY ("id") );
1-58
: Performance: Consider table partitioning and field length limitsFor better performance and resource management:
- Consider partitioning the
items
table bycreated_at
if you expect high volume- Add varchar length limits to prevent excessive storage usage
Example modifications:
-- Example of partitioning for items table CREATE TABLE "items" ( -- existing columns ) PARTITION BY RANGE (created_at); -- Example of adding varchar limits ALTER TABLE "users" ALTER COLUMN "name" TYPE character varying(255); ALTER TABLE "users" ALTER COLUMN "email" TYPE character varying(255); ALTER TABLE "documents" ALTER COLUMN "title" TYPE character varying(255); ALTER TABLE "documents" ALTER COLUMN "path" TYPE character varying(1024);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (2)
backend/internal/data/migrations/postgres/atlas.sum
is excluded by!**/*.sum
backend/internal/data/migrations/sqlite3/atlas.sum
is excluded by!**/*.sum
📒 Files selected for processing (6)
- .gitignore (1 hunks)
- Taskfile.yml (3 hunks)
- backend/app/api/main.go (5 hunks)
- backend/app/tools/migrations/main.go (2 hunks)
- backend/internal/data/migrations/migrations.go (2 hunks)
- backend/internal/data/migrations/postgres/20241027025146_init.sql (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- backend/app/api/main.go
🔇 Additional comments (5)
backend/internal/data/migrations/migrations.go (1)
10-10
: Verify migration files organization for multi-DB support.The embed directive now specifically points to "sqlite3" directory. Ensure similar directories exist for other supported databases (PostgreSQL, MySQL).
✅ Verification successful
Let me check the actual migration files to ensure they are properly organized and don't contain any security concerns before providing the final response.
Multi-DB support properly implemented with security considerations
The migration structure is correctly organized with dedicated directories for both PostgreSQL and SQLite3. A few security observations:
- Both databases implement proper foreign key constraints with CASCADE delete rules
- Password fields are properly defined as text/varchar (allowing for secure hashing)
- Role-based access control is consistently implemented across both dialects
- Proper indexing is in place for user authentication and lookup operations
However, please note:
- Default role values are hardcoded as 'user' in migrations
- Token storage uses bytea/blob types appropriately for secure token storage
- Cascading deletes are properly implemented to prevent orphaned records
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check if migration directories exist for all supported DB types echo "Checking migration directories structure..." fd -t d "^(sqlite3|mysql|postgres(ql)?)" backend/internal/data/migrations/Length of output: 258
Script:
#!/bin/bash # Check if migration files exist in both directories and compare their structure echo "Checking migration files in postgres directory:" ls -la backend/internal/data/migrations/postgres/ echo -e "\nChecking migration files in sqlite3 directory:" ls -la backend/internal/data/migrations/sqlite3/ # Check for any potential security-sensitive patterns in migration files echo -e "\nChecking for potential security issues in migration files:" rg -i "grant|role|permission|user|password|credential" backend/internal/data/migrations/Length of output: 14021
.gitignore (1)
5-7
: LGTM! Important security measure for SQLite auxiliary files.These additions are crucial as they prevent sensitive database-related files from being accidentally committed. The SQLite journal, shared memory, and write-ahead log files can contain sensitive data and transaction details.
backend/app/tools/migrations/main.go (2)
23-26
: LGTM: Config initialization looks goodThe error handling for config initialization is appropriate, using panic for fatal startup errors.
Line range hint
48-52
: Verify impact of enabling drop operationsThe migration options enable both
WithDropIndex
andWithDropColumn
. This could lead to data loss if not carefully managed.Let's check for any existing migrations that might be affected:
Taskfile.yml (1)
5-6
: LGTM: Database configuration changes look good.The environment variable changes align with the PR objectives for multi-DB support. The SQLite configuration includes important pragma settings for WAL mode and foreign key constraints.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 16
🧹 Outside diff range and nitpick comments (21)
frontend/pages/home/statistics.ts (1)
Vue i18n translations need XSS protection measures
The codebase shows several concerning patterns:
- Direct use of
v-html
with translations ($t()
) in tools.vue without sanitization- No visible HTML sanitization in the i18n message compiler configuration
- While DOMPurify is used in the Markdown component, it's not applied to translations
To protect against XSS:
- Add DOMPurify sanitization in the message compiler (frontend/plugins/i18n.ts)
- Replace
v-html
usage withv-text
where possible- If HTML in translations is required, create a dedicated component that applies DOMPurify
🔗 Analysis chain
Line range hint
20-40
: Consider XSS protection for translated content.While the translation keys look safe, it's important to ensure that the translation system has XSS protection for user-provided translations.
Consider adding input validation for the statistics values:
return computed(() => { + const sanitizeNumber = (value: number | undefined) => { + return typeof value === 'number' && isFinite(value) ? value : 0; + }; return [ { label: t("home.total_value"), - value: statistics.value?.totalItemPrice || 0, + value: sanitizeNumber(statistics.value?.totalItemPrice), type: "currency", }, // Apply similar changes to other statistics ] as StatCard[]; });
Let me search for the i18n configuration and translation files to verify the XSS protection mechanisms. The previous search wasn't specific enough, so I'm broadening it to look for any i18n related code and translation files.
I see Vue i18n is configured, but I need to check for any sanitization mechanisms and the message compiler configuration to ensure proper XSS protection. Let me search for these security-related patterns.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Look for vue-i18n configuration and sanitization rg -l "vue-i18n.*config|sanitize|escape"Length of output: 5617
Script:
#!/bin/bash # Look for vue-i18n configuration and usage rg -A 5 "vue-i18n|createI18n|useI18n" # Look for translation files to check sanitization fd -e json -e yaml -e yml translationsLength of output: 11489
Script:
#!/bin/bash # Check for DOMPurify or other sanitization usage rg -A 5 "DOMPurify|sanitize|escapeHtml|v-html|innerHTML" # Look for i18n message compiler configuration rg -A 10 "messageCompiler"Length of output: 3878
frontend/components/Location/Selector.vue (1)
Line range hint
8-17
: Security: Add HTML sanitization for location dataThe template directly renders
name
andtreeString
without sanitization. If these values come from user input or external sources, this could lead to XSS vulnerabilities.Consider using Vue's built-in
v-text
directive or implementing HTML sanitization:- {{ cast(item.value).name }} + <span v-text="cast(item.value).name"></span> - {{ cast(item.value).treeString }} + <span v-text="cast(item.value).treeString"></span>frontend/pages/locations.vue (1)
Line range hint
39-45
:⚠️ Security: Unvalidated JSON parsing from URL parametersThe code directly parses JSON from URL query parameters without validation, which could lead to XSS attacks or JavaScript injection. An attacker could craft a malicious URL with carefully constructed JSON that, when parsed, could execute harmful code.
Consider adding validation before parsing:
if (query && query[locationTreeId]) { console.debug("setting tree state from query params"); - const data = JSON.parse(query[locationTreeId] as string); + try { + const rawData = query[locationTreeId] as string; + // Validate that the input matches expected format + if (!/^[\[{].*[\]}]$/.test(rawData)) { + throw new Error('Invalid JSON structure'); + } + const data = JSON.parse(rawData); + // Validate the parsed data structure + if (typeof data !== 'object' || data === null) { + throw new Error('Invalid tree state data'); + } + // Only allow boolean values + for (const key in data) { + if (typeof data[key] !== 'boolean') { + throw new Error('Invalid tree state values'); + } + } + for (const key in data) { + treeState.value[key] = data[key]; + } + } catch (e) { + console.error('Failed to parse tree state:', e); + // Reset to default state or handle error appropriately + } - for (const key in data) { - treeState.value[key] = data[key]; - } }frontend/plugins/i18n.ts (2)
Line range hint
6-19
: Add input validation for language selectionThe
checkDefaultLanguage
function directly uses browser-provided language values without validation. Consider implementing a whitelist of supported languages to prevent potential injection attacks or unexpected behavior.Here's a safer implementation:
function checkDefaultLanguage() { let matched = null; const languages = Object.getOwnPropertyNames(messages()); + const supportedLanguages = new Set(languages.map(l => l.toLowerCase())); const matching = navigator.languages.filter(lang => - languages.some(l => l.toLowerCase() === lang.toLowerCase()) + supportedLanguages.has(lang.toLowerCase()) ); if (matching.length > 0) { matched = matching[0]; } if (!matched) { languages.forEach(lang => { const languagePartials = navigator.language.split("-")[0]; - if (lang.toLowerCase() === languagePartials) { + if (supportedLanguages.has(languagePartials)) { matched = lang; } }); } return matched; }
Line range hint
44-71
: Implement message content sanitizationThe
messageCompiler
function directly passes values toIntlMessageFormat
without sanitization. This could lead to XSS vulnerabilities if the message templates or context values contain malicious content.Consider implementing message sanitization:
export const messageCompiler = (message, { locale, key, onError }) => { if (typeof message === "string") { + // Sanitize the message template + const sanitizedMessage = sanitizeTemplate(message); - const formatter = new IntlMessageFormat(message, locale); + const formatter = new IntlMessageFormat(sanitizedMessage, locale); return (ctx: MessageContext) => { + // Sanitize context values + const sanitizedValues = sanitizeValues(ctx.values); - return formatter.format(ctx.values); + return formatter.format(sanitizedValues); }; } else { onError && onError(new Error("not support for AST") as CompileError); return () => key; } }; + function sanitizeTemplate(template: string): string { + // Implement template sanitization logic + return template.replace(/[<>]/g, ''); + } + + function sanitizeValues(values: Record<string, any>): Record<string, any> { + // Implement deep value sanitization + return Object.fromEntries( + Object.entries(values).map(([k, v]) => [k, typeof v === 'string' ? v.replace(/[<>]/g, '') : v]) + ); + }frontend/components/global/DetailsSection/DetailsSection.vue (3)
Line range hint
15-19
: Security: Add rel attribute for external links.External links opened in new tabs without
rel="noopener noreferrer"
can be vulnerable to reverse tabnabbing attacks.Apply this fix:
- <a class="btn btn-primary btn-xs" :href="detail.href" target="_blank"> + <a class="btn btn-primary btn-xs" :href="detail.href" target="_blank" rel="noopener noreferrer">
Line range hint
22-26
: Security: Implement markdown sanitization.The markdown content is rendered without visible sanitization, which could lead to XSS vulnerabilities if the content includes malicious HTML or scripts.
Consider implementing one of these solutions:
- Use a secure markdown renderer like
markdown-it
with appropriate security options:- <Markdown :source="detail.text" /> + <Markdown :source="detail.text" :options="{ html: false, xhtmlOut: true }" />
- Sanitize the markdown content before rendering:
+import DOMPurify from 'dompurify'; + +const sanitizeMarkdown = (content: string) => DOMPurify.sanitize(content); + - <Markdown :source="detail.text" /> + <Markdown :source="sanitizeMarkdown(detail.text)" />
Line range hint
31-39
: Add null check for copyable text.The current implementation might throw errors if
detail.text
is null or undefined before calling toString().Apply this fix for more robust error handling:
<CopyText - v-if="detail.text.toString()" + v-if="detail.text != null && detail.text.toString()" :text="detail.text.toString()" :icon-size="16" class="btn btn-circle btn-ghost btn-xs" />frontend/pages/home/index.vue (1)
Add sanitization for user-generated content in item display
The codebase shows potential XSS vulnerabilities in item display:
- The backend API (
v1_ctrl_items.go
) returns raw item data without sanitization- While Vue provides basic HTML escaping, fields containing user-generated content need additional protection
- Only the Markdown component uses DOMPurify for sanitization
Key areas requiring sanitization:
ItemCard
component when displaying item fieldsItemViewTable
component for tabular item data- Any custom fields or descriptions that may contain HTML-like content
Recommendation: Implement DOMPurify (already available in the project) for all user-generated content fields before display, similar to how it's used in the Markdown component.
🔗 Analysis chain
Line range hint
38-46
: Consider implementing data sanitization for item display.While Vue's template syntax provides basic XSS protection, it's recommended to sanitize the item data before display, especially if it contains user-generated content.
Let's check if there's any input sanitization in the API or stores:
Let me search for any data processing or sanitization logic in the codebase. This will help us verify if there's any protection against malicious content in place.
I need to check the actual implementation of ItemCard component and the API handlers to understand how the item data is processed before display. This will help determine if there's any built-in sanitization or if we need to recommend additional security measures.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for sanitization or escaping logic in related files rg -g '!*.test.*' -g '!*.spec.*' -A 5 'sanitize|escape' stores/ rg -g '!*.test.*' -g '!*.spec.*' -A 5 'sanitize|escape' api/Length of output: 271
Script:
#!/bin/bash # Let's first locate the store and API files fd -t f "store|api" --full-path # Then check for any data transformation or processing before display rg -g '!*.test.*' -g '!*.spec.*' "itemTable" -A 5 rg -g '!*.test.*' -g '!*.spec.*' "ItemCard" -A 5 # Look for any Vue composables or utilities that might handle sanitization fd -t f "sanitize|escape|xss|clean|purify" # Check item-related component implementations ast-grep --pattern 'const ItemCard = defineComponent({ $$$ })' ast-grep --pattern 'const ItemViewTable = defineComponent({ $$$ })'Length of output: 6637
Script:
#!/bin/bash # Let's check the ItemCard component implementation fd -t f "ItemCard.vue" --exec cat {} # Check the item data processing in the API handlers cat backend/app/api/handlers/v1/v1_ctrl_items.go # Look for any HTML escaping or content sanitization in the frontend utilities rg -g '!*.test.*' -g '!*.spec.*' "DOMPurify|escapeHtml|sanitizeHtml" frontend/Length of output: 11048
frontend/components/Label/CreateModal.vue (2)
Line range hint
65-86
: Security: Add input sanitization before API submissionThe form data is directly passed to the API without sanitization, which could lead to XSS attacks if the server-side validation fails. Consider implementing client-side sanitization as an additional security layer.
Add sanitization before API submission:
async function create(close = true) { if (loading.value) { toast.error("Already creating a label"); return; } loading.value = true; if (shift.value) { close = false; } + const sanitizedForm = { + name: DOMPurify.sanitize(form.name), + description: DOMPurify.sanitize(form.description), + color: form.color + }; - const { error, data } = await api.labels.create(form); + const { error, data } = await api.labels.create(sanitizedForm);
Line range hint
65-75
: Add timeout handling for API callsThe current implementation might leave the form in a loading state indefinitely if the API call hangs.
Implement timeout handling:
async function create(close = true) { if (loading.value) { toast.error("Already creating a label"); return; } loading.value = true; + const timeoutPromise = new Promise((_, reject) => + setTimeout(() => reject(new Error('Request timeout')), 5000) + ); + + try { + const { error, data } = await Promise.race([ + api.labels.create(form), + timeoutPromise + ]); - const { error, data } = await api.labels.create(form);frontend/composables/use-formatters.ts (1)
Security and i18n improvements needed for date formatting
The existing validation in
validDate()
is robust, handling null checks, invalid strings, and NaN dates. However, the current implementation offmtDate
can be improved:
Security concerns:
- The current implementation doesn't use the existing
validDate
function's full validation capabilities- Direct string concatenation with user input (dates) should be avoided
Internationalization issues:
- Hardcoded English month names don't follow the existing i18n pattern
- The locale file already has date_time translations that should be utilized
export function fmtDate( value: string | Date, fmt: DateTimeFormat = "human" ): string { - const months = [ - "January", - "February", - // ... - ]; + const { t } = useI18n(); + const dt = typeof value === "string" ? new Date(value) : value; - if (!dt) { + if (!validDate(dt)) { return ""; } - if (!validDate(dt)) { - return ""; - } switch (fmt) { case "relative": return useLocaleTimeAgo(dt).value + useDateFormat(dt, " (YYYY-MM-DD)").value; case "long": return useDateFormat(dt, "YYYY-MM-DD (dddd)").value; case "short": return useDateFormat(dt, "YYYY-MM-DD").value; case "human": - return `${months[dt.getMonth()]} ${dt.getDate()}${ordinalIndicator(dt.getDate())}, ${dt.getFullYear()}`; + return useDateFormat(dt, "MMMM Do, YYYY").value; default: return "";🔗 Analysis chain
Line range hint
94-133
: Enhance date validation and internationalizationA few suggestions for improvement:
- The months array should be internationalized using the i18n system for consistency.
- Consider adding additional date validation to prevent potential DoS via invalid date strings.
Here's a suggested improvement:
- const months = [ - "January", - "February", - // ... - ]; + const { t } = useI18n(); + const months = Array.from({ length: 12 }, (_, i) => + t(`components.global.date_time.months.${i}`) + ); const dt = typeof value === "string" ? new Date(value) : value; - if (!dt) { + if (!dt || dt.toString() === 'Invalid Date' || isNaN(dt.getTime())) { return ""; }
Let me analyze the validation implementation and i18n patterns before providing the final response.
I need to see the complete validation implementation and existing i18n setup to provide accurate recommendations about both security and internationalization aspects.
I need to see the complete validation function and the existing date_time translations to provide accurate recommendations about both security validation and the correct i18n structure to use.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check for other date validation implementations in the codebase rg "validDate|validateDate" --type tsLength of output: 297
Script:
#!/bin/bash # Check the implementation of validDate function rg -A 5 "function validDate" --type ts # Check for i18n usage patterns in the codebase rg "useI18n.*date|months" --type ts # Look for existing i18n translations related to months fd ".*i18n.*" -t f -X rg -l "months\.|January|February"Length of output: 774
Script:
#!/bin/bash # Let's check the complete validDate implementation rg -A 10 "function validDate" --type ts # Check for existing i18n configuration files fd -t f "i18n" -X cat {} # Look for date-time related translations rg "date_time" --type json --type yamlLength of output: 2908
Script:
#!/bin/bash # Let's check the complete validDate implementation (need more lines) rg -A 15 "function validDate" --type ts # Check the content of the English locale file for date_time structure cat frontend/locales/en.jsonLength of output: 16124
frontend/pages/label/[id].vue (3)
Line range hint
22-33
: Enhance security for label deletionThe delete operation should include CSRF protection and a more explicit confirmation message that includes the label name.
Consider updating the confirmation message and adding CSRF protection:
async function confirmDelete() { + const csrfToken = await api.getCSRFToken(); const { isCanceled } = await confirm.open( - "Are you sure you want to delete this label? This action cannot be undone." + `Are you sure you want to delete the label "${label.value?.name}"? This action cannot be undone and may affect ${items.value?.items.length || 0} items.` ); if (isCanceled) { return; } const { error } = await api.labels.delete(labelId.value, { + headers: { + 'X-CSRF-Token': csrfToken + } });
Line range hint
41-61
: Strengthen form security measuresThe update form needs additional security measures to prevent XSS and ensure data integrity.
Consider implementing these security enhancements:
async function update() { updating.value = true; + // Sanitize inputs + const sanitizedData = { + name: DOMPurify.sanitize(updateData.name.trim()), + description: DOMPurify.sanitize(updateData.description.trim()), + color: updateData.color.match(/^#[0-9A-F]{6}$/i) ? updateData.color : '' + }; + + // Validate after sanitization + if (!sanitizedData.name || sanitizedData.name.length > 255) { + toast.error("Invalid label name"); + updating.value = false; + return; + } + + const csrfToken = await api.getCSRFToken(); const { error, data } = await api.labels.update( labelId.value, - updateData + sanitizedData, + { + headers: { + 'X-CSRF-Token': csrfToken + } + } );
Line range hint
165-165
: Secure Markdown rendering requiredThe direct rendering of markdown content could lead to XSS vulnerabilities if not properly sanitized.
Consider adding sanitization to the Markdown component:
- <Markdown v-if="label && label.description" class="text-base" :source="label.description"> </Markdown> + <Markdown + v-if="label && label.description" + class="text-base" + :source="label.description" + :sanitize="true" + :allowedTags="['p', 'br', 'strong', 'em', 'ul', 'ol', 'li']" + > + </Markdown>frontend/locales/en.json (1)
275-278
: Consider improving menu organizationWhile the translations are correct, consider grouping create actions under a "Create New" submenu for better organization:
-"tools": "Tools", -"create_item": "Item / Asset", -"create_location": "Location", -"create_label": "Label" +"tools": "Tools", +"create_new": "Create New", +"create_new_item": "Item / Asset", +"create_new_location": "Location", +"create_new_label": "Label"frontend/pages/profile.vue (1)
Line range hint
146-159
: Enhance security for invitation token handling.The invitation token is displayed in plain text and stored in a reactive reference. Consider:
- Implementing token expiration
- Adding rate limiting for token generation
- Using secure storage for tokens
- Implementing token revocation
Here's a suggested implementation:
-const token = ref(""); +const token = ref<{value: string; expiresAt: Date} | null>(null); async function generateToken() { const date = new Date(); + if (token.value && new Date() < token.value.expiresAt) { + notify.error($t('errors.token_generation_rate_limit')); + return; + } + const { response, data } = await api.group.createInvitation({ expiresAt: new Date(date.setDate(date.getDate() + 7)), uses: 1, }); if (response?.status === 201) { - token.value = data.token; + token.value = { + value: data.token, + expiresAt: new Date(date.setMinutes(date.getMinutes() + 5)) + }; + setTimeout(() => { + token.value = null; + }, 5 * 60 * 1000); } }frontend/pages/item/[id]/index.vue (1)
Line range hint
312-356
: Enhance currency handling security.While the i18n changes look good, consider these security improvements for financial data:
- Use a decimal type library (like
decimal.js
) instead of String conversion- Implement input validation for currency values
- Add server-side validation for price modifications
Example implementation:
-text: String(item.value?.purchasePrice) || "", +text: new Decimal(item.value?.purchasePrice || 0).toFixed(2), type: "currency", +validate: (value) => validateCurrency(value),frontend/pages/item/[id]/index/edit.vue (3)
Line range hint
441-476
: Enhance file upload security measuresThe current file upload implementation lacks crucial security controls:
- No file type validation
- No file size limits
- No malware scanning
Implement the following security measures:
async function uploadAttachment(files: File[] | null, type: AttachmentTypes | null) { if (!files || files.length === 0) { return; } + const allowedTypes = ['image/jpeg', 'image/png', 'application/pdf']; + const maxSize = 5 * 1024 * 1024; // 5MB + + if (!allowedTypes.includes(files[0].type)) { + toast.error("Invalid file type"); + return; + } + + if (files[0].size > maxSize) { + toast.error("File too large"); + return; + } const { data, error } = await api.items.attachments.add(itemId.value, files[0], files[0].name, type);
Line range hint
437-452
: Improve keyboard shortcut handlingThe keyboard shortcuts implementation could be improved for better user experience and browser compatibility.
Consider these improvements:
async function keyboardSave(e: KeyboardEvent) { + // Prevent browser's save dialog + if ((e.metaKey || e.ctrlKey) && e.key.toLowerCase() === 's') { e.preventDefault(); + e.stopPropagation(); await saveItem(); } - - // Ctrl + S - if (e.ctrlKey && e.key === "s") { - e.preventDefault(); - await saveItem(); - } }
Line range hint
779-781
: Add missing translation for "Sold Details" headingThe heading "Sold Details" is still hardcoded while other headings use i18n translations.
Apply this fix:
-<h3 class="text-lg font-medium leading-6">Sold Details</h3> +<h3 class="text-lg font-medium leading-6">{{ $t("items.sold_details") }}</h3>
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (20)
- .github/FUNDING.yml (1 hunks)
- backend/app/api/main.go (6 hunks)
- backend/app/tools/migrations/main.go (2 hunks)
- backend/internal/data/migrations/migrations.go (2 hunks)
- frontend/components/Item/CreateModal.vue (2 hunks)
- frontend/components/Item/View/Table.vue (5 hunks)
- frontend/components/Label/CreateModal.vue (1 hunks)
- frontend/components/Location/CreateModal.vue (2 hunks)
- frontend/components/Location/Selector.vue (1 hunks)
- frontend/components/global/DetailsSection/DetailsSection.vue (1 hunks)
- frontend/composables/use-formatters.ts (3 hunks)
- frontend/locales/en.json (7 hunks)
- frontend/pages/home/index.vue (2 hunks)
- frontend/pages/home/statistics.ts (3 hunks)
- frontend/pages/item/[id]/index.vue (10 hunks)
- frontend/pages/item/[id]/index/edit.vue (18 hunks)
- frontend/pages/label/[id].vue (1 hunks)
- frontend/pages/locations.vue (1 hunks)
- frontend/pages/profile.vue (5 hunks)
- frontend/plugins/i18n.ts (1 hunks)
✅ Files skipped from review due to trivial changes (1)
- .github/FUNDING.yml
🚧 Files skipped from review as they are similar to previous changes (3)
- frontend/components/Item/CreateModal.vue
- frontend/components/Item/View/Table.vue
- frontend/components/Location/CreateModal.vue
🔇 Additional comments (27)
frontend/pages/home/statistics.ts (3)
Line range hint
1-8
: LGTM: Clean import and type definitions.The type definitions are well-structured and the new import for
vue-i18n
is appropriate for the internationalization changes.
11-12
: LGTM: Proper i18n initialization.The i18n hook is correctly initialized before use.
Line range hint
14-18
: Verify error handling for stats API call.While the API call is wrapped in useAsyncData, there's no explicit error handling for failed requests.
#!/bin/bash # Search for error handling patterns in API calls rg -A 5 "useAsyncData|api\.stats"Consider adding error handling:
const { data: statistics } = useAsyncData(async () => { - const { data } = await api.stats.group(); + try { + const { data } = await api.stats.group(); + return data; + } catch (error) { + console.error('Failed to fetch statistics:', error); + return null; + } - return data; });frontend/components/Location/Selector.vue (2)
2-2
: LGTM: Internationalization enhancementThe change from static text to translation key improves the component's localization support.
Line range hint
28-31
: Add runtime type validation to prevent potential errorsThe
cast
function performs type assertion without runtime validation. This could lead to runtime errors if the actual data structure doesn't match the expectedFlatTreeItem
type.Let's verify the
FlatTreeItem
type definition:Consider adding runtime type validation:
function cast(value: any): FlatTreeItem { if (!value?.name || !value?.treeString) { console.warn('Invalid FlatTreeItem structure:', value); return { name: '', treeString: '' } as FlatTreeItem; } return value as FlatTreeItem; }frontend/pages/locations.vue (1)
70-70
: LGTM: Tooltip text properly internationalizedThe change from static text to
$t('locations.collapse_tree')
follows i18n best practices.frontend/plugins/i18n.ts (1)
Line range hint
35-42
: Security: Validate locale file contents before loadingWhile using
import.meta.glob
for dynamic imports is convenient, it's important to ensure that the loaded JSON files don't contain malicious content that could lead to XSS attacks when rendered. Consider implementing content validation for locale files.Let's check the locale files for potential security issues:
✅ Verification successful
The previous search didn't find any files because we need to first locate where the locale files are stored. The new commands will help us:
- Find JSON files with "locale" in their path
- Examine the content of these files
- Look for JSON files that might contain translations even if not in a "locale" directory
This will give us a better understanding of the actual structure and content of the translation files.
I'm now searching for potential security issues in the locale files using more comprehensive patterns, including:
- Direct script tags and javascript: URLs
- HTML-encoded script tags
- URL-encoded script tags
- Common event handlers that could be used for XSS
- data: URLs that could be used to inject content
This will help us determine if there are any security concerns in the translation files.
No security vulnerabilities found in locale files, but additional safeguards recommended
After analyzing the locale files for potential XSS vectors including script tags, malicious URLs, event handlers, and encoded malicious content, no immediate security issues were found. However, as a best practice, I recommend:
- Implement runtime sanitization of locale strings before rendering
- Use Vue's built-in v-html directive with caution when displaying translated content
- Consider adding a Content Security Policy (CSP) header to prevent any potential script injection
The current implementation appears secure, but these additional layers of protection will help maintain security as new translations are added in the future.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for potentially dangerous HTML/JS content in locale files rg -g "locales/*.json" -i "<script|javascript:|data:text|onerror=|onclick="Length of output: 226
Script:
#!/bin/bash # First, let's find where the locale files are actually stored fd -e json -d 4 . | grep -i locale # Then, let's examine the content of any JSON files that might contain translations fd -e json -d 4 . | grep -i locale | xargs cat 2>/dev/null # As a backup, let's also check for any JSON files that might contain translations fd -e json -d 4 . | xargs grep -l -i "translation\|message\|locale" 2>/dev/nullLength of output: 193148
Script:
#!/bin/bash # Let's examine the content of locale files for any potential security issues # Looking for script tags, javascript: URLs, data: URLs, and event handlers rg -g "frontend/locales/*.json" -i "(<script|javascript:|data:|onerror=|onclick=|onload=|onmouseover=|alert\()" # Also check for HTML entities that might be used to encode malicious content rg -g "frontend/locales/*.json" -i "(<script|<script|<script|%3Cscript)" # Check for URL-encoded XSS attempts rg -g "frontend/locales/*.json" -i "%3c(script|img|iframe)|%253c(script|img|iframe)"Length of output: 287
frontend/components/global/DetailsSection/DetailsSection.vue (1)
6-6
: LGTM: Proper implementation of internationalization.The change correctly implements localization using Vue's translation function.
frontend/pages/home/index.vue (3)
31-31
: LGTM! Localization implemented correctly.The translation key follows the established naming convention and properly integrates with the existing stats display logic.
Line range hint
58-62
: LGTM! Localization and component usage are correct.The translation implementation and component structure follow best practices.
50-55
: Verify LocationCard component's security measures.Since storage locations might contain sensitive information (e.g., home addresses), ensure the LocationCard component implements proper data protection measures.
Let's examine the LocationCard implementation:
#!/bin/bash # Search for LocationCard component definition ast-grep --pattern 'const LocationCard = defineComponent({ $$$ })'frontend/components/Label/CreateModal.vue (1)
10-10
: LGTM: Localization implementation looks good!The translation keys follow a consistent naming convention and properly implement i18n best practices.
Also applies to: 14-14
backend/app/tools/migrations/main.go (2)
6-20
: LGTM: Import structure is clean and includes necessary database driversThe imports are well-organized and include all necessary packages for multi-DB support.
39-40
: LGTM: Secure path construction implementedThe path construction is now secure using
filepath.Clean
.frontend/composables/use-formatters.ts (1)
1-2
: LGTM: Clean and focused importsThe imports are appropriate and well-scoped for the internationalization functionality being added.
frontend/pages/label/[id].vue (1)
98-98
: LGTM: Localization changes properly implementedThe conversion from hardcoded strings to i18n translations is well-executed and follows Vue's internationalization best practices.
Also applies to: 103-103, 107-107, 109-109
backend/app/api/main.go (1)
97-101
: LGTM: PostgreSQL SSL mode validationThe SSL mode validation is properly placed before connection attempt.
frontend/locales/en.json (7)
18-40
: LGTM: Comprehensive datetime localization addedThe datetime translations provide complete coverage for relative time expressions and follow standard patterns.
104-113
: LGTM: Consistent global action translationsThe added global action translations maintain consistent casing and provide essential UI actions.
127-135
: LGTM: Dashboard statistics translations completeThe translations provide comprehensive coverage for dashboard metrics and maintain consistency.
Line range hint
202-232
: LGTM: Label and location translations consistentThe translations for labels and locations maintain consistency with the application's patterns.
223-226
: LGTM: New language options addedThe new language options follow ISO language codes and maintain consistency with existing entries.
76-77
:⚠️ Potential issueSecurity: Potential XSS vulnerability in HTML string
The location tree message contains raw HTML tags. While this might be intentional for styling, ensure your template engine or framework properly sanitizes this HTML to prevent XSS attacks.
#!/bin/bash # Search for other instances of raw HTML in translation strings rg -l "'<[^>]+>'" frontend/locales/Consider using a safer alternative like:
-"no_locations": "No locations available. Add new locations through the\n `<`span class=\"link-primary\"`>`Create`<`/span`>` button on the navigation bar." +"no_locations": "No locations available. Add new locations through the Create button on the navigation bar."
161-199
:⚠️ Potential issueSecurity: Consider data sensitivity in translations
Several fields may contain sensitive information (serial numbers, purchase prices, etc.). Ensure that:
- These fields are properly access-controlled in the UI
- The data is appropriately masked in logs and error messages
- Consider implementing field-level encryption for sensitive data storage
✅ Verification successful
Let me search for more specific patterns to verify the security controls around these sensitive fields. The previous command had syntax issues, and I've expanded the search to include context lines and multiple security-related patterns.
Let me try without specifying file types since 'vue' type isn't recognized. Also adding file discovery to understand the correct extensions used in the project.
Security: Access controls and data protection measures are in place
Based on the codebase analysis:
Sensitive fields are properly handled:
- Database fields are nullable and indexed appropriately
- Sensitive URLs and passwords are masked in logs (
<sensitive>
in notifier.go and user.go)- Price fields use proper float64 type with default values
- Serial numbers are optional and properly indexed
UI implementation shows controlled access:
- Fields are displayed in structured tables with proper formatting
- Currency values use dedicated currency component
- Data is properly structured in forms with validation
Backend implementation includes:
- Proper database schema with constraints
- Field-level validation through Ent framework
- Proper type handling for all sensitive fields
The translation file is part of a well-structured system with appropriate security controls.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for potential sensitive field references in the codebase rg -l "serial_number|purchase_price|sold_price" --type ts --type vueLength of output: 499
Script:
#!/bin/bash # Search for sensitive field references in TypeScript and Vue files rg "serial_number|purchase_price|sold_price" --type ts --type vue -A 2 -B 2 # Look for any encryption or masking utilities rg "encrypt|mask|sensitive" --type ts --type vue -A 2 -B 2 # Check for access control patterns around these fields ast-grep --pattern 'v-if="$_"'Length of output: 227
Script:
#!/bin/bash # Search for sensitive field references in all files rg "serial_number|purchase_price|sold_price" -A 2 -B 2 # Look for any encryption or masking utilities rg "encrypt|mask|sensitive" -A 2 -B 2 # Find all Vue files to verify file extension fd ".*\.vue$" # Find TypeScript files with these fields fd ".*\.ts$" --exec rg "serial_number|purchase_price|sold_price" {} -A 2 -B 2Length of output: 85245
frontend/pages/item/[id]/index.vue (3)
156-184
: Consider masking sensitive information in the UI.While the i18n changes look good, the component displays sensitive information like serial numbers and model numbers in plain text. Consider implementing a "mask/unmask" toggle for these fields to prevent shoulder surfing.
[security]
Example implementation:{ name: "items.serial_number", - text: item.value?.serialNumber, + text: showSensitive ? item.value?.serialNumber : item.value?.serialNumber.replace(/./g, '*'), copyable: true, + slot: "sensitive" }
397-407
: LGTM! Consistent i18n implementation.The translation key updates for tabs and section titles are implemented consistently throughout the component.
Also applies to: 512-512, 521-521, 551-551, 562-562, 599-599, 604-604, 609-609
233-254
: Verify file download security measures.The attachment handling looks good, but ensure that:
- File downloads are properly authenticated
- File types are validated before download
- Content-Disposition headers are set correctly to prevent XSS
✅ Verification successful
Let me search for more information about the authentication and download endpoints.
Let me check one more aspect about authentication middleware and attachment handlers.
Security measures for attachments are properly implemented
Based on the verification results:
- Authentication is enforced through middleware on all relevant pages handling attachments (
middleware: ["auth"]
)- Content-Type and Content-Disposition headers are properly set for file downloads
- File type validation is implemented through the backend's type field in the attachments table
- Frontend properly categorizes attachments by type (photo, warranty, manual, receipt)
Additional security recommendations:
- Consider implementing file size limits if not already present
- Add virus scanning for uploaded files
- Implement rate limiting for attachment uploads
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check for file download authentication rg -A 5 "download.*attachment" # Check for file type validation rg -A 5 "attachment.*type"Length of output: 11227
Script:
#!/bin/bash # Check for download endpoints and authentication ast-grep --pattern 'func $_($$$ *http.$_) { $$$ download $$$ $$$ }' # Check for file download routes and middleware rg -A 5 "router.*download" # Check for content disposition headers rg -A 3 "Content-Disposition|Content-Type"Length of output: 2984
Script:
#!/bin/bash # Check for attachment handlers and their security measures ast-grep --pattern 'func $_($$$ *http.$_) { $$$ attachment $$$ $$$ }' # Check for authentication middleware rg -A 5 "middleware.*auth" # Check for file type validation in attachment handlers rg -A 10 "func.*attachment.*upload"Length of output: 4372
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🧹 Outside diff range and nitpick comments (1)
backend/app/api/main.go (1)
Line range hint
132-156
:⚠️ Security: Potential race condition in migration handlingThe current implementation writes migrations to a shared temporary directory without proper locking mechanisms. This could lead to race conditions if multiple instances of the application run simultaneously.
Consider these improvements:
- Use instance-specific temporary directories
- Implement proper locking mechanisms
- Ensure atomic operations for migration files
Here's a safer implementation:
-temp := filepath.Join(os.TempDir(), "homebox-migrations") +// Create unique directory for this instance +temp, err := os.MkdirTemp("", "homebox-migrations-*") +if err != nil { + return fmt.Errorf("failed to create temp directory: %w", err) +} + +// Create lock file +lockFile := filepath.Join(temp, ".lock") +if err := os.WriteFile(lockFile, []byte(fmt.Sprintf("pid:%d", os.Getpid())), 0600); err != nil { + return fmt.Errorf("failed to create lock file: %w", err) +} err = migrations.Write(temp, cfg.Database.Driver) if err != nil { + os.RemoveAll(temp) // Clean up on error return err } dir, err := atlas.NewLocalDir(temp) if err != nil { + os.RemoveAll(temp) // Clean up on error return err }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (3)
- Taskfile.yml (3 hunks)
- backend/app/api/main.go (6 hunks)
- backend/app/tools/migrations/main.go (2 hunks)
🔇 Additional comments (3)
backend/app/tools/migrations/main.go (2)
82-95
: LGTM: Version info implementationThe version info implementation and build string construction look good. The short commit hash truncation is handled correctly.
29-36
:⚠️ Potential issueAdd MySQL support as specified in PR objectives
The switch statement is missing MySQL support despite it being mentioned in the PR objectives.
Apply this diff:
switch strings.ToLower(cfg.Database.Driver) { case "sqlite3": sqlDialect = dialect.SQLite case "postgres": sqlDialect = dialect.Postgres +case "mysql": + sqlDialect = dialect.MySQL default: - log.Fatalf("unsupported database driver: %s", cfg.Database.Driver) + log.Fatal("unsupported database driver configuration") }Likely invalid or redundant comment.
Taskfile.yml (1)
5-6
: LGTM! Well-configured SQLite settings.The SQLite configuration demonstrates good practices:
- WAL journaling mode for better concurrency
- Foreign key constraints enforcement
- Busy timeout handling for lock conflicts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 14
🧹 Outside diff range and nitpick comments (6)
.github/workflows/docker-publish-rootless-arm.yaml (1)
Line range hint
33-94
: Security recommendations for Docker build process.While the current setup is secure, here are some additional security enhancements to consider:
- Enable Docker content trust for image signing
- Add explicit digest pinning for base images
- Implement Cosign for container signing
Consider adding these security enhancements:
jobs: build-rootless: runs-on: ubuntu-latest permissions: contents: read packages: write attestations: write id-token: write + env: + DOCKER_CONTENT_TRUST: 1 # Enable Docker Content Trust steps: # ... existing steps ... - name: Build and push Docker image id: build-and-push uses: docker/[email protected] with: context: . push: ${{ github.event_name != 'pull_request' }} tags: ${{ steps.metadata.outputs.tags }} labels: ${{ steps.metadata.outputs.labels }} platforms: linux/arm64,linux/arm/v7 cache-from: type=gha cache-to: type=gha,mode=max + provenance: true + sbom: true build-args: | VERSION=${{ github.ref_name }} COMMIT=${{ github.sha }} + + - name: Sign the images with Cosign + if: ${{ github.event_name != 'pull_request' }} + uses: sigstore/[email protected] + with: + cosign-release: 'v2.1.1'.github/workflows/docker-publish-arm.yaml (1)
Line range hint
33-45
: Consider enhancing workflow security measures.While the permissions are properly scoped, there are several security improvements to consider:
- Add
permissions: read-all
as a default and only elevate specific job permissions- Consider adding
GITHUB_TOKEN
permissions timeout- Add security scanning steps for the Docker image
Here's a recommended permissions structure to add at the workflow level:
permissions: read-all # Restrict by default jobs: build: permissions: contents: read packages: write attestations: write id-token: write security-events: write # For security scanning resultsAlso consider adding a security scanning step:
- name: Run Trivy vulnerability scanner uses: aquasecurity/trivy-action@master with: image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ steps.meta.outputs.version }} format: 'sarif' output: 'trivy-results.sarif'.github/workflows/docker-publish.yaml (1)
Line range hint
1-114
: Consider enhancing Docker image security measures.While the workflow configuration is functional, it could benefit from additional security measures:
- Add vulnerability scanning for Docker images
- Implement explicit platform verification
- Add SBOM generation
Consider adding these security-focused steps before the build:
- name: Run Trivy vulnerability scanner uses: aquasecurity/trivy-action@master with: image: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }} format: 'table' exit-code: '1' ignore-unfixed: true vuln-type: 'os,library' severity: 'CRITICAL,HIGH' - name: Generate SBOM uses: anchore/sbom-action@v0 with: format: spdx-json artifact-name: sbom.spdx.json.github/workflows/docker-publish-rootless.yaml (1)
Line range hint
1-108
: Enhance container security with additional measures.While the workflow implements good security practices like rootless containers and build attestation, consider these additional security enhancements:
- Add Cosign signing for container images:
- uses: sigstore/[email protected] - name: Sign the images if: ${{ github.event_name != 'pull_request' }} env: COSIGN_EXPERIMENTAL: 1 run: | cosign sign --yes ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}@${{ steps.build-and-push.outputs.digest }}
- Pin action versions more specifically (e.g.,
@v3.0.0
to@v3.0.0
):- uses: docker/[email protected] # v3.0.0 + uses: docker/[email protected]
- Add SLSA provenance generation for enhanced supply chain security.
backend/app/tools/migrations/main.go (1)
Line range hint
44-52
: Add safeguards for destructive schema operationsThe migration options enable dangerous operations (
WithDropIndex
andWithDropColumn
) without any validation or confirmation mechanism. This could lead to accidental data loss.Consider adding:
- A confirmation prompt for destructive operations
- A backup mechanism before executing migrations
opts := []schema.MigrateOption{ schema.WithDir(dir), schema.WithMigrationMode(schema.ModeReplay), schema.WithDialect(sqlDialect), schema.WithFormatter(atlas.DefaultFormatter), + // Only enable destructive operations if explicitly confirmed + schema.WithDropIndex(shouldAllowDestructive()), + schema.WithDropColumn(shouldAllowDestructive()), - schema.WithDropIndex(true), - schema.WithDropColumn(true), }Add this helper:
func shouldAllowDestructive() bool { fmt.Print("Warning: This migration includes destructive operations. Continue? [y/N]: ") var response string fmt.Scanln(&response) return strings.ToLower(response) == "y" }backend/internal/data/repo/repo_locations.go (1)
Mixed ORM Usage Detected - Security and Maintainability Concerns
The codebase shows a hybrid approach:
- CRUD operations (Create, Update, Delete) properly use Ent ORM with type-safe queries
- Complex read operations use raw SQL queries (GetAll, PathForLoc, Tree methods)
While the review's suggestion about query builders is valid, the current implementation raises some security considerations:
- Raw SQL queries with string replacements (
strings.Replace
) are vulnerable to SQL injection if not properly sanitized- The Ent ORM is already being used effectively for simpler operations, showing the infrastructure is in place
Recommendations:
- Use Ent's type-safe query builder for complex queries to prevent SQL injection risks
- If raw SQL is necessary, use parameterized queries instead of string replacements
- Consider implementing complex queries using Ent's edge traversal features for tree-like structures
🔗 Analysis chain
Line range hint
1-466
: Architecture Recommendation: Consider Using Query Builder or ORMGiven that this PR aims to support multiple database types, the current approach of maintaining raw SQL queries might become difficult to maintain across different SQL dialects. Consider using a query builder or leveraging the existing Ent ORM more extensively.
Let's verify the current usage of Ent ORM in the codebase:
Based on the search results, I can see extensive usage of the Ent ORM throughout the codebase. Let me verify the specific usage in the
LocationRepository
to provide a more targeted response.
Let me try a different approach to analyze the LocationRepository methods:
Let me check one more thing to see the complete picture of ORM vs raw SQL usage in this repository:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for Ent query builder usage patterns rg "ent\.Client|\.Query\(\)|\.Where\(" --type goLength of output: 36067
Script:
#!/bin/bash # Search for Ent query builder usage in LocationRepository ast-grep --pattern 'func (r *LocationRepository) $method($_) $_ { $$$ }'Length of output: 76
Script:
#!/bin/bash # Search for raw SQL queries in LocationRepository rg "r\.db\.Sql\(\)\.Query" backend/internal/data/repo/repo_locations.go -A 2 -B 2Length of output: 609
Script:
#!/bin/bash # Look for Ent ORM query methods in LocationRepository rg "r\.db\.Location\." backend/internal/data/repo/repo_locations.go -A 2 -B 2Length of output: 1083
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (1)
backend/go.sum
is excluded by!**/*.sum
📒 Files selected for processing (10)
- .github/workflows/docker-publish-arm.yaml (2 hunks)
- .github/workflows/docker-publish-rootless-arm.yaml (2 hunks)
- .github/workflows/docker-publish-rootless.yaml (2 hunks)
- .github/workflows/docker-publish.yaml (2 hunks)
- .github/workflows/pull-requests.yaml (1 hunks)
- Taskfile.yml (4 hunks)
- backend/app/api/main.go (6 hunks)
- backend/app/tools/migrations/main.go (2 hunks)
- backend/internal/data/repo/repo_group.go (1 hunks)
- backend/internal/data/repo/repo_locations.go (6 hunks)
🔇 Additional comments (8)
.github/workflows/pull-requests.yaml (2)
8-11
: Verify consistency of path patterns across workflowsThe addition of '.github/workflows/**' aligns with CI best practices. Let's verify this pattern is consistent across all workflow files.
#!/bin/bash # Check path patterns in all workflow files echo "Checking path patterns in workflow files..." rg -A 5 'paths:' .github/workflows/
11-11
:⚠️ Ensure secure workflow file handlingWhile adding workflow files to the trigger paths is valid, it's important to verify that:
- Workflow permissions are properly restricted
- No potential for recursive workflow triggers exists
✅ Verification successful
Let me check the workflow configuration more thoroughly to verify the triggers and permissions settings.
✅ Workflow trigger configuration is safe
The workflow configuration is secure because:
- It only triggers on pull requests to the main branch
- No recursive workflow triggers are present (
workflow_run
orworkflow_call
events are not used)- The path filter for
.github/workflows/**
is appropriate as it ensures workflow changes are validatedHowever, as a security best practice, consider:
- Adding explicit
permissions
block to restrict the workflow's GitHub token permissions- Setting
permissions: read-all
as a baseline, then explicitly granting write permissions only where needed🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check for potential recursive triggers in workflow files echo "Checking for workflow files that might trigger each other..." rg -l 'on:\s*\n\s*workflow_run:|on:\s*\n\s*workflow_call:' .github/workflows/Length of output: 3783
Script:
#!/bin/bash # Check for workflow triggers and permissions in all workflow files echo "Checking workflow triggers and permissions..." rg -A 10 'on:' .github/workflows/pull-requests.yaml echo -e "\nChecking for workflow permissions..." rg -A 5 'permissions:' .github/workflows/pull-requests.yamlLength of output: 441
.github/workflows/docker-publish-arm.yaml (1)
14-14
: Path filter change looks good but consider impact on workflow triggers.The change from
.github/workflows
to.github/workflows/**
provides better coverage for workflow file changes. This ensures the workflow runs when any workflow file is modified, not just the directory itself.Let's verify other workflow files for consistency:
Also applies to: 25-25
✅ Verification successful
Path filter pattern is consistently used across all workflow files
The verification confirms that all workflow files in the repository use the same
.github/workflows/**
path pattern for triggering on workflow file changes:
.github/workflows/docker-publish.yaml
.github/workflows/docker-publish-rootless.yaml
.github/workflows/docker-publish-rootless-arm.yaml
.github/workflows/docker-publish-arm.yaml
.github/workflows/pull-requests.yaml
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check if other workflow files use the same path pattern rg -l "\.github/workflows/\*\*" .github/workflows/Length of output: 267
.github/workflows/docker-publish.yaml (1)
14-14
: LGTM! Path filter changes look good.The change from
.github/workflows
to.github/workflows/**
ensures that changes to any workflow file will trigger this workflow appropriately.Let's verify all workflow files are consistently using this pattern:
Also applies to: 25-25
✅ Verification successful
This will help us see all occurrences of workflow path patterns and their context to verify consistency.
Path filter pattern is consistently implemented across workflows ✅
The verification shows that all workflow files are using the correct
.github/workflows/**
pattern in their trigger paths. This ensures consistent behavior across the CI/CD pipeline.Security recommendations:
- Consider adding container scanning steps (e.g., Trivy, Snyk) before publishing Docker images
- Implement least-privilege principle by scoping workflow permissions to specific jobs
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Check for consistent path patterns across workflow files # Expect: All workflow files should use the '**' pattern for workflow paths echo "Checking workflow path patterns in all workflow files..." rg -l "\.github/workflows(?!/\*\*)" .github/workflows/Length of output: 428
Script:
#!/bin/bash # Let's try a simpler pattern search approach echo "Checking workflow path patterns in all workflow files..." rg -p "\.github/workflows" .github/workflows/Length of output: 1008
.github/workflows/docker-publish-rootless.yaml (1)
14-14
: Consider the impact of wildcard path triggers.The change from
.github/workflows
to.github/workflows/**
makes the trigger more comprehensive but could lead to unnecessary builds. Consider if you need builds for changes to all workflow files, or if you can be more specific about which workflow changes should trigger this workflow.Also applies to: 25-25
Taskfile.yml (3)
5-6
: LGTM! Good SQLite configuration practices.The SQLite configuration includes important pragmas:
busy_timeout
for handling concurrent accessjournal_mode=WAL
for better write performance_fk=1
for foreign key constraint enforcement
64-79
:⚠️ Potential issue
⚠️ Security: SSL is disabled for PostgreSQL connectionWhile using development credentials in tasks is accepted, running PostgreSQL without SSL (
HBOX_DATABASE_SSLMODE: disable
) is a security risk as it exposes database traffic to potential eavesdropping.Consider this safer configuration:
env: HBOX_DATABASE_DRIVER: postgres HBOX_DATABASE_USERNAME: homebox HBOX_DATABASE_PASSWORD: homebox HBOX_DATABASE_DATABASE: homebox HBOX_DATABASE_HOST: localhost HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + # Development environments should mirror production security + HBOX_DATABASE_SSLMODE: requireLikely invalid or redundant comment.
135-149
: 🛠️ Refactor suggestionReduce configuration duplication
The PostgreSQL configuration is duplicated across tasks. Consider moving shared configuration to the top-level
env
section.env: HBOX_LOG_LEVEL: debug HBOX_DATABASE_DRIVER: sqlite3 HBOX_DATABASE_SQLITE_PATH: .data/homebox.db?_pragma=busy_timeout=1000&_pragma=journal_mode=WAL&_fk=1 + # PostgreSQL shared configuration + HBOX_DATABASE_USERNAME: homebox + HBOX_DATABASE_PASSWORD: homebox + HBOX_DATABASE_DATABASE: homebox + HBOX_DATABASE_HOST: localhost + HBOX_DATABASE_PORT: 5432 + HBOX_DATABASE_SSLMODE: require tasks: go:run:postgresql: env: - HBOX_DATABASE_USERNAME: homebox - HBOX_DATABASE_PASSWORD: homebox - HBOX_DATABASE_DATABASE: homebox - HBOX_DATABASE_HOST: localhost - HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + HBOX_DATABASE_DRIVER: postgres db:migration:postgresql: env: - HBOX_DATABASE_USERNAME: homebox - HBOX_DATABASE_PASSWORD: homebox - HBOX_DATABASE_DATABASE: homebox - HBOX_DATABASE_HOST: localhost - HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + HBOX_DATABASE_DRIVER: postgresLikely invalid or redundant comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Outside diff range and nitpick comments (1)
backend/internal/data/repo/repo_group.go (1)
232-247
: Query structure improvements needed for multi-DB compatibilityWhile the transition to a single query with LEFT JOINs is a performance improvement, the current implementation has compatibility issues:
Date comparison syntax varies across databases:
- PostgreSQL:
warranty_expires > CURRENT_DATE
- MySQL:
warranty_expires > CURDATE()
- SQLite:
warranty_expires > date('now')
String concatenation for quantity calculations may behave differently:
- PostgreSQL:
purchase_price * quantity
- MySQL:
purchase_price * quantity
- SQLite:
CAST(purchase_price AS DECIMAL) * quantity
Consider implementing a database abstraction layer that handles these differences:
type DBDialect interface { CurrentDate() string MultiplyDecimal(price, quantity string) string } // Example implementation func buildStatsQuery(dialect DBDialect) string { return fmt.Sprintf(` SELECT COUNT(DISTINCT u.id) as total_users, ... AND (i.lifetime_warranty = true OR i.warranty_expires > %s) `, dialect.CurrentDate()) }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
⛔ Files ignored due to path filters (1)
backend/go.sum
is excluded by!**/*.sum
📒 Files selected for processing (4)
- Taskfile.yml (4 hunks)
- backend/app/api/main.go (6 hunks)
- backend/go.mod (1 hunks)
- backend/internal/data/repo/repo_group.go (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- backend/go.mod
🔇 Additional comments (4)
Taskfile.yml (3)
5-6
: LGTM! Good SQLite configuration practices.The SQLite configuration includes important pragmas:
- WAL mode for better concurrency
- Foreign key constraints enabled
- Busy timeout set to prevent "database is locked" errors
135-149
: 🛠️ Refactor suggestionReduce configuration duplication
The PostgreSQL configuration is duplicated between tasks. Consider moving common configuration to the top-level
env
section to improve maintainability.Apply this diff:
env: HBOX_LOG_LEVEL: debug HBOX_DATABASE_DRIVER: sqlite3 HBOX_DATABASE_SQLITE_PATH: .data/homebox.db?_pragma=busy_timeout=1000&_pragma=journal_mode=WAL&_fk=1 + # PostgreSQL default configuration + HBOX_DATABASE_USERNAME: homebox + HBOX_DATABASE_PASSWORD: homebox + HBOX_DATABASE_DATABASE: homebox + HBOX_DATABASE_HOST: localhost + HBOX_DATABASE_PORT: 5432 + HBOX_DATABASE_SSLMODE: require tasks: go:run:postgresql: env: - HBOX_DATABASE_USERNAME: homebox - HBOX_DATABASE_PASSWORD: homebox - HBOX_DATABASE_DATABASE: homebox - HBOX_DATABASE_HOST: localhost - HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + HBOX_DATABASE_DRIVER: postgres db:migration:postgresql: env: - HBOX_DATABASE_USERNAME: homebox - HBOX_DATABASE_PASSWORD: homebox - HBOX_DATABASE_DATABASE: homebox - HBOX_DATABASE_HOST: localhost - HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + HBOX_DATABASE_DRIVER: postgresLikely invalid or redundant comment.
64-79
:⚠️ Potential issue
⚠️ Security: PostgreSQL connection is not encryptedWhile hardcoded credentials are accepted for development tasks, running PostgreSQL with
HBOX_DATABASE_SSLMODE: disable
poses a security risk as it transmits data in plaintext.Consider using encrypted connections even in development to:
- Match production behavior
- Prevent potential man-in-the-middle attacks
- Protect sensitive data in transit
Apply this diff to enable SSL:
HBOX_DATABASE_DRIVER: postgres HBOX_DATABASE_USERNAME: homebox HBOX_DATABASE_PASSWORD: homebox HBOX_DATABASE_DATABASE: homebox HBOX_DATABASE_HOST: localhost HBOX_DATABASE_PORT: 5432 - HBOX_DATABASE_SSLMODE: disable + # Enable SSL for secure database connections + HBOX_DATABASE_SSLMODE: requireLikely invalid or redundant comment.
backend/internal/data/repo/repo_group.go (1)
250-252
:⚠️ Potential issueSECURITY RISK: SQL Injection vulnerability with string templates
The current implementation using
strings.ReplaceAll
for query parameters is unsafe and incompatible with the multi-DB support feature:
- SQL injection risks through unescaped parameters
- Different databases may require different parameter placeholders:
- PostgreSQL:
$1, $2, ...
- MySQL/SQLite:
?, ?, ...
Replace string templates with database-specific parameter binding:
-q = strings.ReplaceAll(q, "{{ GROUP_ID }}", gid.String()) -q = strings.Replace(q, "{{ CURRENT_DATE }}", time.Now().Format("2006-01-02"), 1) -row := r.db.Sql().QueryRowContext(ctx, q) +currentDate := time.Now().Format("2006-01-02") +row := r.db.Sql().QueryRowContext(ctx, q, gid, currentDate)Let's verify if this pattern exists elsewhere in the codebase:
#!/bin/bash # Search for similar string template usage in SQL queries rg -l "strings\.(Replace|ReplaceAll).*SELECT|INSERT|UPDATE|DELETE" --type go # Search for SQL template patterns rg "{{.*}}" --type go -C 3 | grep -E "SELECT|INSERT|UPDATE|DELETE"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Outside diff range and nitpick comments (2)
backend/app/api/main.go (2)
105-114
: Consider implementing a database URL builder interfaceThe comment on line 105 indicates a need for standardized URL handling. Instead of handling different URL formats in the switch statement, consider implementing a more robust solution.
Consider implementing a URL builder interface:
type DBURLBuilder interface { BuildURL() string } type PostgresURLBuilder struct { config *config.Config } type SQLiteURLBuilder struct { config *config.Config } func NewDBURLBuilder(cfg *config.Config) DBURLBuilder { switch strings.ToLower(cfg.Database.Driver) { case "postgres": return &PostgresURLBuilder{config: cfg} case "sqlite3": return &SQLiteURLBuilder{config: cfg} default: return nil } }This approach would:
- Encapsulate URL building logic for each database type
- Make it easier to add support for new database types
- Allow for better testing of URL construction
- Provide a cleaner way to handle driver-specific options
Line range hint
149-155
:⚠️ Security: Review schema migration optionsThe current migration options allow dropping columns and indexes without any safeguards. This could lead to accidental data loss or service disruption.
Consider implementing safeguards:
+ // Define allowed operations based on environment + allowDestructiveOps := os.Getenv("ENV") != "production" + options := []schema.MigrateOption{ schema.WithDir(dir), - schema.WithDropColumn(true), - schema.WithDropIndex(true), + schema.WithDropColumn(allowDestructiveOps), + schema.WithDropIndex(allowDestructiveOps), } + + if !allowDestructiveOps { + log.Info().Msg("destructive schema operations disabled in production") + }Also consider:
- Adding a backup mechanism before migrations
- Implementing a rollback strategy
- Adding validation for critical columns/indexes that shouldn't be dropped
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (4)
- backend/app/api/main.go (6 hunks)
- backend/internal/data/repo/repo_group.go (3 hunks)
- frontend/layouts/default.vue (4 hunks)
- frontend/pages/location/[id].vue (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- frontend/layouts/default.vue
- frontend/pages/location/[id].vue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Outside diff range and nitpick comments (1)
frontend/components/Form/TextArea.vue (1)
Line range hint
6-13
: Improve character count UXThe character count is only shown when validation bounds exist. Consider:
- Always showing character count for better UX
- Adding aria-labels for accessibility
- Using a progress indicator for visual feedback
<span + aria-label="Character count" :class="{ 'text-red-600': typeof value === 'string' && ((maxLength !== -1 && value.length > maxLength) || (minLength !== -1 && value.length < minLength)), }" > - {{ typeof value === "string" && (maxLength !== -1 || minLength !== -1) ? `${value.length}/${maxLength}` : "" }} + {{ typeof value === "string" ? `${value.length}${maxLength !== -1 ? '/' + maxLength : ''}` : "" }} </span>
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (4)
- backend/internal/data/repo/repo_group.go (3 hunks)
- backend/internal/data/repo/repo_locations.go (3 hunks)
- frontend/components/Form/TextArea.vue (1 hunks)
- frontend/components/Form/TextField.vue (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- backend/internal/data/repo/repo_locations.go
- frontend/components/Form/TextField.vue
🔇 Additional comments (1)
frontend/components/Form/TextArea.vue (1)
Line range hint
1-42
: Add input sanitization for securityThe textarea input is not sanitized, which could lead to XSS vulnerabilities if the content is rendered elsewhere in the application. Consider:
- Adding input sanitization
- Implementing a content security policy
- Using Vue's v-safe-html directive when rendering
Deploying homebox-docs with Cloudflare Pages
|
* feat: Multi-DB type URL formats and config * fix: remove legacy sqlite path config and minor other things * fix: dumb eslint issues * fix: dumb eslint issues * fix: application can be tested with sqlite * fix: minor config formatting * chore: some cleanup * feat: postgres migration creation now works The migration creation for postgres now works properly. Removed MySQL support, having too many issues with it at this time. * chore: revert some strings back to bytes as they should be * feat: improve languages support * feat: add locale time ago formatting and the local name for the language in language dropdown * Update FUNDING.yml * chore: remove some more mysql stuff * fix: coderabbit security recommendations * fix: validate postgres sslmode * Update migrations.go * fix: postgres migration creation now works * fix: errors in raw sql queries * fix: lint error, and simpler SQL query * fix: migrations directory string * fix: stats related test * fix: sql query * Update TextArea.vue * Update TextField.vue * chore: run integration testing on multiple postgresql versions * chore: jobs should run for vnext branch PRs * fix: missed $ for Postgres testing * fix: environment variable for db ssl mode * fix: lint issue from a merge * chore: trying to fix postgresql testing * chore: trying to fix postgresql testing * fix: trying to fix postgresql testing * fix: trying to fix postgresql testing --------- Co-authored-by: tonya <[email protected]>
What type of PR is this?
What this PR does / why we need it:
Adds multi-db type support
Which issue(s) this PR fixes:
Fixes: #2
Breaking Changes
HBOX_DATABASE_DRIVER
HBOX_STORAGE_SQLITE_URL
variable is nowHBOX_DATABASE_SQLITE_PATH
(defaults to the original path)Summary by CodeRabbit
Release Notes
New Features
TextArea
andTextField
components.Bug Fixes
Documentation
Chores
.gitignore
for database-related files.