Compare commits

...

106 Commits

Author SHA1 Message Date
sriramveeraghanta
70cdd49ea9 fix: update avatar component 2025-08-06 20:01:47 +05:30
Dancia
6450793d72 docs: updated README images (#7547) 2025-08-05 17:24:33 +05:30
Akshita Goyal
dffe3a844f [WEB-4632] chore: pinned your work to the sidebar #7537 2025-08-04 20:06:03 +05:30
Aaryan Khandelwal
7ead606798 [WIKI-578] refactor: editor types structure #7536 2025-08-04 18:01:51 +05:30
Vipin Chaudhary
fa150c2b47 [WIKI-576] fix: trail node (#7527)
* fix : trail node

* remove flagged

* refactor : add disable flagging

* refactor:update disabled extension

* refactor: additional disabled

* refactor: update enum

* chore: add description key to page response type

* chore: update base page instance

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2025-08-04 16:12:46 +05:30
Aaryan Khandelwal
c3273b1a85 [WIKI-568] refactor: add touch device support to editor (#7439)
* refactor: add isTouchDevice prop

* chore: handle event propagation in touch devices

* refactor: isTouchDevice implementation

* chore: misc editor updates and utility functions (#7455)

* chore: misc editor updated and utility functions

* fix: code review

* passed isTouchDevice prop to editor-wrapper

* added more props to editor-wrapper.

* chore: update types

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>

* fix: remove unnecessary deps

---------

Co-authored-by: Lakhan Baheti <94619783+1akhanBaheti@users.noreply.github.com>
2025-08-04 16:04:09 +05:30
Bavisetti Narayan
7cec92113f [WEB-4628] chore: return 200 response for work item comment #7532 2025-08-04 15:27:05 +05:30
Vipin Chaudhary
8cc513ba5e [WIKI-577] fix: unexpected emoji input (#7534)
* fix: unexpected emoji input

* update array
2025-08-04 15:21:10 +05:30
P B
784b8da9be docs: update README.md (#7531)
changed tagline
2025-08-01 18:19:02 +05:30
Vihar Kurama
1d443310ce feat: update new logo on readme (#7530)
* feat: update new logo on readme

Changed logo on Readme

* fix: image url

---------

Co-authored-by: sriramveeraghanta <veeraghanta.sriram@gmail.com>
2025-08-01 17:32:34 +05:30
Akshat Jain
cc49a2ca4f [INFRA-219] fix: update Dockerfile and docker-compose for proxy service (#7523)
* fix: update Dockerfile and docker-compose for version v0.28.0 and improve curl commands in install script

* fix: update docker-compose to use 'stable' tag for all services

* fix: improve curl command options in install script for better reliability
2025-07-31 13:27:34 +05:30
sriram veeraghanta
ee53ee33d0 Potential fix for code scanning alert no. 631: Incomplete URL scheme check (#7514)
* Potential fix for code scanning alert no. 631: Incomplete URL scheme check

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

* fix: ignore warning in this file

---------

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-07-31 13:23:59 +05:30
sriram veeraghanta
99f9337f35 fix: enable email notification by default for new users (#7521) 2025-07-31 13:02:41 +05:30
sriram veeraghanta
1458c758a3 fix: adding proxy command in compose file #7518
fix: adding proxy command in compose file
2025-07-30 21:01:34 +05:30
sriramveeraghanta
2d9988f584 fix: adding proxy command 2025-07-30 21:00:16 +05:30
Sangeetha
27fa439c8d [WEB-4602] fix: 500 error on draft wi labels update #7515 2025-07-30 20:18:48 +05:30
sriramveeraghanta
57935a94cc fix: header observer for state changes 2025-07-30 17:24:40 +05:30
Vamsi Krishna
876ccce86b [WEB-4597] fix: project icon render in switcher #7504 2025-07-30 17:18:18 +05:30
Nikhil
a67dba45f8 [WEB-4599] feat: add marketing email consent field to Profile model and timezone migration #7510 2025-07-30 17:17:16 +05:30
Prateek Shourya
ef8e613358 [WEB-4603] feat: add missing event trackers (#7513)
* feat: add event trackers for password creation

* feat: add event trackers for project views

* feat: add event trackers for product updates and changelogs

* chore: use element name instead of event name for changelog redirect link
2025-07-30 17:15:59 +05:30
sriramveeraghanta
c4d2c5b1bb sync: canary changes to preview 2025-07-30 15:44:41 +05:30
Bavisetti Narayan
d4705e1649 [WEB-4544] chore: added field validations in serializer (#7460)
* chore: added field validations in serializer

* chore: added enum for roles
2025-07-30 15:41:44 +05:30
Rishi
8228ecc087 fix(cli): improve API service readiness check in install script (#7468)
* fix: improve API service readiness check in install script

* fix(cli): correct python indentation in api health check

* fix(cli): prevent false positive api ready message on timeout
2025-07-30 15:08:30 +05:30
Aaryan Khandelwal
ce08c9193c [WIKI-573] regression: comment card padding #7512 2025-07-30 15:02:55 +05:30
Bavisetti Narayan
69d5cd183f chore: added validation for description (#7507)
* added PageBinaryUpdateSerializer for binary data validation and update

* chore: added validation for description

* chore: removed the duplicated file

* Fixed coderabbit comments

- Improve content validation by consolidating patterns and enhancing recursion checks

- Updated `PageBinaryUpdateSerializer` to simplify assignment of validated data.
- Enhanced `content_validator.py` with consolidated dangerous patterns and added recursion depth checks to prevent stack overflow during validation.
- Improved readability and maintainability of validation functions by using constants for patterns.

---------

Co-authored-by: Dheeraj Kumar Ketireddy <dheeru0198@gmail.com>
2025-07-30 14:19:49 +05:30
Aaryan Khandelwal
6e1734c86e [WIKI-570] fix: image alignment for read only editor #7509 2025-07-30 14:12:33 +05:30
dependabot[bot]
05f56d04ba chore(deps): bump linkifyjs in the npm_and_yarn group across 1 directory (#7506)
Bumps the npm_and_yarn group with 1 update in the / directory: [linkifyjs](https://github.com/nfrasser/linkifyjs/tree/HEAD/packages/linkifyjs).


Updates `linkifyjs` from 4.3.1 to 4.3.2
- [Release notes](https://github.com/nfrasser/linkifyjs/releases)
- [Changelog](https://github.com/nfrasser/linkifyjs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nfrasser/linkifyjs/commits/v4.3.2/packages/linkifyjs)

---
updated-dependencies:
- dependency-name: linkifyjs
  dependency-version: 4.3.2
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-30 13:12:00 +05:30
Aaryan Khandelwal
3f60f14bb1 [WIKI-539] regression: toolbar visible in comments card #7508 2025-07-30 13:11:56 +05:30
Aaryan Khandelwal
e0b0aafc2b [WIKI-539] regression: editor build errors #7503 2025-07-29 19:10:39 +05:30
Aaryan Khandelwal
e0fa6553ae [WIKI-539] refactor: remove lite text read only editor (#7481)
* refactor: remove lite text read only editor

* chore: update types
2025-07-29 18:09:39 +05:30
Vamsi Krishna
55f06cf546 [WEB-4563]fix: image full screen modal #7491 2025-07-28 19:55:13 +05:30
Sangeetha
84879ee3bd [WEB-4533] feat: read replica functionality (#7453)
* feat: read replica functionality

* fix: set use_read_replica to false

* chore: add use_read_replica to external APIs

* chore: remove use_read_replica on read endpoints

* chore: remove md files

* Updated all the necessary endpoints to use read replica

---------

Co-authored-by: Dheeraj Kumar Ketireddy <dheeru0198@gmail.com>
2025-07-28 17:41:02 +05:30
Sriram Veeraghanta
b1162395ed sync: canary into preview 2025-07-28 12:53:16 +05:30
Sriram Veeraghanta
b93883fc14 chore: version upgrade 2025-07-28 12:51:19 +05:30
Nikhil
4913c116d1 chore: correct migration file 0098 (#7486)
* feat: restore background color and goals fields in Profile and Workspace models in migration 0099

* chore: update migration dependency to reflect recent changes in profile model migrations
2025-07-28 12:32:47 +05:30
Nikhil
4044ce25ce [WEB-4566] feat: add background color and goals field to Profile and Workspace models #7485 2025-07-27 21:10:31 +05:30
Anmol Singh Bhatia
a5f3bd15b1 [WEB-4513] refactor: consolidate password strength meter into shared ui package (#7462)
* refactor: consolidate password strength indicator into shared UI package

* chore: remove old password strength meter implementations

* chore: update package dependencies for password strength refactor

* chore: code refactor

* fix: lock file

---------

Co-authored-by: sriramveeraghanta <veeraghanta.sriram@gmail.com>
2025-07-25 16:56:46 +05:30
Bavisetti Narayan
63d025cbf4 [WEB-4544] chore: added field validations in serializer (#7460)
* chore: added field validations in serializer

* chore: added enum for roles
2025-07-25 16:50:35 +05:30
Aaryan Khandelwal
27f74206a3 [WIKI-537] refactor: document editor (#7384)
* refactor: document editor

* chore: update user prop

* fix: type warning

* chore: update value prop name

* chore: remove unnecessary exports

* hore: update initialValue type

* chore: revert initialValue type

* refactor: unnecessary string handlers
2025-07-25 13:57:45 +05:30
Aaron Heckmann
e20bfa55d6 chore: add ci job names (#7478)
* chore: add ci job names

This makes them easier to find in the Github UI.

* chore: increase ci timeout
2025-07-25 13:41:03 +05:30
Aaryan Khandelwal
f8353d3468 fix: eslint warnings and errors (#7479) 2025-07-25 13:40:25 +05:30
Aaron Heckmann
57479f4554 fix: lint (#7433)
* chore: fix lint

* fix: constants check:lint command

* chore(lint): permit unused vars which begin w/ _

* chore: rm dead code

* fix(lint): more lint fixes to constants pkg

* fix(lint): lint the live server

- fix lint issues

* chore: improve clean script

* fix(lint): more lint

* chore: set live server process title

* chore(deps): update to turbo@2.5.5

* chore(live): target node22

* fix(dev): add missing ui pkg dependency

* fix(dev): lint decorators

* fix(dev): lint space app

* fix(dev): address lint issues in types pkg

* fix(dev): lint editor pkg

* chore(dev): moar lint

* fix(dev): live server exit code

* chore: address PR feedback

* fix(lint): better TPageExtended type

* chore: refactor

* chore: revert most live server changes

* fix: few more lint issues

* chore: enable ci checks

Ensure we can build + confirm that lint is not getting worse.

* chore: address PR feedback

* fix: web lint warning added to package.json

* fix: ci:lint command

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2025-07-25 01:44:51 +05:30
Dheeraj Kumar Ketireddy
514686d9d5 [WEB-4045] feat: restructuring of the external APIs for better maintainability (#7477)
* Basic setup for drf-spectacular

* Updated to only handle /api/v1 endpoints

* feat: add asset and user endpoints with URL routing

- Introduced new asset-related endpoints for user assets and server assets, allowing for asset uploads and management.
- Added user endpoint to retrieve current user information.
- Updated URL routing to include new asset and user patterns.
- Enhanced issue handling with a new search endpoint for issues across multiple fields.
- Expanded member management with a new endpoint for workspace members.

* Group endpoints by tags

* Detailed schema definitions and examples for asset endpoints

* Removed unnecessary extension

* Specify avatar_url field separately

* chore: add project docs

* chore: correct all errors

* chore: added open spec in work items

* feat: enhance cycle API endpoints with detailed OpenAPI specifications

- Updated CycleAPIEndpoint and CycleIssueAPIEndpoint to include detailed OpenAPI schema definitions for GET, POST, PATCH, and DELETE operations.
- Specified allowed HTTP methods for each endpoint in the URL routing.
- Improved documentation for cycle creation, updating, and deletion, including request and response examples.

* chore: added open spec in labels

* chore: work item properties

* feat: enhance API endpoints with OpenAPI specifications and HTTP method definitions

- Added detailed OpenAPI schema definitions for various API endpoints including Intake, Module, and State.
- Specified allowed HTTP methods for each endpoint in the URL routing for better clarity and documentation.
- Improved request and response examples for better understanding of API usage.
- Introduced unarchive functionality for cycles and modules with appropriate endpoint definitions.

* chore: run formatter

* Removed unnecessary settings for authentication

* Refactors OpenAPI documentation structure

Improves the organization and maintainability of the OpenAPI documentation by modularizing the `openapi_spec_helpers.py` file.

The changes include:
- Migrates common parameters, responses, examples, and authentication extensions to separate modules.
- Introduces helper decorators for different endpoint types.
- Updates view imports to use the new module paths.
- Removes the legacy `openapi_spec_helpers.py` file.

This refactoring results in a more structured and easier-to-maintain OpenAPI documentation setup.

* Refactor OpenAPI endpoint specifications

- Removed unnecessary parameters from the OpenAPI documentation for various endpoints in the asset, cycle, and project views.
- Updated request structures to improve clarity and consistency across the API documentation.
- Enhanced response formatting for better readability and maintainability.

* Enhance API documentation with detailed endpoint descriptions

Updated various API endpoints across the application to include comprehensive docstrings that clarify their functionality. Each endpoint now features a summary and detailed description, improving the overall understanding of their purpose and usage. This change enhances the OpenAPI specifications for better developer experience and documentation clarity.

* Enhance API serializers and views with new request structures

- Added new serializers for handling cycle and module issue requests, including `CycleIssueRequestSerializer`, `TransferCycleIssueRequestSerializer`, `ModuleIssueRequestSerializer`, and intake issue creation/updating serializers.
- Updated existing serializers to improve clarity and maintainability, including the `UserAssetUploadSerializer` and `IssueAttachmentUploadSerializer`.
- Refactored API views to utilize the new serializers, enhancing the request handling for cycle and intake issue endpoints.
- Improved OpenAPI documentation by replacing inline request definitions with serializer references for better consistency and readability.

* Refactor OpenAPI documentation and endpoint specifications

- Replaced inline schema definitions with dedicated decorators for various endpoint types, enhancing clarity and maintainability.
- Updated API views to utilize new decorators for user, cycle, intake, module, and project endpoints, improving consistency in OpenAPI documentation.
- Removed unnecessary parameters and responses from endpoint specifications, streamlining the documentation for better readability.
- Enhanced the organization of OpenAPI documentation by modularizing endpoint-specific decorators and parameters.

* chore: correct formatting

* chore: correct formatting for all api folder files

* refactor: clean up serializer imports and test setup

- Removed unused `StateLiteSerializer` import from the serializer module.
- Updated test setup to include a noqa comment for the `django_db_setup` fixture, ensuring clarity in the code.
- Added missing commas in user data dictionary for consistency.

* feat: add project creation and update serializers with validation

- Introduced `ProjectCreateSerializer` and `ProjectUpdateSerializer` to handle project creation and updates, respectively.
- Implemented validation to ensure project leads and default assignees are members of the workspace.
- Updated API views to utilize the new serializers for creating and updating projects, enhancing request handling.
- Added OpenAPI documentation references for the new serializers in the project API endpoints.

* feat: update serializers to include additional read-only fields

* refactor: rename intake issue serializers and enhance structure

- Renamed `CreateIntakeIssueRequestSerializer` to `IntakeIssueCreateSerializer` and `UpdateIntakeIssueRequestSerializer` to `IntakeIssueUpdateSerializer` for clarity.
- Introduced `IssueSerializer` for nested issue data in intake requests, improving the organization of serializer logic.
- Updated API views to utilize the new serializer names, ensuring consistency across the codebase.

* refactor: rename issue serializer for intake and enhance API documentation

- Renamed `IssueSerializer` to `IssueForIntakeSerializer` for better clarity in the context of intake issues.
- Updated references in `IntakeIssueCreateSerializer` and `IntakeIssueUpdateSerializer` to use the new `IssueForIntakeSerializer`.
- Added OpenAPI documentation for the `get_workspace_work_item` endpoint, detailing parameters and responses for improved clarity.

* chore: modules and cycles serializers

* feat: add new serializers for label and issue link management

- Introduced `LabelCreateUpdateSerializer`, `IssueLinkCreateSerializer`, `IssueLinkUpdateSerializer`, and `IssueCommentCreateSerializer` to enhance the handling of label and issue link data.
- Updated existing API views to utilize the new serializers for creating and updating labels, issue links, and comments, improving request handling and validation.
- Added `IssueSearchSerializer` for searching issues, streamlining the search functionality in the API.

* Don't consider read only fields as required

* Add setting to separate request and response definitions

* Fixed avatar_url warning on openapi spec generation

* Made spectacular disabled by default

* Moved spectacular settings into separate file and added detailed descriptions to tags

* Specify methods for asset urls

* Better server names

* Enhance API documentation with summaries for various endpoints

- Added summary descriptions for user asset, cycle, intake, issue, member, module, project, state, and user API endpoints to improve clarity and usability of the API documentation.
- Updated the OpenAPI specifications to reflect these changes, ensuring better understanding for developers interacting with the API.

* Add contact information to OpenAPI settings

- Included contact details for Plane in the OpenAPI settings to enhance API documentation and provide developers with a direct point of contact for support.
- This addition aims to improve the overall usability and accessibility of the API documentation.

* Reordered tags and improved description relavancy

* Enhance OpenAPI documentation for cycle and issue endpoints

- Added response definitions for the `get_cycle_issues` and `delete_cycle_issue` methods in the CycleIssueAPIEndpoint to clarify expected outcomes.
- Included additional response codes for the IssueSearchEndpoint to handle various error scenarios, improving the overall API documentation and usability.

* Enhance serializer documentation across multiple files

- Updated docstrings for various serializers including UserAssetUploadSerializer, AssetUpdateSerializer, and others to provide clearer descriptions of their functionality and usage.
- Improved consistency in formatting and language across serializer classes to enhance readability and maintainability.
- Added detailed explanations for new serializers related to project, module, and cycle management, ensuring comprehensive documentation for developers.

* Refactor API endpoints for cycles, intake, modules, projects, and states

- Replaced existing API endpoint classes with more descriptive names such as CycleListCreateAPIEndpoint, CycleDetailAPIEndpoint, IntakeIssueListCreateAPIEndpoint, and others to enhance clarity.
- Updated URL patterns to reflect the new endpoint names, ensuring consistency across the API.
- Improved documentation and method summaries for better understanding of endpoint functionalities.
- Enhanced query handling in the new endpoint classes to streamline data retrieval and improve performance.

* Refactor issue and label API endpoints for clarity and functionality

- Renamed existing API endpoint classes to more descriptive names such as IssueListCreateAPIEndpoint, IssueDetailAPIEndpoint, LabelListCreateAPIEndpoint, and LabelDetailAPIEndpoint to enhance clarity.
- Updated URL patterns to reflect the new endpoint names, ensuring consistency across the API.
- Improved method summaries and documentation for better understanding of endpoint functionalities.
- Streamlined query handling in the new endpoint classes to enhance data retrieval and performance.

* Refactor asset API endpoint methods and introduce new status enums

- Updated the GenericAssetEndpoint to only allow POST requests for asset creation, removing the GET method.
- Modified the get method to require asset_id, ensuring that asset retrieval is always tied to a specific asset.
- Added new IntakeIssueStatus and ModuleStatus enums to improve clarity and management of asset and module states.
- Enhanced OpenAPI settings to include these new enums for better documentation and usability.

* enforce naming convention

* Added LICENSE to openapi spec

* Enhance OpenAPI documentation for various API endpoints

- Updated API endpoints in asset, cycle, intake, issue, module, project, and state views to include OpenApiRequest and OpenApiExample for better request documentation.
- Added example requests for creating and updating resources, improving clarity for API consumers.
- Ensured consistent use of OpenApi utilities across all relevant endpoints to enhance overall API documentation quality.

* Enhance OpenAPI documentation for various API endpoints

- Added detailed descriptions to multiple API endpoints across asset, cycle, intake, issue, module, project, state, and user views to improve clarity for API consumers.
- Ensured consistent documentation practices by including descriptions that outline the purpose and functionality of each endpoint.
- This update aims to enhance the overall usability and understanding of the API documentation.

* Update OpenAPI examples and enhance project queryset logic

- Changed example fields in OpenAPI documentation for issue comments from "content" to "comment_html" to reflect the correct structure.
- Introduced a new `get_queryset` method in the ProjectDetailAPIEndpoint to filter projects based on user membership and workspace, while also annotating additional project-related data such as total members, cycles, and modules.
- Updated permission checks to use the correct attribute name for project identifiers, ensuring accurate permission handling.

* Enhance OpenAPI documentation and add response examples

- Updated multiple API endpoints across asset, cycle, intake, issue, module, project, state, and user views to include new OpenApiResponse examples for better clarity on expected outcomes.
- Introduced new parameters for project and issue identifiers to improve request handling and documentation consistency.
- Enhanced existing responses with detailed examples to aid API consumers in understanding the expected data structure and error handling.
- This update aims to improve the overall usability and clarity of the API documentation.

* refactor: update terminology from 'issues' to 'work items' across multiple API endpoints for consistency and clarity

* use common timezones from pytz for choices

* Moved the openapi utils to the new folder structure

* Added exception logging in GenericAssetEndpoint to improve error handling

* Fixed code rabbit suggestions

* Refactored IssueDetailAPIEndpoint to streamline issue retrieval and response handling, removing redundant external ID checks and custom ordering logic.

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2025-07-25 00:17:05 +05:30
Vamsi Krishna
98a00f5bde [WEB-4523]fix: sub issues loader #7474 2025-07-24 19:25:37 +05:30
sriram veeraghanta
ed4ee3ad7e fix: remove commands from the docker compose file (#7476) 2025-07-24 18:28:12 +05:30
Prateek Shourya
18e4c60b42 [WEB-4536] fix: remove inactive workspace members from lists and dropdowns (#7473) 2025-07-24 15:29:18 +05:30
sriram veeraghanta
849b7b7bf3 [WIKI-565] chore: handle 404 messages in live server (#7472)
* chore: handle 404 messages in live server

* chore: removed error stack from response
2025-07-24 15:13:44 +05:30
sriram veeraghanta
2746bad86b [WEB-4561] fix: SVG attributes in react components (#7470)
* chore: svg attribute fix (#7446)

* fix: replace invalid --env-file with dotenv-cli for tsup onSuccess

* fix: replace invalid --env-file usage in start script and move dotenv-cli to dependencies

* chor: update SVG attributes to camelCase for JSX compatibility

* chor: update SVG attributes to camelCase

* fix: removed dontenv-cli

* fix: svg attibute fixes

* chore: minor attribute customizations

---------

Co-authored-by: Donal Noye <68372408+Donal-Noye@users.noreply.github.com>
2025-07-24 14:46:55 +05:30
Vamsi Krishna
008727d393 [WEB-4547]fix: archived work item redirection #7471 2025-07-24 14:43:31 +05:30
merrime-n
56e4d2c6f1 docs: add file paths to the new language guide (#7454)
* improve the comprehensiblity of translation contribution docs

* Update CONTRIBUTING.md to fix the code blocks appeareance
2025-07-24 14:12:49 +05:30
Vipin Chaudhary
d7e58a60fa [WIKI-563] refactor: subscription styles (#7444)
* refactor: billing UI

* refactor : fix typo
2025-07-24 14:11:23 +05:30
Sangeetha
af81064961 [WEB-4558] fix: project test cases #7464 2025-07-24 14:07:37 +05:30
Akshita Goyal
3c6e2b4447 [WEB-4552] fix: stickies layout + minified layout filter (#7466) 2025-07-24 14:06:46 +05:30
sriram veeraghanta
b7be45d08a chore: upgrade axios version to 1.11.0 (#7469) 2025-07-24 13:50:37 +05:30
Akshita Goyal
d2629d723c [WEB-4552] chore: minor design changes for home #7459 2025-07-23 14:30:25 +05:30
Akshita Goyal
9007ec3709 [WEB-4546] chore: home update #7456 2025-07-23 10:11:18 +05:30
Akshita Goyal
763a28ab60 [WEB-4546] chore: header enhancements + quickstart guide ui update + breadcrumb #7451 2025-07-22 16:47:14 +05:30
Aaryan Khandelwal
5c22a6cecc [WIKI-484] chore: update editor packages (#7265) 2025-07-21 19:25:49 +05:30
Vamsi Krishna
4c3af7f8a1 [WEB-4531]chore: refactor for timeline chart (#7440) 2025-07-21 19:22:58 +05:30
Aaryan Khandelwal
cc673a17a0 [WIKI-357] fix: created by filter for pages list #7445 2025-07-21 19:22:27 +05:30
Prateek Shourya
2377474823 fix: build errors in work item modal (#7450) 2025-07-21 16:18:51 +05:30
Prateek Shourya
f3daac6f95 [WEB-4457] refactor: decouple work item properties from mobx store (#7363) 2025-07-18 20:38:21 +05:30
Anmol Singh Bhatia
5660b28574 [WEB-4507] fix: webhooks translation #7424 2025-07-18 20:09:39 +05:30
Vamsi Krishna
d7d1545801 [WEB-4192]fix: inactive member is hidden in created by #7435 2025-07-18 20:08:59 +05:30
Vamsi Krishna
3ab1f0de84 [WEB-4522]fix: removed redundant state indicator (#7434)
* fix: removed redundant state indicator

* fix: removed redundant state indicator for relation list item
2025-07-18 20:08:36 +05:30
Bavisetti Narayan
a75ae71ff0 [WEB-4442] fix: removed the deleted labels from webhook payload (#7337)
* chore: removed the deleted label from payload

* chore: optimised the query

* chore: optimised the logic

* chore: defined function for prefetch
2025-07-18 20:08:01 +05:30
Aaryan Khandelwal
d5eb374217 [WEB-561] refactor: editor build config (#7442)
* refactor: editor build config

* fix: type errors
2025-07-18 20:06:48 +05:30
sriram veeraghanta
df4ea1f7ac fix: update tsup build packages (#7438) 2025-07-18 15:04:31 +05:30
Anmol Singh Bhatia
07c80bb02c [WEB-4525] fix: project create/update error handling (#7429)
* chore: project create/update error handling

* chore: code refactor
2025-07-18 14:25:28 +05:30
Prateek Shourya
1ad792b4bb [WEB-4528] fix: calendar options interactions (#7437) 2025-07-18 14:25:08 +05:30
sriram veeraghanta
0eb4af9d19 fix: on-headers vulnerability (#7436) 2025-07-18 13:27:53 +05:30
Aaryan Khandelwal
7136b3129b [WIKI-543] fix: unable to delete last cell of a table (#7394)
* fix: delete last cell of a table

* refactor: last cell selection logic
2025-07-17 21:19:54 +05:30
Aaryan Khandelwal
48f1999c95 [WIKI-557] fix: keyboard navigation in tables (#7428) 2025-07-17 15:15:18 +05:30
Akshat Jain
3783e34ae8 [INFRA-213] Fix aio dockerfile with new file structure (#7427)
* fix: update paths in Dockerfile and supervisor.conf for application directories

* fix: update live command in supervisor.conf to use server.js
2025-07-17 14:59:13 +05:30
Sangeetha
e313aee3df [WEB-4346] chore: refactor copy_s3_object bg task (#7298) 2025-07-17 13:10:34 +05:30
Vipin Chaudhary
6bb79df0eb [WIKI-547] fix: update find suggestion logic for emoji extension (#7411)
* fix: update find suggestion logic

* refactor: remove logs

* refactor : make logic simpler

* feat: check for one char to show suggestion

* refactor : import types from extension

* refactor: add early return

* refactor : put custom suggestion in helper

* fix : char

* fix: types
2025-07-17 13:07:12 +05:30
Sangeetha
ec0ef98c1b [WEB-4281] chore: error code on project updation endpoint (#7218) 2025-07-17 13:05:24 +05:30
Vamsi Krishna
9523c28c3e [WEB-4514]chore: moved EIssueLayoutTypes enum to types #7416 2025-07-17 13:01:24 +05:30
Anmol Singh Bhatia
156ed329ac [WEB-4510] fix: peek view ui layout (#7422) 2025-07-17 12:53:37 +05:30
Bavisetti Narayan
ef0e3dca12 [WEB-4520] chore: changed the serializer for cycle issues #7423 2025-07-17 12:53:04 +05:30
Aaryan Khandelwal
d8f2c97810 [WIKI-554] feat: delete multiple rows/columns from a table (#7426) 2025-07-17 12:52:33 +05:30
Anmol Singh Bhatia
89983b06d2 [WEB-4515] fix: duplicate work item quick action #7421 2025-07-16 15:19:55 +05:30
Bavisetti Narayan
3224122df0 [WEB-1643] chore: support cyrillic characters in subject #7336 2025-07-16 01:04:59 +05:30
Sangeetha
99127ff8e4 [WEB-4479] feat: enable/disable SMTP configuration (#7393)
* feat: api update instance configuration

* chore: add enable_smtp key

* fix: empty string for enable_smtp key

* chore: update email_port and email_from

* fix: handled smtp enable disable

* fix: error handling

* fix: refactor

* fix: removed enabled toast

* fix: refactor

---------

Co-authored-by: gakshita <akshitagoyal1516@gmail.com>
Co-authored-by: Akshita Goyal <36129505+gakshita@users.noreply.github.com>
2025-07-16 01:04:18 +05:30
Anmol Singh Bhatia
da5390fa03 [WEB-4497] chore: reduce breadcrumb height (#7413)
* chore: reduce breadcrumb height from 3.75rem to 3.25rem

* chore: code refactor

* chore: code refactor
2025-07-16 01:03:31 +05:30
Anmol Singh Bhatia
ac22df3f88 [WEB-4451] chore: work item header quick action enhancements (#7414)
* chore: work item header quick action enhancements

* chore: code refactor
2025-07-16 00:52:30 +05:30
Aaryan Khandelwal
df762afaef [WIKI-551] refactor: work item activity logic #7417 2025-07-15 20:49:20 +05:30
Aaryan Khandelwal
2058f06b8a [WIKI-549] fix: comment scroll logic #7412 2025-07-15 15:12:27 +05:30
Anmol Singh Bhatia
bfd4bd5e75 [WEB-4506] fix: date range modal z-index #7410 2025-07-15 13:51:16 +05:30
Nikhil
ac928f263a [WIKI-520] fix: update content disposition handling in asset download endpoints (#7380) 2025-07-15 13:24:17 +05:30
Bavisetti Narayan
62065a6ebb [WEB-4504] chore: added migration for app rail and reactions #7408 2025-07-14 20:57:29 +05:30
Prateek Shourya
67b62dcbe3 [WEB-4498] improvement: remove workspace details from workspace members list API (#7407)
* [WEB-4498] improvement: remove workspace details from workspace members list API

* refactor: update select_related usage in workspace invitation and member views

---------

Co-authored-by: Dheeraj Kumar Ketireddy <dheeru0198@gmail.com>
2025-07-14 20:45:23 +05:30
Aaryan Khandelwal
a427367720 [WIKI-277] fix: unable to delete last table column/row in editor (#7389)
* fix: delete column not working

* refactor: delete column commad

* fix: delete last row logic

* refactor: row count logic

* refactor: isCellSelection utility function
2025-07-14 17:27:09 +05:30
Aaryan Khandelwal
c067eaa1ed [WIKI-542] refactor: editor extensions (#7345)
* refactor: editor extensions

* fix: placeholder extension
2025-07-14 17:10:32 +05:30
Aaryan Khandelwal
2c70c1aaa8 [WIKI-509] feat: comment copy link option (#7385)
* feat: comment copy link option

* chore: add translations

* chore: update block position

* chore: rename use id scroll hook

* refactor: setTimeout function

* refactor: use-hash-scroll hook
2025-07-14 17:07:44 +05:30
Akshat Jain
f90e553881 [INFRA-209] Remove nginx related configurations from plane community (#7406)
* Remove deprecated Nginx configuration files and scripts, including Dockerfiles, environment scripts, and configuration templates, to streamline the project structure.

* Update environment configuration and Docker setup for proxy services

- Added LISTEN_PORT and LISTEN_SSL_PORT variables to .env.example and related files.
- Updated Docker Compose files to reference new port variables instead of deprecated NGINX_PORT.
- Adjusted README and variable documentation to reflect changes in port configuration.
- Changed build context for proxy services to use the new directory structure.

* Refactor port configuration in environment and Docker files

- Renamed LISTEN_PORT and LISTEN_SSL_PORT to LISTEN_HTTP_PORT and LISTEN_HTTPS_PORT in .env.example and related files.
- Updated Docker Compose configurations to reflect the new port variable names.
- Adjusted documentation in README and variables.env to ensure consistency with the new naming conventions.
2025-07-14 16:38:27 +05:30
Aaryan Khandelwal
0af0e52275 [WIKI-545] fix: shrinking headings in page table of contents #7404 2025-07-14 16:30:19 +05:30
Anmol Singh Bhatia
7829e3adf5 [WEB-4494] fix: kanban delete modal ux #7405 2025-07-14 16:29:26 +05:30
Anmol Singh Bhatia
47354f0e91 [WEB-4468] fix: setting sidebar truncate #7403 2025-07-14 16:26:04 +05:30
M. Palanikannan
db2f783d33 [WEB-4407] chore: add project page creation event tracking #7378 2025-07-14 15:30:45 +05:30
Manish Gupta
6d01622663 [INFRA-208] Reorganize deployment structure and update build workflows (#7391)
* refactor: reorganize deployment structure and update build workflows

- Restructure deployment directories from deploy/ to deployments/
- Move selfhost files to deployments/cli/community/
- Add new AIO community deployment setup
- Update GitHub Actions workflows for new directory structure
- Add Caddy proxy configuration for CE deployment
- Remove deprecated AIO build files and workflows
- Update build context paths in install scripts

* chore: update Dockerfile and supervisor configuration

- Changed `apk add` command in Dockerfile to use `--no-cache` for better image size management.
- Updated `build.sh` to ensure proper directory navigation with quotes around `dirname "$0"`.
- Modified `supervisor.conf` to set `stderr_logfile_maxbytes` to 50MB and added `stderr_logfile_backups` for better log management across multiple services.

* chore: consistent node and python version

---------

Co-authored-by: sriramveeraghanta <veeraghanta.sriram@gmail.com>
2025-07-14 14:38:43 +05:30
Anmol Singh Bhatia
dff2f8ae12 [WEB-4482] fix: kanban drag and delete #7402 2025-07-14 14:18:38 +05:30
sriram veeraghanta
4c57ed4336 fix: local dev environment setup using yarn v4 (#7400)
* chore: adding typescript dependency to constants.

* chore: package deps update
2025-07-14 13:08:41 +05:30
Nikhil
c8dab1cc9c [WEB-4484]chore: streamline issue saving process with advisory locks for sequence management (#7396)
* chore: streamline issue saving process with advisory locks for sequence management

* fix: update advisory lock usage in issue model for improved concurrency management
2025-07-14 13:08:26 +05:30
Aaron Heckmann
17c90a9d93 [WEB-4485] - fix(dev): get local dev migrator working again #7397 2025-07-12 19:45:45 +05:30
Masoom Wahid
a3714c8e3e feat : added Kabul to the list of timezones (#7370) 2025-07-12 19:30:39 +05:30
686 changed files with 19360 additions and 8001 deletions

View File

@@ -15,12 +15,15 @@ RABBITMQ_USER="plane"
RABBITMQ_PASSWORD="plane"
RABBITMQ_VHOST="plane"
LISTEN_HTTP_PORT=80
LISTEN_HTTPS_PORT=443
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
AWS_SECRET_ACCESS_KEY="secret-key"
AWS_S3_ENDPOINT_URL="http://plane-minio:9000"
# Changing this requires change in the nginx.conf for uploads if using minio setup
# Changing this requires change in the proxy config for uploads if using minio setup
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
FILE_SIZE_LIMIT=5242880
@@ -36,8 +39,15 @@ DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
# If SSL Cert to be generated, set CERT_EMAIl="email <EMAIL_ADDRESS>"
CERT_ACME_CA=https://acme-v02.api.letsencrypt.org/directory
TRUSTED_PROXIES=0.0.0.0/0
SITE_ADDRESS=:80
CERT_EMAIL=
# For DNS Challenge based certificate generation, set the CERT_ACME_DNS, CERT_EMAIL
# CERT_ACME_DNS="acme_dns <CERT_DNS_PROVIDER> <CERT_DNS_PROVIDER_API_KEY>"
CERT_ACME_DNS=
# Force HTTPS for handling SSL Termination
MINIO_ENDPOINT_SSL=0

View File

@@ -1,139 +0,0 @@
name: Build AIO Base Image
on:
workflow_dispatch:
inputs:
base_tag_name:
description: 'Base Tag Name'
required: false
default: ''
env:
TARGET_BRANCH: ${{ github.ref_name }}
jobs:
base_build_setup:
name: Build Preparation
runs-on: ubuntu-latest
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
gh_buildx_version: ${{ steps.set_env_variables.outputs.BUILDX_VERSION }}
gh_buildx_platforms: ${{ steps.set_env_variables.outputs.BUILDX_PLATFORMS }}
gh_buildx_endpoint: ${{ steps.set_env_variables.outputs.BUILDX_ENDPOINT }}
image_tag: ${{ steps.set_env_variables.outputs.IMAGE_TAG }}
steps:
- id: set_env_variables
name: Set Environment Variables
run: |
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
if [ "${{ github.event.inputs.base_tag_name }}" != "" ]; then
echo "IMAGE_TAG=${{ github.event.inputs.base_tag_name }}" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
echo "IMAGE_TAG=latest" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
echo "IMAGE_TAG=preview" >> $GITHUB_OUTPUT
else
echo "IMAGE_TAG=develop" >> $GITHUB_OUTPUT
fi
if [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=makeplane/plane-dev" >> $GITHUB_OUTPUT
else
echo "BUILDX_DRIVER=docker-container" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
fi
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
full_base_build_push:
runs-on: ubuntu-latest
needs: [base_build_setup]
env:
BASE_IMG_TAG: makeplane/plane-aio-base:full-${{ needs.base_build_setup.outputs.image_tag }}
BUILDX_DRIVER: ${{ needs.base_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.base_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.base_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.base_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Check out the repo
uses: actions/checkout@v4
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-full
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.BASE_IMG_TAG }}
push: true
cache-from: type=gha
cache-to: type=gha,mode=max
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
slim_base_build_push:
runs-on: ubuntu-latest
needs: [base_build_setup]
env:
BASE_IMG_TAG: makeplane/plane-aio-base:slim-${{ needs.base_build_setup.outputs.image_tag }}
BUILDX_DRIVER: ${{ needs.base_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.base_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.base_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.base_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Check out the repo
uses: actions/checkout@v4
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-slim
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.BASE_IMG_TAG }}
push: true
cache-from: type=gha
cache-to: type=gha,mode=max
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}

View File

@@ -1,207 +0,0 @@
name: Branch Build AIO
on:
workflow_dispatch:
inputs:
full:
description: 'Run full build'
type: boolean
required: false
default: false
slim:
description: 'Run slim build'
type: boolean
required: false
default: false
base_tag_name:
description: 'Base Tag Name'
required: false
default: ''
release:
types: [released, prereleased]
env:
TARGET_BRANCH: ${{ github.ref_name || github.event.release.target_commitish }}
FULL_BUILD_INPUT: ${{ github.event.inputs.full }}
SLIM_BUILD_INPUT: ${{ github.event.inputs.slim }}
jobs:
branch_build_setup:
name: Build Setup
runs-on: ubuntu-latest
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
flat_branch_name: ${{ steps.set_env_variables.outputs.FLAT_BRANCH_NAME }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
gh_buildx_version: ${{ steps.set_env_variables.outputs.BUILDX_VERSION }}
gh_buildx_platforms: ${{ steps.set_env_variables.outputs.BUILDX_PLATFORMS }}
gh_buildx_endpoint: ${{ steps.set_env_variables.outputs.BUILDX_ENDPOINT }}
aio_base_tag: ${{ steps.set_env_variables.outputs.AIO_BASE_TAG }}
do_full_build: ${{ steps.set_env_variables.outputs.DO_FULL_BUILD }}
do_slim_build: ${{ steps.set_env_variables.outputs.DO_SLIM_BUILD }}
steps:
- id: set_env_variables
name: Set Environment Variables
run: |
if [ "${{ env.TARGET_BRANCH }}" == "master" ] || [ "${{ github.event_name }}" == "release" ]; then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=makeplane/plane-dev" >> $GITHUB_OUTPUT
echo "AIO_BASE_TAG=latest" >> $GITHUB_OUTPUT
else
echo "BUILDX_DRIVER=docker-container" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
if [ "${{ github.event_name}}" == "workflow_dispatch" ] && [ "${{ github.event.inputs.base_tag_name }}" != "" ]; then
echo "AIO_BASE_TAG=${{ github.event.inputs.base_tag_name }}" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
echo "AIO_BASE_TAG=preview" >> $GITHUB_OUTPUT
else
echo "AIO_BASE_TAG=develop" >> $GITHUB_OUTPUT
fi
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
if [ "${{ env.FULL_BUILD_INPUT }}" == "true" ] || [ "${{github.event_name}}" == "push" ] || [ "${{github.event_name}}" == "release" ]; then
echo "DO_FULL_BUILD=true" >> $GITHUB_OUTPUT
else
echo "DO_FULL_BUILD=false" >> $GITHUB_OUTPUT
fi
if [ "${{ env.SLIM_BUILD_INPUT }}" == "true" ] || [ "${{github.event_name}}" == "push" ] || [ "${{github.event_name}}" == "release" ]; then
echo "DO_SLIM_BUILD=true" >> $GITHUB_OUTPUT
else
echo "DO_SLIM_BUILD=false" >> $GITHUB_OUTPUT
fi
FLAT_BRANCH_NAME=$(echo "${{ env.TARGET_BRANCH }}" | sed 's/[^a-zA-Z0-9]/-/g')
echo "FLAT_BRANCH_NAME=$FLAT_BRANCH_NAME" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
full_build_push:
if: ${{ needs.branch_build_setup.outputs.do_full_build == 'true' }}
runs-on: ubuntu-22.04
needs: [branch_build_setup]
env:
BUILD_TYPE: full
AIO_BASE_TAG: ${{ needs.branch_build_setup.outputs.aio_base_tag }}
AIO_IMAGE_TAGS: makeplane/plane-aio:full-${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-aio:${{env.BUILD_TYPE}}-stable,makeplane/plane-aio:${{env.BUILD_TYPE}}-${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-aio:${{env.BUILD_TYPE}}-latest
else
TAG=${{ env.AIO_IMAGE_TAGS }}
fi
echo "AIO_IMAGE_TAGS=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.AIO_IMAGE_TAGS }}
push: true
build-args: |
BASE_TAG=${{ env.AIO_BASE_TAG }}
BUILD_TYPE=${{env.BUILD_TYPE}}
cache-from: type=gha
cache-to: type=gha,mode=max
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
slim_build_push:
if: ${{ needs.branch_build_setup.outputs.do_slim_build == 'true' }}
runs-on: ubuntu-22.04
needs: [branch_build_setup]
env:
BUILD_TYPE: slim
AIO_BASE_TAG: ${{ needs.branch_build_setup.outputs.aio_base_tag }}
AIO_IMAGE_TAGS: makeplane/plane-aio:slim-${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-aio:${{env.BUILD_TYPE}}-stable,makeplane/plane-aio:${{env.BUILD_TYPE}}-${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-aio:${{env.BUILD_TYPE}}-latest
else
TAG=${{ env.AIO_IMAGE_TAGS }}
fi
echo "AIO_IMAGE_TAGS=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.AIO_IMAGE_TAGS }}
push: true
build-args: |
BASE_TAG=${{ env.AIO_BASE_TAG }}
BUILD_TYPE=${{env.BUILD_TYPE}}
cache-from: type=gha
cache-to: type=gha,mode=max
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}

View File

@@ -25,6 +25,11 @@ on:
required: false
default: false
type: boolean
aio_build:
description: "Build for AIO docker image"
required: false
default: false
type: boolean
push:
branches:
- preview
@@ -36,6 +41,7 @@ env:
BUILD_TYPE: ${{ github.event.inputs.build_type }}
RELEASE_VERSION: ${{ github.event.inputs.releaseVersion }}
IS_PRERELEASE: ${{ github.event.inputs.isPrerelease }}
AIO_BUILD: ${{ github.event.inputs.aio_build }}
jobs:
branch_build_setup:
@@ -54,11 +60,13 @@ jobs:
dh_img_live: ${{ steps.set_env_variables.outputs.DH_IMG_LIVE }}
dh_img_backend: ${{ steps.set_env_variables.outputs.DH_IMG_BACKEND }}
dh_img_proxy: ${{ steps.set_env_variables.outputs.DH_IMG_PROXY }}
dh_img_aio: ${{ steps.set_env_variables.outputs.DH_IMG_AIO }}
build_type: ${{steps.set_env_variables.outputs.BUILD_TYPE}}
build_release: ${{ steps.set_env_variables.outputs.BUILD_RELEASE }}
build_prerelease: ${{ steps.set_env_variables.outputs.BUILD_PRERELEASE }}
release_version: ${{ steps.set_env_variables.outputs.RELEASE_VERSION }}
aio_build: ${{ steps.set_env_variables.outputs.AIO_BUILD }}
steps:
- id: set_env_variables
@@ -84,12 +92,15 @@ jobs:
echo "DH_IMG_LIVE=plane-live" >> $GITHUB_OUTPUT
echo "DH_IMG_BACKEND=plane-backend" >> $GITHUB_OUTPUT
echo "DH_IMG_PROXY=plane-proxy" >> $GITHUB_OUTPUT
echo "DH_IMG_AIO=plane-aio-community" >> $GITHUB_OUTPUT
echo "BUILD_TYPE=${{env.BUILD_TYPE}}" >> $GITHUB_OUTPUT
BUILD_RELEASE=false
BUILD_PRERELEASE=false
RELVERSION="latest"
BUILD_AIO=${{ env.AIO_BUILD }}
if [ "${{ env.BUILD_TYPE }}" == "Release" ]; then
FLAT_RELEASE_VERSION=$(echo "${{ env.RELEASE_VERSION }}" | sed 's/[^a-zA-Z0-9.-]//g')
echo "FLAT_RELEASE_VERSION=${FLAT_RELEASE_VERSION}" >> $GITHUB_OUTPUT
@@ -108,10 +119,14 @@ jobs:
if [ "${{ env.IS_PRERELEASE }}" == "true" ]; then
BUILD_PRERELEASE=true
fi
BUILD_AIO=true
fi
echo "BUILD_RELEASE=${BUILD_RELEASE}" >> $GITHUB_OUTPUT
echo "BUILD_PRERELEASE=${BUILD_PRERELEASE}" >> $GITHUB_OUTPUT
echo "RELEASE_VERSION=${RELVERSION}" >> $GITHUB_OUTPUT
echo "AIO_BUILD=${BUILD_AIO}" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
@@ -242,13 +257,102 @@ jobs:
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_proxy }}
build-context: ./nginx
dockerfile-path: ./nginx/Dockerfile
build-context: ./apps/proxy
dockerfile-path: ./apps/proxy/Dockerfile.ce
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_aio:
if: ${{ needs.branch_build_setup.outputs.aio_build == 'true' }}
name: Build-Push AIO Docker Image
runs-on: ubuntu-22.04
needs: [
branch_build_setup,
branch_build_push_admin,
branch_build_push_web,
branch_build_push_space,
branch_build_push_live,
branch_build_push_api,
branch_build_push_proxy
]
steps:
- name: Checkout Files
uses: actions/checkout@v4
- name: Prepare AIO Assets
id: prepare_aio_assets
run: |
cd deployments/aio/community
if [ "${{ needs.branch_build_setup.outputs.build_type }}" == "Release" ]; then
aio_version=${{ needs.branch_build_setup.outputs.release_version }}
else
aio_version=${{ needs.branch_build_setup.outputs.gh_branch_name }}
fi
bash ./build.sh --release $aio_version
echo "AIO_BUILD_VERSION=${aio_version}" >> $GITHUB_OUTPUT
- name: Upload AIO Assets
uses: actions/upload-artifact@v4
with:
path: ./deployments/aio/community/dist
name: aio-assets-dist
- name: AIO Build and Push
uses: makeplane/actions/build-push@v1.1.0
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
dockerhub-username: ${{ secrets.DOCKERHUB_USERNAME }}
dockerhub-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_aio }}
build-context: ./deployments/aio/community
dockerfile-path: ./deployments/aio/community/Dockerfile
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
additional-assets: aio-assets-dist
additional-assets-dir: ./deployments/aio/community/dist
build-args: |
PLANE_VERSION=${{ steps.prepare_aio_assets.outputs.AIO_BUILD_VERSION }}
upload_build_assets:
name: Upload Build Assets
runs-on: ubuntu-22.04
needs: [branch_build_setup, branch_build_push_admin, branch_build_push_web, branch_build_push_space, branch_build_push_live, branch_build_push_api, branch_build_push_proxy]
steps:
- name: Checkout Files
uses: actions/checkout@v4
- name: Update Assets
run: |
if [ "${{ needs.branch_build_setup.outputs.build_type }}" == "Release" ]; then
REL_VERSION=${{ needs.branch_build_setup.outputs.release_version }}
else
REL_VERSION=${{ needs.branch_build_setup.outputs.gh_branch_name }}
fi
cp ./deployments/cli/community/install.sh deployments/cli/community/setup.sh
sed -i 's/${APP_RELEASE:-stable}/${APP_RELEASE:-'${REL_VERSION}'}/g' deployments/cli/community/docker-compose.yml
# sed -i 's/APP_RELEASE=stable/APP_RELEASE='${REL_VERSION}'/g' deployments/cli/community/variables.env
- name: Upload Assets
uses: actions/upload-artifact@v4
with:
name: community-assets
path: |
./deployments/cli/community/setup.sh
./deployments/cli/community/restore.sh
./deployments/cli/community/restore-airgapped.sh
./deployments/cli/community/docker-compose.yml
./deployments/cli/community/variables.env
./deployments/swarm/community/swarm.sh
publish_release:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Release' }}
name: Build Release
@@ -271,9 +375,9 @@ jobs:
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
sed -i 's/${APP_RELEASE:-stable}/${APP_RELEASE:-'${REL_VERSION}'}/g' deploy/selfhost/docker-compose.yml
# sed -i 's/APP_RELEASE=stable/APP_RELEASE='${REL_VERSION}'/g' deploy/selfhost/variables.env
cp ./deployments/cli/community/install.sh deployments/cli/community/setup.sh
sed -i 's/${APP_RELEASE:-stable}/${APP_RELEASE:-'${REL_VERSION}'}/g' deployments/cli/community/docker-compose.yml
# sed -i 's/APP_RELEASE=stable/APP_RELEASE='${REL_VERSION}'/g' deployments/cli/community/variables.env
- name: Create Release
id: create_release
@@ -287,9 +391,10 @@ jobs:
prerelease: ${{ env.IS_PRERELEASE }}
generate_release_notes: true
files: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/swarm.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/restore-airgapped.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env
${{ github.workspace }}/deployments/cli/community/setup.sh
${{ github.workspace }}/deployments/cli/community/restore.sh
${{ github.workspace }}/deployments/cli/community/restore-airgapped.sh
${{ github.workspace }}/deployments/cli/community/docker-compose.yml
${{ github.workspace }}/deployments/cli/community/variables.env
${{ github.workspace }}/deployments/swarm/community/swarm.sh

View File

@@ -1,95 +0,0 @@
name: Build and Lint on Pull Request
on:
workflow_dispatch:
pull_request:
types: ["opened", "synchronize", "ready_for_review"]
jobs:
lint-server:
if: github.event.pull_request.draft == false
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x" # Specify the Python version you need
- name: Install Pylint
run: python -m pip install ruff
- name: Install Server Dependencies
run: cd apps/server && pip install -r requirements.txt
- name: Lint apps/server
run: ruff check --fix apps/server
lint-admin:
if: github.event.pull_request.draft == false
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn lint --filter=admin
lint-space:
if: github.event.pull_request.draft == false
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn lint --filter=space
lint-web:
if: github.event.pull_request.draft == false
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn lint --filter=web
build-admin:
needs: lint-admin
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn build --filter=admin
build-space:
needs: lint-space
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn build --filter=space
build-web:
needs: lint-web
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 20.x
- run: yarn install
- run: yarn build --filter=web

View File

@@ -0,0 +1,30 @@
name: Build and lint API
on:
workflow_dispatch:
pull_request:
branches: ["preview"]
types: ["opened", "synchronize", "ready_for_review", "review_requested", "reopened"]
paths:
- "apps/api/**"
jobs:
lint-api:
name: Lint API
runs-on: ubuntu-latest
timeout-minutes: 25
if: |
github.event.pull_request.draft == false &&
github.event.pull_request.requested_reviewers != null
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install Pylint
run: python -m pip install ruff
- name: Install API Dependencies
run: cd apps/api && pip install -r requirements.txt
- name: Lint apps/api
run: ruff check --fix apps/api

View File

@@ -0,0 +1,43 @@
name: Build and lint web apps
on:
workflow_dispatch:
pull_request:
branches: ["preview"]
types: ["opened", "synchronize", "ready_for_review", "review_requested", "reopened"]
paths:
- "**.tsx?"
- "**.jsx?"
- "**.css"
- "**.json"
- "!apps/api/**"
jobs:
build-and-lint:
name: Build and lint web apps
runs-on: ubuntu-latest
timeout-minutes: 25
if: |
github.event.pull_request.draft == false &&
github.event.pull_request.requested_reviewers != null
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 2
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version-file: ".nvmrc"
- name: Install dependencies
run: yarn install --frozen-lockfile
- name: Build web apps
run: yarn run build
- name: Lint web apps
run: yarn run ci:lint

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
lts/jod

View File

@@ -186,41 +186,39 @@ Adding a new language involves several steps to ensure it integrates seamlessly
1. **Update type definitions**
Add the new language to the TLanguage type in the language definitions file:
```typescript
// types/language.ts
export type TLanguage = "en" | "fr" | "your-lang";
```
```ts
// packages/i18n/src/types/language.ts
export type TLanguage = "en" | "fr" | "your-lang";
```
2. **Add language configuration**
1. **Add language configuration**
Include the new language in the list of supported languages:
```ts
// packages/i18n/src/constants/language.ts
export const SUPPORTED_LANGUAGES: ILanguageOption[] = [
{ label: "English", value: "en" },
{ label: "Your Language", value: "your-lang" }
];
```
```typescript
// constants/language.ts
export const SUPPORTED_LANGUAGES: ILanguageOption[] = [
{ label: "English", value: "en" },
{ label: "Your Language", value: "your-lang" }
];
```
3. **Create translation files**
2. **Create translation files**
1. Create a new folder for your language under locales (e.g., `locales/your-lang/`).
2. Add a `translations.json` file inside the folder.
3. Copy the structure from an existing translation file and translate all keys.
4. **Update import logic**
3. **Update import logic**
Modify the language import logic to include your new language:
```typescript
private importLanguageFile(language: TLanguage): Promise<any> {
switch (language) {
case "your-lang":
return import("../locales/your-lang/translations.json");
// ...
}
}
```
```ts
private importLanguageFile(language: TLanguage): Promise<any> {
switch (language) {
case "your-lang":
return import("../locales/your-lang/translations.json");
// ...
}
}
```
### Quality checklist

View File

@@ -2,11 +2,10 @@
<p align="center">
<a href="https://plane.so">
<img src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_logo_.webp" alt="Plane Logo" width="70">
<img src="https://media.docs.plane.so/logo/plane_github_readme.png" alt="Plane Logo" width="400">
</a>
</p>
<h1 align="center"><b>Plane</b></h1>
<p align="center"><b>Open-source project management that unlocks customer value</b></p>
<p align="center"><b>Modern project management for all teams</b></p>
<p align="center">
<a href="https://discord.com/invite/A92xrEGCge">
@@ -25,14 +24,7 @@
<p>
<a href="https://app.plane.so/#gh-light-mode-only" target="_blank">
<img
src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_screen.webp"
alt="Plane Screens"
width="100%"
/>
</a>
<a href="https://app.plane.so/#gh-dark-mode-only" target="_blank">
<img
src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_screens_dark_mode.webp"
src="https://media.docs.plane.so/GitHub-readme/github-top.webp"
alt="Plane Screens"
width="100%"
/>
@@ -48,13 +40,13 @@ Meet [Plane](https://plane.so/), an open-source project management tool to track
Getting started with Plane is simple. Choose the setup that works best for you:
- **Plane Cloud**
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
- **Self-host Plane**
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
| Installation methods | Docs link |
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Installation methods | Docs link |
| -------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://developers.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://developers.plane.so/self-hosting/methods/kubernetes) |
@@ -63,58 +55,58 @@ Prefer full control over your data and infrastructure? Install and run Plane on
## 🌟 Features
- **Issues**
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
- **Cycles**
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
Simplify complex projects by dividing them into smaller, manageable modules.
- **Views**
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
- **Pages**
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
- **Analytics**
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
## 🛠️ Local development
See [CONTRIBUTING](./CONTRIBUTING.md)
## ⚙️ Built with
[![Next JS](https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white)](https://nextjs.org/)
[![Django](https://img.shields.io/badge/Django-092E20?style=for-the-badge&logo=django&logoColor=green)](https://www.djangoproject.com/)
[![Node JS](https://img.shields.io/badge/node.js-339933?style=for-the-badge&logo=Node.js&logoColor=white)](https://nodejs.org/en)
## 📸 Screenshots
<p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Issues_rNZjrGgFl.png?updatedAt=1709298765880"
src="https://media.docs.plane.so/GitHub-readme/github-work-items.webp"
alt="Plane Views"
width="100%"
/>
</a>
</p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Cycles_jCDhqmTl9.png?updatedAt=1709298780697"
width="100%"
/>
</a>
</p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Modules_PSCVsbSfI.png?updatedAt=1709298796783"
src="https://media.docs.plane.so/GitHub-readme/github-cycles.webp"
width="100%"
/>
</a>
</p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://media.docs.plane.so/GitHub-readme/github-modules.webp"
alt="Plane Cycles and Modules"
width="100%"
/>
@@ -123,7 +115,7 @@ See [CONTRIBUTING](./CONTRIBUTING.md)
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Views_uxXsRatS4.png?updatedAt=1709298834522"
src="https://media.docs.plane.so/GitHub-readme/github-views.webp"
alt="Plane Analytics"
width="100%"
/>
@@ -132,25 +124,16 @@ See [CONTRIBUTING](./CONTRIBUTING.md)
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Analytics_0o22gLRtp.png?updatedAt=1709298834389"
src="https://media.docs.plane.so/GitHub-readme/github-analytics.webp"
alt="Plane Pages"
width="100%"
/>
</a>
</p>
</p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Drive_LlfeY4xn3.png?updatedAt=1709298837917"
alt="Plane Command Menu"
width="100%"
/>
</a>
</p>
</p>
## 📝 Documentation
Explore Plane's [product documentation](https://docs.plane.so/) and [developer documentation](https://developers.plane.so/) to learn about features, setup, and usage.
## ❤️ Community
@@ -186,6 +169,6 @@ Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CON
<img src="https://contrib.rocks/image?repo=makeplane/plane" />
</a>
## License
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).

View File

@@ -1,182 +0,0 @@
ARG BASE_TAG=develop
ARG BUILD_TYPE=full
# *****************************************************************************
# STAGE 1: Build the project
# *****************************************************************************
FROM node:18-alpine AS builder
RUN apk add --no-cache libc6-compat
# Set working directory
WORKDIR /app
RUN yarn global add turbo
COPY . .
RUN turbo prune --scope=web --scope=space --scope=admin --docker
# *****************************************************************************
# STAGE 2: Install dependencies & build the project
# *****************************************************************************
# Add lockfile and package.json's of isolated subworkspace
FROM node:18-alpine AS installer
RUN apk add --no-cache libc6-compat
WORKDIR /app
# First install the dependencies (as they change less often)
COPY .gitignore .gitignore
COPY --from=builder /app/out/json/ .
COPY --from=builder /app/out/yarn.lock ./yarn.lock
RUN yarn install
# # Build the project
COPY --from=builder /app/out/full/ .
COPY turbo.json turbo.json
ARG NEXT_PUBLIC_API_BASE_URL=""
ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
ARG NEXT_PUBLIC_SPACE_BASE_URL=""
ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
ARG NEXT_PUBLIC_WEB_BASE_URL=""
ENV NEXT_PUBLIC_WEB_BASE_URL=$NEXT_PUBLIC_WEB_BASE_URL
ENV NEXT_TELEMETRY_DISABLED=1
ENV TURBO_TELEMETRY_DISABLED=1
RUN yarn turbo run build --filter=web --filter=space --filter=admin
# *****************************************************************************
# STAGE 3: Copy the project and start it
# *****************************************************************************
FROM makeplane/plane-aio-base:${BUILD_TYPE}-${BASE_TAG} AS runner
WORKDIR /app
SHELL [ "/bin/bash", "-c" ]
# PYTHON APPLICATION SETUP
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
COPY apps/api/requirements.txt ./api/
COPY apps/api/requirements ./api/requirements
RUN pip install -r ./api/requirements.txt --compile --no-cache-dir
# Add in Django deps and generate Django's static files
COPY apps/api/manage.py ./api/manage.py
COPY apps/api/plane ./api/plane/
COPY apps/api/templates ./api/templates/
COPY package.json ./api/package.json
COPY apps/api/bin ./api/bin/
RUN chmod +x ./api/bin/*
RUN chmod -R 777 ./api/
# NEXTJS BUILDS
COPY --from=installer /app/web/next.config.js ./web/
COPY --from=installer /app/web/package.json ./web/
COPY --from=installer /app/web/.next/standalone ./web
COPY --from=installer /app/web/.next/static ./web/web/.next/static
COPY --from=installer /app/web/public ./web/web/public
COPY --from=installer /app/space/next.config.js ./space/
COPY --from=installer /app/space/package.json ./space/
COPY --from=installer /app/space/.next/standalone ./space
COPY --from=installer /app/space/.next/static ./space/space/.next/static
COPY --from=installer /app/space/public ./space/space/public
COPY --from=installer /app/admin/next.config.js ./admin/
COPY --from=installer /app/admin/package.json ./admin/
COPY --from=installer /app/admin/.next/standalone ./admin
COPY --from=installer /app/admin/.next/static ./admin/admin/.next/static
COPY --from=installer /app/admin/public ./admin/admin/public
ARG NEXT_PUBLIC_API_BASE_URL=""
ENV NEXT_PUBLIC_API_BASE_URL=$NEXT_PUBLIC_API_BASE_URL
ARG NEXT_PUBLIC_ADMIN_BASE_URL=""
ENV NEXT_PUBLIC_ADMIN_BASE_URL=$NEXT_PUBLIC_ADMIN_BASE_URL
ARG NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
ENV NEXT_PUBLIC_ADMIN_BASE_PATH=$NEXT_PUBLIC_ADMIN_BASE_PATH
ARG NEXT_PUBLIC_SPACE_BASE_URL=""
ENV NEXT_PUBLIC_SPACE_BASE_URL=$NEXT_PUBLIC_SPACE_BASE_URL
ARG NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
ENV NEXT_PUBLIC_SPACE_BASE_PATH=$NEXT_PUBLIC_SPACE_BASE_PATH
ARG NEXT_PUBLIC_WEB_BASE_URL=""
ENV NEXT_PUBLIC_WEB_BASE_URL=$NEXT_PUBLIC_WEB_BASE_URL
ENV NEXT_TELEMETRY_DISABLED=1
ENV TURBO_TELEMETRY_DISABLED=1
ARG BUILD_TYPE=full
ENV BUILD_TYPE=$BUILD_TYPE
COPY aio/supervisord-${BUILD_TYPE}-base /app/supervisord.conf
COPY aio/supervisord-app /app/supervisord-app
RUN cat /app/supervisord-app >> /app/supervisord.conf && \
rm /app/supervisord-app
COPY ./aio/nginx.conf /etc/nginx/nginx.conf.template
# if build type is full, run the below copy pg-setup.sh
COPY aio/postgresql.conf /etc/postgresql/postgresql.conf
COPY aio/pg-setup.sh /app/pg-setup.sh
RUN chmod +x /app/pg-setup.sh
# *****************************************************************************
# APPLICATION ENVIRONMENT SETTINGS
# *****************************************************************************
ENV APP_DOMAIN=localhost
ENV WEB_URL=http://${APP_DOMAIN}
ENV DEBUG=0
ENV CORS_ALLOWED_ORIGINS=http://${APP_DOMAIN},https://${APP_DOMAIN}
# Secret Key
ENV SECRET_KEY=60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5
# Gunicorn Workers
ENV GUNICORN_WORKERS=1
ENV POSTGRES_USER="plane"
ENV POSTGRES_PASSWORD="plane"
ENV POSTGRES_DB="plane"
ENV POSTGRES_HOST="localhost"
ENV POSTGRES_PORT="5432"
ENV DATABASE_URL="postgresql://plane:plane@localhost:5432/plane"
ENV REDIS_HOST="localhost"
ENV REDIS_PORT="6379"
ENV REDIS_URL="redis://localhost:6379"
ENV USE_MINIO="1"
ENV AWS_REGION=""
ENV AWS_ACCESS_KEY_ID="access-key"
ENV AWS_SECRET_ACCESS_KEY="secret-key"
ENV AWS_S3_ENDPOINT_URL="http://localhost:9000"
ENV AWS_S3_BUCKET_NAME="uploads"
ENV MINIO_ROOT_USER="access-key"
ENV MINIO_ROOT_PASSWORD="secret-key"
ENV BUCKET_NAME="uploads"
ENV FILE_SIZE_LIMIT="5242880"
# *****************************************************************************
RUN /app/pg-setup.sh
CMD ["/usr/bin/supervisord", "-c", "/app/supervisord.conf"]

View File

@@ -1,73 +0,0 @@
FROM --platform=$BUILDPLATFORM tonistiigi/binfmt AS binfmt
FROM python:3.12-slim
# Set environment variables to non-interactive for apt
ENV DEBIAN_FRONTEND=noninteractive
ENV BUILD_TYPE=full
SHELL [ "/bin/bash", "-c" ]
WORKDIR /app
RUN mkdir -p /app/{data,logs} && \
mkdir -p /app/data/{redis,pg,minio,nginx} && \
mkdir -p /app/logs/{access,error} && \
mkdir -p /etc/supervisor/conf.d
# Update the package list and install prerequisites
RUN apt-get update && \
apt-get install -y \
gnupg2 curl ca-certificates lsb-release software-properties-common \
build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev \
libsqlite3-dev wget llvm libncurses5-dev libncursesw5-dev xz-utils \
tk-dev libffi-dev liblzma-dev supervisor nginx nano vim ncdu \
sudo lsof net-tools libpq-dev procps gettext
# Install Redis 7.2
RUN echo "deb http://deb.debian.org/debian $(lsb_release -cs)-backports main" > /etc/apt/sources.list.d/backports.list && \
curl -fsSL https://packages.redis.io/gpg | gpg --dearmor -o /usr/share/keyrings/redis-archive-keyring.gpg && \
echo "deb [signed-by=/usr/share/keyrings/redis-archive-keyring.gpg] https://packages.redis.io/deb $(lsb_release -cs) main" > /etc/apt/sources.list.d/redis.list && \
apt-get update && \
apt-get install -y redis-server
# Install PostgreSQL 15
ENV POSTGRES_VERSION=15
RUN curl -fsSL https://www.postgresql.org/media/keys/ACCC4CF8.asc | gpg --dearmor -o /usr/share/keyrings/pgdg-archive-keyring.gpg && \
echo "deb [signed-by=/usr/share/keyrings/pgdg-archive-keyring.gpg] http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list && \
apt-get update && \
apt-get install -y postgresql-$POSTGRES_VERSION postgresql-client-$POSTGRES_VERSION && \
mkdir -p /var/lib/postgresql/data && \
chown -R postgres:postgres /var/lib/postgresql
COPY postgresql.conf /etc/postgresql/postgresql.conf
RUN sudo -u postgres /usr/lib/postgresql/$POSTGRES_VERSION/bin/initdb -D /var/lib/postgresql/data
# Install MinIO
ARG TARGETARCH
RUN if [ "$TARGETARCH" = "amd64" ]; then \
curl -fSl https://dl.min.io/server/minio/release/linux-amd64/minio -o /usr/local/bin/minio; \
elif [ "$TARGETARCH" = "arm64" ]; then \
curl -fSl https://dl.min.io/server/minio/release/linux-arm64/minio -o /usr/local/bin/minio; \
else \
echo "Unsupported architecture: $TARGETARCH"; exit 1; \
fi && \
chmod +x /usr/local/bin/minio
# Install Node.js 18
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - && \
apt-get install -y nodejs && \
python -m pip install --upgrade pip && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
# Create Supervisor configuration file
COPY supervisord-full-base /app/supervisord.conf
COPY nginx.conf /etc/nginx/nginx.conf.template
COPY env.sh /app/nginx-start.sh
RUN chmod +x /app/nginx-start.sh
# Expose ports for Redis, PostgreSQL, and MinIO
EXPOSE 6379 5432 9000 80 443
# Start Supervisor
CMD ["/usr/bin/supervisord", "-c", "/app/supervisord.conf"]

View File

@@ -1,45 +0,0 @@
FROM --platform=$BUILDPLATFORM tonistiigi/binfmt AS binfmt
FROM python:3.12-slim
# Set environment variables to non-interactive for apt
ENV DEBIAN_FRONTEND=noninteractive
ENV BUILD_TYPE=slim
SHELL [ "/bin/bash", "-c" ]
WORKDIR /app
RUN mkdir -p /app/{data,logs} && \
mkdir -p /app/data/{nginx} && \
mkdir -p /app/logs/{access,error} && \
mkdir -p /etc/supervisor/conf.d
# Update the package list and install prerequisites
RUN apt-get update && \
apt-get install -y \
gnupg2 curl ca-certificates lsb-release software-properties-common \
build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev \
libsqlite3-dev wget llvm libncurses5-dev libncursesw5-dev xz-utils \
tk-dev libffi-dev liblzma-dev supervisor nginx nano vim ncdu \
sudo lsof net-tools libpq-dev procps gettext
# Install Node.js 18
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - && \
apt-get install -y nodejs
RUN python -m pip install --upgrade pip && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
# Create Supervisor configuration file
COPY supervisord-slim-base /app/supervisord.conf
COPY nginx.conf /etc/nginx/nginx.conf.template
COPY env.sh /app/nginx-start.sh
RUN chmod +x /app/nginx-start.sh
# Expose ports for Redis, PostgreSQL, and MinIO
EXPOSE 80 443
# Start Supervisor
CMD ["/usr/bin/supervisord", "-c", "/app/supervisord.conf"]

View File

@@ -1,7 +0,0 @@
#!/bin/bash
export dollar="$"
export http_upgrade="http_upgrade"
export scheme="scheme"
envsubst < /etc/nginx/nginx.conf.template > /etc/nginx/nginx.conf
exec nginx -g 'daemon off;'

View File

@@ -1,72 +0,0 @@
events {
}
http {
sendfile on;
server {
listen 80;
root /www/data/;
access_log /var/log/nginx/access.log;
client_max_body_size ${FILE_SIZE_LIMIT};
add_header X-Content-Type-Options "nosniff" always;
add_header Referrer-Policy "no-referrer-when-downgrade" always;
add_header Permissions-Policy "interest-cohort=()" always;
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
add_header X-Forwarded-Proto "${dollar}scheme";
add_header X-Forwarded-Host "${dollar}host";
add_header X-Forwarded-For "${dollar}proxy_add_x_forwarded_for";
add_header X-Real-IP "${dollar}remote_addr";
location / {
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:3001/;
}
location /spaces/ {
rewrite ^/spaces/?$ /spaces/login break;
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:3002/spaces/;
}
location /god-mode/ {
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:3003/god-mode/;
}
location /api/ {
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:8000/api/;
}
location /auth/ {
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:8000/auth/;
}
location /${BUCKET_NAME}/ {
proxy_http_version 1.1;
proxy_set_header Upgrade ${dollar}http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host ${dollar}http_host;
proxy_pass http://localhost:9000/uploads/;
}
}
}

View File

@@ -1,14 +0,0 @@
#!/bin/bash
if [ "$BUILD_TYPE" == "full" ]; then
export PGHOST=localhost
sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/pg_ctl" -D /var/lib/postgresql/data start
sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/psql" --command "CREATE USER $POSTGRES_USER WITH SUPERUSER PASSWORD '$POSTGRES_PASSWORD';" && \
sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/createdb" -O "$POSTGRES_USER" "$POSTGRES_DB" && \
sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/psql" --command "GRANT ALL PRIVILEGES ON DATABASE $POSTGRES_DB TO $POSTGRES_USER;" && \
sudo -u postgres "/usr/lib/postgresql/${POSTGRES_VERSION}/bin/pg_ctl" -D /var/lib/postgresql/data stop
fi

View File

@@ -1,815 +0,0 @@
# -----------------------------
# PostgreSQL configuration file
# -----------------------------
#
# This file consists of lines of the form:
#
# name = value
#
# (The "=" is optional.) Whitespace may be used. Comments are introduced with
# "#" anywhere on a line. The complete list of parameter names and allowed
# values can be found in the PostgreSQL documentation.
#
# The commented-out settings shown in this file represent the default values.
# Re-commenting a setting is NOT sufficient to revert it to the default value;
# you need to reload the server.
#
# This file is read on server startup and when the server receives a SIGHUP
# signal. If you edit the file on a running system, you have to SIGHUP the
# server for the changes to take effect, run "pg_ctl reload", or execute
# "SELECT pg_reload_conf()". Some parameters, which are marked below,
# require a server shutdown and restart to take effect.
#
# Any parameter can also be given as a command-line option to the server, e.g.,
# "postgres -c log_connections=on". Some parameters can be changed at run time
# with the "SET" SQL command.
#
# Memory units: B = bytes Time units: us = microseconds
# kB = kilobytes ms = milliseconds
# MB = megabytes s = seconds
# GB = gigabytes min = minutes
# TB = terabytes h = hours
# d = days
#------------------------------------------------------------------------------
# FILE LOCATIONS
#------------------------------------------------------------------------------
# The default values of these variables are driven from the -D command-line
# option or PGDATA environment variable, represented here as ConfigDir.
data_directory = '/var/lib/postgresql/data' # use data in another directory
# (change requires restart)
hba_file = '/etc/postgresql/15/main/pg_hba.conf' # host-based authentication file
# (change requires restart)
ident_file = '/etc/postgresql/15/main/pg_ident.conf' # ident configuration file
# (change requires restart)
# If external_pid_file is not explicitly set, no extra PID file is written.
external_pid_file = '/var/run/postgresql/15-main.pid' # write an extra PID file
# (change requires restart)
#------------------------------------------------------------------------------
# CONNECTIONS AND AUTHENTICATION
#------------------------------------------------------------------------------
# - Connection Settings -
listen_addresses = 'localhost' # what IP address(es) to listen on;
# comma-separated list of addresses;
# defaults to 'localhost'; use '*' for all
# (change requires restart)
port = 5432 # (change requires restart)
max_connections = 200 # (change requires restart)
#superuser_reserved_connections = 3 # (change requires restart)
unix_socket_directories = '/var/run/postgresql' # comma-separated list of directories
# (change requires restart)
#unix_socket_group = '' # (change requires restart)
#unix_socket_permissions = 0777 # begin with 0 to use octal notation
# (change requires restart)
#bonjour = off # advertise server via Bonjour
# (change requires restart)
#bonjour_name = '' # defaults to the computer name
# (change requires restart)
# - TCP settings -
# see "man tcp" for details
#tcp_keepalives_idle = 0 # TCP_KEEPIDLE, in seconds;
# 0 selects the system default
#tcp_keepalives_interval = 0 # TCP_KEEPINTVL, in seconds;
# 0 selects the system default
#tcp_keepalives_count = 0 # TCP_KEEPCNT;
# 0 selects the system default
#tcp_user_timeout = 0 # TCP_USER_TIMEOUT, in milliseconds;
# 0 selects the system default
#client_connection_check_interval = 0 # time between checks for client
# disconnection while running queries;
# 0 for never
# - Authentication -
#authentication_timeout = 1min # 1s-600s
#password_encryption = scram-sha-256 # scram-sha-256 or md5
#db_user_namespace = off
# GSSAPI using Kerberos
#krb_server_keyfile = 'FILE:${sysconfdir}/krb5.keytab'
#krb_caseins_users = off
# - SSL -
ssl = on
#ssl_ca_file = ''
ssl_cert_file = '/etc/ssl/certs/ssl-cert-snakeoil.pem'
#ssl_crl_file = ''
#ssl_crl_dir = ''
ssl_key_file = '/etc/ssl/private/ssl-cert-snakeoil.key'
#ssl_ciphers = 'HIGH:MEDIUM:+3DES:!aNULL' # allowed SSL ciphers
#ssl_prefer_server_ciphers = on
#ssl_ecdh_curve = 'prime256v1'
#ssl_min_protocol_version = 'TLSv1.2'
#ssl_max_protocol_version = ''
#ssl_dh_params_file = ''
#ssl_passphrase_command = ''
#ssl_passphrase_command_supports_reload = off
#------------------------------------------------------------------------------
# RESOURCE USAGE (except WAL)
#------------------------------------------------------------------------------
# - Memory -
shared_buffers = 256MB # min 128kB
# (change requires restart)
#huge_pages = try # on, off, or try
# (change requires restart)
#huge_page_size = 0 # zero for system default
# (change requires restart)
#temp_buffers = 8MB # min 800kB
#max_prepared_transactions = 0 # zero disables the feature
# (change requires restart)
# Caution: it is not advisable to set max_prepared_transactions nonzero unless
# you actively intend to use prepared transactions.
#work_mem = 4MB # min 64kB
#hash_mem_multiplier = 2.0 # 1-1000.0 multiplier on hash table work_mem
#maintenance_work_mem = 64MB # min 1MB
#autovacuum_work_mem = -1 # min 1MB, or -1 to use maintenance_work_mem
#logical_decoding_work_mem = 64MB # min 64kB
#max_stack_depth = 2MB # min 100kB
#shared_memory_type = mmap # the default is the first option
# supported by the operating system:
# mmap
# sysv
# windows
# (change requires restart)
dynamic_shared_memory_type = posix # the default is usually the first option
# supported by the operating system:
# posix
# sysv
# windows
# mmap
# (change requires restart)
#min_dynamic_shared_memory = 0MB # (change requires restart)
# - Disk -
#temp_file_limit = -1 # limits per-process temp file space
# in kilobytes, or -1 for no limit
# - Kernel Resources -
#max_files_per_process = 1000 # min 64
# (change requires restart)
# - Cost-Based Vacuum Delay -
#vacuum_cost_delay = 0 # 0-100 milliseconds (0 disables)
#vacuum_cost_page_hit = 1 # 0-10000 credits
#vacuum_cost_page_miss = 2 # 0-10000 credits
#vacuum_cost_page_dirty = 20 # 0-10000 credits
#vacuum_cost_limit = 200 # 1-10000 credits
# - Background Writer -
#bgwriter_delay = 200ms # 10-10000ms between rounds
#bgwriter_lru_maxpages = 100 # max buffers written/round, 0 disables
#bgwriter_lru_multiplier = 2.0 # 0-10.0 multiplier on buffers scanned/round
#bgwriter_flush_after = 512kB # measured in pages, 0 disables
# - Asynchronous Behavior -
#backend_flush_after = 0 # measured in pages, 0 disables
#effective_io_concurrency = 1 # 1-1000; 0 disables prefetching
#maintenance_io_concurrency = 10 # 1-1000; 0 disables prefetching
#max_worker_processes = 8 # (change requires restart)
#max_parallel_workers_per_gather = 2 # limited by max_parallel_workers
#max_parallel_maintenance_workers = 2 # limited by max_parallel_workers
#max_parallel_workers = 8 # number of max_worker_processes that
# can be used in parallel operations
#parallel_leader_participation = on
#old_snapshot_threshold = -1 # 1min-60d; -1 disables; 0 is immediate
# (change requires restart)
#------------------------------------------------------------------------------
# WRITE-AHEAD LOG
#------------------------------------------------------------------------------
# - Settings -
#wal_level = replica # minimal, replica, or logical
# (change requires restart)
#fsync = on # flush data to disk for crash safety
# (turning this off can cause
# unrecoverable data corruption)
#synchronous_commit = on # synchronization level;
# off, local, remote_write, remote_apply, or on
#wal_sync_method = fsync # the default is the first option
# supported by the operating system:
# open_datasync
# fdatasync (default on Linux and FreeBSD)
# fsync
# fsync_writethrough
# open_sync
#full_page_writes = on # recover from partial page writes
#wal_log_hints = off # also do full page writes of non-critical updates
# (change requires restart)
#wal_compression = off # enables compression of full-page writes;
# off, pglz, lz4, zstd, or on
#wal_init_zero = on # zero-fill new WAL files
#wal_recycle = on # recycle WAL files
#wal_buffers = -1 # min 32kB, -1 sets based on shared_buffers
# (change requires restart)
#wal_writer_delay = 200ms # 1-10000 milliseconds
#wal_writer_flush_after = 1MB # measured in pages, 0 disables
#wal_skip_threshold = 2MB
#commit_delay = 0 # range 0-100000, in microseconds
#commit_siblings = 5 # range 1-1000
# - Checkpoints -
#checkpoint_timeout = 5min # range 30s-1d
#checkpoint_completion_target = 0.9 # checkpoint target duration, 0.0 - 1.0
#checkpoint_flush_after = 256kB # measured in pages, 0 disables
#checkpoint_warning = 30s # 0 disables
max_wal_size = 1GB
min_wal_size = 80MB
# - Prefetching during recovery -
#recovery_prefetch = try # prefetch pages referenced in the WAL?
#wal_decode_buffer_size = 512kB # lookahead window used for prefetching
# (change requires restart)
# - Archiving -
#archive_mode = off # enables archiving; off, on, or always
# (change requires restart)
#archive_library = '' # library to use to archive a logfile segment
# (empty string indicates archive_command should
# be used)
#archive_command = '' # command to use to archive a logfile segment
# placeholders: %p = path of file to archive
# %f = file name only
# e.g. 'test ! -f /mnt/server/archivedir/%f && cp %p /mnt/server/archivedir/%f'
#archive_timeout = 0 # force a logfile segment switch after this
# number of seconds; 0 disables
# - Archive Recovery -
# These are only used in recovery mode.
#restore_command = '' # command to use to restore an archived logfile segment
# placeholders: %p = path of file to restore
# %f = file name only
# e.g. 'cp /mnt/server/archivedir/%f %p'
#archive_cleanup_command = '' # command to execute at every restartpoint
#recovery_end_command = '' # command to execute at completion of recovery
# - Recovery Target -
# Set these only when performing a targeted recovery.
#recovery_target = '' # 'immediate' to end recovery as soon as a
# consistent state is reached
# (change requires restart)
#recovery_target_name = '' # the named restore point to which recovery will proceed
# (change requires restart)
#recovery_target_time = '' # the time stamp up to which recovery will proceed
# (change requires restart)
#recovery_target_xid = '' # the transaction ID up to which recovery will proceed
# (change requires restart)
#recovery_target_lsn = '' # the WAL LSN up to which recovery will proceed
# (change requires restart)
#recovery_target_inclusive = on # Specifies whether to stop:
# just after the specified recovery target (on)
# just before the recovery target (off)
# (change requires restart)
#recovery_target_timeline = 'latest' # 'current', 'latest', or timeline ID
# (change requires restart)
#recovery_target_action = 'pause' # 'pause', 'promote', 'shutdown'
# (change requires restart)
#------------------------------------------------------------------------------
# REPLICATION
#------------------------------------------------------------------------------
# - Sending Servers -
# Set these on the primary and on any standby that will send replication data.
#max_wal_senders = 10 # max number of walsender processes
# (change requires restart)
#max_replication_slots = 10 # max number of replication slots
# (change requires restart)
#wal_keep_size = 0 # in megabytes; 0 disables
#max_slot_wal_keep_size = -1 # in megabytes; -1 disables
#wal_sender_timeout = 60s # in milliseconds; 0 disables
#track_commit_timestamp = off # collect timestamp of transaction commit
# (change requires restart)
# - Primary Server -
# These settings are ignored on a standby server.
#synchronous_standby_names = '' # standby servers that provide sync rep
# method to choose sync standbys, number of sync standbys,
# and comma-separated list of application_name
# from standby(s); '*' = all
#vacuum_defer_cleanup_age = 0 # number of xacts by which cleanup is delayed
# - Standby Servers -
# These settings are ignored on a primary server.
#primary_conninfo = '' # connection string to sending server
#primary_slot_name = '' # replication slot on sending server
#promote_trigger_file = '' # file name whose presence ends recovery
#hot_standby = on # "off" disallows queries during recovery
# (change requires restart)
#max_standby_archive_delay = 30s # max delay before canceling queries
# when reading WAL from archive;
# -1 allows indefinite delay
#max_standby_streaming_delay = 30s # max delay before canceling queries
# when reading streaming WAL;
# -1 allows indefinite delay
#wal_receiver_create_temp_slot = off # create temp slot if primary_slot_name
# is not set
#wal_receiver_status_interval = 10s # send replies at least this often
# 0 disables
#hot_standby_feedback = off # send info from standby to prevent
# query conflicts
#wal_receiver_timeout = 60s # time that receiver waits for
# communication from primary
# in milliseconds; 0 disables
#wal_retrieve_retry_interval = 5s # time to wait before retrying to
# retrieve WAL after a failed attempt
#recovery_min_apply_delay = 0 # minimum delay for applying changes during recovery
# - Subscribers -
# These settings are ignored on a publisher.
#max_logical_replication_workers = 4 # taken from max_worker_processes
# (change requires restart)
#max_sync_workers_per_subscription = 2 # taken from max_logical_replication_workers
#------------------------------------------------------------------------------
# QUERY TUNING
#------------------------------------------------------------------------------
# - Planner Method Configuration -
#enable_async_append = on
#enable_bitmapscan = on
#enable_gathermerge = on
#enable_hashagg = on
#enable_hashjoin = on
#enable_incremental_sort = on
#enable_indexscan = on
#enable_indexonlyscan = on
#enable_material = on
#enable_memoize = on
#enable_mergejoin = on
#enable_nestloop = on
#enable_parallel_append = on
#enable_parallel_hash = on
#enable_partition_pruning = on
#enable_partitionwise_join = off
#enable_partitionwise_aggregate = off
#enable_seqscan = on
#enable_sort = on
#enable_tidscan = on
# - Planner Cost Constants -
#seq_page_cost = 1.0 # measured on an arbitrary scale
#random_page_cost = 4.0 # same scale as above
#cpu_tuple_cost = 0.01 # same scale as above
#cpu_index_tuple_cost = 0.005 # same scale as above
#cpu_operator_cost = 0.0025 # same scale as above
#parallel_setup_cost = 1000.0 # same scale as above
#parallel_tuple_cost = 0.1 # same scale as above
#min_parallel_table_scan_size = 8MB
#min_parallel_index_scan_size = 512kB
#effective_cache_size = 4GB
#jit_above_cost = 100000 # perform JIT compilation if available
# and query more expensive than this;
# -1 disables
#jit_inline_above_cost = 500000 # inline small functions if query is
# more expensive than this; -1 disables
#jit_optimize_above_cost = 500000 # use expensive JIT optimizations if
# query is more expensive than this;
# -1 disables
# - Genetic Query Optimizer -
#geqo = on
#geqo_threshold = 12
#geqo_effort = 5 # range 1-10
#geqo_pool_size = 0 # selects default based on effort
#geqo_generations = 0 # selects default based on effort
#geqo_selection_bias = 2.0 # range 1.5-2.0
#geqo_seed = 0.0 # range 0.0-1.0
# - Other Planner Options -
#default_statistics_target = 100 # range 1-10000
#constraint_exclusion = partition # on, off, or partition
#cursor_tuple_fraction = 0.1 # range 0.0-1.0
#from_collapse_limit = 8
#jit = on # allow JIT compilation
#join_collapse_limit = 8 # 1 disables collapsing of explicit
# JOIN clauses
#plan_cache_mode = auto # auto, force_generic_plan or
# force_custom_plan
#recursive_worktable_factor = 10.0 # range 0.001-1000000
#------------------------------------------------------------------------------
# REPORTING AND LOGGING
#------------------------------------------------------------------------------
# - Where to Log -
#log_destination = 'stderr' # Valid values are combinations of
# stderr, csvlog, jsonlog, syslog, and
# eventlog, depending on platform.
# csvlog and jsonlog require
# logging_collector to be on.
# This is used when logging to stderr:
#logging_collector = off # Enable capturing of stderr, jsonlog,
# and csvlog into log files. Required
# to be on for csvlogs and jsonlogs.
# (change requires restart)
# These are only used if logging_collector is on:
#log_directory = 'log' # directory where log files are written,
# can be absolute or relative to PGDATA
#log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log' # log file name pattern,
# can include strftime() escapes
#log_file_mode = 0600 # creation mode for log files,
# begin with 0 to use octal notation
#log_rotation_age = 1d # Automatic rotation of logfiles will
# happen after that time. 0 disables.
#log_rotation_size = 10MB # Automatic rotation of logfiles will
# happen after that much log output.
# 0 disables.
#log_truncate_on_rotation = off # If on, an existing log file with the
# same name as the new log file will be
# truncated rather than appended to.
# But such truncation only occurs on
# time-driven rotation, not on restarts
# or size-driven rotation. Default is
# off, meaning append to existing files
# in all cases.
# These are relevant when logging to syslog:
#syslog_facility = 'LOCAL0'
#syslog_ident = 'postgres'
#syslog_sequence_numbers = on
#syslog_split_messages = on
# This is only relevant when logging to eventlog (Windows):
# (change requires restart)
#event_source = 'PostgreSQL'
# - When to Log -
#log_min_messages = warning # values in order of decreasing detail:
# debug5
# debug4
# debug3
# debug2
# debug1
# info
# notice
# warning
# error
# log
# fatal
# panic
#log_min_error_statement = error # values in order of decreasing detail:
# debug5
# debug4
# debug3
# debug2
# debug1
# info
# notice
# warning
# error
# log
# fatal
# panic (effectively off)
#log_min_duration_statement = -1 # -1 is disabled, 0 logs all statements
# and their durations, > 0 logs only
# statements running at least this number
# of milliseconds
#log_min_duration_sample = -1 # -1 is disabled, 0 logs a sample of statements
# and their durations, > 0 logs only a sample of
# statements running at least this number
# of milliseconds;
# sample fraction is determined by log_statement_sample_rate
#log_statement_sample_rate = 1.0 # fraction of logged statements exceeding
# log_min_duration_sample to be logged;
# 1.0 logs all such statements, 0.0 never logs
#log_transaction_sample_rate = 0.0 # fraction of transactions whose statements
# are logged regardless of their duration; 1.0 logs all
# statements from all transactions, 0.0 never logs
#log_startup_progress_interval = 10s # Time between progress updates for
# long-running startup operations.
# 0 disables the feature, > 0 indicates
# the interval in milliseconds.
# - What to Log -
#debug_print_parse = off
#debug_print_rewritten = off
#debug_print_plan = off
#debug_pretty_print = on
#log_autovacuum_min_duration = 10min # log autovacuum activity;
# -1 disables, 0 logs all actions and
# their durations, > 0 logs only
# actions running at least this number
# of milliseconds.
#log_checkpoints = on
#log_connections = off
#log_disconnections = off
#log_duration = off
#log_error_verbosity = default # terse, default, or verbose messages
#log_hostname = off
log_line_prefix = '%m [%p] %q%u@%d ' # special values:
# %a = application name
# %u = user name
# %d = database name
# %r = remote host and port
# %h = remote host
# %b = backend type
# %p = process ID
# %P = process ID of parallel group leader
# %t = timestamp without milliseconds
# %m = timestamp with milliseconds
# %n = timestamp with milliseconds (as a Unix epoch)
# %Q = query ID (0 if none or not computed)
# %i = command tag
# %e = SQL state
# %c = session ID
# %l = session line number
# %s = session start timestamp
# %v = virtual transaction ID
# %x = transaction ID (0 if none)
# %q = stop here in non-session
# processes
# %% = '%'
# e.g. '<%u%%%d> '
#log_lock_waits = off # log lock waits >= deadlock_timeout
#log_recovery_conflict_waits = off # log standby recovery conflict waits
# >= deadlock_timeout
#log_parameter_max_length = -1 # when logging statements, limit logged
# bind-parameter values to N bytes;
# -1 means print in full, 0 disables
#log_parameter_max_length_on_error = 0 # when logging an error, limit logged
# bind-parameter values to N bytes;
# -1 means print in full, 0 disables
#log_statement = 'none' # none, ddl, mod, all
#log_replication_commands = off
#log_temp_files = -1 # log temporary files equal or larger
# than the specified size in kilobytes;
# -1 disables, 0 logs all temp files
log_timezone = 'Etc/UTC'
#------------------------------------------------------------------------------
# PROCESS TITLE
#------------------------------------------------------------------------------
cluster_name = '15/main' # added to process titles if nonempty
# (change requires restart)
#update_process_title = on
#------------------------------------------------------------------------------
# STATISTICS
#------------------------------------------------------------------------------
# - Cumulative Query and Index Statistics -
#track_activities = on
#track_activity_query_size = 1024 # (change requires restart)
#track_counts = on
#track_io_timing = off
#track_wal_io_timing = off
#track_functions = none # none, pl, all
#stats_fetch_consistency = cache
# - Monitoring -
#compute_query_id = auto
#log_statement_stats = off
#log_parser_stats = off
#log_planner_stats = off
#log_executor_stats = off
#------------------------------------------------------------------------------
# AUTOVACUUM
#------------------------------------------------------------------------------
#autovacuum = on # Enable autovacuum subprocess? 'on'
# requires track_counts to also be on.
#autovacuum_max_workers = 3 # max number of autovacuum subprocesses
# (change requires restart)
#autovacuum_naptime = 1min # time between autovacuum runs
#autovacuum_vacuum_threshold = 50 # min number of row updates before
# vacuum
#autovacuum_vacuum_insert_threshold = 1000 # min number of row inserts
# before vacuum; -1 disables insert
# vacuums
#autovacuum_analyze_threshold = 50 # min number of row updates before
# analyze
#autovacuum_vacuum_scale_factor = 0.2 # fraction of table size before vacuum
#autovacuum_vacuum_insert_scale_factor = 0.2 # fraction of inserts over table
# size before insert vacuum
#autovacuum_analyze_scale_factor = 0.1 # fraction of table size before analyze
#autovacuum_freeze_max_age = 200000000 # maximum XID age before forced vacuum
# (change requires restart)
#autovacuum_multixact_freeze_max_age = 400000000 # maximum multixact age
# before forced vacuum
# (change requires restart)
#autovacuum_vacuum_cost_delay = 2ms # default vacuum cost delay for
# autovacuum, in milliseconds;
# -1 means use vacuum_cost_delay
#autovacuum_vacuum_cost_limit = -1 # default vacuum cost limit for
# autovacuum, -1 means use
# vacuum_cost_limit
#------------------------------------------------------------------------------
# CLIENT CONNECTION DEFAULTS
#------------------------------------------------------------------------------
# - Statement Behavior -
#client_min_messages = notice # values in order of decreasing detail:
# debug5
# debug4
# debug3
# debug2
# debug1
# log
# notice
# warning
# error
#search_path = '"$user", public' # schema names
#row_security = on
#default_table_access_method = 'heap'
#default_tablespace = '' # a tablespace name, '' uses the default
#default_toast_compression = 'pglz' # 'pglz' or 'lz4'
#temp_tablespaces = '' # a list of tablespace names, '' uses
# only default tablespace
#check_function_bodies = on
#default_transaction_isolation = 'read committed'
#default_transaction_read_only = off
#default_transaction_deferrable = off
#session_replication_role = 'origin'
#statement_timeout = 0 # in milliseconds, 0 is disabled
#lock_timeout = 0 # in milliseconds, 0 is disabled
#idle_in_transaction_session_timeout = 0 # in milliseconds, 0 is disabled
#idle_session_timeout = 0 # in milliseconds, 0 is disabled
#vacuum_freeze_table_age = 150000000
#vacuum_freeze_min_age = 50000000
#vacuum_failsafe_age = 1600000000
#vacuum_multixact_freeze_table_age = 150000000
#vacuum_multixact_freeze_min_age = 5000000
#vacuum_multixact_failsafe_age = 1600000000
#bytea_output = 'hex' # hex, escape
#xmlbinary = 'base64'
#xmloption = 'content'
#gin_pending_list_limit = 4MB
# - Locale and Formatting -
datestyle = 'iso, mdy'
#intervalstyle = 'postgres'
timezone = 'Etc/UTC'
#timezone_abbreviations = 'Default' # Select the set of available time zone
# abbreviations. Currently, there are
# Default
# Australia (historical usage)
# India
# You can create your own file in
# share/timezonesets/.
#extra_float_digits = 1 # min -15, max 3; any value >0 actually
# selects precise output mode
#client_encoding = sql_ascii # actually, defaults to database
# encoding
# These settings are initialized by initdb, but they can be changed.
lc_messages = 'C.UTF-8' # locale for system error message
# strings
lc_monetary = 'C.UTF-8' # locale for monetary formatting
lc_numeric = 'C.UTF-8' # locale for number formatting
lc_time = 'C.UTF-8' # locale for time formatting
# default configuration for text search
default_text_search_config = 'pg_catalog.english'
# - Shared Library Preloading -
#local_preload_libraries = ''
#session_preload_libraries = ''
#shared_preload_libraries = '' # (change requires restart)
#jit_provider = 'llvmjit' # JIT library to use
# - Other Defaults -
#dynamic_library_path = '$libdir'
#extension_destdir = '' # prepend path when loading extensions
# and shared objects (added by Debian)
#gin_fuzzy_search_limit = 0
#------------------------------------------------------------------------------
# LOCK MANAGEMENT
#------------------------------------------------------------------------------
#deadlock_timeout = 1s
#max_locks_per_transaction = 64 # min 10
# (change requires restart)
#max_pred_locks_per_transaction = 64 # min 10
# (change requires restart)
#max_pred_locks_per_relation = -2 # negative values mean
# (max_pred_locks_per_transaction
# / -max_pred_locks_per_relation) - 1
#max_pred_locks_per_page = 2 # min 0
#------------------------------------------------------------------------------
# VERSION AND PLATFORM COMPATIBILITY
#------------------------------------------------------------------------------
# - Previous PostgreSQL Versions -
#array_nulls = on
#backslash_quote = safe_encoding # on, off, or safe_encoding
#escape_string_warning = on
#lo_compat_privileges = off
#quote_all_identifiers = off
#standard_conforming_strings = on
#synchronize_seqscans = on
# - Other Platforms and Clients -
#transform_null_equals = off
#------------------------------------------------------------------------------
# ERROR HANDLING
#------------------------------------------------------------------------------
#exit_on_error = off # terminate session on any error?
#restart_after_crash = on # reinitialize after backend crash?
#data_sync_retry = off # retry or panic on failure to fsync
# data?
# (change requires restart)
#recovery_init_sync_method = fsync # fsync, syncfs (Linux 5.8+)
#------------------------------------------------------------------------------
# CONFIG FILE INCLUDES
#------------------------------------------------------------------------------
# These options allow settings to be loaded from files other than the
# default postgresql.conf. Note that these are directives, not variable
# assignments, so they can usefully be given more than once.
# include_dir = 'conf.d' # include files ending in '.conf' from
# a directory, e.g., 'conf.d'
#include_if_exists = '...' # include file only if it exists
#include = '...' # include file
#------------------------------------------------------------------------------
# CUSTOMIZED OPTIONS
#------------------------------------------------------------------------------
# Add settings for extensions here

View File

@@ -1,71 +0,0 @@
[program:web]
command=node /app/web/web/server.js
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
environment=PORT=3001,HOSTNAME=0.0.0.0
[program:space]
command=node /app/space/space/server.js
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
environment=PORT=3002,HOSTNAME=0.0.0.0
[program:admin]
command=node /app/admin/admin/server.js
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
environment=PORT=3003,HOSTNAME=0.0.0.0
[program:migrator]
directory=/app/api
command=sh -c "./bin/docker-entrypoint-migrator.sh"
autostart=true
autorestart=false
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
[program:api]
directory=/app/api
command=sh -c "./bin/docker-entrypoint-api.sh"
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
[program:worker]
directory=/app/api
command=sh -c "./bin/docker-entrypoint-worker.sh"
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0
[program:beat]
directory=/app/api
command=sh -c "./bin/docker-entrypoint-beat.sh"
autostart=true
autorestart=true
stdout_logfile=/dev/stdout
stdout_logfile_maxbytes=0
stderr_logfile=/dev/stdout
stderr_logfile_maxbytes=0

View File

@@ -1,38 +0,0 @@
[supervisord]
user=root
nodaemon=true
stderr_logfile=/app/logs/error/supervisor.err.log
stdout_logfile=/app/logs/access/supervisor.log
[program:redis]
directory=/app/data/redis
command=redis-server
autostart=true
autorestart=true
stderr_logfile=/app/logs/error/redis.err.log
stdout_logfile=/app/logs/access/redis.log
[program:postgresql]
user=postgres
command=/usr/lib/postgresql/15/bin/postgres --config-file=/etc/postgresql/postgresql.conf
autostart=true
autorestart=true
stderr_logfile=/app/logs/error/postgresql.err.log
stdout_logfile=/app/logs/access/postgresql.log
[program:minio]
directory=/app/data/minio
command=minio server /app/data/minio
autostart=true
autorestart=true
stderr_logfile=/app/logs/error/minio.err.log
stdout_logfile=/app/logs/access/minio.log
[program:nginx]
directory=/app/data/nginx
command=/app/nginx-start.sh
autostart=true
autorestart=true
stderr_logfile=/app/logs/error/nginx.err.log
stdout_logfile=/app/logs/access/nginx.log

View File

@@ -1,14 +0,0 @@
[supervisord]
user=root
nodaemon=true
stderr_logfile=/app/logs/error/supervisor.err.log
stdout_logfile=/app/logs/access/supervisor.log
[program:nginx]
directory=/app/data/nginx
command=/app/nginx-start.sh
autostart=true
autorestart=true
stderr_logfile=/app/logs/error/nginx.err.log
stdout_logfile=/app/logs/access/nginx.log

View File

@@ -66,9 +66,11 @@ const InstanceGitlabAuthenticationPage = observer(() => {
<ToggleSwitch
value={Boolean(parseInt(enableGitlabConfig))}
onChange={() => {
Boolean(parseInt(enableGitlabConfig)) === true
? updateConfig("IS_GITLAB_ENABLED", "0")
: updateConfig("IS_GITLAB_ENABLED", "1");
if (Boolean(parseInt(enableGitlabConfig)) === true) {
updateConfig("IS_GITLAB_ENABLED", "0");
} else {
updateConfig("IS_GITLAB_ENABLED", "1");
}
}}
size="sm"
disabled={isSubmitting || !formattedConfig}

View File

@@ -67,9 +67,11 @@ const InstanceGoogleAuthenticationPage = observer(() => {
<ToggleSwitch
value={Boolean(parseInt(enableGoogleConfig))}
onChange={() => {
Boolean(parseInt(enableGoogleConfig)) === true
? updateConfig("IS_GOOGLE_ENABLED", "0")
: updateConfig("IS_GOOGLE_ENABLED", "1");
if (Boolean(parseInt(enableGoogleConfig)) === true) {
updateConfig("IS_GOOGLE_ENABLED", "0");
} else {
updateConfig("IS_GOOGLE_ENABLED", "1");
}
}}
size="sm"
disabled={isSubmitting || !formattedConfig}

View File

@@ -49,9 +49,9 @@ export const InstanceEmailForm: FC<IInstanceEmailForm> = (props) => {
EMAIL_USE_TLS: config["EMAIL_USE_TLS"],
EMAIL_USE_SSL: config["EMAIL_USE_SSL"],
EMAIL_FROM: config["EMAIL_FROM"],
ENABLE_SMTP: config["ENABLE_SMTP"],
},
});
const emailFormFields: TControllerInputFormField[] = [
{
key: "EMAIL_HOST",
@@ -101,7 +101,7 @@ export const InstanceEmailForm: FC<IInstanceEmailForm> = (props) => {
];
const onSubmit = async (formData: EmailFormValues) => {
const payload: Partial<EmailFormValues> = { ...formData };
const payload: Partial<EmailFormValues> = { ...formData, ENABLE_SMTP: "1" };
await updateInstanceConfigurations(payload)
.then(() =>

View File

@@ -1,8 +1,9 @@
"use client";
import { useEffect, useState } from "react";
import { observer } from "mobx-react";
import useSWR from "swr";
import { Loader } from "@plane/ui";
import { Loader, setToast, TOAST_TYPE, ToggleSwitch } from "@plane/ui";
// hooks
import { useInstance } from "@/hooks/store";
// components
@@ -10,36 +11,80 @@ import { InstanceEmailForm } from "./email-config-form";
const InstanceEmailPage = observer(() => {
// store
const { fetchInstanceConfigurations, formattedConfig } = useInstance();
const { fetchInstanceConfigurations, formattedConfig, disableEmail } = useInstance();
useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
const { isLoading } = useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
const [isSubmitting, setIsSubmitting] = useState(false);
const [isSMTPEnabled, setIsSMTPEnabled] = useState(false);
const handleToggle = async () => {
if (isSMTPEnabled) {
setIsSubmitting(true);
try {
await disableEmail();
setIsSMTPEnabled(false);
setToast({
title: "Email feature disabled",
message: "Email feature has been disabled",
type: TOAST_TYPE.SUCCESS,
});
} catch (error) {
setToast({
title: "Error disabling email",
message: "Failed to disable email feature. Please try again.",
type: TOAST_TYPE.ERROR,
});
} finally {
setIsSubmitting(false);
}
return;
}
setIsSMTPEnabled(true);
};
useEffect(() => {
if (formattedConfig) {
setIsSMTPEnabled(formattedConfig.ENABLE_SMTP === "1");
}
}, [formattedConfig]);
return (
<>
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<div className="text-xl font-medium text-custom-text-100">Secure emails from your own instance</div>
<div className="text-sm font-normal text-custom-text-300">
Plane can send useful emails to you and your users from your own instance without talking to the Internet.
<div className="flex items-center justify-between gap-4 border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<div className="py-4 space-y-1 flex-shrink-0">
<div className="text-xl font-medium text-custom-text-100">Secure emails from your own instance</div>
<div className="text-sm font-normal text-custom-text-300">
Set it up below and please test your settings before you save them.&nbsp;
<span className="text-red-400">Misconfigs can lead to email bounces and errors.</span>
Plane can send useful emails to you and your users from your own instance without talking to the Internet.
<div className="text-sm font-normal text-custom-text-300">
Set it up below and please test your settings before you save them.&nbsp;
<span className="text-red-400">Misconfigs can lead to email bounces and errors.</span>
</div>
</div>
</div>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
{formattedConfig ? (
<InstanceEmailForm config={formattedConfig} />
) : (
<Loader className="space-y-10">
<Loader.Item height="50px" width="75%" />
<Loader.Item height="50px" width="75%" />
<Loader.Item height="50px" width="40%" />
<Loader.Item height="50px" width="40%" />
<Loader.Item height="50px" width="20%" />
{isLoading ? (
<Loader>
<Loader.Item width="24px" height="16px" className="rounded-full" />
</Loader>
) : (
<ToggleSwitch value={isSMTPEnabled} onChange={handleToggle} size="sm" disabled={isSubmitting} />
)}
</div>
{isSMTPEnabled && !isLoading && (
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
{formattedConfig ? (
<InstanceEmailForm config={formattedConfig} />
) : (
<Loader className="space-y-10">
<Loader.Item height="50px" width="75%" />
<Loader.Item height="50px" width="75%" />
<Loader.Item height="50px" width="40%" />
<Loader.Item height="50px" width="40%" />
<Loader.Item height="50px" width="20%" />
</Loader>
)}
</div>
)}
</div>
</>
);

View File

@@ -72,7 +72,7 @@ export const AdminHeader: FC = observer(() => {
const breadcrumbItems = generateBreadcrumbItems(pathName);
return (
<div className="relative z-10 flex h-[3.75rem] w-full flex-shrink-0 flex-row items-center justify-between gap-x-2 gap-y-4 border-b border-custom-sidebar-border-200 bg-custom-sidebar-background-100 p-4">
<div className="relative z-10 flex h-header w-full flex-shrink-0 flex-row items-center justify-between gap-x-2 gap-y-4 border-b border-custom-sidebar-border-200 bg-custom-sidebar-background-100 p-4">
<div className="flex w-full flex-grow items-center gap-2 overflow-ellipsis whitespace-nowrap">
<HamburgerToggle />
{breadcrumbItems.length >= 0 && (

View File

@@ -77,7 +77,7 @@ export const AdminSidebarDropdown = observer(() => {
}, [csrfToken]);
return (
<div className="flex max-h-[3.75rem] items-center gap-x-5 gap-y-2 border-b border-custom-sidebar-border-200 px-4 py-3.5">
<div className="flex max-h-header items-center gap-x-5 gap-y-2 border-b border-custom-sidebar-border-200 px-4 py-3.5">
<div className="h-full w-full truncate">
<div
className={`flex flex-grow items-center gap-x-2 truncate rounded py-1 ${

View File

@@ -1,89 +0,0 @@
"use client";
import { FC, useMemo } from "react";
// plane internal packages
import { E_PASSWORD_STRENGTH } from "@plane/constants";
import { cn, getPasswordStrength } from "@plane/utils";
type TPasswordStrengthMeter = {
password: string;
isFocused?: boolean;
};
export const PasswordStrengthMeter: FC<TPasswordStrengthMeter> = (props) => {
const { password, isFocused = false } = props;
// derived values
const strength = useMemo(() => getPasswordStrength(password), [password]);
const strengthBars = useMemo(() => {
switch (strength) {
case E_PASSWORD_STRENGTH.EMPTY: {
return {
bars: [`bg-custom-text-100`, `bg-custom-text-100`, `bg-custom-text-100`],
text: "Please enter your password.",
textColor: "text-custom-text-100",
};
}
case E_PASSWORD_STRENGTH.LENGTH_NOT_VALID: {
return {
bars: [`bg-red-500`, `bg-custom-text-100`, `bg-custom-text-100`],
text: "Password length should me more than 8 characters.",
textColor: "text-red-500",
};
}
case E_PASSWORD_STRENGTH.STRENGTH_NOT_VALID: {
return {
bars: [`bg-red-500`, `bg-custom-text-100`, `bg-custom-text-100`],
text: "Password is weak.",
textColor: "text-red-500",
};
}
case E_PASSWORD_STRENGTH.STRENGTH_VALID: {
return {
bars: [`bg-green-500`, `bg-green-500`, `bg-green-500`],
text: "Password is strong.",
textColor: "text-green-500",
};
}
default: {
return {
bars: [`bg-custom-text-100`, `bg-custom-text-100`, `bg-custom-text-100`],
text: "Please enter your password.",
textColor: "text-custom-text-100",
};
}
}
}, [strength]);
const isPasswordMeterVisible = isFocused ? true : strength === E_PASSWORD_STRENGTH.STRENGTH_VALID ? false : true;
if (!isPasswordMeterVisible) return <></>;
return (
<div className="w-full space-y-2 pt-2">
<div className="space-y-1.5">
<div className="relative flex items-center gap-2">
{strengthBars?.bars.map((color, index) => (
<div key={`${color}-${index}`} className={cn("w-full h-1 rounded-full", color)} />
))}
</div>
<div className={cn(`text-xs font-medium text-custom-text-100`, strengthBars?.textColor)}>
{strengthBars?.text}
</div>
</div>
{/* <div className="relative flex flex-wrap gap-x-4 gap-y-2">
{PASSWORD_CRITERIA.map((criteria) => (
<div
key={criteria.key}
className={cn(
"relative flex items-center gap-1 text-xs",
criteria.isCriteriaValid(password) ? `text-green-500/70` : "text-custom-text-300"
)}
>
<CircleCheck width={14} height={14} />
{criteria.label}
</div>
))}
</div> */}
</div>
);
};

View File

@@ -7,11 +7,10 @@ import { Eye, EyeOff } from "lucide-react";
// plane internal packages
import { API_BASE_URL, E_PASSWORD_STRENGTH } from "@plane/constants";
import { AuthService } from "@plane/services";
import { Button, Checkbox, Input, Spinner } from "@plane/ui";
import { Button, Checkbox, Input, PasswordStrengthIndicator, Spinner } from "@plane/ui";
import { getPasswordStrength } from "@plane/utils";
// components
import { Banner } from "@/components/common/banner";
import { PasswordStrengthMeter } from "@/components/common/password-strength-meter";
// service initialization
const authService = new AuthService();
@@ -274,7 +273,7 @@ export const InstanceSetupForm: FC = (props) => {
{errorData.type && errorData.type === EErrorCodes.INVALID_PASSWORD && errorData.message && (
<p className="px-1 text-xs text-red-500">{errorData.message}</p>
)}
<PasswordStrengthMeter password={formData.password} isFocused={isPasswordInputFocused} />
<PasswordStrengthIndicator password={formData.password} isFocused={isPasswordInputFocused} />
</div>
<div className="w-full space-y-1">

View File

@@ -32,6 +32,7 @@ export interface IInstanceStore {
fetchInstanceAdmins: () => Promise<IInstanceAdmin[] | undefined>;
fetchInstanceConfigurations: () => Promise<IInstanceConfiguration[] | undefined>;
updateInstanceConfigurations: (data: Partial<IFormattedInstanceConfiguration>) => Promise<IInstanceConfiguration[]>;
disableEmail: () => Promise<void>;
}
export class InstanceStore implements IInstanceStore {
@@ -187,4 +188,30 @@ export class InstanceStore implements IInstanceStore {
throw error;
}
};
disableEmail = async () => {
const instanceConfigurations = this.instanceConfigurations;
try {
runInAction(() => {
this.instanceConfigurations = this.instanceConfigurations?.map((config) => {
if (
[
"EMAIL_HOST",
"EMAIL_PORT",
"EMAIL_HOST_USER",
"EMAIL_HOST_PASSWORD",
"EMAIL_FROM",
"ENABLE_SMTP",
].includes(config.key)
)
return { ...config, value: "" };
return config;
});
});
await this.instanceService.disableEmail();
} catch (error) {
console.error("Error disabling the email");
this.instanceConfigurations = instanceConfigurations;
}
};
}

View File

@@ -1,7 +1,7 @@
{
"name": "admin",
"description": "Admin UI for Plane",
"version": "0.27.1",
"version": "0.28.0",
"license": "AGPL-3.0",
"private": true,
"scripts": {
@@ -10,7 +10,7 @@
"preview": "next build && next start",
"start": "next start",
"clean": "rm -rf .turbo && rm -rf .next && rm -rf node_modules && rm -rf dist",
"check:lint": "eslint . --max-warnings 0",
"check:lint": "eslint . --max-warnings 19",
"check:types": "tsc --noEmit",
"check:format": "prettier --check \"**/*.{ts,tsx,md,json,css,scss}\"",
"fix:lint": "eslint . --fix",
@@ -28,7 +28,7 @@
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
"axios": "^1.8.3",
"axios": "1.11.0",
"lodash": "^4.17.21",
"lucide-react": "^0.469.0",
"mobx": "^6.12.0",
@@ -40,8 +40,7 @@
"react-dom": "^18.3.1",
"react-hook-form": "7.51.5",
"swr": "^2.2.4",
"uuid": "^9.0.1",
"zxcvbn": "^4.4.2"
"uuid": "^9.0.1"
},
"devDependencies": {
"@plane/eslint-config": "*",
@@ -51,7 +50,6 @@
"@types/react": "^18.3.11",
"@types/react-dom": "^18.2.18",
"@types/uuid": "^9.0.8",
"@types/zxcvbn": "^4.4.4",
"typescript": "5.8.3"
}
}

View File

@@ -28,7 +28,7 @@ AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
AWS_SECRET_ACCESS_KEY="secret-key"
AWS_S3_ENDPOINT_URL="http://localhost:9000"
# Changing this requires change in the nginx.conf for uploads if using minio setup
# Changing this requires change in the proxy config for uploads if using minio setup
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
FILE_SIZE_LIMIT=5242880
@@ -39,8 +39,7 @@ DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=0
# Nginx Configuration
NGINX_PORT=80
# Email redirections and minio domain settings
WEB_URL="http://localhost:8000"

View File

@@ -1,6 +1,6 @@
{
"name": "plane-api",
"version": "0.27.1",
"version": "0.28.0",
"license": "AGPL-3.0",
"private": true,
"description": "API server powering Plane's backend"

View File

@@ -3,3 +3,10 @@ from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "plane.api"
def ready(self):
# Import authentication extensions to register them with drf-spectacular
try:
import plane.utils.openapi.auth # noqa
except ImportError:
pass

View File

@@ -1,8 +1,14 @@
from .user import UserLiteSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectSerializer, ProjectLiteSerializer
from .project import (
ProjectSerializer,
ProjectLiteSerializer,
ProjectCreateSerializer,
ProjectUpdateSerializer,
)
from .issue import (
IssueSerializer,
LabelCreateUpdateSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueCommentSerializer,
@@ -10,9 +16,40 @@ from .issue import (
IssueActivitySerializer,
IssueExpandSerializer,
IssueLiteSerializer,
IssueAttachmentUploadSerializer,
IssueSearchSerializer,
IssueCommentCreateSerializer,
IssueLinkCreateSerializer,
IssueLinkUpdateSerializer,
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .intake import IntakeIssueSerializer
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
CycleLiteSerializer,
CycleIssueRequestSerializer,
TransferCycleIssueRequestSerializer,
CycleCreateSerializer,
CycleUpdateSerializer,
)
from .module import (
ModuleSerializer,
ModuleIssueSerializer,
ModuleLiteSerializer,
ModuleIssueRequestSerializer,
ModuleCreateSerializer,
ModuleUpdateSerializer,
)
from .intake import (
IntakeIssueSerializer,
IntakeIssueCreateSerializer,
IntakeIssueUpdateSerializer,
)
from .estimate import EstimatePointSerializer
from .asset import (
UserAssetUploadSerializer,
AssetUpdateSerializer,
GenericAssetUploadSerializer,
GenericAssetUpdateSerializer,
FileAssetSerializer,
)

View File

@@ -0,0 +1,123 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import FileAsset
class UserAssetUploadSerializer(serializers.Serializer):
"""
Serializer for user asset upload requests.
This serializer validates the metadata required to generate a presigned URL
for uploading user profile assets (avatar or cover image) directly to S3 storage.
Supports JPEG, PNG, WebP, JPG, and GIF image formats with size validation.
"""
name = serializers.CharField(help_text="Original filename of the asset")
type = serializers.ChoiceField(
choices=[
("image/jpeg", "JPEG"),
("image/png", "PNG"),
("image/webp", "WebP"),
("image/jpg", "JPG"),
("image/gif", "GIF"),
],
default="image/jpeg",
help_text="MIME type of the file",
style={"placeholder": "image/jpeg"},
)
size = serializers.IntegerField(help_text="File size in bytes")
entity_type = serializers.ChoiceField(
choices=[
(FileAsset.EntityTypeContext.USER_AVATAR, "User Avatar"),
(FileAsset.EntityTypeContext.USER_COVER, "User Cover"),
],
help_text="Type of user asset",
)
class AssetUpdateSerializer(serializers.Serializer):
"""
Serializer for asset status updates after successful upload completion.
Handles post-upload asset metadata updates including attribute modifications
and upload confirmation for S3-based file storage workflows.
"""
attributes = serializers.JSONField(
required=False, help_text="Additional attributes to update for the asset"
)
class GenericAssetUploadSerializer(serializers.Serializer):
"""
Serializer for generic asset upload requests with project association.
Validates metadata for generating presigned URLs for workspace assets including
project association, external system tracking, and file validation for
document management and content storage workflows.
"""
name = serializers.CharField(help_text="Original filename of the asset")
type = serializers.CharField(required=False, help_text="MIME type of the file")
size = serializers.IntegerField(help_text="File size in bytes")
project_id = serializers.UUIDField(
required=False,
help_text="UUID of the project to associate with the asset",
style={"placeholder": "123e4567-e89b-12d3-a456-426614174000"},
)
external_id = serializers.CharField(
required=False,
help_text="External identifier for the asset (for integration tracking)",
)
external_source = serializers.CharField(
required=False, help_text="External source system (for integration tracking)"
)
class GenericAssetUpdateSerializer(serializers.Serializer):
"""
Serializer for generic asset upload confirmation and status management.
Handles post-upload status updates for workspace assets including
upload completion marking and metadata finalization.
"""
is_uploaded = serializers.BooleanField(
default=True, help_text="Whether the asset has been successfully uploaded"
)
class FileAssetSerializer(BaseSerializer):
"""
Comprehensive file asset serializer with complete metadata and URL generation.
Provides full file asset information including storage metadata, access URLs,
relationship data, and upload status for complete asset management workflows.
"""
asset_url = serializers.CharField(read_only=True)
class Meta:
model = FileAsset
fields = "__all__"
read_only_fields = [
"id",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
"comment",
"page",
"draft_issue",
"user",
"is_deleted",
"deleted_at",
"storage_metadata",
"asset_url",
]

View File

@@ -3,6 +3,13 @@ from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer):
"""
Base serializer providing common functionality for all model serializers.
Features field filtering, dynamic expansion of related fields, and standardized
primary key handling for consistent API responses across the application.
"""
id = serializers.PrimaryKeyRelatedField(read_only=True)
def __init__(self, *args, **kwargs):

View File

@@ -8,16 +8,13 @@ from plane.db.models import Cycle, CycleIssue
from plane.utils.timezone_converter import convert_to_utc
class CycleSerializer(BaseSerializer):
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
total_estimates = serializers.FloatField(read_only=True)
completed_estimates = serializers.FloatField(read_only=True)
started_estimates = serializers.FloatField(read_only=True)
class CycleCreateSerializer(BaseSerializer):
"""
Serializer for creating cycles with timezone handling and date validation.
Manages cycle creation including project timezone conversion, date range validation,
and UTC normalization for time-bound iteration planning and sprint management.
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@@ -27,6 +24,29 @@ class CycleSerializer(BaseSerializer):
self.fields["start_date"].timezone = project_timezone
self.fields["end_date"].timezone = project_timezone
class Meta:
model = Cycle
fields = [
"name",
"description",
"start_date",
"end_date",
"owned_by",
"external_source",
"external_id",
"timezone",
]
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"deleted_at",
]
def validate(self, data):
if (
data.get("start_date", None) is not None
@@ -59,6 +79,40 @@ class CycleSerializer(BaseSerializer):
)
return data
class CycleUpdateSerializer(CycleCreateSerializer):
"""
Serializer for updating cycles with enhanced ownership management.
Extends cycle creation with update-specific features including ownership
assignment and modification tracking for cycle lifecycle management.
"""
class Meta(CycleCreateSerializer.Meta):
model = Cycle
fields = CycleCreateSerializer.Meta.fields + [
"owned_by",
]
class CycleSerializer(BaseSerializer):
"""
Cycle serializer with comprehensive project metrics and time tracking.
Provides cycle details including work item counts by status, progress estimates,
and time-bound iteration data for project management and sprint planning.
"""
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
total_estimates = serializers.FloatField(read_only=True)
completed_estimates = serializers.FloatField(read_only=True)
started_estimates = serializers.FloatField(read_only=True)
class Meta:
model = Cycle
fields = "__all__"
@@ -76,6 +130,13 @@ class CycleSerializer(BaseSerializer):
class CycleIssueSerializer(BaseSerializer):
"""
Serializer for cycle-issue relationships with sub-issue counting.
Manages the association between cycles and work items, including
hierarchical issue tracking for nested work item structures.
"""
sub_issues_count = serializers.IntegerField(read_only=True)
class Meta:
@@ -85,6 +146,39 @@ class CycleIssueSerializer(BaseSerializer):
class CycleLiteSerializer(BaseSerializer):
"""
Lightweight cycle serializer for minimal data transfer.
Provides essential cycle information without computed metrics,
optimized for list views and reference lookups.
"""
class Meta:
model = Cycle
fields = "__all__"
class CycleIssueRequestSerializer(serializers.Serializer):
"""
Serializer for bulk work item assignment to cycles.
Validates work item ID lists for batch operations including
cycle assignment and sprint planning workflows.
"""
issues = serializers.ListField(
child=serializers.UUIDField(), help_text="List of issue IDs to add to the cycle"
)
class TransferCycleIssueRequestSerializer(serializers.Serializer):
"""
Serializer for transferring work items between cycles.
Handles work item migration between cycles including validation
and relationship updates for sprint reallocation workflows.
"""
new_cycle_id = serializers.UUIDField(
help_text="ID of the target cycle to transfer issues to"
)

View File

@@ -4,6 +4,13 @@ from .base import BaseSerializer
class EstimatePointSerializer(BaseSerializer):
"""
Serializer for project estimation points and story point values.
Handles numeric estimation data for work item sizing and sprint planning,
providing standardized point values for project velocity calculations.
"""
class Meta:
model = EstimatePoint
fields = ["id", "value"]

View File

@@ -1,11 +1,77 @@
# Module improts
from .base import BaseSerializer
from .issue import IssueExpandSerializer
from plane.db.models import IntakeIssue
from plane.db.models import IntakeIssue, Issue
from rest_framework import serializers
class IssueForIntakeSerializer(BaseSerializer):
"""
Serializer for work item data within intake submissions.
Handles essential work item fields for intake processing including
content validation and priority assignment for triage workflows.
"""
class Meta:
model = Issue
fields = [
"name",
"description",
"description_html",
"priority",
]
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IntakeIssueCreateSerializer(BaseSerializer):
"""
Serializer for creating intake work items with embedded issue data.
Manages intake work item creation including nested issue creation,
status assignment, and source tracking for issue queue management.
"""
issue = IssueForIntakeSerializer(help_text="Issue data for the intake issue")
class Meta:
model = IntakeIssue
fields = [
"issue",
"intake",
"status",
"snoozed_till",
"duplicate_to",
"source",
"source_email",
]
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IntakeIssueSerializer(BaseSerializer):
"""
Comprehensive serializer for intake work items with expanded issue details.
Provides full intake work item data including embedded issue information,
status tracking, and triage metadata for issue queue management.
"""
issue_detail = IssueExpandSerializer(read_only=True, source="issue")
inbox = serializers.UUIDField(source="intake.id", read_only=True)
@@ -22,3 +88,53 @@ class IntakeIssueSerializer(BaseSerializer):
"created_at",
"updated_at",
]
class IntakeIssueUpdateSerializer(BaseSerializer):
"""
Serializer for updating intake work items and their associated issues.
Handles intake work item modifications including status changes, triage decisions,
and embedded issue updates for issue queue processing workflows.
"""
issue = IssueForIntakeSerializer(
required=False, help_text="Issue data to update in the intake issue"
)
class Meta:
model = IntakeIssue
fields = [
"status",
"snoozed_till",
"duplicate_to",
"source",
"source_email",
"issue",
]
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueDataSerializer(serializers.Serializer):
"""
Serializer for nested work item data in intake request payloads.
Validates core work item fields within intake requests including
content formatting, priority levels, and metadata for issue creation.
"""
name = serializers.CharField(max_length=255, help_text="Issue name")
description_html = serializers.CharField(
required=False, allow_null=True, help_text="Issue description HTML"
)
priority = serializers.ChoiceField(
choices=Issue.PRIORITY_CHOICES, default="none", help_text="Issue priority"
)

View File

@@ -20,6 +20,12 @@ from plane.db.models import (
ProjectMember,
State,
User,
EstimatePoint,
)
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
validate_binary_data,
)
from .base import BaseSerializer
@@ -34,6 +40,13 @@ from django.core.validators import URLValidator
class IssueSerializer(BaseSerializer):
"""
Comprehensive work item serializer with full relationship management.
Handles complete work item lifecycle including assignees, labels, validation,
and related model updates. Supports dynamic field expansion and HTML content processing.
"""
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(
queryset=User.objects.values_list("id", flat=True)
@@ -75,6 +88,22 @@ class IssueSerializer(BaseSerializer):
except Exception:
raise serializers.ValidationError("Invalid HTML passed")
# Validate description content for security
if data.get("description"):
is_valid, error_msg = validate_json_content(data["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if data.get("description_html"):
is_valid, error_msg = validate_html_content(data["description_html"])
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
if data.get("description_binary"):
is_valid, error_msg = validate_binary_data(data["description_binary"])
if not is_valid:
raise serializers.ValidationError({"description_binary": error_msg})
# Validate assignees are from project
if data.get("assignees", []):
data["assignees"] = ProjectMember.objects.filter(
@@ -105,13 +134,27 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
workspace_id=self.context.get("workspace_id"), pk=data.get("parent").id
workspace_id=self.context.get("workspace_id"),
project_id=self.context.get("project_id"),
pk=data.get("parent").id,
).exists()
):
raise serializers.ValidationError(
"Parent is not valid issue_id please pass a valid issue_id"
)
if (
data.get("estimate_point")
and not EstimatePoint.objects.filter(
workspace_id=self.context.get("workspace_id"),
project_id=self.context.get("project_id"),
pk=data.get("estimate_point").id,
).exists()
):
raise serializers.ValidationError(
"Estimate point is not valid please pass a valid estimate_point_id"
)
return data
def create(self, validated_data):
@@ -300,13 +343,58 @@ class IssueSerializer(BaseSerializer):
class IssueLiteSerializer(BaseSerializer):
"""
Lightweight work item serializer for minimal data transfer.
Provides essential work item identifiers optimized for list views,
references, and performance-critical operations.
"""
class Meta:
model = Issue
fields = ["id", "sequence_id", "project_id"]
read_only_fields = fields
class LabelCreateUpdateSerializer(BaseSerializer):
"""
Serializer for creating and updating work item labels.
Manages label metadata including colors, descriptions, hierarchy,
and sorting for work item categorization and filtering.
"""
class Meta:
model = Label
fields = [
"name",
"color",
"description",
"external_source",
"external_id",
"parent",
"sort_order",
]
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"deleted_at",
]
class LabelSerializer(BaseSerializer):
"""
Full serializer for work item labels with complete metadata.
Provides comprehensive label information including hierarchical relationships,
visual properties, and organizational data for work item tagging.
"""
class Meta:
model = Label
fields = "__all__"
@@ -322,10 +410,17 @@ class LabelSerializer(BaseSerializer):
]
class IssueLinkSerializer(BaseSerializer):
class IssueLinkCreateSerializer(BaseSerializer):
"""
Serializer for creating work item external links with validation.
Handles URL validation, format checking, and duplicate prevention
for attaching external resources to work items.
"""
class Meta:
model = IssueLink
fields = "__all__"
fields = ["url", "issue_id"]
read_only_fields = [
"id",
"workspace",
@@ -361,6 +456,22 @@ class IssueLinkSerializer(BaseSerializer):
)
return IssueLink.objects.create(**validated_data)
class IssueLinkUpdateSerializer(IssueLinkCreateSerializer):
"""
Serializer for updating work item external links.
Extends link creation with update-specific validation to prevent
URL conflicts and maintain link integrity during modifications.
"""
class Meta(IssueLinkCreateSerializer.Meta):
model = IssueLink
fields = IssueLinkCreateSerializer.Meta.fields + [
"issue_id",
]
read_only_fields = IssueLinkCreateSerializer.Meta.read_only_fields
def update(self, instance, validated_data):
if (
IssueLink.objects.filter(
@@ -376,7 +487,37 @@ class IssueLinkSerializer(BaseSerializer):
return super().update(instance, validated_data)
class IssueLinkSerializer(BaseSerializer):
"""
Full serializer for work item external links.
Provides complete link information including metadata and timestamps
for managing external resource associations with work items.
"""
class Meta:
model = IssueLink
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueAttachmentSerializer(BaseSerializer):
"""
Serializer for work item file attachments.
Manages file asset associations with work items including metadata,
storage information, and access control for document management.
"""
class Meta:
model = FileAsset
fields = "__all__"
@@ -390,7 +531,47 @@ class IssueAttachmentSerializer(BaseSerializer):
]
class IssueCommentCreateSerializer(BaseSerializer):
"""
Serializer for creating work item comments.
Handles comment creation with JSON and HTML content support,
access control, and external integration tracking.
"""
class Meta:
model = IssueComment
fields = [
"comment_json",
"comment_html",
"access",
"external_source",
"external_id",
]
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"deleted_at",
"actor",
"comment_stripped",
"edited_at",
]
class IssueCommentSerializer(BaseSerializer):
"""
Full serializer for work item comments with membership context.
Provides complete comment data including member status, content formatting,
and edit tracking for collaborative work item discussions.
"""
is_member = serializers.BooleanField(read_only=True)
class Meta:
@@ -420,12 +601,26 @@ class IssueCommentSerializer(BaseSerializer):
class IssueActivitySerializer(BaseSerializer):
"""
Serializer for work item activity and change history.
Tracks and represents work item modifications, state changes,
and user interactions for audit trails and activity feeds.
"""
class Meta:
model = IssueActivity
exclude = ["created_by", "updated_by"]
class CycleIssueSerializer(BaseSerializer):
"""
Serializer for work items within cycles.
Provides cycle context for work items including cycle metadata
and timing information for sprint and iteration management.
"""
cycle = CycleSerializer(read_only=True)
class Meta:
@@ -433,6 +628,13 @@ class CycleIssueSerializer(BaseSerializer):
class ModuleIssueSerializer(BaseSerializer):
"""
Serializer for work items within modules.
Provides module context for work items including module metadata
and organizational information for feature-based work grouping.
"""
module = ModuleSerializer(read_only=True)
class Meta:
@@ -440,18 +642,50 @@ class ModuleIssueSerializer(BaseSerializer):
class LabelLiteSerializer(BaseSerializer):
"""
Lightweight label serializer for minimal data transfer.
Provides essential label information with visual properties,
optimized for UI display and performance-critical operations.
"""
class Meta:
model = Label
fields = ["id", "name", "color"]
class IssueExpandSerializer(BaseSerializer):
"""
Extended work item serializer with full relationship expansion.
Provides work items with expanded related data including cycles, modules,
labels, assignees, and states for comprehensive data representation.
"""
cycle = CycleLiteSerializer(source="issue_cycle.cycle", read_only=True)
module = ModuleLiteSerializer(source="issue_module.module", read_only=True)
labels = LabelLiteSerializer(read_only=True, many=True)
assignees = UserLiteSerializer(read_only=True, many=True)
labels = serializers.SerializerMethodField()
assignees = serializers.SerializerMethodField()
state = StateLiteSerializer(read_only=True)
def get_labels(self, obj):
expand = self.context.get("expand", [])
if "labels" in expand:
# Use prefetched data
return LabelLiteSerializer(
[il.label for il in obj.label_issue.all()], many=True
).data
return [il.label_id for il in obj.label_issue.all()]
def get_assignees(self, obj):
expand = self.context.get("expand", [])
if "assignees" in expand:
return UserLiteSerializer(
[ia.assignee for ia in obj.issue_assignee.all()], many=True
).data
return [ia.assignee_id for ia in obj.issue_assignee.all()]
class Meta:
model = Issue
fields = "__all__"
@@ -464,3 +698,41 @@ class IssueExpandSerializer(BaseSerializer):
"created_at",
"updated_at",
]
class IssueAttachmentUploadSerializer(serializers.Serializer):
"""
Serializer for work item attachment upload request validation.
Handles file upload metadata validation including size, type, and external
integration tracking for secure work item document attachment workflows.
"""
name = serializers.CharField(help_text="Original filename of the asset")
type = serializers.CharField(required=False, help_text="MIME type of the file")
size = serializers.IntegerField(help_text="File size in bytes")
external_id = serializers.CharField(
required=False,
help_text="External identifier for the asset (for integration tracking)",
)
external_source = serializers.CharField(
required=False, help_text="External source system (for integration tracking)"
)
class IssueSearchSerializer(serializers.Serializer):
"""
Serializer for work item search result data formatting.
Provides standardized search result structure including work item identifiers,
project context, and workspace information for search API responses.
"""
id = serializers.CharField(required=True, help_text="Issue ID")
name = serializers.CharField(required=True, help_text="Issue name")
sequence_id = serializers.CharField(required=True, help_text="Issue sequence ID")
project__identifier = serializers.CharField(
required=True, help_text="Project identifier"
)
project_id = serializers.CharField(required=True, help_text="Project ID")
workspace__slug = serializers.CharField(required=True, help_text="Workspace slug")

View File

@@ -13,24 +13,33 @@ from plane.db.models import (
)
class ModuleSerializer(BaseSerializer):
class ModuleCreateSerializer(BaseSerializer):
"""
Serializer for creating modules with member validation and date checking.
Handles module creation including member assignment validation, date range verification,
and duplicate name prevention for feature-based project organization setup.
"""
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(
queryset=User.objects.values_list("id", flat=True)
),
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
class Meta:
model = Module
fields = "__all__"
fields = [
"name",
"description",
"start_date",
"target_date",
"status",
"lead",
"members",
"external_source",
"external_id",
]
read_only_fields = [
"id",
"workspace",
@@ -42,11 +51,6 @@ class ModuleSerializer(BaseSerializer):
"deleted_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data["members"] = [str(member.id) for member in instance.members.all()]
return data
def validate(self, data):
if (
data.get("start_date", None) is not None
@@ -96,6 +100,22 @@ class ModuleSerializer(BaseSerializer):
return module
class ModuleUpdateSerializer(ModuleCreateSerializer):
"""
Serializer for updating modules with enhanced validation and member management.
Extends module creation with update-specific validations including member reassignment,
name conflict checking, and relationship management for module modifications.
"""
class Meta(ModuleCreateSerializer.Meta):
model = Module
fields = ModuleCreateSerializer.Meta.fields + [
"members",
]
read_only_fields = ModuleCreateSerializer.Meta.read_only_fields
def update(self, instance, validated_data):
members = validated_data.pop("members", None)
module_name = validated_data.get("name")
@@ -131,7 +151,54 @@ class ModuleSerializer(BaseSerializer):
return super().update(instance, validated_data)
class ModuleSerializer(BaseSerializer):
"""
Comprehensive module serializer with work item metrics and member management.
Provides complete module data including work item counts by status, member relationships,
and progress tracking for feature-based project organization.
"""
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
class Meta:
model = Module
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
"deleted_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data["members"] = [str(member.id) for member in instance.members.all()]
return data
class ModuleIssueSerializer(BaseSerializer):
"""
Serializer for module-work item relationships with sub-item counting.
Manages the association between modules and work items, including
hierarchical issue tracking for nested work item structures.
"""
sub_issues_count = serializers.IntegerField(read_only=True)
class Meta:
@@ -149,6 +216,13 @@ class ModuleIssueSerializer(BaseSerializer):
class ModuleLinkSerializer(BaseSerializer):
"""
Serializer for module external links with URL validation.
Handles external resource associations with modules including
URL validation and duplicate prevention for reference management.
"""
class Meta:
model = ModuleLink
fields = "__all__"
@@ -174,6 +248,27 @@ class ModuleLinkSerializer(BaseSerializer):
class ModuleLiteSerializer(BaseSerializer):
"""
Lightweight module serializer for minimal data transfer.
Provides essential module information without computed metrics,
optimized for list views and reference lookups.
"""
class Meta:
model = Module
fields = "__all__"
class ModuleIssueRequestSerializer(serializers.Serializer):
"""
Serializer for bulk work item assignment to modules.
Validates work item ID lists for batch operations including
module assignment and work item organization workflows.
"""
issues = serializers.ListField(
child=serializers.UUIDField(),
help_text="List of issue IDs to add to the module",
)

View File

@@ -2,12 +2,150 @@
from rest_framework import serializers
# Module imports
from plane.db.models import Project, ProjectIdentifier, WorkspaceMember
from plane.db.models import (
Project,
ProjectIdentifier,
WorkspaceMember,
State,
Estimate,
)
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
)
from .base import BaseSerializer
class ProjectCreateSerializer(BaseSerializer):
"""
Serializer for creating projects with workspace validation.
Handles project creation including identifier validation, member verification,
and workspace association for new project initialization.
"""
class Meta:
model = Project
fields = [
"name",
"description",
"project_lead",
"default_assignee",
"identifier",
"icon_prop",
"emoji",
"cover_image",
"module_view",
"cycle_view",
"issue_views_view",
"page_view",
"intake_view",
"guest_view_all_features",
"archive_in",
"close_in",
"timezone",
]
read_only_fields = [
"id",
"workspace",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
def validate(self, data):
if data.get("project_lead", None) is not None:
# Check if the project lead is a member of the workspace
if not WorkspaceMember.objects.filter(
workspace_id=self.context["workspace_id"],
member_id=data.get("project_lead"),
).exists():
raise serializers.ValidationError(
"Project lead should be a user in the workspace"
)
if data.get("default_assignee", None) is not None:
# Check if the default assignee is a member of the workspace
if not WorkspaceMember.objects.filter(
workspace_id=self.context["workspace_id"],
member_id=data.get("default_assignee"),
).exists():
raise serializers.ValidationError(
"Default assignee should be a user in the workspace"
)
return data
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(detail="Project Identifier is required")
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
return project
class ProjectUpdateSerializer(ProjectCreateSerializer):
"""
Serializer for updating projects with enhanced state and estimation management.
Extends project creation with update-specific validations including default state
assignment, estimation configuration, and project setting modifications.
"""
class Meta(ProjectCreateSerializer.Meta):
model = Project
fields = ProjectCreateSerializer.Meta.fields + [
"default_state",
"estimate",
]
read_only_fields = ProjectCreateSerializer.Meta.read_only_fields
def update(self, instance, validated_data):
"""Update a project"""
if (
validated_data.get("default_state", None) is not None
and not State.objects.filter(
project=instance, id=validated_data.get("default_state")
).exists()
):
# Check if the default state is a state in the project
raise serializers.ValidationError(
"Default state should be a state in the project"
)
if (
validated_data.get("estimate", None) is not None
and not Estimate.objects.filter(
project=instance, id=validated_data.get("estimate")
).exists()
):
# Check if the estimate is a estimate in the project
raise serializers.ValidationError(
"Estimate should be a estimate in the project"
)
return super().update(instance, validated_data)
class ProjectSerializer(BaseSerializer):
"""
Comprehensive project serializer with metrics and member context.
Provides complete project data including member counts, cycle/module totals,
deployment status, and user-specific context for project management.
"""
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
@@ -57,6 +195,29 @@ class ProjectSerializer(BaseSerializer):
"Default assignee should be a user in the workspace"
)
# Validate description content for security
if "description" in data and data["description"]:
# For Project, description might be text field, not JSON
if isinstance(data["description"], dict):
is_valid, error_msg = validate_json_content(data["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if "description_text" in data and data["description_text"]:
is_valid, error_msg = validate_json_content(data["description_text"])
if not is_valid:
raise serializers.ValidationError({"description_text": error_msg})
if "description_html" in data and data["description_html"]:
if isinstance(data["description_html"], dict):
is_valid, error_msg = validate_json_content(data["description_html"])
else:
is_valid, error_msg = validate_html_content(
str(data["description_html"])
)
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
return data
def create(self, validated_data):
@@ -81,6 +242,13 @@ class ProjectSerializer(BaseSerializer):
class ProjectLiteSerializer(BaseSerializer):
"""
Lightweight project serializer for minimal data transfer.
Provides essential project information including identifiers, visual properties,
and basic metadata optimized for list views and references.
"""
cover_image_url = serializers.CharField(read_only=True)
class Meta:

View File

@@ -4,6 +4,13 @@ from plane.db.models import State
class StateSerializer(BaseSerializer):
"""
Serializer for work item states with default state management.
Handles state creation and updates including default state validation
and automatic default state switching for workflow management.
"""
def validate(self, data):
# If the default is being provided then make all other states default False
if data.get("default", False):
@@ -24,10 +31,18 @@ class StateSerializer(BaseSerializer):
"workspace",
"project",
"deleted_at",
"slug",
]
class StateLiteSerializer(BaseSerializer):
"""
Lightweight state serializer for minimal data transfer.
Provides essential state information including visual properties
and grouping data optimized for UI display and filtering.
"""
class Meta:
model = State
fields = ["id", "name", "color", "group"]

View File

@@ -1,3 +1,5 @@
from rest_framework import serializers
# Module imports
from plane.db.models import User
@@ -5,6 +7,18 @@ from .base import BaseSerializer
class UserLiteSerializer(BaseSerializer):
"""
Lightweight user serializer for minimal data transfer.
Provides essential user information including names, avatar, and contact details
optimized for member lists, assignee displays, and user references.
"""
avatar_url = serializers.CharField(
help_text="Avatar URL",
read_only=True,
)
class Meta:
model = User
fields = [

View File

@@ -4,7 +4,12 @@ from .base import BaseSerializer
class WorkspaceLiteSerializer(BaseSerializer):
"""Lite serializer with only required fields"""
"""
Lightweight workspace serializer for minimal data transfer.
Provides essential workspace identifiers including name, slug, and ID
optimized for navigation, references, and performance-critical operations.
"""
class Meta:
model = Workspace

View File

@@ -5,8 +5,11 @@ from .cycle import urlpatterns as cycle_patterns
from .module import urlpatterns as module_patterns
from .intake import urlpatterns as intake_patterns
from .member import urlpatterns as member_patterns
from .asset import urlpatterns as asset_patterns
from .user import urlpatterns as user_patterns
urlpatterns = [
*asset_patterns,
*project_patterns,
*state_patterns,
*issue_patterns,
@@ -14,4 +17,5 @@ urlpatterns = [
*module_patterns,
*intake_patterns,
*member_patterns,
*user_patterns,
]

View File

@@ -0,0 +1,40 @@
from django.urls import path
from plane.api.views import (
UserAssetEndpoint,
UserServerAssetEndpoint,
GenericAssetEndpoint,
)
urlpatterns = [
path(
"assets/user-assets/",
UserAssetEndpoint.as_view(http_method_names=["post"]),
name="user-assets",
),
path(
"assets/user-assets/<uuid:asset_id>/",
UserAssetEndpoint.as_view(http_method_names=["patch", "delete"]),
name="user-assets-detail",
),
path(
"assets/user-assets/server/",
UserServerAssetEndpoint.as_view(http_method_names=["post"]),
name="user-server-assets",
),
path(
"assets/user-assets/<uuid:asset_id>/server/",
UserServerAssetEndpoint.as_view(http_method_names=["patch", "delete"]),
name="user-server-assets-detail",
),
path(
"workspaces/<str:slug>/assets/",
GenericAssetEndpoint.as_view(http_method_names=["post"]),
name="generic-asset",
),
path(
"workspaces/<str:slug>/assets/<uuid:asset_id>/",
GenericAssetEndpoint.as_view(http_method_names=["get", "patch"]),
name="generic-asset-detail",
),
]

View File

@@ -1,8 +1,10 @@
from django.urls import path
from plane.api.views.cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
CycleListCreateAPIEndpoint,
CycleDetailAPIEndpoint,
CycleIssueListCreateAPIEndpoint,
CycleIssueDetailAPIEndpoint,
TransferCycleIssueAPIEndpoint,
CycleArchiveUnarchiveAPIEndpoint,
)
@@ -10,37 +12,42 @@ from plane.api.views.cycle import (
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleAPIEndpoint.as_view(),
CycleListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/",
CycleAPIEndpoint.as_view(),
CycleDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueAPIEndpoint.as_view(),
CycleIssueListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/",
CycleIssueAPIEndpoint.as_view(),
CycleIssueDetailAPIEndpoint.as_view(http_method_names=["get", "delete"]),
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/transfer-issues/",
TransferCycleIssueAPIEndpoint.as_view(),
TransferCycleIssueAPIEndpoint.as_view(http_method_names=["post"]),
name="transfer-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/archive/",
CycleArchiveUnarchiveAPIEndpoint.as_view(),
CycleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["post"]),
name="cycle-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-cycles/",
CycleArchiveUnarchiveAPIEndpoint.as_view(),
CycleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["get"]),
name="cycle-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-cycles/<uuid:pk>/unarchive/",
CycleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["delete"]),
name="cycle-archive-unarchive",
),
]

View File

@@ -1,17 +1,22 @@
from django.urls import path
from plane.api.views import IntakeIssueAPIEndpoint
from plane.api.views import (
IntakeIssueListCreateAPIEndpoint,
IntakeIssueDetailAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/",
IntakeIssueAPIEndpoint.as_view(),
IntakeIssueListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="intake-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
IntakeIssueDetailAPIEndpoint.as_view(
http_method_names=["get", "patch", "delete"]
),
name="intake-issue",
),
]

View File

@@ -1,79 +1,95 @@
from django.urls import path
from plane.api.views import (
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
IssueActivityAPIEndpoint,
IssueListCreateAPIEndpoint,
IssueDetailAPIEndpoint,
LabelListCreateAPIEndpoint,
LabelDetailAPIEndpoint,
IssueLinkListCreateAPIEndpoint,
IssueLinkDetailAPIEndpoint,
IssueCommentListCreateAPIEndpoint,
IssueCommentDetailAPIEndpoint,
IssueActivityListAPIEndpoint,
IssueActivityDetailAPIEndpoint,
IssueAttachmentListCreateAPIEndpoint,
IssueAttachmentDetailAPIEndpoint,
WorkspaceIssueAPIEndpoint,
IssueAttachmentEndpoint,
IssueSearchEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/issues/<str:project__identifier>-<str:issue__identifier>/",
WorkspaceIssueAPIEndpoint.as_view(),
"workspaces/<str:slug>/issues/search/",
IssueSearchEndpoint.as_view(http_method_names=["get"]),
name="issue-search",
),
path(
"workspaces/<str:slug>/issues/<str:project_identifier>-<str:issue_identifier>/",
WorkspaceIssueAPIEndpoint.as_view(http_method_names=["get"]),
name="issue-by-identifier",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueAPIEndpoint.as_view(),
IssueListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueAPIEndpoint.as_view(),
IssueDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/",
LabelAPIEndpoint.as_view(),
LabelListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/<uuid:pk>/",
LabelAPIEndpoint.as_view(),
LabelDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/",
IssueLinkAPIEndpoint.as_view(),
IssueLinkListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="link",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/<uuid:pk>/",
IssueLinkAPIEndpoint.as_view(),
IssueLinkDetailAPIEndpoint.as_view(
http_method_names=["get", "patch", "delete"]
),
name="link",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/",
IssueCommentAPIEndpoint.as_view(),
IssueCommentListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/comments/<uuid:pk>/",
IssueCommentAPIEndpoint.as_view(),
IssueCommentDetailAPIEndpoint.as_view(
http_method_names=["get", "patch", "delete"]
),
name="comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/",
IssueActivityAPIEndpoint.as_view(),
IssueActivityListAPIEndpoint.as_view(http_method_names=["get"]),
name="activity",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/<uuid:pk>/",
IssueActivityAPIEndpoint.as_view(),
IssueActivityDetailAPIEndpoint.as_view(http_method_names=["get"]),
name="activity",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/",
IssueAttachmentEndpoint.as_view(),
IssueAttachmentListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="attachment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-attachments/<uuid:pk>/",
IssueAttachmentEndpoint.as_view(),
IssueAttachmentDetailAPIEndpoint.as_view(http_method_names=["get", "delete"]),
name="issue-attachment",
),
]

View File

@@ -1,11 +1,16 @@
from django.urls import path
from plane.api.views import ProjectMemberAPIEndpoint
from plane.api.views import ProjectMemberAPIEndpoint, WorkspaceMemberAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<str:project_id>/members/",
ProjectMemberAPIEndpoint.as_view(),
name="users",
)
ProjectMemberAPIEndpoint.as_view(http_method_names=["get"]),
name="project-members",
),
path(
"workspaces/<str:slug>/members/",
WorkspaceMemberAPIEndpoint.as_view(http_method_names=["get"]),
name="workspace-members",
),
]

View File

@@ -1,40 +1,47 @@
from django.urls import path
from plane.api.views import (
ModuleAPIEndpoint,
ModuleIssueAPIEndpoint,
ModuleListCreateAPIEndpoint,
ModuleDetailAPIEndpoint,
ModuleIssueListCreateAPIEndpoint,
ModuleIssueDetailAPIEndpoint,
ModuleArchiveUnarchiveAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/",
ModuleAPIEndpoint.as_view(),
ModuleListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="modules",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/",
ModuleAPIEndpoint.as_view(),
name="modules",
ModuleDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="modules-detail",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueAPIEndpoint.as_view(),
ModuleIssueListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:issue_id>/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
ModuleIssueDetailAPIEndpoint.as_view(http_method_names=["delete"]),
name="module-issues-detail",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:pk>/archive/",
ModuleArchiveUnarchiveAPIEndpoint.as_view(),
name="module-archive-unarchive",
ModuleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["post"]),
name="module-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-modules/",
ModuleArchiveUnarchiveAPIEndpoint.as_view(),
name="module-archive-unarchive",
ModuleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["get"]),
name="module-archive-list",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-modules/<uuid:pk>/unarchive/",
ModuleArchiveUnarchiveAPIEndpoint.as_view(http_method_names=["delete"]),
name="module-unarchive",
),
]

View File

@@ -1,19 +1,27 @@
from django.urls import path
from plane.api.views import ProjectAPIEndpoint, ProjectArchiveUnarchiveAPIEndpoint
from plane.api.views import (
ProjectListCreateAPIEndpoint,
ProjectDetailAPIEndpoint,
ProjectArchiveUnarchiveAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/", ProjectAPIEndpoint.as_view(), name="project"
"workspaces/<str:slug>/projects/",
ProjectListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
ProjectAPIEndpoint.as_view(),
ProjectDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archive/",
ProjectArchiveUnarchiveAPIEndpoint.as_view(),
ProjectArchiveUnarchiveAPIEndpoint.as_view(
http_method_names=["post", "delete"]
),
name="project-archive-unarchive",
),
]

View File

@@ -0,0 +1,20 @@
from drf_spectacular.views import (
SpectacularAPIView,
SpectacularRedocView,
SpectacularSwaggerView,
)
from django.urls import path
urlpatterns = [
path("schema/", SpectacularAPIView.as_view(), name="schema"),
path(
"schema/swagger-ui/",
SpectacularSwaggerView.as_view(url_name="schema"),
name="swagger-ui",
),
path(
"schema/redoc/",
SpectacularRedocView.as_view(url_name="schema"),
name="redoc",
),
]

View File

@@ -1,16 +1,19 @@
from django.urls import path
from plane.api.views import StateAPIEndpoint
from plane.api.views import (
StateListCreateAPIEndpoint,
StateDetailAPIEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/",
StateAPIEndpoint.as_view(),
StateListCreateAPIEndpoint.as_view(http_method_names=["get", "post"]),
name="states",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:state_id>/",
StateAPIEndpoint.as_view(),
StateDetailAPIEndpoint.as_view(http_method_names=["get", "patch", "delete"]),
name="states",
),
]

View File

@@ -0,0 +1,11 @@
from django.urls import path
from plane.api.views import UserEndpoint
urlpatterns = [
path(
"users/me/",
UserEndpoint.as_view(http_method_names=["get"]),
name="users",
),
]

View File

@@ -1,30 +1,55 @@
from .project import ProjectAPIEndpoint, ProjectArchiveUnarchiveAPIEndpoint
from .project import (
ProjectListCreateAPIEndpoint,
ProjectDetailAPIEndpoint,
ProjectArchiveUnarchiveAPIEndpoint,
)
from .state import StateAPIEndpoint
from .state import (
StateListCreateAPIEndpoint,
StateDetailAPIEndpoint,
)
from .issue import (
WorkspaceIssueAPIEndpoint,
IssueAPIEndpoint,
LabelAPIEndpoint,
IssueLinkAPIEndpoint,
IssueCommentAPIEndpoint,
IssueActivityAPIEndpoint,
IssueAttachmentEndpoint,
IssueListCreateAPIEndpoint,
IssueDetailAPIEndpoint,
LabelListCreateAPIEndpoint,
LabelDetailAPIEndpoint,
IssueLinkListCreateAPIEndpoint,
IssueLinkDetailAPIEndpoint,
IssueCommentListCreateAPIEndpoint,
IssueCommentDetailAPIEndpoint,
IssueActivityListAPIEndpoint,
IssueActivityDetailAPIEndpoint,
IssueAttachmentListCreateAPIEndpoint,
IssueAttachmentDetailAPIEndpoint,
IssueSearchEndpoint,
)
from .cycle import (
CycleAPIEndpoint,
CycleIssueAPIEndpoint,
CycleListCreateAPIEndpoint,
CycleDetailAPIEndpoint,
CycleIssueListCreateAPIEndpoint,
CycleIssueDetailAPIEndpoint,
TransferCycleIssueAPIEndpoint,
CycleArchiveUnarchiveAPIEndpoint,
)
from .module import (
ModuleAPIEndpoint,
ModuleIssueAPIEndpoint,
ModuleListCreateAPIEndpoint,
ModuleDetailAPIEndpoint,
ModuleIssueListCreateAPIEndpoint,
ModuleIssueDetailAPIEndpoint,
ModuleArchiveUnarchiveAPIEndpoint,
)
from .member import ProjectMemberAPIEndpoint
from .member import ProjectMemberAPIEndpoint, WorkspaceMemberAPIEndpoint
from .intake import IntakeIssueAPIEndpoint
from .intake import (
IntakeIssueListCreateAPIEndpoint,
IntakeIssueDetailAPIEndpoint,
)
from .asset import UserAssetEndpoint, UserServerAssetEndpoint, GenericAssetEndpoint
from .user import UserEndpoint

View File

@@ -0,0 +1,631 @@
# Python Imports
import uuid
# Django Imports
from django.utils import timezone
from django.conf import settings
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiExample, OpenApiRequest, OpenApiTypes
# Module Imports
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
from plane.settings.storage import S3Storage
from plane.db.models import FileAsset, User, Workspace
from plane.api.views.base import BaseAPIView
from plane.api.serializers import (
UserAssetUploadSerializer,
AssetUpdateSerializer,
GenericAssetUploadSerializer,
GenericAssetUpdateSerializer,
)
from plane.utils.openapi import (
ASSET_ID_PARAMETER,
WORKSPACE_SLUG_PARAMETER,
PRESIGNED_URL_SUCCESS_RESPONSE,
GENERIC_ASSET_UPLOAD_SUCCESS_RESPONSE,
GENERIC_ASSET_VALIDATION_ERROR_RESPONSE,
ASSET_CONFLICT_RESPONSE,
ASSET_DOWNLOAD_SUCCESS_RESPONSE,
ASSET_DOWNLOAD_ERROR_RESPONSE,
ASSET_UPDATED_RESPONSE,
ASSET_DELETED_RESPONSE,
VALIDATION_ERROR_RESPONSE,
ASSET_NOT_FOUND_RESPONSE,
NOT_FOUND_RESPONSE,
UNAUTHORIZED_RESPONSE,
asset_docs,
)
from plane.utils.exception_logger import log_exception
class UserAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload user profile images."""
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
if asset is None:
return
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save(update_fields=["is_deleted", "deleted_at"])
return
def entity_asset_delete(self, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar_asset_id = None
user.save()
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image_asset_id = None
user.save()
return
return
@asset_docs(
operation_id="create_user_asset_upload",
summary="Generate presigned URL for user asset upload",
description="Generate presigned URL for user asset upload",
request=OpenApiRequest(
request=UserAssetUploadSerializer,
examples=[
OpenApiExample(
"User Avatar Upload",
value={
"name": "profile.jpg",
"type": "image/jpeg",
"size": 1024000,
"entity_type": "USER_AVATAR",
},
description="Example request for uploading a user avatar",
),
OpenApiExample(
"User Cover Upload",
value={
"name": "cover.jpg",
"type": "image/jpeg",
"size": 1024000,
"entity_type": "USER_COVER",
},
description="Example request for uploading a user cover",
),
],
),
responses={
200: PRESIGNED_URL_SUCCESS_RESPONSE,
400: VALIDATION_ERROR_RESPONSE,
401: UNAUTHORIZED_RESPONSE,
},
)
def post(self, request):
"""Generate presigned URL for user asset upload.
Create a presigned URL for uploading user profile assets (avatar or cover image).
This endpoint generates the necessary credentials for direct S3 upload.
"""
# get the asset key
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", False)
# Check if the file size is within the limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Check if the entity type is allowed
if not entity_type or entity_type not in ["USER_AVATAR", "USER_COVER"]:
return Response(
{"error": "Invalid entity type.", "status": False},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = [
"image/jpeg",
"image/png",
"image/webp",
"image/jpg",
"image/gif",
]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# asset key
asset_key = f"{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={"name": name, "type": type, "size": size_limit},
asset=asset_key,
size=size_limit,
user=request.user,
created_by=request.user,
entity_type=entity_type,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key, file_type=type, file_size=size_limit
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@asset_docs(
operation_id="update_user_asset",
summary="Mark user asset as uploaded",
description="Mark user asset as uploaded",
parameters=[ASSET_ID_PARAMETER],
request=OpenApiRequest(
request=AssetUpdateSerializer,
examples=[
OpenApiExample(
"Update Asset Attributes",
value={
"attributes": {
"name": "updated_profile.jpg",
"type": "image/jpeg",
"size": 1024000,
},
"entity_type": "USER_AVATAR",
},
description="Example request for updating asset attributes",
),
],
),
responses={
204: ASSET_UPDATED_RESPONSE,
404: NOT_FOUND_RESPONSE,
},
)
def patch(self, request, asset_id):
"""Update user asset after upload completion.
Update the asset status and attributes after the file has been uploaded to S3.
This endpoint should be called after completing the S3 upload to mark the asset as uploaded.
"""
# get the asset id
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save(update_fields=["is_uploaded", "attributes"])
return Response(status=status.HTTP_204_NO_CONTENT)
@asset_docs(
operation_id="delete_user_asset",
summary="Delete user asset",
parameters=[ASSET_ID_PARAMETER],
responses={
204: ASSET_DELETED_RESPONSE,
404: NOT_FOUND_RESPONSE,
},
)
def delete(self, request, asset_id):
"""Delete user asset.
Delete a user profile asset (avatar or cover image) and remove its reference from the user profile.
This performs a soft delete by marking the asset as deleted and updating the user's profile.
"""
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
class UserServerAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload user profile images."""
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
if asset is None:
return
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save(update_fields=["is_deleted", "deleted_at"])
return
def entity_asset_delete(self, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar_asset_id = None
user.save()
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image_asset_id = None
user.save()
return
return
@asset_docs(
operation_id="create_user_server_asset_upload",
summary="Generate presigned URL for user server asset upload",
request=UserAssetUploadSerializer,
responses={
200: PRESIGNED_URL_SUCCESS_RESPONSE,
400: VALIDATION_ERROR_RESPONSE,
},
)
def post(self, request):
"""Generate presigned URL for user server asset upload.
Create a presigned URL for uploading user profile assets (avatar or cover image) using server credentials.
This endpoint generates the necessary credentials for direct S3 upload with server-side authentication.
"""
# get the asset key
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", False)
# Check if the file size is within the limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Check if the entity type is allowed
if not entity_type or entity_type not in ["USER_AVATAR", "USER_COVER"]:
return Response(
{"error": "Invalid entity type.", "status": False},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = [
"image/jpeg",
"image/png",
"image/webp",
"image/jpg",
"image/gif",
]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# asset key
asset_key = f"{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={"name": name, "type": type, "size": size_limit},
asset=asset_key,
size=size_limit,
user=request.user,
created_by=request.user,
entity_type=entity_type,
)
# Get the presigned URL
storage = S3Storage(request=request, is_server=True)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key, file_type=type, file_size=size_limit
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@asset_docs(
operation_id="update_user_server_asset",
summary="Mark user server asset as uploaded",
parameters=[ASSET_ID_PARAMETER],
request=AssetUpdateSerializer,
responses={
204: ASSET_UPDATED_RESPONSE,
404: NOT_FOUND_RESPONSE,
},
)
def patch(self, request, asset_id):
"""Update user server asset after upload completion.
Update the asset status and attributes after the file has been uploaded to S3 using server credentials.
This endpoint should be called after completing the S3 upload to mark the asset as uploaded.
"""
# get the asset id
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save(update_fields=["is_uploaded", "attributes"])
return Response(status=status.HTTP_204_NO_CONTENT)
@asset_docs(
operation_id="delete_user_server_asset",
summary="Delete user server asset",
parameters=[ASSET_ID_PARAMETER],
responses={
204: ASSET_DELETED_RESPONSE,
404: NOT_FOUND_RESPONSE,
},
)
def delete(self, request, asset_id):
"""Delete user server asset.
Delete a user profile asset (avatar or cover image) using server credentials and remove its reference from the user profile.
This performs a soft delete by marking the asset as deleted and updating the user's profile.
"""
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
class GenericAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload generic assets that can be later bound to entities."""
use_read_replica = True
@asset_docs(
operation_id="get_generic_asset",
summary="Get presigned URL for asset download",
description="Get presigned URL for asset download",
parameters=[WORKSPACE_SLUG_PARAMETER],
responses={
200: ASSET_DOWNLOAD_SUCCESS_RESPONSE,
400: ASSET_DOWNLOAD_ERROR_RESPONSE,
404: ASSET_NOT_FOUND_RESPONSE,
},
)
def get(self, request, slug, asset_id):
"""Get presigned URL for asset download.
Generate a presigned URL for downloading a generic asset.
The asset must be uploaded and associated with the specified workspace.
"""
try:
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# Get the asset
asset = FileAsset.objects.get(
id=asset_id, workspace_id=workspace.id, is_deleted=False
)
# Check if the asset exists and is uploaded
if not asset.is_uploaded:
return Response(
{"error": "Asset not yet uploaded"},
status=status.HTTP_400_BAD_REQUEST,
)
# Generate presigned URL for GET
storage = S3Storage(request=request, is_server=True)
presigned_url = storage.generate_presigned_url(
object_name=asset.asset.name, filename=asset.attributes.get("name")
)
return Response(
{
"asset_id": str(asset.id),
"asset_url": presigned_url,
"asset_name": asset.attributes.get("name", ""),
"asset_type": asset.attributes.get("type", ""),
},
status=status.HTTP_200_OK,
)
except Workspace.DoesNotExist:
return Response(
{"error": "Workspace not found"}, status=status.HTTP_404_NOT_FOUND
)
except FileAsset.DoesNotExist:
return Response(
{"error": "Asset not found"}, status=status.HTTP_404_NOT_FOUND
)
except Exception as e:
log_exception(e)
return Response(
{"error": "Internal server error"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
@asset_docs(
operation_id="create_generic_asset_upload",
summary="Generate presigned URL for generic asset upload",
description="Generate presigned URL for generic asset upload",
parameters=[WORKSPACE_SLUG_PARAMETER],
request=OpenApiRequest(
request=GenericAssetUploadSerializer,
examples=[
OpenApiExample(
"GenericAssetUploadSerializer",
value={
"name": "image.jpg",
"type": "image/jpeg",
"size": 1024000,
"project_id": "123e4567-e89b-12d3-a456-426614174000",
"external_id": "1234567890",
"external_source": "github",
},
description="Example request for uploading a generic asset",
),
],
),
responses={
200: GENERIC_ASSET_UPLOAD_SUCCESS_RESPONSE,
400: GENERIC_ASSET_VALIDATION_ERROR_RESPONSE,
404: NOT_FOUND_RESPONSE,
409: ASSET_CONFLICT_RESPONSE,
},
)
def post(self, request, slug):
"""Generate presigned URL for generic asset upload.
Create a presigned URL for uploading generic assets that can be bound to entities like work items.
Supports various file types and includes external source tracking for integrations.
"""
name = request.data.get("name")
type = request.data.get("type")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
project_id = request.data.get("project_id")
external_id = request.data.get("external_id")
external_source = request.data.get("external_source")
# Check if the request is valid
if not name or not size:
return Response(
{"error": "Name and size are required fields.", "status": False},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file size is within the limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Check if the file type is allowed
if not type or type not in settings.ATTACHMENT_MIME_TYPES:
return Response(
{"error": "Invalid file type.", "status": False},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Check for existing asset with same external details if provided
if external_id and external_source:
existing_asset = FileAsset.objects.filter(
workspace__slug=slug,
external_source=external_source,
external_id=external_id,
is_deleted=False,
).first()
if existing_asset:
return Response(
{
"message": "Asset with same external id and source already exists",
"asset_id": str(existing_asset.id),
"asset_url": existing_asset.asset_url,
},
status=status.HTTP_409_CONFLICT,
)
# Create a File Asset
asset = FileAsset.objects.create(
attributes={"name": name, "type": type, "size": size_limit},
asset=asset_key,
size=size_limit,
workspace_id=workspace.id,
project_id=project_id,
created_by=request.user,
external_id=external_id,
external_source=external_source,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT, # Using ISSUE_ATTACHMENT since we'll bind it to issues
)
# Get the presigned URL
storage = S3Storage(request=request, is_server=True)
presigned_url = storage.generate_presigned_post(
object_name=asset_key, file_type=type, file_size=size_limit
)
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@asset_docs(
operation_id="update_generic_asset",
summary="Update generic asset after upload completion",
description="Update generic asset after upload completion",
parameters=[WORKSPACE_SLUG_PARAMETER, ASSET_ID_PARAMETER],
request=OpenApiRequest(
request=GenericAssetUpdateSerializer,
examples=[
OpenApiExample(
"GenericAssetUpdateSerializer",
value={"is_uploaded": True},
description="Example request for updating a generic asset",
)
],
),
responses={
204: ASSET_UPDATED_RESPONSE,
404: ASSET_NOT_FOUND_RESPONSE,
},
)
def patch(self, request, slug, asset_id):
"""Update generic asset after upload completion.
Update the asset status after the file has been uploaded to S3.
This endpoint should be called after completing the S3 upload to mark the asset as uploaded
and trigger metadata extraction.
"""
try:
asset = FileAsset.objects.get(
id=asset_id, workspace__slug=slug, is_deleted=False
)
# Update is_uploaded status
asset.is_uploaded = request.data.get("is_uploaded", asset.is_uploaded)
# Update storage metadata if not present
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
asset.save(update_fields=["is_uploaded"])
return Response(status=status.HTTP_204_NO_CONTENT)
except FileAsset.DoesNotExist:
return Response(
{"error": "Asset not found"}, status=status.HTTP_404_NOT_FOUND
)

View File

@@ -13,13 +13,14 @@ from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
# Third party imports
from rest_framework.views import APIView
from rest_framework.generics import GenericAPIView
# Module imports
from plane.api.middleware.api_authentication import APIKeyAuthentication
from plane.api.rate_limit import ApiKeyRateThrottle, ServiceTokenRateThrottle
from plane.utils.exception_logger import log_exception
from plane.utils.paginator import BasePaginator
from plane.utils.core.mixins import ReadReplicaControlMixin
class TimezoneMixin:
@@ -36,11 +37,15 @@ class TimezoneMixin:
timezone.deactivate()
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
class BaseAPIView(
TimezoneMixin, GenericAPIView, ReadReplicaControlMixin, BasePaginator
):
authentication_classes = [APIKeyAuthentication]
permission_classes = [IsAuthenticated]
use_read_replica = False
def filter_queryset(self, queryset):
for backend in list(self.filter_backends):
queryset = backend().filter_queryset(self.request, queryset, self)

View File

@@ -23,9 +23,18 @@ from django.db import models
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiRequest, OpenApiResponse
# Module imports
from plane.api.serializers import CycleIssueSerializer, CycleSerializer
from plane.api.serializers import (
CycleIssueSerializer,
CycleSerializer,
CycleIssueRequestSerializer,
TransferCycleIssueRequestSerializer,
CycleCreateSerializer,
CycleUpdateSerializer,
IssueSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
@@ -42,19 +51,42 @@ from plane.utils.analytics_plot import burndown_plot
from plane.utils.host import base_host
from .base import BaseAPIView
from plane.bgtasks.webhook_task import model_activity
from plane.utils.openapi.decorators import cycle_docs
from plane.utils.openapi import (
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
CYCLE_VIEW_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
create_paginated_response,
# Request Examples
CYCLE_CREATE_EXAMPLE,
CYCLE_UPDATE_EXAMPLE,
CYCLE_ISSUE_REQUEST_EXAMPLE,
TRANSFER_CYCLE_ISSUE_EXAMPLE,
# Response Examples
CYCLE_EXAMPLE,
CYCLE_ISSUE_EXAMPLE,
TRANSFER_CYCLE_ISSUE_SUCCESS_EXAMPLE,
TRANSFER_CYCLE_ISSUE_ERROR_EXAMPLE,
TRANSFER_CYCLE_COMPLETED_ERROR_EXAMPLE,
DELETED_RESPONSE,
ARCHIVED_RESPONSE,
CYCLE_CANNOT_ARCHIVE_RESPONSE,
UNARCHIVED_RESPONSE,
REQUIRED_FIELDS_RESPONSE,
)
class CycleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle.
"""
class CycleListCreateAPIEndpoint(BaseAPIView):
"""Cycle List and Create Endpoint"""
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -136,17 +168,34 @@ class CycleAPIEndpoint(BaseAPIView):
.distinct()
)
def get(self, request, slug, project_id, pk=None):
@cycle_docs(
operation_id="list_cycles",
summary="List cycles",
description="Retrieve all cycles in a project. Supports filtering by cycle status like current, upcoming, completed, or draft.",
parameters=[
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
CYCLE_VIEW_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
CycleSerializer,
"PaginatedCycleResponse",
"Paginated list of cycles",
"Paginated Cycles",
),
},
)
def get(self, request, slug, project_id):
"""List cycles
Retrieve all cycles in a project.
Supports filtering by cycle status like current, upcoming, completed, or draft.
"""
project = Project.objects.get(workspace__slug=slug, pk=project_id)
if pk:
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = CycleSerializer(
queryset,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data
return Response(data, status=status.HTTP_200_OK)
queryset = self.get_queryset().filter(archived_at__isnull=True)
cycle_view = request.GET.get("cycle_view", "all")
@@ -237,7 +286,28 @@ class CycleAPIEndpoint(BaseAPIView):
).data,
)
@cycle_docs(
operation_id="create_cycle",
summary="Create cycle",
description="Create a new development cycle with specified name, description, and date range. Supports external ID tracking for integration purposes.",
request=OpenApiRequest(
request=CycleCreateSerializer,
examples=[CYCLE_CREATE_EXAMPLE],
),
responses={
201: OpenApiResponse(
description="Cycle created",
response=CycleSerializer,
examples=[CYCLE_EXAMPLE],
),
},
)
def post(self, request, slug, project_id):
"""Create cycle
Create a new development cycle with specified name, description, and date range.
Supports external ID tracking for integration purposes.
"""
if (
request.data.get("start_date", None) is None
and request.data.get("end_date", None) is None
@@ -245,7 +315,7 @@ class CycleAPIEndpoint(BaseAPIView):
request.data.get("start_date", None) is not None
and request.data.get("end_date", None) is not None
):
serializer = CycleSerializer(data=request.data)
serializer = CycleCreateSerializer(data=request.data)
if serializer.is_valid():
if (
request.data.get("external_id")
@@ -274,13 +344,16 @@ class CycleAPIEndpoint(BaseAPIView):
# Send the model activity
model_activity.delay(
model_name="cycle",
model_id=str(serializer.data["id"]),
model_id=str(serializer.instance.id),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
cycle = Cycle.objects.get(pk=serializer.instance.id)
serializer = CycleSerializer(cycle)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
@@ -291,7 +364,148 @@ class CycleAPIEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
class CycleDetailAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `retrieve`, `update` and `destroy` actions related to cycle.
"""
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
Cycle.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.select_related("project")
.select_related("workspace")
.select_related("owned_by")
.annotate(
total_issues=Count(
"issue_cycle",
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
completed_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
cancelled_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
started_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
unstarted_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
backlog_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
@cycle_docs(
operation_id="retrieve_cycle",
summary="Retrieve cycle",
description="Retrieve details of a specific cycle by its ID. Supports cycle status filtering.",
responses={
200: OpenApiResponse(
description="Cycles",
response=CycleSerializer,
examples=[CYCLE_EXAMPLE],
),
},
)
def get(self, request, slug, project_id, pk):
"""List or retrieve cycles
Retrieve all cycles in a project or get details of a specific cycle.
Supports filtering by cycle status like current, upcoming, completed, or draft.
"""
project = Project.objects.get(workspace__slug=slug, pk=project_id)
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = CycleSerializer(
queryset,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data
return Response(data, status=status.HTTP_200_OK)
@cycle_docs(
operation_id="update_cycle",
summary="Update cycle",
description="Modify an existing cycle's properties like name, description, or date range. Completed cycles can only have their sort order changed.",
request=OpenApiRequest(
request=CycleUpdateSerializer,
examples=[CYCLE_UPDATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Cycle updated",
response=CycleSerializer,
examples=[CYCLE_EXAMPLE],
),
},
)
def patch(self, request, slug, project_id, pk):
"""Update cycle
Modify an existing cycle's properties like name, description, or date range.
Completed cycles can only have their sort order changed.
"""
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
current_instance = json.dumps(
@@ -320,7 +534,7 @@ class CycleAPIEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
serializer = CycleSerializer(cycle, data=request.data, partial=True)
serializer = CycleUpdateSerializer(cycle, data=request.data, partial=True)
if serializer.is_valid():
if (
request.data.get("external_id")
@@ -346,17 +560,32 @@ class CycleAPIEndpoint(BaseAPIView):
# Send the model activity
model_activity.delay(
model_name="cycle",
model_id=str(serializer.data["id"]),
model_id=str(serializer.instance.id),
requested_data=request.data,
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
cycle = Cycle.objects.get(pk=serializer.instance.id)
serializer = CycleSerializer(cycle)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@cycle_docs(
operation_id="delete_cycle",
summary="Delete cycle",
description="Permanently remove a cycle and all its associated issue relationships",
responses={
204: DELETED_RESPONSE,
},
)
def delete(self, request, slug, project_id, pk):
"""Delete cycle
Permanently remove a cycle and all its associated issue relationships.
Only admins or the cycle creator can perform this action.
"""
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
if cycle.owned_by_id != request.user.id and (
not ProjectMember.objects.filter(
@@ -403,7 +632,10 @@ class CycleAPIEndpoint(BaseAPIView):
class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
"""Cycle Archive and Unarchive Endpoint"""
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -509,7 +741,27 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
.distinct()
)
@cycle_docs(
operation_id="list_archived_cycles",
description="Retrieve all cycles that have been archived in the project.",
summary="List archived cycles",
parameters=[CURSOR_PARAMETER, PER_PAGE_PARAMETER],
request={},
responses={
200: create_paginated_response(
CycleSerializer,
"PaginatedArchivedCycleResponse",
"Paginated list of archived cycles",
"Paginated Archived Cycles",
),
},
)
def get(self, request, slug, project_id):
"""List archived cycles
Retrieve all cycles that have been archived in the project.
Returns paginated results with cycle statistics and completion data.
"""
return self.paginate(
request=request,
queryset=(self.get_queryset()),
@@ -518,7 +770,22 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
).data,
)
@cycle_docs(
operation_id="archive_cycle",
summary="Archive cycle",
description="Move a completed cycle to archived status for historical tracking. Only cycles that have ended can be archived.",
request={},
responses={
204: ARCHIVED_RESPONSE,
400: CYCLE_CANNOT_ARCHIVE_RESPONSE,
},
)
def post(self, request, slug, project_id, cycle_id):
"""Archive cycle
Move a completed cycle to archived status for historical tracking.
Only cycles that have ended can be archived.
"""
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
@@ -537,7 +804,21 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@cycle_docs(
operation_id="unarchive_cycle",
summary="Unarchive cycle",
description="Restore an archived cycle to active status, making it available for regular use.",
request={},
responses={
204: UNARCHIVED_RESPONSE,
},
)
def delete(self, request, slug, project_id, cycle_id):
"""Unarchive cycle
Restore an archived cycle to active status, making it available for regular use.
The cycle will reappear in active cycle lists.
"""
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
@@ -546,18 +827,14 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
"""
class CycleIssueListCreateAPIEndpoint(BaseAPIView):
"""Cycle Issue List and Create Endpoint"""
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle_issue"
bulk = True
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -583,20 +860,27 @@ class CycleIssueAPIEndpoint(BaseAPIView):
.distinct()
)
def get(self, request, slug, project_id, cycle_id, issue_id=None):
# Get
if issue_id:
cycle_issue = CycleIssue.objects.get(
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
issue_id=issue_id,
)
serializer = CycleIssueSerializer(
cycle_issue, fields=self.fields, expand=self.expand
)
return Response(serializer.data, status=status.HTTP_200_OK)
@cycle_docs(
operation_id="list_cycle_work_items",
summary="List cycle work items",
description="Retrieve all work items assigned to a cycle.",
parameters=[CURSOR_PARAMETER, PER_PAGE_PARAMETER],
request={},
responses={
200: create_paginated_response(
CycleIssueSerializer,
"PaginatedCycleIssueResponse",
"Paginated list of cycle work items",
"Paginated Cycle Work Items",
),
},
)
def get(self, request, slug, project_id, cycle_id):
"""List or retrieve cycle work items
Retrieve all work items assigned to a cycle or get details of a specific cycle work item.
Returns paginated results with work item details, assignees, and labels.
"""
# List
order_by = request.GET.get("order_by", "created_at")
issues = (
@@ -639,24 +923,46 @@ class CycleIssueAPIEndpoint(BaseAPIView):
return self.paginate(
request=request,
queryset=(issues),
on_results=lambda issues: CycleSerializer(
on_results=lambda issues: IssueSerializer(
issues, many=True, fields=self.fields, expand=self.expand
).data,
)
@cycle_docs(
operation_id="add_cycle_work_items",
summary="Add Work Items to Cycle",
description="Assign multiple work items to a cycle. Automatically handles bulk creation and updates with activity tracking.",
request=OpenApiRequest(
request=CycleIssueRequestSerializer,
examples=[CYCLE_ISSUE_REQUEST_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Cycle work items added",
response=CycleIssueSerializer,
examples=[CYCLE_ISSUE_EXAMPLE],
),
400: REQUIRED_FIELDS_RESPONSE,
},
)
def post(self, request, slug, project_id, cycle_id):
"""Add cycle issues
Assign multiple work items to a cycle or move them from another cycle.
Automatically handles bulk creation and updates with activity tracking.
"""
issues = request.data.get("issues", [])
if not issues:
return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Work items are required"}, status=status.HTTP_400_BAD_REQUEST
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
# Get all CycleIssues already created
# Get all CycleWorkItems already created
cycle_issues = list(
CycleIssue.objects.filter(~Q(cycle_id=cycle_id), issue_id__in=issues)
)
@@ -730,7 +1036,88 @@ class CycleIssueAPIEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
class CycleIssueDetailAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
"""
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle_issue"
bulk = True
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
CycleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(cycle_id=self.kwargs.get("cycle_id"))
.select_related("project")
.select_related("workspace")
.select_related("cycle")
.select_related("issue", "issue__state", "issue__project")
.prefetch_related("issue__assignees", "issue__labels")
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
@cycle_docs(
operation_id="retrieve_cycle_work_item",
summary="Retrieve cycle work item",
description="Retrieve details of a specific cycle work item.",
responses={
200: OpenApiResponse(
description="Cycle work items",
response=CycleIssueSerializer,
examples=[CYCLE_ISSUE_EXAMPLE],
),
},
)
def get(self, request, slug, project_id, cycle_id, issue_id):
"""Retrieve cycle work item
Retrieve details of a specific cycle work item.
Returns paginated results with work item details, assignees, and labels.
"""
cycle_issue = CycleIssue.objects.get(
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
issue_id=issue_id,
)
serializer = CycleIssueSerializer(
cycle_issue, fields=self.fields, expand=self.expand
)
return Response(serializer.data, status=status.HTTP_200_OK)
@cycle_docs(
operation_id="delete_cycle_work_item",
summary="Delete cycle work item",
description="Remove a work item from a cycle while keeping the work item in the project.",
responses={
204: DELETED_RESPONSE,
},
)
def delete(self, request, slug, project_id, cycle_id, issue_id):
"""Remove cycle work item
Remove a work item from a cycle while keeping the work item in the project.
Records the removal activity for tracking purposes.
"""
cycle_issue = CycleIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
@@ -764,7 +1151,54 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
permission_classes = [ProjectEntityPermission]
@cycle_docs(
operation_id="transfer_cycle_work_items",
summary="Transfer cycle work items",
description="Move incomplete work items from the current cycle to a new target cycle. Captures progress snapshot and transfers only unfinished work items.",
request=OpenApiRequest(
request=TransferCycleIssueRequestSerializer,
examples=[TRANSFER_CYCLE_ISSUE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Work items transferred successfully",
response={
"type": "object",
"properties": {
"message": {
"type": "string",
"description": "Success message",
"example": "Success",
},
},
},
examples=[TRANSFER_CYCLE_ISSUE_SUCCESS_EXAMPLE],
),
400: OpenApiResponse(
description="Bad request",
response={
"type": "object",
"properties": {
"error": {
"type": "string",
"description": "Error message",
"example": "New Cycle Id is required",
},
},
},
examples=[
TRANSFER_CYCLE_ISSUE_ERROR_EXAMPLE,
TRANSFER_CYCLE_COMPLETED_ERROR_EXAMPLE,
],
),
},
)
def post(self, request, slug, project_id, cycle_id):
"""Transfer cycle issues
Move incomplete issues from the current cycle to a new target cycle.
Captures progress snapshot and transfers only unfinished work items.
"""
new_cycle_id = request.data.get("new_cycle_id", False)
if not new_cycle_id:

View File

@@ -12,30 +12,49 @@ from django.contrib.postgres.fields import ArrayField
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiResponse, OpenApiRequest
# Module imports
from plane.api.serializers import IntakeIssueSerializer, IssueSerializer
from plane.api.serializers import (
IntakeIssueSerializer,
IssueSerializer,
IntakeIssueCreateSerializer,
IntakeIssueUpdateSerializer,
)
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import Intake, IntakeIssue, Issue, Project, ProjectMember, State
from plane.utils.host import base_host
from .base import BaseAPIView
from plane.db.models.intake import SourceType
from plane.utils.openapi import (
intake_docs,
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
ISSUE_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
create_paginated_response,
# Request Examples
INTAKE_ISSUE_CREATE_EXAMPLE,
INTAKE_ISSUE_UPDATE_EXAMPLE,
# Response Examples
INTAKE_ISSUE_EXAMPLE,
INVALID_REQUEST_RESPONSE,
DELETED_RESPONSE,
)
class IntakeIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to intake issues.
"""
permission_classes = [ProjectLitePermission]
class IntakeIssueListCreateAPIEndpoint(BaseAPIView):
"""Intake Work Item List and Create Endpoint"""
serializer_class = IntakeIssueSerializer
model = IntakeIssue
filterset_fields = ["status"]
model = Intake
permission_classes = [ProjectLitePermission]
use_read_replica = True
def get_queryset(self):
intake = Intake.objects.filter(
@@ -61,13 +80,33 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
.order_by(self.kwargs.get("order_by", "-created_at"))
)
def get(self, request, slug, project_id, issue_id=None):
if issue_id:
intake_issue_queryset = self.get_queryset().get(issue_id=issue_id)
intake_issue_data = IntakeIssueSerializer(
intake_issue_queryset, fields=self.fields, expand=self.expand
).data
return Response(intake_issue_data, status=status.HTTP_200_OK)
@intake_docs(
operation_id="get_intake_work_items_list",
summary="List intake work items",
description="Retrieve all work items in the project's intake queue. Returns paginated results when listing all intake work items.",
parameters=[
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
IntakeIssueSerializer,
"PaginatedIntakeIssueResponse",
"Paginated list of intake work items",
"Paginated Intake Work Items",
),
},
)
def get(self, request, slug, project_id):
"""List intake work items
Retrieve all work items in the project's intake queue.
Returns paginated results when listing all intake work items.
"""
issue_queryset = self.get_queryset()
return self.paginate(
request=request,
@@ -77,7 +116,33 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
).data,
)
@intake_docs(
operation_id="create_intake_work_item",
summary="Create intake work item",
description="Submit a new work item to the project's intake queue for review and triage. Automatically creates the work item with default triage state and tracks activity.",
parameters=[
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
],
request=OpenApiRequest(
request=IntakeIssueCreateSerializer,
examples=[INTAKE_ISSUE_CREATE_EXAMPLE],
),
responses={
201: OpenApiResponse(
description="Intake work item created",
response=IntakeIssueSerializer,
examples=[INTAKE_ISSUE_EXAMPLE],
),
400: INVALID_REQUEST_RESPONSE,
},
)
def post(self, request, slug, project_id):
"""Create intake work item
Submit a new work item to the project's intake queue for review and triage.
Automatically creates the work item with default triage state and tracks activity.
"""
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
@@ -141,9 +206,100 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
)
serializer = IntakeIssueSerializer(intake_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.data, status=status.HTTP_201_CREATED)
class IntakeIssueDetailAPIEndpoint(BaseAPIView):
"""Intake Issue API Endpoint"""
permission_classes = [ProjectLitePermission]
serializer_class = IntakeIssueSerializer
model = IntakeIssue
use_read_replica = True
filterset_fields = ["status"]
def get_queryset(self):
intake = Intake.objects.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
)
if intake is None and not project.intake_view:
return IntakeIssue.objects.none()
return (
IntakeIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
intake_id=intake.id,
)
.select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
)
@intake_docs(
operation_id="retrieve_intake_work_item",
summary="Retrieve intake work item",
description="Retrieve details of a specific intake work item.",
parameters=[
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
ISSUE_ID_PARAMETER,
],
responses={
200: OpenApiResponse(
description="Intake work item",
response=IntakeIssueSerializer,
examples=[INTAKE_ISSUE_EXAMPLE],
),
},
)
def get(self, request, slug, project_id, issue_id):
"""Retrieve intake work item
Retrieve details of a specific intake work item.
"""
intake_issue_queryset = self.get_queryset().get(issue_id=issue_id)
intake_issue_data = IntakeIssueSerializer(
intake_issue_queryset, fields=self.fields, expand=self.expand
).data
return Response(intake_issue_data, status=status.HTTP_200_OK)
@intake_docs(
operation_id="update_intake_work_item",
summary="Update intake work item",
description="Modify an existing intake work item's properties or status for triage processing. Supports status changes like accept, reject, or mark as duplicate.",
parameters=[
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
ISSUE_ID_PARAMETER,
],
request=OpenApiRequest(
request=IntakeIssueUpdateSerializer,
examples=[INTAKE_ISSUE_UPDATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Intake work item updated",
response=IntakeIssueSerializer,
examples=[INTAKE_ISSUE_EXAMPLE],
),
400: INVALID_REQUEST_RESPONSE,
},
)
def patch(self, request, slug, project_id, issue_id):
"""Update intake work item
Modify an existing intake work item's properties or status for triage processing.
Supports status changes like accept, reject, or mark as duplicate.
"""
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
@@ -180,7 +336,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
request.user.id
):
return Response(
{"error": "You cannot edit intake issues"},
{"error": "You cannot edit intake work items"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -251,7 +407,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
# Only project admins and members can edit intake issue attributes
if project_member.role > 15:
serializer = IntakeIssueSerializer(
serializer = IntakeIssueUpdateSerializer(
intake_issue, data=request.data, partial=True
)
current_instance = json.dumps(
@@ -301,7 +457,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
origin=base_host(request=request, is_app=True),
intake=str(intake_issue.id),
)
serializer = IntakeIssueSerializer(intake_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
@@ -309,7 +465,25 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
IntakeIssueSerializer(intake_issue).data, status=status.HTTP_200_OK
)
@intake_docs(
operation_id="delete_intake_work_item",
summary="Delete intake work item",
description="Permanently remove an intake work item from the triage queue. Also deletes the underlying work item if it hasn't been accepted yet.",
parameters=[
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
ISSUE_ID_PARAMETER,
],
responses={
204: DELETED_RESPONSE,
},
)
def delete(self, request, slug, project_id, issue_id):
"""Delete intake work item
Permanently remove an intake work item from the triage queue.
Also deletes the underlying work item if it hasn't been accepted yet.
"""
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
@@ -349,7 +523,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
).exists()
):
return Response(
{"error": "Only admin or creator can delete the issue"},
{"error": "Only admin or creator can delete the work item"},
status=status.HTTP_403_FORBIDDEN,
)
issue.delete()

File diff suppressed because it is too large Load Diff

View File

@@ -1,29 +1,122 @@
# Python imports
import uuid
# Django imports
from django.contrib.auth.hashers import make_password
from django.core.validators import validate_email
from django.core.exceptions import ValidationError
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from drf_spectacular.utils import (
extend_schema,
OpenApiResponse,
)
# Module imports
from .base import BaseAPIView
from plane.api.serializers import UserLiteSerializer
from plane.db.models import User, Workspace, Project, WorkspaceMember, ProjectMember
from plane.db.models import User, Workspace, WorkspaceMember, ProjectMember
from plane.app.permissions import ProjectMemberPermission, WorkSpaceAdminPermission
from plane.utils.openapi import (
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
UNAUTHORIZED_RESPONSE,
FORBIDDEN_RESPONSE,
WORKSPACE_NOT_FOUND_RESPONSE,
PROJECT_NOT_FOUND_RESPONSE,
WORKSPACE_MEMBER_EXAMPLE,
PROJECT_MEMBER_EXAMPLE,
)
from plane.app.permissions import ProjectMemberPermission
class WorkspaceMemberAPIEndpoint(BaseAPIView):
permission_classes = [WorkSpaceAdminPermission]
use_read_replica = True
@extend_schema(
operation_id="get_workspace_members",
summary="List workspace members",
description="Retrieve all users who are members of the specified workspace.",
tags=["Members"],
parameters=[WORKSPACE_SLUG_PARAMETER],
responses={
200: OpenApiResponse(
description="List of workspace members with their roles",
response={
"type": "array",
"items": {
"allOf": [
{"$ref": "#/components/schemas/UserLite"},
{
"type": "object",
"properties": {
"role": {
"type": "integer",
"description": "Member role in the workspace",
}
},
},
]
},
},
examples=[WORKSPACE_MEMBER_EXAMPLE],
),
401: UNAUTHORIZED_RESPONSE,
403: FORBIDDEN_RESPONSE,
404: WORKSPACE_NOT_FOUND_RESPONSE,
},
)
# Get all the users that are present inside the workspace
def get(self, request, slug):
"""List workspace members
Retrieve all users who are members of the specified workspace.
Returns user profiles with their respective workspace roles and permissions.
"""
# Check if the workspace exists
if not Workspace.objects.filter(slug=slug).exists():
return Response(
{"error": "Provided workspace does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace_members = WorkspaceMember.objects.filter(
workspace__slug=slug
).select_related("member")
# Get all the users with their roles
users_with_roles = []
for workspace_member in workspace_members:
user_data = UserLiteSerializer(workspace_member.member).data
user_data["role"] = workspace_member.role
users_with_roles.append(user_data)
return Response(users_with_roles, status=status.HTTP_200_OK)
# API endpoint to get and insert users inside the workspace
class ProjectMemberAPIEndpoint(BaseAPIView):
permission_classes = [ProjectMemberPermission]
use_read_replica = True
@extend_schema(
operation_id="get_project_members",
summary="List project members",
description="Retrieve all users who are members of the specified project.",
tags=["Members"],
parameters=[WORKSPACE_SLUG_PARAMETER, PROJECT_ID_PARAMETER],
responses={
200: OpenApiResponse(
description="List of project members with their roles",
response=UserLiteSerializer,
examples=[PROJECT_MEMBER_EXAMPLE],
),
401: UNAUTHORIZED_RESPONSE,
403: FORBIDDEN_RESPONSE,
404: PROJECT_NOT_FOUND_RESPONSE,
},
)
# Get all the users that are present inside the workspace
def get(self, request, slug, project_id):
"""List project members
Retrieve all users who are members of the specified project.
Returns user profiles with their project-specific roles and access levels.
"""
# Check if the workspace exists
if not Workspace.objects.filter(slug=slug).exists():
return Response(
@@ -42,91 +135,3 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
).data
return Response(users, status=status.HTTP_200_OK)
# Insert a new user inside the workspace, and assign the user to the project
def post(self, request, slug, project_id):
# Check if user with email already exists, and send bad request if it's
# not present, check for workspace and valid project mandat
# ------------------- Validation -------------------
if (
request.data.get("email") is None
or request.data.get("display_name") is None
):
return Response(
{
"error": "Expected email, display_name, workspace_slug, project_id, one or more of the fields are missing."
},
status=status.HTTP_400_BAD_REQUEST,
)
email = request.data.get("email")
try:
validate_email(email)
except ValidationError:
return Response(
{"error": "Invalid email provided"}, status=status.HTTP_400_BAD_REQUEST
)
workspace = Workspace.objects.filter(slug=slug).first()
project = Project.objects.filter(pk=project_id).first()
if not all([workspace, project]):
return Response(
{"error": "Provided workspace or project does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if user exists
user = User.objects.filter(email=email).first()
workspace_member = None
project_member = None
if user:
# Check if user is part of the workspace
workspace_member = WorkspaceMember.objects.filter(
workspace=workspace, member=user
).first()
if workspace_member:
# Check if user is part of the project
project_member = ProjectMember.objects.filter(
project=project, member=user
).first()
if project_member:
return Response(
{"error": "User is already part of the workspace and project"},
status=status.HTTP_400_BAD_REQUEST,
)
# If user does not exist, create the user
if not user:
user = User.objects.create(
email=email,
display_name=request.data.get("display_name"),
first_name=request.data.get("first_name", ""),
last_name=request.data.get("last_name", ""),
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
is_active=False,
)
user.save()
# Create a workspace member for the user if not already a member
if not workspace_member:
workspace_member = WorkspaceMember.objects.create(
workspace=workspace, member=user, role=request.data.get("role", 5)
)
workspace_member.save()
# Create a project member for the user if not already a member
if not project_member:
project_member = ProjectMember.objects.create(
project=project, member=user, role=request.data.get("role", 5)
)
project_member.save()
# Serialize the user and return the response
user_data = UserLiteSerializer(user).data
return Response(user_data, status=status.HTTP_201_CREATED)

View File

@@ -10,12 +10,16 @@ from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiResponse, OpenApiExample, OpenApiRequest
# Module imports
from plane.api.serializers import (
IssueSerializer,
ModuleIssueSerializer,
ModuleSerializer,
ModuleIssueRequestSerializer,
ModuleCreateSerializer,
ModuleUpdateSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activities_task import issue_activity
@@ -34,19 +38,49 @@ from plane.db.models import (
from .base import BaseAPIView
from plane.bgtasks.webhook_task import model_activity
from plane.utils.host import base_host
from plane.utils.openapi import (
module_docs,
module_issue_docs,
WORKSPACE_SLUG_PARAMETER,
PROJECT_ID_PARAMETER,
MODULE_ID_PARAMETER,
MODULE_PK_PARAMETER,
ISSUE_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
create_paginated_response,
# Request Examples
MODULE_CREATE_EXAMPLE,
MODULE_UPDATE_EXAMPLE,
MODULE_ISSUE_REQUEST_EXAMPLE,
# Response Examples
MODULE_EXAMPLE,
MODULE_ISSUE_EXAMPLE,
INVALID_REQUEST_RESPONSE,
PROJECT_NOT_FOUND_RESPONSE,
EXTERNAL_ID_EXISTS_RESPONSE,
MODULE_NOT_FOUND_RESPONSE,
DELETED_RESPONSE,
ADMIN_ONLY_RESPONSE,
REQUIRED_FIELDS_RESPONSE,
MODULE_ISSUE_NOT_FOUND_RESPONSE,
ARCHIVED_RESPONSE,
CANNOT_ARCHIVE_RESPONSE,
UNARCHIVED_RESPONSE,
)
class ModuleAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module.
class ModuleListCreateAPIEndpoint(BaseAPIView):
"""Module List and Create Endpoint"""
"""
model = Module
permission_classes = [ProjectEntityPermission]
serializer_class = ModuleSerializer
model = Module
webhook_event = "module"
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -136,9 +170,33 @@ class ModuleAPIEndpoint(BaseAPIView):
.order_by(self.kwargs.get("order_by", "-created_at"))
)
@module_docs(
operation_id="create_module",
summary="Create module",
description="Create a new project module with specified name, description, and timeline.",
request=OpenApiRequest(
request=ModuleCreateSerializer,
examples=[MODULE_CREATE_EXAMPLE],
),
responses={
201: OpenApiResponse(
description="Module created",
response=ModuleSerializer,
examples=[MODULE_EXAMPLE],
),
400: INVALID_REQUEST_RESPONSE,
404: PROJECT_NOT_FOUND_RESPONSE,
409: EXTERNAL_ID_EXISTS_RESPONSE,
},
)
def post(self, request, slug, project_id):
"""Create module
Create a new project module with specified name, description, and timeline.
Automatically assigns the creator as module lead and tracks activity.
"""
project = Project.objects.get(pk=project_id, workspace__slug=slug)
serializer = ModuleSerializer(
serializer = ModuleCreateSerializer(
data=request.data,
context={"project_id": project_id, "workspace_id": project.workspace_id},
)
@@ -170,19 +228,185 @@ class ModuleAPIEndpoint(BaseAPIView):
# Send the model activity
model_activity.delay(
model_name="module",
model_id=str(serializer.data["id"]),
model_id=str(serializer.instance.id),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
module = Module.objects.get(pk=serializer.data["id"])
module = Module.objects.get(pk=serializer.instance.id)
serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@module_docs(
operation_id="list_modules",
summary="List modules",
description="Retrieve all modules in a project.",
parameters=[
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
ModuleSerializer,
"PaginatedModuleResponse",
"Paginated list of modules",
"Paginated Modules",
),
404: OpenApiResponse(description="Module not found"),
},
)
def get(self, request, slug, project_id):
"""List or retrieve modules
Retrieve all modules in a project or get details of a specific module.
Returns paginated results with module statistics and member information.
"""
return self.paginate(
request=request,
queryset=(self.get_queryset().filter(archived_at__isnull=True)),
on_results=lambda modules: ModuleSerializer(
modules, many=True, fields=self.fields, expand=self.expand
).data,
)
class ModuleDetailAPIEndpoint(BaseAPIView):
"""Module Detail Endpoint"""
model = Module
permission_classes = [ProjectEntityPermission]
serializer_class = ModuleSerializer
webhook_event = "module"
use_read_replica = True
def get_queryset(self):
return (
Module.objects.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("project")
.select_related("workspace")
.select_related("lead")
.prefetch_related("members")
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related("module", "created_by"),
)
)
.annotate(
total_issues=Count(
"issue_module",
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.annotate(
completed_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.annotate(
cancelled_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.annotate(
started_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.annotate(
unstarted_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.annotate(
backlog_issues=Count(
"issue_module__issue__state__group",
filter=Q(
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
)
@module_docs(
operation_id="update_module",
summary="Update module",
description="Modify an existing module's properties like name, description, status, or timeline.",
parameters=[
MODULE_PK_PARAMETER,
],
request=OpenApiRequest(
request=ModuleUpdateSerializer,
examples=[MODULE_UPDATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Module updated successfully",
response=ModuleSerializer,
examples=[MODULE_EXAMPLE],
),
400: OpenApiResponse(
description="Invalid request data",
response=ModuleSerializer,
examples=[MODULE_UPDATE_EXAMPLE],
),
404: OpenApiResponse(description="Module not found"),
409: OpenApiResponse(
description="Module with same external ID already exists"
),
},
)
def patch(self, request, slug, project_id, pk):
"""Update module
Modify an existing module's properties like name, description, status, or timeline.
Tracks all changes in model activity logs for audit purposes.
"""
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
current_instance = json.dumps(
@@ -222,7 +446,7 @@ class ModuleAPIEndpoint(BaseAPIView):
# Send the model activity
model_activity.delay(
model_name="module",
model_id=str(serializer.data["id"]),
model_id=str(serializer.instance.id),
requested_data=request.data,
current_instance=current_instance,
actor_id=request.user.id,
@@ -233,22 +457,50 @@ class ModuleAPIEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = ModuleSerializer(
queryset, fields=self.fields, expand=self.expand
).data
return Response(data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset().filter(archived_at__isnull=True)),
on_results=lambda modules: ModuleSerializer(
modules, many=True, fields=self.fields, expand=self.expand
).data,
)
@module_docs(
operation_id="retrieve_module",
summary="Retrieve module",
description="Retrieve details of a specific module.",
parameters=[
MODULE_PK_PARAMETER,
],
responses={
200: OpenApiResponse(
description="Module",
response=ModuleSerializer,
examples=[MODULE_EXAMPLE],
),
404: OpenApiResponse(description="Module not found"),
},
)
def get(self, request, slug, project_id, pk):
"""Retrieve module
Retrieve details of a specific module.
"""
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = ModuleSerializer(queryset, fields=self.fields, expand=self.expand).data
return Response(data, status=status.HTTP_200_OK)
@module_docs(
operation_id="delete_module",
summary="Delete module",
description="Permanently remove a module and all its associated issue relationships.",
parameters=[
MODULE_PK_PARAMETER,
],
responses={
204: DELETED_RESPONSE,
403: ADMIN_ONLY_RESPONSE,
404: MODULE_NOT_FOUND_RESPONSE,
},
)
def delete(self, request, slug, project_id, pk):
"""Delete module
Permanently remove a module and all its associated issue relationships.
Only admins or the module creator can perform this action.
"""
module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
if module.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
@@ -293,19 +545,14 @@ class ModuleAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module issues.
"""
class ModuleIssueListCreateAPIEndpoint(BaseAPIView):
"""Module Work Item List and Create Endpoint"""
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module_issue"
bulk = True
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -333,7 +580,35 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
.distinct()
)
@module_issue_docs(
operation_id="list_module_work_items",
summary="List module work items",
description="Retrieve all work items assigned to a module with detailed information.",
parameters=[
MODULE_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
request={},
responses={
200: create_paginated_response(
IssueSerializer,
"PaginatedModuleIssueResponse",
"Paginated list of module work items",
"Paginated Module Work Items",
),
404: OpenApiResponse(description="Module not found"),
},
)
def get(self, request, slug, project_id, module_id):
"""List module work items
Retrieve all work items assigned to a module with detailed information.
Returns paginated results including assignees, labels, and attachments.
"""
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(
@@ -379,7 +654,33 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
).data,
)
@module_issue_docs(
operation_id="add_module_work_items",
summary="Add Work Items to Module",
description="Assign multiple work items to a module or move them from another module. Automatically handles bulk creation and updates with activity tracking.",
parameters=[
MODULE_ID_PARAMETER,
],
request=OpenApiRequest(
request=ModuleIssueRequestSerializer,
examples=[MODULE_ISSUE_REQUEST_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Module issues added",
response=ModuleIssueSerializer,
examples=[MODULE_ISSUE_EXAMPLE],
),
400: REQUIRED_FIELDS_RESPONSE,
404: MODULE_NOT_FOUND_RESPONSE,
},
)
def post(self, request, slug, project_id, module_id):
"""Add module work items
Assign multiple work items to a module or move them from another module.
Automatically handles bulk creation and updates with activity tracking.
"""
issues = request.data.get("issues", [])
if not len(issues):
return Response(
@@ -459,7 +760,143 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
class ModuleIssueDetailAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to module work items.
"""
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module_issue"
bulk = True
use_read_replica = True
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
ModuleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(module_id=self.kwargs.get("module_id"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(project__archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
.select_related("module")
.select_related("issue", "issue__state", "issue__project")
.prefetch_related("issue__assignees", "issue__labels")
.prefetch_related("module__members")
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
@module_issue_docs(
operation_id="retrieve_module_work_item",
summary="Retrieve module work item",
description="Retrieve details of a specific module work item.",
parameters=[
MODULE_ID_PARAMETER,
ISSUE_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
IssueSerializer,
"PaginatedModuleIssueDetailResponse",
"Paginated list of module work item details",
"Module Work Item Details",
),
404: OpenApiResponse(description="Module not found"),
},
)
def get(self, request, slug, project_id, module_id, issue_id):
"""List module work items
Retrieve all work items assigned to a module with detailed information.
Returns paginated results including assignees, labels, and attachments.
"""
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(
issue_module__module_id=module_id,
issue_module__deleted_at__isnull=True,
pk=issue_id,
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_module__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.order_by(order_by)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
return self.paginate(
request=request,
queryset=(issues),
on_results=lambda issues: IssueSerializer(
issues, many=True, fields=self.fields, expand=self.expand
).data,
)
@module_issue_docs(
operation_id="delete_module_work_item",
summary="Delete module work item",
description="Remove a work item from a module while keeping the work item in the project.",
parameters=[
MODULE_ID_PARAMETER,
ISSUE_ID_PARAMETER,
],
responses={
204: DELETED_RESPONSE,
404: MODULE_ISSUE_NOT_FOUND_RESPONSE,
},
)
def delete(self, request, slug, project_id, module_id, issue_id):
"""Remove module work item
Remove a work item from a module while keeping the work item in the project.
Records the removal activity for tracking purposes.
"""
module_issue = ModuleIssue.objects.get(
workspace__slug=slug,
project_id=project_id,
@@ -483,6 +920,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -573,7 +1011,34 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
.order_by(self.kwargs.get("order_by", "-created_at"))
)
def get(self, request, slug, project_id, pk):
@module_docs(
operation_id="list_archived_modules",
summary="List archived modules",
description="Retrieve all modules that have been archived in the project.",
parameters=[
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
request={},
responses={
200: create_paginated_response(
ModuleSerializer,
"PaginatedArchivedModuleResponse",
"Paginated list of archived modules",
"Paginated Archived Modules",
),
404: OpenApiResponse(description="Project not found"),
},
)
def get(self, request, slug, project_id):
"""List archived modules
Retrieve all modules that have been archived in the project.
Returns paginated results with module statistics.
"""
return self.paginate(
request=request,
queryset=(self.get_queryset()),
@@ -582,7 +1047,26 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
).data,
)
@module_docs(
operation_id="archive_module",
summary="Archive module",
description="Move a module to archived status for historical tracking.",
parameters=[
MODULE_PK_PARAMETER,
],
request={},
responses={
204: ARCHIVED_RESPONSE,
400: CANNOT_ARCHIVE_RESPONSE,
404: MODULE_NOT_FOUND_RESPONSE,
},
)
def post(self, request, slug, project_id, pk):
"""Archive module
Move a completed module to archived status for historical tracking.
Only modules with completed status can be archived.
"""
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
if module.status not in ["completed", "cancelled"]:
return Response(
@@ -599,7 +1083,24 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@module_docs(
operation_id="unarchive_module",
summary="Unarchive module",
description="Restore an archived module to active status, making it available for regular use.",
parameters=[
MODULE_PK_PARAMETER,
],
responses={
204: UNARCHIVED_RESPONSE,
404: MODULE_NOT_FOUND_RESPONSE,
},
)
def delete(self, request, slug, project_id, pk):
"""Unarchive module
Restore an archived module to active status, making it available for regular use.
The module will reappear in active module lists and become fully functional.
"""
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
module.archived_at = None
module.save()

View File

@@ -11,9 +11,8 @@ from django.core.serializers.json import DjangoJSONEncoder
from rest_framework import status
from rest_framework.response import Response
from rest_framework.serializers import ValidationError
from drf_spectacular.utils import OpenApiResponse, OpenApiExample, OpenApiRequest
from plane.api.serializers import ProjectSerializer
from plane.app.permissions import ProjectBasePermission
# Module imports
from plane.db.models import (
@@ -31,16 +30,44 @@ from plane.db.models import (
from plane.bgtasks.webhook_task import model_activity, webhook_activity
from .base import BaseAPIView
from plane.utils.host import base_host
from plane.api.serializers import (
ProjectSerializer,
ProjectCreateSerializer,
ProjectUpdateSerializer,
)
from plane.app.permissions import ProjectBasePermission
from plane.utils.openapi import (
project_docs,
PROJECT_ID_PARAMETER,
PROJECT_PK_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
create_paginated_response,
# Request Examples
PROJECT_CREATE_EXAMPLE,
PROJECT_UPDATE_EXAMPLE,
# Response Examples
PROJECT_EXAMPLE,
PROJECT_NOT_FOUND_RESPONSE,
WORKSPACE_NOT_FOUND_RESPONSE,
PROJECT_NAME_TAKEN_RESPONSE,
DELETED_RESPONSE,
ARCHIVED_RESPONSE,
UNARCHIVED_RESPONSE,
)
class ProjectAPIEndpoint(BaseAPIView):
"""Project Endpoints to create, update, list, retrieve and delete endpoint"""
class ProjectListCreateAPIEndpoint(BaseAPIView):
"""Project List and Create Endpoint"""
serializer_class = ProjectSerializer
model = Project
webhook_event = "project"
permission_classes = [ProjectBasePermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -104,42 +131,87 @@ class ProjectAPIEndpoint(BaseAPIView):
.distinct()
)
def get(self, request, slug, pk=None):
if pk is None:
sort_order_query = ProjectMember.objects.filter(
member=request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order")
projects = (
self.get_queryset()
.annotate(sort_order=Subquery(sort_order_query))
.prefetch_related(
Prefetch(
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug, is_active=True
).select_related("member"),
)
)
.order_by(request.GET.get("order_by", "sort_order"))
)
return self.paginate(
request=request,
queryset=(projects),
on_results=lambda projects: ProjectSerializer(
projects, many=True, fields=self.fields, expand=self.expand
).data,
)
project = self.get_queryset().get(workspace__slug=slug, pk=pk)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
@project_docs(
operation_id="list_projects",
summary="List or retrieve projects",
description="Retrieve all projects in a workspace or get details of a specific project.",
parameters=[
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
ORDER_BY_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
ProjectSerializer,
"PaginatedProjectResponse",
"Paginated list of projects",
"Paginated Projects",
),
404: PROJECT_NOT_FOUND_RESPONSE,
},
)
def get(self, request, slug):
"""List projects
Retrieve all projects in a workspace or get details of a specific project.
Returns projects ordered by user's custom sort order with member information.
"""
sort_order_query = ProjectMember.objects.filter(
member=request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order")
projects = (
self.get_queryset()
.annotate(sort_order=Subquery(sort_order_query))
.prefetch_related(
Prefetch(
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug, is_active=True
).select_related("member"),
)
)
.order_by(request.GET.get("order_by", "sort_order"))
)
return self.paginate(
request=request,
queryset=(projects),
on_results=lambda projects: ProjectSerializer(
projects, many=True, fields=self.fields, expand=self.expand
).data,
)
@project_docs(
operation_id="create_project",
summary="Create project",
description="Create a new project in the workspace with default states and member assignments.",
request=OpenApiRequest(
request=ProjectCreateSerializer,
examples=[PROJECT_CREATE_EXAMPLE],
),
responses={
201: OpenApiResponse(
description="Project created successfully",
response=ProjectSerializer,
examples=[PROJECT_EXAMPLE],
),
404: WORKSPACE_NOT_FOUND_RESPONSE,
409: PROJECT_NAME_TAKEN_RESPONSE,
},
)
def post(self, request, slug):
"""Create project
Create a new project in the workspace with default states and member assignments.
Automatically adds the creator as admin and sets up default workflow states.
"""
try:
workspace = Workspace.objects.get(slug=slug)
serializer = ProjectSerializer(
serializer = ProjectCreateSerializer(
data={**request.data}, context={"workspace_id": workspace.id}
)
if serializer.is_valid():
@@ -147,25 +219,25 @@ class ProjectAPIEndpoint(BaseAPIView):
# Add the user as Administrator to the project
_ = ProjectMember.objects.create(
project_id=serializer.data["id"], member=request.user, role=20
project_id=serializer.instance.id, member=request.user, role=20
)
# Also create the issue property for the user
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"], user=request.user
project_id=serializer.instance.id, user=request.user
)
if serializer.data["project_lead"] is not None and str(
serializer.data["project_lead"]
if serializer.instance.project_lead is not None and str(
serializer.instance.project_lead
) != str(request.user.id):
ProjectMember.objects.create(
project_id=serializer.data["id"],
member_id=serializer.data["project_lead"],
project_id=serializer.instance.id,
member_id=serializer.instance.project_lead,
role=20,
)
# Also create the issue property for the user
IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user_id=serializer.data["project_lead"],
project_id=serializer.instance.id,
user_id=serializer.instance.project_lead,
)
# Default states
@@ -219,7 +291,7 @@ class ProjectAPIEndpoint(BaseAPIView):
]
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = self.get_queryset().filter(pk=serializer.instance.id).first()
# Model activity
model_activity.delay(
@@ -251,7 +323,131 @@ class ProjectAPIEndpoint(BaseAPIView):
status=status.HTTP_409_CONFLICT,
)
class ProjectDetailAPIEndpoint(BaseAPIView):
"""Project Endpoints to update, retrieve and delete endpoint"""
serializer_class = ProjectSerializer
model = Project
webhook_event = "project"
permission_classes = [ProjectBasePermission]
use_read_replica = True
def get_queryset(self):
return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(
Q(
project_projectmember__member=self.request.user,
project_projectmember__is_active=True,
)
| Q(network=2)
)
.select_related(
"workspace", "workspace__owner", "default_assignee", "project_lead"
)
.annotate(
is_member=Exists(
ProjectMember.objects.filter(
member=self.request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
)
)
)
.annotate(
total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False, is_active=True
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
total_cycles=Cycle.objects.filter(project_id=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
total_modules=Module.objects.filter(project_id=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"),
member_id=self.request.user.id,
is_active=True,
).values("role")
)
.annotate(
is_deployed=Exists(
DeployBoard.objects.filter(
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
)
)
)
.order_by(self.kwargs.get("order_by", "-created_at"))
.distinct()
)
@project_docs(
operation_id="retrieve_project",
summary="Retrieve project",
description="Retrieve details of a specific project.",
parameters=[
PROJECT_PK_PARAMETER,
],
responses={
200: OpenApiResponse(
description="Project details",
response=ProjectSerializer,
examples=[PROJECT_EXAMPLE],
),
404: PROJECT_NOT_FOUND_RESPONSE,
},
)
def get(self, request, slug, pk):
"""Retrieve project
Retrieve details of a specific project.
"""
project = self.get_queryset().get(workspace__slug=slug, pk=pk)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
@project_docs(
operation_id="update_project",
summary="Update project",
description="Partially update an existing project's properties like name, description, or settings.",
parameters=[
PROJECT_PK_PARAMETER,
],
request=OpenApiRequest(
request=ProjectUpdateSerializer,
examples=[PROJECT_UPDATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="Project updated successfully",
response=ProjectSerializer,
examples=[PROJECT_EXAMPLE],
),
404: PROJECT_NOT_FOUND_RESPONSE,
409: PROJECT_NAME_TAKEN_RESPONSE,
},
)
def patch(self, request, slug, pk):
"""Update project
Partially update an existing project's properties like name, description, or settings.
Tracks changes in model activity logs for audit purposes.
"""
try:
workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk)
@@ -267,7 +463,7 @@ class ProjectAPIEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectSerializer(
serializer = ProjectUpdateSerializer(
project,
data={**request.data, "intake_view": intake_view},
context={"workspace_id": workspace.id},
@@ -287,7 +483,7 @@ class ProjectAPIEndpoint(BaseAPIView):
is_default=True,
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = self.get_queryset().filter(pk=serializer.instance.id).first()
model_activity.delay(
model_name="project",
@@ -318,7 +514,23 @@ class ProjectAPIEndpoint(BaseAPIView):
status=status.HTTP_409_CONFLICT,
)
@project_docs(
operation_id="delete_project",
summary="Delete project",
description="Permanently remove a project and all its associated data from the workspace.",
parameters=[
PROJECT_PK_PARAMETER,
],
responses={
204: DELETED_RESPONSE,
},
)
def delete(self, request, slug, pk):
"""Delete project
Permanently remove a project and all its associated data from the workspace.
Only admins can delete projects and the action cannot be undone.
"""
project = Project.objects.get(pk=pk, workspace__slug=slug)
# Delete the user favorite cycle
UserFavorite.objects.filter(
@@ -342,16 +554,52 @@ class ProjectAPIEndpoint(BaseAPIView):
class ProjectArchiveUnarchiveAPIEndpoint(BaseAPIView):
"""Project Archive and Unarchive Endpoint"""
permission_classes = [ProjectBasePermission]
@project_docs(
operation_id="archive_project",
summary="Archive project",
description="Move a project to archived status, hiding it from active project lists.",
parameters=[
PROJECT_ID_PARAMETER,
],
request={},
responses={
204: ARCHIVED_RESPONSE,
},
)
def post(self, request, slug, project_id):
"""Archive project
Move a project to archived status, hiding it from active project lists.
Archived projects remain accessible but are excluded from regular workflows.
"""
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = timezone.now()
project.save()
UserFavorite.objects.filter(workspace__slug=slug, project=project_id).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@project_docs(
operation_id="unarchive_project",
summary="Unarchive project",
description="Restore an archived project to active status, making it available in regular workflows.",
parameters=[
PROJECT_ID_PARAMETER,
],
request={},
responses={
204: UNARCHIVED_RESPONSE,
},
)
def delete(self, request, slug, project_id):
"""Unarchive project
Restore an archived project to active status, making it available in regular workflows.
The project will reappear in active project lists and become fully functional.
"""
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = None
project.save()

View File

@@ -4,19 +4,41 @@ from django.db import IntegrityError
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiResponse, OpenApiExample, OpenApiRequest
# Module imports
from plane.api.serializers import StateSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import Issue, State
# Module imports
from .base import BaseAPIView
from plane.utils.openapi import (
state_docs,
STATE_ID_PARAMETER,
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
create_paginated_response,
# Request Examples
STATE_CREATE_EXAMPLE,
STATE_UPDATE_EXAMPLE,
# Response Examples
STATE_EXAMPLE,
INVALID_REQUEST_RESPONSE,
STATE_NAME_EXISTS_RESPONSE,
DELETED_RESPONSE,
STATE_CANNOT_DELETE_RESPONSE,
EXTERNAL_ID_EXISTS_RESPONSE,
)
class StateAPIEndpoint(BaseAPIView):
class StateListCreateAPIEndpoint(BaseAPIView):
"""State List and Create Endpoint"""
serializer_class = StateSerializer
model = State
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
@@ -33,7 +55,30 @@ class StateAPIEndpoint(BaseAPIView):
.distinct()
)
@state_docs(
operation_id="create_state",
summary="Create state",
description="Create a new workflow state for a project with specified name, color, and group.",
request=OpenApiRequest(
request=StateSerializer,
examples=[STATE_CREATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="State created",
response=StateSerializer,
examples=[STATE_EXAMPLE],
),
400: INVALID_REQUEST_RESPONSE,
409: STATE_NAME_EXISTS_RESPONSE,
},
)
def post(self, request, slug, project_id):
"""Create state
Create a new workflow state for a project with specified name, color, and group.
Supports external ID tracking for integration purposes.
"""
try:
serializer = StateSerializer(
data=request.data, context={"project_id": project_id}
@@ -80,14 +125,31 @@ class StateAPIEndpoint(BaseAPIView):
status=status.HTTP_409_CONFLICT,
)
def get(self, request, slug, project_id, state_id=None):
if state_id:
serializer = StateSerializer(
self.get_queryset().get(pk=state_id),
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
@state_docs(
operation_id="list_states",
summary="List states",
description="Retrieve all workflow states for a project.",
parameters=[
CURSOR_PARAMETER,
PER_PAGE_PARAMETER,
FIELDS_PARAMETER,
EXPAND_PARAMETER,
],
responses={
200: create_paginated_response(
StateSerializer,
"PaginatedStateResponse",
"Paginated list of states",
"Paginated States",
),
},
)
def get(self, request, slug, project_id):
"""List states
Retrieve all workflow states for a project.
Returns paginated results when listing all states.
"""
return self.paginate(
request=request,
queryset=(self.get_queryset()),
@@ -96,7 +158,76 @@ class StateAPIEndpoint(BaseAPIView):
).data,
)
class StateDetailAPIEndpoint(BaseAPIView):
"""State Detail Endpoint"""
serializer_class = StateSerializer
model = State
permission_classes = [ProjectEntityPermission]
use_read_replica = True
def get_queryset(self):
return (
State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.filter(is_triage=False)
.filter(project__archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
.distinct()
)
@state_docs(
operation_id="retrieve_state",
summary="Retrieve state",
description="Retrieve details of a specific state.",
parameters=[
STATE_ID_PARAMETER,
],
responses={
200: OpenApiResponse(
description="State retrieved",
response=StateSerializer,
examples=[STATE_EXAMPLE],
),
},
)
def get(self, request, slug, project_id, state_id):
"""Retrieve state
Retrieve details of a specific state.
Returns paginated results when listing all states.
"""
serializer = StateSerializer(
self.get_queryset().get(pk=state_id),
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
@state_docs(
operation_id="delete_state",
summary="Delete state",
description="Permanently remove a workflow state from a project. Default states and states with existing work items cannot be deleted.",
parameters=[
STATE_ID_PARAMETER,
],
responses={
204: DELETED_RESPONSE,
400: STATE_CANNOT_DELETE_RESPONSE,
},
)
def delete(self, request, slug, project_id, state_id):
"""Delete state
Permanently remove a workflow state from a project.
Default states and states with existing work items cannot be deleted.
"""
state = State.objects.get(
is_triage=False, pk=state_id, project_id=project_id, workspace__slug=slug
)
@@ -119,7 +250,33 @@ class StateAPIEndpoint(BaseAPIView):
state.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, state_id=None):
@state_docs(
operation_id="update_state",
summary="Update state",
description="Partially update an existing workflow state's properties like name, color, or group.",
parameters=[
STATE_ID_PARAMETER,
],
request=OpenApiRequest(
request=StateSerializer,
examples=[STATE_UPDATE_EXAMPLE],
),
responses={
200: OpenApiResponse(
description="State updated",
response=StateSerializer,
examples=[STATE_EXAMPLE],
),
400: INVALID_REQUEST_RESPONSE,
409: EXTERNAL_ID_EXISTS_RESPONSE,
},
)
def patch(self, request, slug, project_id, state_id):
"""Update state
Partially update an existing workflow state's properties like name, color, or group.
Validates external ID uniqueness if provided.
"""
state = State.objects.get(
workspace__slug=slug, project_id=project_id, pk=state_id
)

View File

@@ -0,0 +1,37 @@
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from drf_spectacular.utils import OpenApiResponse
# Module imports
from plane.api.serializers import UserLiteSerializer
from plane.api.views.base import BaseAPIView
from plane.db.models import User
from plane.utils.openapi.decorators import user_docs
from plane.utils.openapi import USER_EXAMPLE
class UserEndpoint(BaseAPIView):
serializer_class = UserLiteSerializer
model = User
@user_docs(
operation_id="get_current_user",
summary="Get current user",
description="Retrieve the authenticated user's profile information including basic details.",
responses={
200: OpenApiResponse(
description="Current user profile",
response=UserLiteSerializer,
examples=[USER_EXAMPLE],
),
},
)
def get(self, request):
"""Get current user
Retrieve the authenticated user's profile information including basic details.
Returns user data based on the current authentication context.
"""
serializer = UserLiteSerializer(request.user)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -75,12 +75,12 @@ class ProjectEntityPermission(BasePermission):
return False
# Handle requests based on project__identifier
if hasattr(view, "project__identifier") and view.project__identifier:
if hasattr(view, "project_identifier") and view.project_identifier:
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
project__identifier=view.project__identifier,
project__identifier=view.project_identifier,
is_active=True,
).exists()

View File

@@ -96,6 +96,7 @@ from .page import (
SubPageSerializer,
PageDetailSerializer,
PageVersionSerializer,
PageBinaryUpdateSerializer,
PageVersionDetailSerializer,
)

View File

@@ -1,3 +1,5 @@
from lxml import html
# Django imports
from django.utils import timezone
@@ -16,7 +18,15 @@ from plane.db.models import (
DraftIssueLabel,
DraftIssueCycle,
DraftIssueModule,
ProjectMember,
EstimatePoint,
)
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
validate_binary_data,
)
from plane.app.permissions import ROLE
class DraftIssueCreateSerializer(BaseSerializer):
@@ -57,14 +67,84 @@ class DraftIssueCreateSerializer(BaseSerializer):
data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
def validate(self, attrs):
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
attrs.get("start_date", None) is not None
and attrs.get("target_date", None) is not None
and attrs.get("start_date", None) > attrs.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
return data
# Validate description content for security
if "description" in attrs and attrs["description"]:
is_valid, error_msg = validate_json_content(attrs["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if "description_html" in attrs and attrs["description_html"]:
is_valid, error_msg = validate_html_content(attrs["description_html"])
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
if "description_binary" in attrs and attrs["description_binary"]:
is_valid, error_msg = validate_binary_data(attrs["description_binary"])
if not is_valid:
raise serializers.ValidationError({"description_binary": error_msg})
# Validate assignees are from project
if attrs.get("assignee_ids", []):
attrs["assignee_ids"] = ProjectMember.objects.filter(
project_id=self.context["project_id"],
role__gte=ROLE.MEMBER.value,
is_active=True,
member_id__in=attrs["assignee_ids"],
).values_list("member_id", flat=True)
# Validate labels are from project
if attrs.get("label_ids"):
label_ids = [label.id for label in attrs["label_ids"]]
attrs["label_ids"] = list(
Label.objects.filter(
project_id=self.context.get("project_id"), id__in=label_ids
).values_list("id", flat=True)
)
# # Check state is from the project only else raise validation error
if (
attrs.get("state")
and not State.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("state").id,
).exists()
):
raise serializers.ValidationError(
"State is not valid please pass a valid state_id"
)
# # Check parent issue is from workspace as it can be cross workspace
if (
attrs.get("parent")
and not Issue.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("parent").id,
).exists()
):
raise serializers.ValidationError(
"Parent is not valid issue_id please pass a valid issue_id"
)
if (
attrs.get("estimate_point")
and not EstimatePoint.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("estimate_point").id,
).exists()
):
raise serializers.ValidationError(
"Estimate point is not valid please pass a valid estimate_point_id"
)
return attrs
def create(self, validated_data):
assignees = validated_data.pop("assignee_ids", None)
@@ -89,14 +169,14 @@ class DraftIssueCreateSerializer(BaseSerializer):
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
assignee_id=assignee_id,
draft_issue=issue,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
for assignee_id in assignees
],
batch_size=10,
)
@@ -105,14 +185,14 @@ class DraftIssueCreateSerializer(BaseSerializer):
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
label_id=label_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
for label_id in labels
],
batch_size=10,
)
@@ -163,14 +243,14 @@ class DraftIssueCreateSerializer(BaseSerializer):
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
assignee_id=assignee_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
for assignee_id in assignees
],
batch_size=10,
)
@@ -180,7 +260,7 @@ class DraftIssueCreateSerializer(BaseSerializer):
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
label_id=label,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,

View File

@@ -1,3 +1,5 @@
from lxml import html
# Django imports
from django.utils import timezone
from django.core.validators import URLValidator
@@ -37,6 +39,12 @@ from plane.db.models import (
IssueVersion,
IssueDescriptionVersion,
ProjectMember,
EstimatePoint,
)
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
validate_binary_data,
)
@@ -119,6 +127,23 @@ class IssueCreateSerializer(BaseSerializer):
):
raise serializers.ValidationError("Start date cannot exceed target date")
# Validate description content for security
if "description" in attrs and attrs["description"]:
is_valid, error_msg = validate_json_content(attrs["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if "description_html" in attrs and attrs["description_html"]:
is_valid, error_msg = validate_html_content(attrs["description_html"])
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
if "description_binary" in attrs and attrs["description_binary"]:
is_valid, error_msg = validate_binary_data(attrs["description_binary"])
if not is_valid:
raise serializers.ValidationError({"description_binary": error_msg})
# Validate assignees are from project
if attrs.get("assignee_ids", []):
attrs["assignee_ids"] = ProjectMember.objects.filter(
project_id=self.context["project_id"],
@@ -127,6 +152,51 @@ class IssueCreateSerializer(BaseSerializer):
member_id__in=attrs["assignee_ids"],
).values_list("member_id", flat=True)
# Validate labels are from project
if attrs.get("label_ids"):
label_ids = [label.id for label in attrs["label_ids"]]
attrs["label_ids"] = list(
Label.objects.filter(
project_id=self.context.get("project_id"),
id__in=label_ids,
).values_list("id", flat=True)
)
# Check state is from the project only else raise validation error
if (
attrs.get("state")
and not State.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("state").id,
).exists()
):
raise serializers.ValidationError(
"State is not valid please pass a valid state_id"
)
# Check parent issue is from workspace as it can be cross workspace
if (
attrs.get("parent")
and not Issue.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("parent").id,
).exists()
):
raise serializers.ValidationError(
"Parent is not valid issue_id please pass a valid issue_id"
)
if (
attrs.get("estimate_point")
and not EstimatePoint.objects.filter(
project_id=self.context.get("project_id"),
pk=attrs.get("estimate_point").id,
).exists()
):
raise serializers.ValidationError(
"Estimate point is not valid please pass a valid estimate_point_id"
)
return attrs
def create(self, validated_data):
@@ -190,14 +260,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueLabel.objects.bulk_create(
[
IssueLabel(
label=label,
label_id=label_id,
issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
for label_id in labels
],
batch_size=10,
)
@@ -243,14 +313,14 @@ class IssueCreateSerializer(BaseSerializer):
IssueLabel.objects.bulk_create(
[
IssueLabel(
label=label,
label_id=label_id,
issue=instance,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
for label_id in labels
],
batch_size=10,
ignore_conflicts=True,

View File

@@ -1,8 +1,14 @@
# Third party imports
from rest_framework import serializers
import base64
# Module imports
from .base import BaseSerializer
from plane.utils.content_validator import (
validate_binary_data,
validate_html_content,
validate_json_content,
)
from plane.db.models import (
Page,
PageLog,
@@ -186,3 +192,71 @@ class PageVersionDetailSerializer(BaseSerializer):
"updated_by",
]
read_only_fields = ["workspace", "page"]
class PageBinaryUpdateSerializer(serializers.Serializer):
"""Serializer for updating page binary description with validation"""
description_binary = serializers.CharField(required=False, allow_blank=True)
description_html = serializers.CharField(required=False, allow_blank=True)
description = serializers.JSONField(required=False, allow_null=True)
def validate_description_binary(self, value):
"""Validate the base64-encoded binary data"""
if not value:
return value
try:
# Decode the base64 data
binary_data = base64.b64decode(value)
# Validate the binary data
is_valid, error_message = validate_binary_data(binary_data)
if not is_valid:
raise serializers.ValidationError(
f"Invalid binary data: {error_message}"
)
return binary_data
except Exception as e:
if isinstance(e, serializers.ValidationError):
raise
raise serializers.ValidationError("Failed to decode base64 data")
def validate_description_html(self, value):
"""Validate the HTML content"""
if not value:
return value
# Use the validation function from utils
is_valid, error_message = validate_html_content(value)
if not is_valid:
raise serializers.ValidationError(error_message)
return value
def validate_description(self, value):
"""Validate the JSON description"""
if not value:
return value
# Use the validation function from utils
is_valid, error_message = validate_json_content(value)
if not is_valid:
raise serializers.ValidationError(error_message)
return value
def update(self, instance, validated_data):
"""Update the page instance with validated data"""
if "description_binary" in validated_data:
instance.description_binary = validated_data.get("description_binary")
if "description_html" in validated_data:
instance.description_html = validated_data.get("description_html")
if "description" in validated_data:
instance.description = validated_data.get("description")
instance.save()
return instance

View File

@@ -13,6 +13,11 @@ from plane.db.models import (
DeployBoard,
ProjectPublicMember,
)
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
validate_binary_data,
)
class ProjectSerializer(BaseSerializer):
@@ -24,55 +29,77 @@ class ProjectSerializer(BaseSerializer):
fields = "__all__"
read_only_fields = ["workspace", "deleted_at"]
def validate_name(self, name):
project_id = self.instance.id if self.instance else None
workspace_id = self.context["workspace_id"]
project = Project.objects.filter(name=name, workspace_id=workspace_id)
if project_id:
project = project.exclude(id=project_id)
if project.exists():
raise serializers.ValidationError(
detail="PROJECT_NAME_ALREADY_EXIST",
)
return name
def validate_identifier(self, identifier):
project_id = self.instance.id if self.instance else None
workspace_id = self.context["workspace_id"]
project = Project.objects.filter(
identifier=identifier, workspace_id=workspace_id
)
if project_id:
project = project.exclude(id=project_id)
if project.exists():
raise serializers.ValidationError(
detail="PROJECT_IDENTIFIER_ALREADY_EXIST",
)
return identifier
def validate(self, data):
# Validate description content for security
if "description" in data and data["description"]:
# For Project, description might be text field, not JSON
if isinstance(data["description"], dict):
is_valid, error_msg = validate_json_content(data["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if "description_text" in data and data["description_text"]:
is_valid, error_msg = validate_json_content(data["description_text"])
if not is_valid:
raise serializers.ValidationError({"description_text": error_msg})
if "description_html" in data and data["description_html"]:
if isinstance(data["description_html"], dict):
is_valid, error_msg = validate_json_content(data["description_html"])
else:
is_valid, error_msg = validate_html_content(
str(data["description_html"])
)
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
return data
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(detail="Project Identifier is required")
workspace_id = self.context["workspace_id"]
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
_ = ProjectIdentifier.objects.create(
name=project.identifier,
project=project,
workspace_id=self.context["workspace_id"],
project = Project.objects.create(**validated_data, workspace_id=workspace_id)
ProjectIdentifier.objects.create(
name=project.identifier, project=project, workspace_id=workspace_id
)
return project
def update(self, instance, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
# If identifier is not passed update the project and return
if identifier == "":
project = super().update(instance, validated_data)
return project
# If no Project Identifier is found create it
project_identifier = ProjectIdentifier.objects.filter(
name=identifier, workspace_id=instance.workspace_id
).first()
if project_identifier is None:
project = super().update(instance, validated_data)
project_identifier = ProjectIdentifier.objects.filter(
project=project
).first()
if project_identifier is not None:
project_identifier.name = identifier
project_identifier.save()
return project
# If found check if the project_id to be updated and identifier project id is same
if project_identifier.project_id == instance.id:
# If same pass update
project = super().update(instance, validated_data)
return project
# If not same fail update
raise serializers.ValidationError(detail="Project Identifier is already taken")
class ProjectLiteSerializer(BaseSerializer):
class Meta:

View File

@@ -24,6 +24,11 @@ from plane.db.models import (
)
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
from plane.utils.url import contains_url
from plane.utils.content_validator import (
validate_html_content,
validate_json_content,
validate_binary_data,
)
# Django imports
from django.core.validators import URLValidator
@@ -76,7 +81,6 @@ class WorkspaceLiteSerializer(BaseSerializer):
class WorkSpaceMemberSerializer(DynamicBaseSerializer):
member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
@@ -93,7 +97,6 @@ class WorkspaceMemberMeSerializer(BaseSerializer):
class WorkspaceMemberAdminSerializer(DynamicBaseSerializer):
member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
class Meta:
model = WorkspaceMember
@@ -314,6 +317,25 @@ class StickySerializer(BaseSerializer):
read_only_fields = ["workspace", "owner"]
extra_kwargs = {"name": {"required": False}}
def validate(self, data):
# Validate description content for security
if "description" in data and data["description"]:
is_valid, error_msg = validate_json_content(data["description"])
if not is_valid:
raise serializers.ValidationError({"description": error_msg})
if "description_html" in data and data["description_html"]:
is_valid, error_msg = validate_html_content(data["description_html"])
if not is_valid:
raise serializers.ValidationError({"description_html": error_msg})
if "description_binary" in data and data["description_binary"]:
is_valid, error_msg = validate_binary_data(data["description_binary"])
if not is_valid:
raise serializers.ValidationError({"description_binary": error_msg})
return data
class WorkspaceUserPreferenceSerializer(BaseSerializer):
class Meta:

View File

@@ -16,8 +16,6 @@ from plane.db.models import (
IssueView,
ProjectPage,
Workspace,
CycleIssue,
ModuleIssue,
ProjectMember,
)
from plane.utils.build_chart import build_analytics_chart

View File

@@ -740,7 +740,8 @@ class WorkspaceAssetDownloadEndpoint(BaseAPIView):
storage = S3Storage(request=request)
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition=f"attachment; filename={asset.asset.name}",
disposition="attachment",
filename=asset.attributes.get("name", uuid.uuid4().hex),
)
return HttpResponseRedirect(signed_url)
@@ -767,7 +768,8 @@ class ProjectAssetDownloadEndpoint(BaseAPIView):
storage = S3Storage(request=request)
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition=f"attachment; filename={asset.asset.name}",
disposition="attachment",
filename=asset.attributes.get("name", uuid.uuid4().hex),
)
return HttpResponseRedirect(signed_url)

View File

@@ -24,6 +24,7 @@ from rest_framework.viewsets import ModelViewSet
from plane.authentication.session import BaseSessionAuthentication
from plane.utils.exception_logger import log_exception
from plane.utils.paginator import BasePaginator
from plane.utils.core.mixins import ReadReplicaControlMixin
class TimezoneMixin:
@@ -40,7 +41,7 @@ class TimezoneMixin:
timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class BaseViewSet(TimezoneMixin, ReadReplicaControlMixin, ModelViewSet, BasePaginator):
model = None
permission_classes = [IsAuthenticated]
@@ -53,6 +54,8 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
search_fields = []
use_read_replica = False
def get_queryset(self):
try:
return self.model.objects.all()
@@ -149,7 +152,7 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return expand if expand else None
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
class BaseAPIView(TimezoneMixin, ReadReplicaControlMixin, APIView, BasePaginator):
permission_classes = [IsAuthenticated]
filter_backends = (DjangoFilterBackend, SearchFilter)
@@ -160,6 +163,8 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
search_fields = []
use_read_replica = False
def filter_queryset(self, queryset):
for backend in list(self.filter_backends):
queryset = backend().filter_queryset(self.request, queryset, self)

View File

@@ -19,6 +19,7 @@ from plane.db.models import IssueActivity, IssueComment, CommentReaction, Intake
class IssueActivityEndpoint(BaseAPIView):
permission_classes = [ProjectEntityPermission]
use_read_replica = True
@method_decorator(gzip_page)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])

View File

@@ -232,6 +232,8 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
class UnreadNotificationEndpoint(BaseAPIView):
use_read_replica = True
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE"
)

View File

@@ -25,6 +25,7 @@ from plane.app.serializers import (
PageSerializer,
SubPageSerializer,
PageDetailSerializer,
PageBinaryUpdateSerializer,
)
from plane.db.models import (
Page,
@@ -40,7 +41,7 @@ from ..base import BaseAPIView, BaseViewSet
from plane.bgtasks.page_transaction_task import page_transaction
from plane.bgtasks.page_version_task import page_version
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.bgtasks.copy_s3_object import copy_s3_objects
from plane.bgtasks.copy_s3_object import copy_s3_objects_of_description_and_assets
def unarchive_archive_page_and_descendants(page_id, archived_at):
@@ -538,32 +539,27 @@ class PagesDescriptionViewSet(BaseViewSet):
{"description_html": page.description_html}, cls=DjangoJSONEncoder
)
# Get the base64 data from the request
base64_data = request.data.get("description_binary")
# If base64 data is provided
if base64_data:
# Decode the base64 data to bytes
new_binary_data = base64.b64decode(base64_data)
# capture the page transaction
# Use serializer for validation and update
serializer = PageBinaryUpdateSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
# Capture the page transaction
if request.data.get("description_html"):
page_transaction.delay(
new_value=request.data, old_value=existing_instance, page_id=pk
)
# Store the updated binary data
page.description_binary = new_binary_data
page.description_html = request.data.get("description_html")
page.description = request.data.get("description")
page.save()
# Return a success response
# Update the page using serializer
updated_page = serializer.save()
# Run background tasks
page_version.delay(
page_id=page.id,
page_id=updated_page.id,
existing_instance=existing_instance,
user_id=request.user.id,
)
return Response({"message": "Updated successfully"})
else:
return Response({"error": "No binary data provided"})
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class PageDuplicateEndpoint(BaseAPIView):
@@ -606,7 +602,7 @@ class PageDuplicateEndpoint(BaseAPIView):
)
# Copy the s3 objects uploaded in the page
copy_s3_objects.delay(
copy_s3_objects_of_description_and_assets.delay(
entity_name="PAGE",
entity_identifier=page.id,
project_id=project_id,

View File

@@ -46,6 +46,7 @@ class ProjectViewSet(BaseViewSet):
serializer_class = ProjectListSerializer
model = Project
webhook_event = "project"
use_read_replica = True
def get_queryset(self):
sort_order = ProjectMember.objects.filter(
@@ -239,205 +240,165 @@ class ProjectViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def create(self, request, slug):
try:
workspace = Workspace.objects.get(slug=slug)
workspace = Workspace.objects.get(slug=slug)
serializer = ProjectSerializer(
data={**request.data}, context={"workspace_id": workspace.id}
serializer = ProjectSerializer(
data={**request.data}, context={"workspace_id": workspace.id}
)
if serializer.is_valid():
serializer.save()
# Add the user as Administrator to the project
_ = ProjectMember.objects.create(
project_id=serializer.data["id"], member=request.user, role=20
)
# Also create the issue property for the user
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"], user=request.user
)
if serializer.is_valid():
serializer.save()
# Add the user as Administrator to the project
_ = ProjectMember.objects.create(
project_id=serializer.data["id"], member=request.user, role=20
if serializer.data["project_lead"] is not None and str(
serializer.data["project_lead"]
) != str(request.user.id):
ProjectMember.objects.create(
project_id=serializer.data["id"],
member_id=serializer.data["project_lead"],
role=20,
)
# Also create the issue property for the user
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"], user=request.user
IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user_id=serializer.data["project_lead"],
)
if serializer.data["project_lead"] is not None and str(
serializer.data["project_lead"]
) != str(request.user.id):
ProjectMember.objects.create(
project_id=serializer.data["id"],
member_id=serializer.data["project_lead"],
role=20,
)
# Also create the issue property for the user
IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user_id=serializer.data["project_lead"],
)
# Default states
states = [
{
"name": "Backlog",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
{
"name": "Todo",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
{
"name": "In Progress",
"color": "#F59E0B",
"sequence": 35000,
"group": "started",
},
{
"name": "Done",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},
]
State.objects.bulk_create(
[
State(
name=state["name"],
color=state["color"],
project=serializer.instance,
sequence=state["sequence"],
workspace=serializer.instance.workspace,
group=state["group"],
default=state.get("default", False),
created_by=request.user,
)
for state in states
]
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
# Create the model activity
model_activity.delay(
model_name="project",
model_id=str(project.id),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{
"name": "The project name is already taken",
"code": "PROJECT_NAME_ALREADY_EXIST",
},
status=status.HTTP_409_CONFLICT,
)
except Workspace.DoesNotExist:
return Response(
{"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
)
except serializers.ValidationError:
return Response(
# Default states
states = [
{
"identifier": "The project identifier is already taken",
"code": "PROJECT_IDENTIFIER_ALREADY_EXIST",
"name": "Backlog",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
status=status.HTTP_409_CONFLICT,
{
"name": "Todo",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
{
"name": "In Progress",
"color": "#F59E0B",
"sequence": 35000,
"group": "started",
},
{
"name": "Done",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},
]
State.objects.bulk_create(
[
State(
name=state["name"],
color=state["color"],
project=serializer.instance,
sequence=state["sequence"],
workspace=serializer.instance.workspace,
group=state["group"],
default=state.get("default", False),
created_by=request.user,
)
for state in states
]
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
# Create the model activity
model_activity.delay(
model_name="project",
model_id=str(project.id),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, pk=None):
try:
if not ProjectMember.objects.filter(
member=request.user,
workspace__slug=slug,
project_id=pk,
role=20,
is_active=True,
).exists():
return Response(
{"error": "You don't have the required permissions."},
status=status.HTTP_403_FORBIDDEN,
)
workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk)
intake_view = request.data.get("inbox_view", project.intake_view)
current_instance = json.dumps(
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
if project.archived_at:
return Response(
{"error": "Archived projects cannot be updated"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectSerializer(
project,
data={**request.data, "intake_view": intake_view},
context={"workspace_id": workspace.id},
partial=True,
)
if serializer.is_valid():
serializer.save()
if intake_view:
intake = Intake.objects.filter(
project=project, is_default=True
).first()
if not intake:
Intake.objects.create(
name=f"{project.name} Intake",
project=project,
is_default=True,
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
model_activity.delay(
model_name="project",
model_id=str(project.id),
requested_data=request.data,
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
status=status.HTTP_409_CONFLICT,
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
# try:
if not ProjectMember.objects.filter(
member=request.user,
workspace__slug=slug,
project_id=pk,
role=20,
is_active=True,
).exists():
return Response(
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
{"error": "You don't have the required permissions."},
status=status.HTTP_403_FORBIDDEN,
)
except serializers.ValidationError:
workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk)
intake_view = request.data.get("inbox_view", project.intake_view)
current_instance = json.dumps(
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
if project.archived_at:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_409_CONFLICT,
{"error": "Archived projects cannot be updated"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectSerializer(
project,
data={**request.data, "intake_view": intake_view},
context={"workspace_id": workspace.id},
partial=True,
)
if serializer.is_valid():
serializer.save()
if intake_view:
intake = Intake.objects.filter(project=project, is_default=True).first()
if not intake:
Intake.objects.create(
name=f"{project.name} Intake",
project=project,
is_default=True,
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
model_activity.delay(
model_name="project",
model_id=str(project.id),
requested_data=request.data,
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, pk):
if (
WorkspaceMember.objects.filter(

View File

@@ -312,6 +312,7 @@ class ProjectMemberUserEndpoint(BaseAPIView):
class UserProjectRolesEndpoint(BaseAPIView):
permission_classes = [WorkspaceUserPermission]
use_read_replica = True
def get(self, request, slug):
project_members = ProjectMember.objects.filter(

View File

@@ -116,6 +116,7 @@ class TimezoneEndpoint(APIView):
("Astrakhan", "Europe/Astrakhan"), # UTC+04:00
("Tbilisi", "Asia/Tbilisi"), # UTC+04:00
("Mauritius", "Indian/Mauritius"), # UTC+04:00
("Kabul", "Asia/Kabul"), # UTC+04:30
("Islamabad", "Asia/Karachi"), # UTC+05:00
("Karachi", "Asia/Karachi"), # UTC+05:00
("Tashkent", "Asia/Tashkent"), # UTC+05:00

View File

@@ -44,6 +44,7 @@ from django.views.decorators.vary import vary_on_cookie
class UserEndpoint(BaseViewSet):
serializer_class = UserSerializer
model = User
use_read_replica = True
def get_object(self):
return self.request.user

View File

@@ -177,6 +177,7 @@ class WorkSpaceViewSet(BaseViewSet):
class UserWorkSpacesEndpoint(BaseAPIView):
search_fields = ["name"]
filterset_fields = ["owner"]
use_read_replica = True
def get(self, request):
fields = [field for field in request.GET.get("fields", "").split(",") if field]

View File

@@ -10,7 +10,6 @@ from plane.app.views.base import BaseAPIView
from plane.db.models import Cycle
from plane.app.permissions import WorkspaceViewerPermission
from plane.app.serializers.cycle import CycleSerializer
from plane.utils.timezone_converter import user_timezone_converter
class WorkspaceCyclesEndpoint(BaseAPIView):

View File

@@ -12,6 +12,7 @@ from plane.utils.cache import cache_response
class WorkspaceEstimatesEndpoint(BaseAPIView):
permission_classes = [WorkspaceEntityPermission]
use_read_replica = True
@cache_response(60 * 60 * 2)
def get(self, request, slug):

View File

@@ -14,6 +14,8 @@ from plane.app.permissions import allow_permission, ROLE
class WorkspaceFavoriteEndpoint(BaseAPIView):
use_read_replica = True
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request, slug):
# the second filter is to check if the user is a member of the project

View File

@@ -80,7 +80,7 @@ class WorkspaceInvitationsViewset(BaseViewSet):
workspace_id=workspace.id,
member__email__in=[email.get("email") for email in emails],
is_active=True,
).select_related("member", "workspace", "workspace__owner")
).select_related("member", "member__avatar_asset")
if workspace_members:
return Response(

View File

@@ -12,6 +12,7 @@ from plane.utils.cache import cache_response
class WorkspaceLabelsEndpoint(BaseAPIView):
permission_classes = [WorkspaceViewerPermission]
use_read_replica = True
@cache_response(60 * 60 * 2)
def get(self, request, slug):

View File

@@ -28,15 +28,14 @@ class WorkSpaceMemberViewSet(BaseViewSet):
model = WorkspaceMember
search_fields = ["member__display_name", "member__first_name"]
use_read_replica = True
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"), is_active=True)
.select_related("workspace", "workspace__owner")
.select_related("member")
.prefetch_related("member__avatar_asset", "workspace__logo_asset")
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("member", "member__avatar_asset")
)
@allow_permission(
@@ -216,6 +215,8 @@ class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView):
use_read_replica = True
def get(self, request, slug):
draft_issue_count = (
DraftIssue.objects.filter(

View File

@@ -11,6 +11,7 @@ from plane.app.permissions import allow_permission, ROLE
class QuickLinkViewSet(BaseViewSet):
model = WorkspaceUserLink
use_read_replica = True
def get_serializer_class(self):
return WorkspaceUserLinkSerializer

View File

@@ -12,6 +12,7 @@ from plane.app.permissions import allow_permission, ROLE
class UserRecentVisitViewSet(BaseViewSet):
model = UserRecentVisit
use_read_replica = True
def get_serializer_class(self):
return WorkspaceRecentVisitSerializer

View File

@@ -13,6 +13,7 @@ from collections import defaultdict
class WorkspaceStatesEndpoint(BaseAPIView):
permission_classes = [WorkspaceEntityPermission]
use_read_replica = True
@cache_response(60 * 60 * 2)
def get(self, request, slug):

View File

@@ -12,6 +12,7 @@ from plane.app.serializers import StickySerializer
class WorkspaceStickyViewSet(BaseViewSet):
serializer_class = StickySerializer
model = Sticky
use_read_replica = True
def get_queryset(self):
return self.filter_queryset(

View File

@@ -13,6 +13,7 @@ from rest_framework import status
class WorkspaceUserPreferenceViewSet(BaseAPIView):
model = WorkspaceUserPreference
use_read_replica = True
def get_serializer_class(self):
return WorkspaceUserPreferenceSerializer

View File

@@ -18,7 +18,7 @@ from plane.authentication.adapter.error import (
class GitHubOAuthProvider(OauthAdapter):
token_url = "https://github.com/login/oauth/access_token"
userinfo_url = "https://api.github.com/user"
org_membership_url = f"https://api.github.com/orgs"
org_membership_url = "https://api.github.com/orgs"
provider = "github"
scope = "read:user user:email"

View File

@@ -83,8 +83,52 @@ def sync_with_external_service(entity_name, description_html):
return {}
def copy_assets(entity, entity_identifier, project_id, asset_ids, user_id):
duplicated_assets = []
workspace = entity.workspace
storage = S3Storage()
original_assets = FileAsset.objects.filter(
workspace=workspace, project_id=project_id, id__in=asset_ids
)
for original_asset in original_assets:
destination_key = (
f"{workspace.id}/{uuid.uuid4().hex}-{original_asset.attributes.get('name')}"
)
duplicated_asset = FileAsset.objects.create(
attributes={
"name": original_asset.attributes.get("name"),
"type": original_asset.attributes.get("type"),
"size": original_asset.attributes.get("size"),
},
asset=destination_key,
size=original_asset.size,
workspace=workspace,
created_by_id=user_id,
entity_type=original_asset.entity_type,
project_id=project_id,
storage_metadata=original_asset.storage_metadata,
**get_entity_id_field(original_asset.entity_type, entity_identifier),
)
storage.copy_object(original_asset.asset, destination_key)
duplicated_assets.append(
{
"new_asset_id": str(duplicated_asset.id),
"old_asset_id": str(original_asset.id),
}
)
if duplicated_assets:
FileAsset.objects.filter(
pk__in=[item["new_asset_id"] for item in duplicated_assets]
).update(is_uploaded=True)
return duplicated_assets
@shared_task
def copy_s3_objects(entity_name, entity_identifier, project_id, slug, user_id):
def copy_s3_objects_of_description_and_assets(
entity_name, entity_identifier, project_id, slug, user_id
):
"""
Step 1: Extract asset ids from the description_html of the entity
Step 2: Duplicate the assets
@@ -100,53 +144,20 @@ def copy_s3_objects(entity_name, entity_identifier, project_id, slug, user_id):
entity = model_class.objects.get(id=entity_identifier)
asset_ids = extract_asset_ids(entity.description_html, "image-component")
duplicated_assets = []
workspace = entity.workspace
storage = S3Storage()
original_assets = FileAsset.objects.filter(
workspace=workspace, project_id=project_id, id__in=asset_ids
duplicated_assets = copy_assets(
entity, entity_identifier, project_id, asset_ids, user_id
)
for original_asset in original_assets:
destination_key = f"{workspace.id}/{uuid.uuid4().hex}-{original_asset.attributes.get('name')}"
duplicated_asset = FileAsset.objects.create(
attributes={
"name": original_asset.attributes.get("name"),
"type": original_asset.attributes.get("type"),
"size": original_asset.attributes.get("size"),
},
asset=destination_key,
size=original_asset.size,
workspace=workspace,
created_by_id=user_id,
entity_type=original_asset.entity_type,
project_id=project_id,
storage_metadata=original_asset.storage_metadata,
**get_entity_id_field(original_asset.entity_type, entity_identifier),
)
storage.copy_object(original_asset.asset, destination_key)
duplicated_assets.append(
{
"new_asset_id": str(duplicated_asset.id),
"old_asset_id": str(original_asset.id),
}
)
updated_html = update_description(entity, duplicated_assets, "image-component")
if duplicated_assets:
FileAsset.objects.filter(
pk__in=[item["new_asset_id"] for item in duplicated_assets]
).update(is_uploaded=True)
updated_html = update_description(
entity, duplicated_assets, "image-component"
)
external_data = sync_with_external_service(entity_name, updated_html)
external_data = sync_with_external_service(entity_name, updated_html)
if external_data:
entity.description = external_data.get("description")
entity.description_binary = base64.b64decode(
external_data.get("description_binary")
)
entity.save()
if external_data:
entity.description = external_data.get("description")
entity.description_binary = base64.b64decode(
external_data.get("description_binary")
)
entity.save()
return
except Exception as e:

View File

@@ -21,8 +21,8 @@ from plane.utils.exception_logger import log_exception
def remove_unwanted_characters(input_text):
# Keep only alphanumeric characters, spaces, and dashes.
processed_text = re.sub(r"[^a-zA-Z0-9 \-]", "", input_text)
# Remove only control characters and potentially problematic characters for email subjects
processed_text = re.sub(r"[\x00-\x1F\x7F-\x9F]", "", input_text)
return processed_text

View File

@@ -30,7 +30,6 @@ from plane.db.models import (
)
from plane.settings.redis import redis_instance
from plane.utils.exception_logger import log_exception
from plane.bgtasks.webhook_task import webhook_activity
from plane.utils.issue_relation_mapper import get_inverse_relation
from plane.utils.uuid import is_valid_uuid

Some files were not shown because too many files have changed in this diff Show More