Compare commits

...

74 Commits

Author SHA1 Message Date
Prateek Shourya
5546bd2305 feat: sync mobx issue store with local db. 2024-12-30 21:51:51 +05:30
Aaryan Khandelwal
94f421f27d chore: add live server prettier config (#6287) 2024-12-27 21:03:20 +05:30
Aaryan Khandelwal
8d7425a3b7 [PE-182] refactor: pages' components and store for scalability (#6283)
* refactor: created a generic base page instance

* refactor: project store hooks

* chore: add missing prop declaration

* refactor: editor page root and body

* refactor: issue embed hook

* chore: update search entity types

* fix: version editor component

* fix: add page to favorites action

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
2024-12-27 20:41:38 +05:30
Anmol Singh Bhatia
211d5e1cd0 chore: code refactor and build fix (#6285)
* chore: code refactor and build fix

* chore: code refactor

* chore: code refactor
2024-12-27 18:18:45 +05:30
Vamsi Krishna
3c6bbaef3c fix: modified link behaviour to improve accessibility (#6284) 2024-12-27 17:46:40 +05:30
Prateek Shourya
4159d12959 [WEB-2889] fix: global views sorting when hyper model is enabled. (#6280) 2024-12-27 13:03:26 +05:30
Anmol Singh Bhatia
2f2f8dc5f4 [WEB-2880] chore: project detail response updated (#6281)
* chore: project detail response updated

* chore: code refactor
2024-12-27 09:17:35 +05:30
Anmol Singh Bhatia
756a71ca78 [WEB-2907] chore: issue store updated and code refactor (#6279)
* chore: issue and epic store updated and code refactor

* chore: layout ux copy updated
2024-12-26 20:01:32 +05:30
Vamsi Krishna
36b3328c5e fix: user role not updating in user profile (#6278) 2024-12-26 17:19:43 +05:30
Prateek Shourya
a5c1282e52 [WEB-2896] fix: mutation problem with issue properties while accepting an intake issue. (#6277) 2024-12-26 16:46:52 +05:30
devin-ai-integration[bot]
ed64168ca7 chore(utils): copy helper functions from web/helpers (#6264)
* chore(utils): copy helper functions from web/helpers

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* chore(utils): bump version to 0.24.2

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* chore: bump root package version to 0.24.2

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* fix: remove duplicate function and simplify auth utils

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* fix: improve HTML entity escaping in sanitizeHTML

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* fix: version changes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: sriram@plane.so <sriram@plane.so>
2024-12-26 15:27:40 +05:30
Bavisetti Narayan
f54f3a6091 chore: workspace entity search endpoint (#6272)
* chore: workspace entity search endpoint

* fix: editor entity search endpoint

* chore: restrict guest users

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-12-26 15:00:32 +05:30
Bavisetti Narayan
2d9464e841 chore: create unique constraints for webhook (#6257)
* chore: create unique constraints for webhook

* chore: updated the migration file
2024-12-24 21:00:50 +05:30
Vamsi Krishna
70f72a2b0f [WEB-2699]chore: added issue count for upcoming cycles (#6273)
* chore: added issue count for upcoming cycles

* chore: memoized show issue count
2024-12-24 20:53:06 +05:30
Vamsi Krishna
c0b5e0e766 fix: label creation (#6271) 2024-12-24 20:52:31 +05:30
Anmol Singh Bhatia
fedcdf0c84 [WEB-2879] chore sub issue analytics improvements (#6275)
* chore: epics type added to package

* chore: epic analytics helper function added

* chore: sub issue analytics mutation improvement
2024-12-24 20:52:03 +05:30
Bavisetti Narayan
ff936887d2 chore: quick link migration (#6274)
* chore: added workspace link queryset

* chore: added workspace in sort order
2024-12-24 20:51:15 +05:30
Anmol Singh Bhatia
ea78c2bceb fix: active cycle update payload (#6270) 2024-12-24 14:01:47 +05:30
Vamsi Krishna
ba1a314608 [WEB-1412]fix: split labels in kanban board (#6253)
* fix: split labels in kanban board

* chore: incresaed labels max render and moved labels to end of properties
chore: refactored label render component
2024-12-23 20:28:17 +05:30
Vamsi Krishna
3a6a8e3a97 fix: create view - layout drop down close (#6267) 2024-12-23 20:27:54 +05:30
Bavisetti Narayan
1735561ffd chore: remove the default intake state (#6252)
* chore: remove the default intake state

* chore: changed the payload
2024-12-23 20:26:48 +05:30
Prateek Shourya
b80a904bbf [WEB-2863] chore: minor improvements and bug fixes (#6222)
* fix: remove deprecated icons from logo picker

* improvement: minor empty states updates
2024-12-23 20:26:07 +05:30
M. Palanikannan
20260f0720 [PE-101] feat: smooth scrolling in editor while dragging and dropping nodes (#6233)
* fix: smoother drag scrolling

* fix: refactoring out common fns

* fix: moved to mouse events instead of drag

* fix: improving the drag preview

* fix: added better selection logic

* fix: drag handle new way almost working

* fix: drag-handle old behaviour with better scrolling

* fix: remove experiments

* fix: better scroll thresholds

* fix: transition to drop cursor added

* fix: drag handling speed

* fix: cleaning up listeners

* fix: common out selection and dragging logic

* fix: scroll threshold logic fixed
2024-12-23 20:04:34 +05:30
Prateek Shourya
6070ed4d36 improvement: enhance activity components and types modularity (#6262) 2024-12-23 20:03:42 +05:30
M. Palanikannan
ac47cc62ee [PE-102] fix: zooming for fullscreen images (#6266)
* fix: added magnification properly and also moving around the zoomed image

* fix: zoom via trackpad pinch

* fix: update imports

* fix: initial magnification is reset
2024-12-23 20:03:10 +05:30
sriram veeraghanta
1059fbbebf fix: build errors while upgrading date-fns 2024-12-23 19:05:52 +05:30
sriram veeraghanta
61478d1b6b fix: build errors in utils package 2024-12-23 18:45:22 +05:30
Aaryan Khandelwal
88737b1072 fix: issue mentions (#6265) 2024-12-23 17:42:39 +05:30
Anmol Singh Bhatia
34d114a4c5 fix: sub-issue collapsible visibility (#6259) 2024-12-23 15:42:03 +05:30
Aaryan Khandelwal
d54c1bae03 [PE-93] regression: mention users highlight color, reomve bot users from search list (#6258)
* chore: remove bot users in mention

* fix: user highlight color

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-12-23 15:25:40 +05:30
devin-ai-integration[bot]
9f5def3a6a chore: copy helper functions from admin and space into @plane/utils (#6256)
* chore: copy helper functions from space to @plane/utils

Co-Authored-By: sriram@plane.so <sriram@plane.so>

* refactor: move enums from utils/auth.ts to @plane/constants/auth.ts

Co-Authored-By: sriram@plane.so <sriram@plane.so>

---------

Co-authored-by: Devin AI <158243242+devin-ai-integration[bot]@users.noreply.github.com>
Co-authored-by: sriram@plane.so <sriram@plane.so>
2024-12-23 14:30:13 +05:30
sriram veeraghanta
043f4eaa5e chore: common services package (#6255)
* fix: initial services package setup

* fix: services packages updates

* fix: services changes

* fix: merge conflicts

* chore: file structuring

* fix: import fixes
2024-12-23 01:51:30 +05:30
sriram veeraghanta
1ee0661ac1 fix: missing packages in utils and live 2024-12-22 22:08:04 +05:30
sriram veeraghanta
60f7edcef8 fix: moving space constants to package 2024-12-21 17:17:43 +05:30
sriram veeraghanta
23849789f9 chore: admin imports refactor (#6251)
* chore: admin package refactoring

* chore: build errors

* fix: removing duplicates
2024-12-20 20:44:46 +05:30
Aaryan Khandelwal
33acb9e8ed [PE-93] regression: mentions in space app, entity search (#6250)
* fix: mentions in space app

* fix: user entity filter
2024-12-20 16:55:57 +05:30
Aaryan Khandelwal
d2c0940f04 refactor: accept generic function to search mentions (#6249) 2024-12-20 15:51:36 +05:30
Nikhil
00624eafbd fix: issue serializer to remove deleted labels and assignees (#6241) 2024-12-20 14:44:38 +05:30
Prateek Shourya
e6bf57aa18 [WEB-2885] fix: retain issue description when creating an issue copy (#6243) 2024-12-20 14:17:41 +05:30
Aaryan Khandelwal
3c8c657ee0 fix: cn helper function inport error (#6244) 2024-12-20 14:17:22 +05:30
Aaryan Khandelwal
119d343d5f [PE-93] refactor: editor mentions extension (#6178)
* refactor: editor mentions

* fix: build errors

* fix: build errors

* chore: add cycle status to search endpoint response

* fix: build errors

* fix: dynamic mention content in markdown

* chore: update entity search endpoint

* style: user mention popover

* chore: edition specific mention content handler

* chore: show deactivated user for old mentions

* chore: update search entity keys

* refactor: use editor mention hook
2024-12-20 13:41:25 +05:30
Aaryan Khandelwal
c10b875e2a fix: page title fixed height (#6242) 2024-12-20 13:24:19 +05:30
Vamsi Krishna
f10f9cbd41 [WEB-2859]chore: sub issue list optimization (#6232)
* chore: optimized api calls for sub-issue widget

* chore: added api call for on sub issues widget click
2024-12-19 22:45:08 +05:30
guru_sainath
9b71a702c7 [WEB-2884] chore: Update timezone list, add new endpoint, and update timezone dropdowns (#6231)
* dev: updated timezones list

* chore: added rate limiting
2024-12-19 20:15:55 +05:30
Vamsi Krishna
0a320a8540 * fix: avoided uncessary api call while creating issue draft (#6230)
* fix: fixed import order in module header
2024-12-19 16:26:35 +05:30
Anmol Singh Bhatia
44d8de1169 chore: remove workspace toggle from issue parent modal (#6227) 2024-12-19 13:59:44 +05:30
Prateek Shourya
6214c09170 refactor: move all issue related enums to constants package (#6229) 2024-12-19 13:58:54 +05:30
sriram veeraghanta
51ca353577 Merge branch 'preview' of github.com:makeplane/plane into preview 2024-12-18 14:58:13 +05:30
sriram veeraghanta
881c744eb9 fix: build errors 2024-12-18 14:57:59 +05:30
Bavisetti Narayan
ec41ae61b4 chore: removed the deleted votes and reaction (#6218) 2024-12-18 14:54:03 +05:30
Aaryan Khandelwal
5773c2bde3 chore: gif support for editor (#6219) 2024-12-18 13:17:05 +05:30
M. Palanikannan
e33bae2125 [PE-92] fix: removing readonly collaborative document editor (#6209)
* fix: removing readonly editor

* fix: sync state

* fix: indexeddb sync loader added

* fix: remove node error fixed

* style: page title and checkbox

* chore: removing the syncing logic

* revert: is editable check removed in display message

* fix: editable field optional

* fix: editable removed as optional prop

* fix: extra options import fix

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-12-18 12:58:18 +05:30
Aaryan Khandelwal
580c4b1930 refactor: remove cn helper function from the editor package (#6217) 2024-12-18 12:22:14 +05:30
Vamsi Krishna
ddd4b51b4e fix: labels empty state for drop down (#6216) 2024-12-17 19:14:10 +05:30
Satish Gandham
ede4aad55b - Do not clear temp files that are locked. (#6214)
- Handle edge cases in sync workspace
2024-12-17 17:46:24 +05:30
Akshita Goyal
1a715c98b2 chore: added common component for project activity (#6212)
* chore: added common component for project activity

* fix: added enum

* fix: added enum for initiatives
2024-12-17 17:02:59 +05:30
Vamsi Krishna
8e6d885731 [WEB-2678]feat: added functionality to add labels directly from dropdown (#6211)
* enhancement:added functionality to add features directly from dropdown

* fix: fixed import order

* fix: fixed lint errors
2024-12-17 14:29:56 +05:30
Prateek Shourya
4507802aba refactor: enhance workspace and project wrapper modularity (#6207) 2024-12-16 19:01:37 +05:30
Anmol Singh Bhatia
438cc33046 code refactor and improvement (#6203)
* chore: package code refactoring

* chore: component restructuring and refactor

* chore: comment create improvement
2024-12-16 17:24:50 +05:30
Vamsi Krishna
442b0fd7e5 fix: added project sync after transfer issues (#6200) 2024-12-16 15:15:48 +05:30
Dancia
1119b9dc36 Updated README.md (#6182)
* Updated README.md

* minor fixes

* minor fixes
2024-12-16 14:33:08 +05:30
Manish Gupta
47a76f48b4 fix: separated docker compose environment variables (#5575)
* Separated environment variables for specific app containers.

* updated env

* cleanup

* Separated environment variables for specific app containers.

* updated env

* cleanup

---------

Co-authored-by: Akshat Jain <akshatjain9782@gmail.com>
2024-12-16 13:23:33 +05:30
Manish Gupta
a0f03d07f3 chore: Check github releases for upgrades (#6162)
* modifed action and install.sh for selfhost

* updated selfhost readme and install.sh

* fixes

* changes suggested by code-rabbit

* chore: updated powered by (#6160)

* improvement: update fetch map during workspace-level module fetch to reduce redundant API calls (#6159)

* fix: remove unwanted states fetching logic to avoid multiple API calls. (#6158)

* chore remove unnecessary CTA (#6161)

* fix: build branch workflow upload artifacts

* fixes

* changes suggested by code-rabbit

* modifed action and install.sh for selfhost

* updated selfhost readme and install.sh

* fix: build branch workflow upload artifacts

* fixes

* changes suggested by code-rabbit

---------

Co-authored-by: guru_sainath <gurusainath007@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-16 13:22:23 +05:30
Nikhil
74b2ec03ff feat: add language support (#6205) 2024-12-15 11:04:03 +05:30
guru_sainath
5908998127 [WEB-2854] chore: trigger issue_description_version task on issue create and update (#6202)
* chore: issue description version task trigger from issue create and update

* chore: add default value in prop
2024-12-13 22:30:29 +05:30
guru_sainath
df6a80e7ae chore: add sync jobs for issue_version and issue_description_version tables (#6199)
* chore: added fields in issue_version and profile tables and created a new sticky table

* chore: removed point in issue version

* chore: add imports in init

* chore: added sync jobs for issue_version and issue_description_version

* chore: removed logs

* chore: updated logginh

---------

Co-authored-by: sainath <sainath@sainaths-MacBook-Pro.local>
2024-12-13 17:48:55 +05:30
guru_sainath
6ff258ceca chore: Add fields to issue_version and profile tables, and create new sticky table (#6198)
* chore: added fields in issue_version and profile tables and created a new sticky table

* chore: removed point in issue version

* chore: add imports in init

---------

Co-authored-by: sainath <sainath@sainaths-MacBook-Pro.local>
2024-12-13 17:30:25 +05:30
Saurabhkmr98
a8140a5f08 chore: Add logger package for node server side apps (#6188)
* chore: Add logger as a package

* chore: Add logger package for node server side apps

* remove plane logger import in web

* resolve pr reviews and add client logger with readme update

* fix: transformation and added middleware for logging requests

* chore: update readme

* fix: env configurable max file size

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-13 14:32:56 +05:30
Prateek Shourya
9234f21f26 [WEB-2848] improvement: enhanced components modularity (#6196)
* improvement: enhanced componenets modularity

* fix: lint errors resolved
2024-12-13 14:26:26 +05:30
Bavisetti Narayan
ab11e83535 [WEB-2843] chore: updated the cycle end date logic (#6194)
* chore: updated the cycle end date logic

* chore: changed the key for timezone
2024-12-13 13:34:07 +05:30
Akshita Goyal
b4112358ac [WEB-2688] chore: added icons and splitted issue header (#6195)
* chore: added icons and splitted issue header

* fix: added ee filler component

* fix: component name fixed

* fix: removed dupes

* fix: casing
2024-12-13 13:31:13 +05:30
Aaryan Khandelwal
77239ebcd4 fix: GitHub casing across the platform (#6193) 2024-12-13 02:22:46 +05:30
Prateek Shourya
54f828cbfa refactor: enhance components modularity and introduce new UI componenets (#6192)
* feat: add navigation dropdown component

* chore: enhance title/ description loader and componenet modularity

* chore: issue store filter update

* chore: added few icons to ui package

* chore: improvements for tabs componenet

* chore: enhance sidebar modularity

* chore: update issue and router store to add support for additional issue layouts

* chore: enhanced cycle componenets modularity

* feat: added project grouping header for cycles list

* chore: enhanced project dropdown componenet by adding multiple selection functionality

* chore: enhanced rich text editor modularity by taking members ids as props for mentions

* chore: added functionality to filter disabled layouts in issue-layout dropdown

* chore: added support to pass project ids as props in project card list

* feat: multi select project modal

* chore: seperate out project componenet for reusability

* chore: command pallete store improvements

* fix: build errors
2024-12-12 21:40:57 +05:30
Bavisetti Narayan
9ad8b43408 chore: handled the cycle date time using project timezone (#6187)
* chore: handled the cycle date time using project timezone

* chore: reverted the frontend commit
2024-12-12 14:11:12 +05:30
732 changed files with 16485 additions and 6635 deletions

123
README.md
View File

@@ -5,9 +5,7 @@
<img src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_logo_.webp" alt="Plane Logo" width="70">
</a>
</p>
<h3 align="center"><b>Plane</b></h3>
<p align="center"><b>Open-source project management that unlocks customer value</b></p>
<h1 align="center"><b>Plane</b></h1>
<p align="center">
<a href="https://discord.com/invite/A92xrEGCge">
@@ -44,79 +42,85 @@ Meet [Plane](https://dub.sh/plane-website-readme), an open-source project manage
> Plane is evolving every day. Your suggestions, ideas, and reported bugs help us immensely. Do not hesitate to join in the conversation on [Discord](https://discord.com/invite/A92xrEGCge) or raise a GitHub issue. We read everything and respond to most.
## Installation
## 🚀 Installation
The easiest way to get started with Plane is by creating a [Plane Cloud](https://app.plane.so) account.
Getting started with Plane is simple. Choose the setup that works best for you:
If you would like to self-host Plane, please see our [deployment guide](https://docs.plane.so/docker-compose).
- **Plane Cloud**
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
- **Self-host Plane**
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
| Installation methods | Docs link |
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://docs.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://docs.plane.so/kubernetes) |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://developers.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://developers.plane.so/self-hosting/methods/kubernetes) |
`Instance admins` can configure instance settings with [God-mode](https://docs.plane.so/instance-admin).
`Instance admins` can manage and customize settings using [God mode](https://developers.plane.so/self-hosting/govern/instance-admin).
## 🚀 Features
## 🌟 Features
- **Issues**: Quickly create issues and add details using a powerful rich text editor that supports file uploads. Add sub-properties and references to problems for better organization and tracking.
- **Issues**
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
- **Cycles**:
Keep up your team's momentum with Cycles. Gain insights into your project's progress with burn-down charts and other valuable features.
- **Cycles**
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
- **Modules**: Break down your large projects into smaller, more manageable modules. Assign modules between teams to track and plan your project's progress easily.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Views**: Create custom filters to display only the issues that matter to you. Save and share your filters in just a few clicks.
- **Views**
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
- **Pages**: Plane pages, equipped with AI and a rich text editor, let you jot down your thoughts on the fly. Format your text, upload images, hyperlink, or sync your existing ideas into an actionable item or issue.
- **Pages**
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
- **Analytics**: Get insights into all your Plane data in real-time. Visualize issue data to spot trends, remove blockers, and progress your work.
- **Analytics**
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
## 🛠️ Quick start for contributors
> Development system must have docker engine installed and running.
## 🛠️ Local development
Setting up local environment is extremely easy and straight forward. Follow the below step and you will be ready to contribute -
### Pre-requisites
- Ensure Docker Engine is installed and running.
1. Clone the code locally using:
### Development setup
Setting up your local environment is simple and straightforward. Follow these steps to get started:
1. Clone the repository:
```
git clone https://github.com/makeplane/plane.git
```
2. Switch to the code folder:
2. Navigate to the project folder:
```
cd plane
```
3. Create your feature or fix branch you plan to work on using:
3. Create a new branch for your feature or fix:
```
git checkout -b <feature-branch-name>
```
4. Open terminal and run:
4. Run the setup script in the terminal:
```
./setup.sh
```
5. Open the code on VSCode or similar equivalent IDE.
6. Review the `.env` files available in various folders.
Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system.
7. Run the docker command to initiate services:
5. Open the project in an IDE such as VS Code.
6. Review the `.env` files in the relevant folders. Refer to [Environment Setup](./ENV_SETUP.md) for details on the environment variables used.
7. Start the services using Docker:
```
docker compose -f docker-compose-local.yml up -d
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case it does not auto-reload).
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
Thats it!
## ❤️ Community
The Plane community can be found on [GitHub Discussions](https://github.com/orgs/makeplane/discussions), and our [Discord server](https://discord.com/invite/A92xrEGCge). Our [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community chanels.
Ask questions, report bugs, join discussions, voice ideas, make feature requests, or share your projects.
### Repo Activity
![Plane Repo Activity](https://repobeats.axiom.co/api/embed/2523c6ed2f77c082b7908c33e2ab208981d76c39.svg "Repobeats analytics image")
## Built with
[![Next JS](https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white)](https://nextjs.org/)<br/>
[![Django](https://img.shields.io/badge/Django-092E20?style=for-the-badge&logo=django&logoColor=green)](https://www.djangoproject.com/)<br/>
[![Node JS](https://img.shields.io/badge/node.js-339933?style=for-the-badge&logo=Node.js&logoColor=white)](https://nodejs.org/en)
## 📸 Screenshots
@@ -165,7 +169,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
</a>
</p>
</p>
<p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Drive_LlfeY4xn3.png?updatedAt=1709298837917"
@@ -176,23 +180,42 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
</p>
</p>
## ⛓️ Security
## 📝 Documentation
Explore Plane's [product documentation](https://docs.plane.so/) and [developer documentation](https://developers.plane.so/) to learn about features, setup, and usage.
If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports.
## ❤️ Community
Email squawk@plane.so to disclose any security vulnerabilities.
Join the Plane community on [GitHub Discussions](https://github.com/orgs/makeplane/discussions) and our [Discord server](https://discord.com/invite/A92xrEGCge). We follow a [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) in all our community channels.
## ❤️ Contribute
Feel free to ask questions, report bugs, participate in discussions, share ideas, request features, or showcase your projects. Wed love to hear from you!
There are many ways to contribute to Plane, including:
## 🛡️ Security
- Submitting [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) and [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+) for various components.
- Reviewing [the documentation](https://docs.plane.so/) and submitting [pull requests](https://github.com/makeplane/plane), from fixing typos to adding new features.
- Speaking or writing about Plane or any other ecosystem integration and [letting us know](https://discord.com/invite/A92xrEGCge)!
- Upvoting [popular feature requests](https://github.com/makeplane/plane/issues) to show your support.
If you discover a security vulnerability in Plane, please report it responsibly instead of opening a public issue. We take all legitimate reports seriously and will investigate them promptly. See [Security policy](https://github.com/makeplane/plane/blob/master/SECURITY.md) for more info.
To disclose any security issues, please email us at security@plane.so.
## 🤝 Contributing
There are many ways you can contribute to Plane:
- Report [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) or submit [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+).
- Review the [documentation](https://docs.plane.so/) and submit [pull requests](https://github.com/makeplane/docs) to improve it—whether it's fixing typos or adding new content.
- Talk or write about Plane or any other ecosystem integration and [let us know](https://discord.com/invite/A92xrEGCge)!
- Show your support by upvoting [popular feature requests](https://github.com/makeplane/plane/issues).
Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CONTRIBUTING.md) for details on the process for submitting pull requests to us.
### Repo activity
![Plane Repo Activity](https://repobeats.axiom.co/api/embed/2523c6ed2f77c082b7908c33e2ab208981d76c39.svg "Repobeats analytics image")
### We couldn't have done this without you.
<a href="https://github.com/makeplane/plane/graphs/contributors">
<img src="https://contrib.rocks/image?repo=makeplane/plane" />
</a>
## License
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).

View File

@@ -4,10 +4,11 @@ import { FC, useState } from "react";
import isEmpty from "lodash/isEmpty";
import Link from "next/link";
import { useForm } from "react-hook-form";
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import { IFormattedInstanceConfiguration, TInstanceGithubAuthenticationConfigurationKeys } from "@plane/types";
// ui
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
import { cn } from "@plane/utils";
// components
import {
CodeBlock,
@@ -17,8 +18,6 @@ import {
TControllerInputFormField,
TCopyField,
} from "@/components/common";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
// hooks
import { useInstance } from "@/hooks/store";
@@ -103,8 +102,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
url: originURL,
description: (
<>
We will auto-generate this. Paste this into the{" "}
<CodeBlock darkerShade>Authorized origin URL</CodeBlock> field{" "}
We will auto-generate this. Paste this into the <CodeBlock darkerShade>Authorized origin URL</CodeBlock> field{" "}
<a
tabIndex={-1}
href="https://github.com/settings/applications/new"
@@ -123,8 +121,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
url: `${originURL}/auth/github/callback/`,
description: (
<>
We will auto-generate this. Paste this into your{" "}
<CodeBlock darkerShade>Authorized Callback URI</CodeBlock> field{" "}
We will auto-generate this. Paste this into your <CodeBlock darkerShade>Authorized Callback URI</CodeBlock>{" "}
field{" "}
<a
tabIndex={-1}
href="https://github.com/settings/applications/new"

View File

@@ -5,12 +5,12 @@ import { observer } from "mobx-react";
import Image from "next/image";
import { useTheme } from "next-themes";
import useSWR from "swr";
// plane internal packages
import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
import { resolveGeneralTheme } from "@plane/utils";
// components
import { AuthenticationMethodCard } from "@/components/authentication";
import { PageHeader } from "@/components/common";
// helpers
import { resolveGeneralTheme } from "@/helpers/common.helper";
// hooks
import { useInstance } from "@/hooks/store";
// icons
@@ -44,7 +44,7 @@ const InstanceGithubAuthenticationPage = observer(() => {
loading: "Saving Configuration...",
success: {
title: "Configuration saved",
message: () => `Github authentication is now ${value ? "active" : "disabled"}.`,
message: () => `GitHub authentication is now ${value ? "active" : "disabled"}.`,
},
error: {
title: "Error",
@@ -67,8 +67,8 @@ const InstanceGithubAuthenticationPage = observer(() => {
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<AuthenticationMethodCard
name="Github"
description="Allow members to login or sign up to plane with their Github accounts."
name="GitHub"
description="Allow members to login or sign up to plane with their GitHub accounts."
icon={
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}

View File

@@ -2,10 +2,11 @@ import { FC, useState } from "react";
import isEmpty from "lodash/isEmpty";
import Link from "next/link";
import { useForm } from "react-hook-form";
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import { IFormattedInstanceConfiguration, TInstanceGitlabAuthenticationConfigurationKeys } from "@plane/types";
// ui
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
import { cn } from "@plane/utils";
// components
import {
CodeBlock,
@@ -15,8 +16,6 @@ import {
TControllerInputFormField,
TCopyField,
} from "@/components/common";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
// hooks
import { useInstance } from "@/hooks/store";
@@ -117,8 +116,7 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
url: `${originURL}/auth/gitlab/callback/`,
description: (
<>
We will auto-generate this. Paste this into the{" "}
<CodeBlock darkerShade>Redirect URI</CodeBlock> field of your{" "}
We will auto-generate this. Paste this into the <CodeBlock darkerShade>Redirect URI</CodeBlock> field of your{" "}
<a
tabIndex={-1}
href="https://docs.gitlab.com/ee/integration/oauth_provider.html"

View File

@@ -3,10 +3,11 @@ import { FC, useState } from "react";
import isEmpty from "lodash/isEmpty";
import Link from "next/link";
import { useForm } from "react-hook-form";
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import { IFormattedInstanceConfiguration, TInstanceGoogleAuthenticationConfigurationKeys } from "@plane/types";
// ui
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
import { cn } from "@plane/utils";
// components
import {
CodeBlock,
@@ -16,8 +17,6 @@ import {
TControllerInputFormField,
TCopyField,
} from "@/components/common";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
// hooks
import { useInstance } from "@/hooks/store";

View File

@@ -3,10 +3,10 @@
import { useState } from "react";
import { observer } from "mobx-react";
import useSWR from "swr";
// plane internal packages
import { TInstanceConfigurationKeys } from "@plane/types";
import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
// hooks
import { useInstance } from "@/hooks/store";
// plane admin components

View File

@@ -4,11 +4,11 @@ import { ReactNode } from "react";
import { ThemeProvider, useTheme } from "next-themes";
import { SWRConfig } from "swr";
// ui
import { ADMIN_BASE_PATH, DEFAULT_SWR_CONFIG } from "@plane/constants";
import { Toast } from "@plane/ui";
import { resolveGeneralTheme } from "@plane/utils";
// constants
import { SWR_CONFIG } from "@/constants/swr-config";
// helpers
import { ASSET_PREFIX, resolveGeneralTheme } from "@/helpers/common.helper";
// lib
import { InstanceProvider } from "@/lib/instance-provider";
import { StoreProvider } from "@/lib/store-provider";
@@ -22,6 +22,7 @@ const ToastWithTheme = () => {
};
export default function RootLayout({ children }: { children: ReactNode }) {
const ASSET_PREFIX = ADMIN_BASE_PATH;
return (
<html lang="en">
<head>
@@ -34,7 +35,7 @@ export default function RootLayout({ children }: { children: ReactNode }) {
<body className={`antialiased`}>
<ThemeProvider themes={["light", "dark"]} defaultTheme="system" enableSystem>
<ToastWithTheme />
<SWRConfig value={SWR_CONFIG}>
<SWRConfig value={DEFAULT_SWR_CONFIG}>
<StoreProvider>
<InstanceProvider>
<UserProvider>{children}</UserProvider>

View File

@@ -3,13 +3,11 @@ import Link from "next/link";
import { useRouter } from "next/navigation";
import { Controller, useForm } from "react-hook-form";
// constants
import { ORGANIZATION_SIZE, RESTRICTED_URLS } from "@plane/constants";
import { WEB_BASE_URL, ORGANIZATION_SIZE, RESTRICTED_URLS } from "@plane/constants";
// types
import { IWorkspace } from "@plane/types";
// components
import { Button, CustomSelect, getButtonStyling, Input, setToast, TOAST_TYPE } from "@plane/ui";
// helpers
import { WEB_BASE_URL } from "@/helpers/common.helper";
// hooks
import { useWorkspace } from "@/hooks/store";
// services

View File

@@ -7,12 +7,10 @@ import useSWR from "swr";
import { Loader as LoaderIcon } from "lucide-react";
// types
import { TInstanceConfigurationKeys } from "@plane/types";
// ui
import { Button, getButtonStyling, Loader, setPromiseToast, ToggleSwitch } from "@plane/ui";
import { cn } from "@plane/utils";
// components
import { WorkspaceListItem } from "@/components/workspace";
// helpers
import { cn } from "@/helpers/common.helper";
// hooks
import { useInstance, useWorkspace } from "@/hooks/store";

View File

@@ -10,7 +10,7 @@ import {
// components
import { AuthenticationMethodCard } from "@/components/authentication";
// helpers
import { getBaseAuthenticationModes } from "@/helpers/authentication.helper";
import { getBaseAuthenticationModes } from "@/lib/auth-helpers";
// plane admin components
import { UpgradeButton } from "@/plane-admin/components/common";
// images

View File

@@ -3,10 +3,9 @@
import React from "react";
// icons
import { SquareArrowOutUpRight } from "lucide-react";
// ui
// plane internal packages
import { getButtonStyling } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
export const UpgradeButton: React.FC = () => (
<a href="https://plane.so/pricing?mode=self-hosted" target="_blank" className={cn(getButtonStyling("primary", "sm"))}>

View File

@@ -5,13 +5,14 @@ import { observer } from "mobx-react";
import Link from "next/link";
import { ExternalLink, FileText, HelpCircle, MoveLeft } from "lucide-react";
import { Transition } from "@headlessui/react";
// ui
// plane internal packages
import { WEB_BASE_URL } from "@plane/constants";
import { DiscordIcon, GithubIcon, Tooltip } from "@plane/ui";
// helpers
import { WEB_BASE_URL, cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
// hooks
import { useTheme } from "@/hooks/store";
// assets
// eslint-disable-next-line import/order
import packageJson from "package.json";
const helpOptions = [

View File

@@ -5,11 +5,10 @@ import { observer } from "mobx-react";
import { useTheme as useNextTheme } from "next-themes";
import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane ui
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import { Avatar } from "@plane/ui";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
import { getFileURL, cn } from "@plane/utils";
// hooks
import { useTheme, useUser } from "@/hooks/store";
// services

View File

@@ -4,11 +4,11 @@ import { observer } from "mobx-react";
import Link from "next/link";
import { usePathname } from "next/navigation";
import { Image, BrainCog, Cog, Lock, Mail } from "lucide-react";
// plane internal packages
import { Tooltip, WorkspaceIcon } from "@plane/ui";
import { cn } from "@plane/utils";
// hooks
import { cn } from "@/helpers/common.helper";
import { useTheme } from "@/hooks/store";
// helpers
const INSTANCE_ADMIN_LINKS = [
{

View File

@@ -30,7 +30,7 @@ export const InstanceHeader: FC = observer(() => {
case "google":
return "Google";
case "github":
return "Github";
return "GitHub";
case "gitlab":
return "GitLab";
case "workspace":

View File

@@ -1,7 +1,7 @@
import { FC } from "react";
import { Info, X } from "lucide-react";
// helpers
import { TAuthErrorInfo } from "@/helpers/authentication.helper";
// plane constants
import { TAuthErrorInfo } from "@plane/constants";
type TAuthBanner = {
bannerData: TAuthErrorInfo | undefined;

View File

@@ -2,7 +2,7 @@
import { FC } from "react";
// helpers
import { cn } from "helpers/common.helper";
import { cn } from "@plane/utils";
type Props = {
name: string;

View File

@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
import Link from "next/link";
// icons
import { Settings2 } from "lucide-react";
// types
// plane internal packages
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
// ui
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
// hooks
import { useInstance } from "@/hooks/store";

View File

@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
import Link from "next/link";
// icons
import { Settings2 } from "lucide-react";
// types
// plane internal packages
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
// ui
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
// hooks
import { useInstance } from "@/hooks/store";

View File

@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
import Link from "next/link";
// icons
import { Settings2 } from "lucide-react";
// types
// plane internal packages
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
// ui
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
// hooks
import { useInstance } from "@/hooks/store";

View File

@@ -1,4 +1,4 @@
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
type TProps = {
children: React.ReactNode;

View File

@@ -4,10 +4,9 @@ import React, { useState } from "react";
import { Controller, Control } from "react-hook-form";
// icons
import { Eye, EyeOff } from "lucide-react";
// ui
// plane internal packages
import { Input } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
import { cn } from "@plane/utils";
type Props = {
control: Control<any>;
@@ -37,9 +36,7 @@ export const ControllerInput: React.FC<Props> = (props) => {
return (
<div className="flex flex-col gap-1">
<h4 className="text-sm text-custom-text-300">
{label}
</h4>
<h4 className="text-sm text-custom-text-300">{label}</h4>
<div className="relative">
<Controller
control={control}

View File

@@ -1,14 +1,9 @@
"use client";
import { FC, useMemo } from "react";
// import { CircleCheck } from "lucide-react";
// helpers
import { cn } from "@/helpers/common.helper";
import {
E_PASSWORD_STRENGTH,
// PASSWORD_CRITERIA,
getPasswordStrength,
} from "@/helpers/password.helper";
// plane internal packages
import { E_PASSWORD_STRENGTH } from "@plane/constants";
import { cn, getPasswordStrength } from "@plane/utils";
type TPasswordStrengthMeter = {
password: string;

View File

@@ -4,13 +4,12 @@ import { FC, useEffect, useMemo, useState } from "react";
import { useSearchParams } from "next/navigation";
// icons
import { Eye, EyeOff } from "lucide-react";
// ui
// plane internal packages
import { API_BASE_URL, E_PASSWORD_STRENGTH } from "@plane/constants";
import { Button, Checkbox, Input, Spinner } from "@plane/ui";
import { getPasswordStrength } from "@plane/utils";
// components
import { Banner, PasswordStrengthMeter } from "@/components/common";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
import { E_PASSWORD_STRENGTH, getPasswordStrength } from "@/helpers/password.helper";
// services
import { AuthService } from "@/services/auth.service";

View File

@@ -2,24 +2,18 @@
import { FC, useEffect, useMemo, useState } from "react";
import { useSearchParams } from "next/navigation";
// services
import { Eye, EyeOff } from "lucide-react";
// plane internal packages
import { API_BASE_URL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
import { Button, Input, Spinner } from "@plane/ui";
// components
import { Banner } from "@/components/common";
// helpers
import {
authErrorHandler,
EAuthenticationErrorCodes,
EErrorAlertType,
TAuthErrorInfo,
} from "@/helpers/authentication.helper";
import { API_BASE_URL } from "@/helpers/common.helper";
import { authErrorHandler } from "@/lib/auth-helpers";
// services
import { AuthService } from "@/services/auth.service";
// local components
import { AuthBanner } from "../authentication";
// ui
// icons
// service initialization
const authService = new AuthService();
@@ -102,7 +96,7 @@ export const InstanceSignInForm: FC = (props) => {
useEffect(() => {
if (errorCode) {
const errorDetail = authErrorHandler(errorCode?.toString() as EAuthenticationErrorCodes);
const errorDetail = authErrorHandler(errorCode?.toString() as EAdminAuthErrorCodes);
if (errorDetail) {
setErrorInfo(errorDetail);
}

View File

@@ -1,13 +1,13 @@
"use client";
import React from "react";
import { resolveGeneralTheme } from "helpers/common.helper";
import { observer } from "mobx-react";
import Image from "next/image";
import Link from "next/link";
import { useTheme as nextUseTheme } from "next-themes";
// ui
import { Button, getButtonStyling } from "@plane/ui";
import { resolveGeneralTheme } from "@plane/utils";
// hooks
import { useTheme } from "@/hooks/store";
// icons

View File

@@ -1,9 +1,9 @@
import { observer } from "mobx-react";
import { ExternalLink } from "lucide-react";
// helpers
// plane internal packages
import { WEB_BASE_URL } from "@plane/constants";
import { Tooltip } from "@plane/ui";
import { WEB_BASE_URL } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
import { getFileURL } from "@plane/utils";
// hooks
import { useWorkspace } from "@/hooks/store";

View File

@@ -1,8 +0,0 @@
export const SITE_NAME = "Plane | Simple, extensible, open-source project management tool.";
export const SITE_TITLE = "Plane | Simple, extensible, open-source project management tool.";
export const SITE_DESCRIPTION =
"Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.";
export const SITE_KEYWORDS =
"software development, plan, ship, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
export const SITE_URL = "https://app.plane.so/";
export const TWITTER_USER_NAME = "Plane | Simple, extensible, open-source project management tool.";

View File

@@ -0,0 +1,164 @@
import { ReactNode } from "react";
import Image from "next/image";
import Link from "next/link";
import { KeyRound, Mails } from "lucide-react";
// plane packages
import { SUPPORT_EMAIL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
import { resolveGeneralTheme } from "@plane/utils";
// components
import {
EmailCodesConfiguration,
GithubConfiguration,
GitlabConfiguration,
GoogleConfiguration,
PasswordLoginConfiguration,
} from "@/components/authentication";
// images
import githubLightModeImage from "@/public/logos/github-black.png";
import githubDarkModeImage from "@/public/logos/github-white.png";
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
import GoogleLogo from "@/public/logos/google-logo.svg";
export enum EErrorAlertType {
BANNER_ALERT = "BANNER_ALERT",
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
INLINE_EMAIL = "INLINE_EMAIL",
INLINE_PASSWORD = "INLINE_PASSWORD",
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
}
const errorCodeMessages: {
[key in EAdminAuthErrorCodes]: { title: string; message: (email?: string | undefined) => ReactNode };
} = {
// admin
[EAdminAuthErrorCodes.ADMIN_ALREADY_EXIST]: {
title: `Admin already exists`,
message: () => `Admin already exists. Please try again.`,
},
[EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME]: {
title: `Email, password and first name required`,
message: () => `Email, password and first name required. Please try again.`,
},
[EAdminAuthErrorCodes.INVALID_ADMIN_EMAIL]: {
title: `Invalid admin email`,
message: () => `Invalid admin email. Please try again.`,
},
[EAdminAuthErrorCodes.INVALID_ADMIN_PASSWORD]: {
title: `Invalid admin password`,
message: () => `Invalid admin password. Please try again.`,
},
[EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD]: {
title: `Email and password required`,
message: () => `Email and password required. Please try again.`,
},
[EAdminAuthErrorCodes.ADMIN_AUTHENTICATION_FAILED]: {
title: `Authentication failed`,
message: () => `Authentication failed. Please try again.`,
},
[EAdminAuthErrorCodes.ADMIN_USER_ALREADY_EXIST]: {
title: `Admin user already exists`,
message: () => (
<div>
Admin user already exists.&nbsp;
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
Sign In
</Link>
&nbsp;now.
</div>
),
},
[EAdminAuthErrorCodes.ADMIN_USER_DOES_NOT_EXIST]: {
title: `Admin user does not exist`,
message: () => (
<div>
Admin user does not exist.&nbsp;
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
Sign In
</Link>
&nbsp;now.
</div>
),
},
[EAdminAuthErrorCodes.ADMIN_USER_DEACTIVATED]: {
title: `User account deactivated`,
message: () => `User account deactivated. Please contact ${!!SUPPORT_EMAIL ? SUPPORT_EMAIL : "administrator"}.`,
},
};
export const authErrorHandler = (
errorCode: EAdminAuthErrorCodes,
email?: string | undefined
): TAuthErrorInfo | undefined => {
const bannerAlertErrorCodes = [
EAdminAuthErrorCodes.ADMIN_ALREADY_EXIST,
EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,
EAdminAuthErrorCodes.INVALID_ADMIN_EMAIL,
EAdminAuthErrorCodes.INVALID_ADMIN_PASSWORD,
EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD,
EAdminAuthErrorCodes.ADMIN_AUTHENTICATION_FAILED,
EAdminAuthErrorCodes.ADMIN_USER_ALREADY_EXIST,
EAdminAuthErrorCodes.ADMIN_USER_DOES_NOT_EXIST,
EAdminAuthErrorCodes.ADMIN_USER_DEACTIVATED,
];
if (bannerAlertErrorCodes.includes(errorCode))
return {
type: EErrorAlertType.BANNER_ALERT,
code: errorCode,
title: errorCodeMessages[errorCode]?.title || "Error",
message: errorCodeMessages[errorCode]?.message(email) || "Something went wrong. Please try again.",
};
return undefined;
};
export const getBaseAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
disabled,
updateConfig,
resolvedTheme,
}) => [
{
key: "unique-codes",
name: "Unique codes",
description:
"Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "passwords-login",
name: "Passwords",
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "google",
name: "Google",
description: "Allow members to log in or sign up for Plane with their Google accounts.",
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "github",
name: "GitHub",
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
icon: (
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
height={20}
width={20}
alt="GitHub Logo"
/>
),
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "gitlab",
name: "GitLab",
description: "Allow members to log in or sign up to plane with their GitLab accounts.",
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
];

View File

@@ -1,5 +1,4 @@
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
import { API_BASE_URL } from "@plane/constants";
// services
import { APIService } from "@/services/api.service";

View File

@@ -1,4 +1,5 @@
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import type {
IFormattedInstanceConfiguration,
IInstance,
@@ -7,7 +8,6 @@ import type {
IInstanceInfo,
} from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
import { APIService } from "@/services/api.service";
export class InstanceService extends APIService {

View File

@@ -1,7 +1,6 @@
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import type { IUser } from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
// services
import { APIService } from "@/services/api.service";

View File

@@ -1,7 +1,6 @@
// types
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import type { IWorkspace, TWorkspacePaginationInfo } from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
// services
import { APIService } from "@/services/api.service";

View File

@@ -1,5 +1,7 @@
import set from "lodash/set";
import { observable, action, computed, makeObservable, runInAction } from "mobx";
// plane internal packages
import { EInstanceStatus, TInstanceStatus } from "@plane/constants";
import {
IInstance,
IInstanceAdmin,
@@ -8,8 +10,6 @@ import {
IInstanceInfo,
IInstanceConfig,
} from "@plane/types";
// helpers
import { EInstanceStatus, TInstanceStatus } from "@/helpers/instance.helper";
// services
import { InstanceService } from "@/services/instance.service";
// root store

View File

@@ -1,7 +1,7 @@
import { action, observable, runInAction, makeObservable } from "mobx";
// plane internal packages
import { EUserStatus, TUserStatus } from "@plane/constants";
import { IUser } from "@plane/types";
// helpers
import { EUserStatus, TUserStatus } from "@/helpers/user.helper";
// services
import { AuthService } from "@/services/auth.service";
import { UserService } from "@/services/user.service";

View File

@@ -1,203 +0,0 @@
import { ReactNode } from "react";
import Image from "next/image";
import Link from "next/link";
import { KeyRound, Mails } from "lucide-react";
// types
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
// components
import {
EmailCodesConfiguration,
GithubConfiguration,
GitlabConfiguration,
GoogleConfiguration,
PasswordLoginConfiguration,
} from "@/components/authentication";
// helpers
import { SUPPORT_EMAIL, resolveGeneralTheme } from "@/helpers/common.helper";
// images
import githubLightModeImage from "@/public/logos/github-black.png";
import githubDarkModeImage from "@/public/logos/github-white.png";
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
import GoogleLogo from "@/public/logos/google-logo.svg";
export enum EPageTypes {
PUBLIC = "PUBLIC",
NON_AUTHENTICATED = "NON_AUTHENTICATED",
SET_PASSWORD = "SET_PASSWORD",
ONBOARDING = "ONBOARDING",
AUTHENTICATED = "AUTHENTICATED",
}
export enum EAuthModes {
SIGN_IN = "SIGN_IN",
SIGN_UP = "SIGN_UP",
}
export enum EAuthSteps {
EMAIL = "EMAIL",
PASSWORD = "PASSWORD",
UNIQUE_CODE = "UNIQUE_CODE",
}
export enum EErrorAlertType {
BANNER_ALERT = "BANNER_ALERT",
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
INLINE_EMAIL = "INLINE_EMAIL",
INLINE_PASSWORD = "INLINE_PASSWORD",
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
}
export enum EAuthenticationErrorCodes {
// Admin
ADMIN_ALREADY_EXIST = "5150",
REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME = "5155",
INVALID_ADMIN_EMAIL = "5160",
INVALID_ADMIN_PASSWORD = "5165",
REQUIRED_ADMIN_EMAIL_PASSWORD = "5170",
ADMIN_AUTHENTICATION_FAILED = "5175",
ADMIN_USER_ALREADY_EXIST = "5180",
ADMIN_USER_DOES_NOT_EXIST = "5185",
ADMIN_USER_DEACTIVATED = "5190",
}
export type TAuthErrorInfo = {
type: EErrorAlertType;
code: EAuthenticationErrorCodes;
title: string;
message: ReactNode;
};
const errorCodeMessages: {
[key in EAuthenticationErrorCodes]: { title: string; message: (email?: string | undefined) => ReactNode };
} = {
// admin
[EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST]: {
title: `Admin already exists`,
message: () => `Admin already exists. Please try again.`,
},
[EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME]: {
title: `Email, password and first name required`,
message: () => `Email, password and first name required. Please try again.`,
},
[EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL]: {
title: `Invalid admin email`,
message: () => `Invalid admin email. Please try again.`,
},
[EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD]: {
title: `Invalid admin password`,
message: () => `Invalid admin password. Please try again.`,
},
[EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD]: {
title: `Email and password required`,
message: () => `Email and password required. Please try again.`,
},
[EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED]: {
title: `Authentication failed`,
message: () => `Authentication failed. Please try again.`,
},
[EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST]: {
title: `Admin user already exists`,
message: () => (
<div>
Admin user already exists.&nbsp;
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
Sign In
</Link>
&nbsp;now.
</div>
),
},
[EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST]: {
title: `Admin user does not exist`,
message: () => (
<div>
Admin user does not exist.&nbsp;
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
Sign In
</Link>
&nbsp;now.
</div>
),
},
[EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED]: {
title: `User account deactivated`,
message: () => `User account deactivated. Please contact ${!!SUPPORT_EMAIL ? SUPPORT_EMAIL : "administrator"}.`,
},
};
export const authErrorHandler = (
errorCode: EAuthenticationErrorCodes,
email?: string | undefined
): TAuthErrorInfo | undefined => {
const bannerAlertErrorCodes = [
EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST,
EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,
EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL,
EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD,
EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD,
EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED,
EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST,
EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST,
EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED,
];
if (bannerAlertErrorCodes.includes(errorCode))
return {
type: EErrorAlertType.BANNER_ALERT,
code: errorCode,
title: errorCodeMessages[errorCode]?.title || "Error",
message: errorCodeMessages[errorCode]?.message(email) || "Something went wrong. Please try again.",
};
return undefined;
};
export const getBaseAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
disabled,
updateConfig,
resolvedTheme,
}) => [
{
key: "unique-codes",
name: "Unique codes",
description:
"Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "passwords-login",
name: "Passwords",
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "google",
name: "Google",
description: "Allow members to log in or sign up for Plane with their Google accounts.",
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "github",
name: "GitHub",
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
icon: (
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
height={20}
width={20}
alt="GitHub Logo"
/>
),
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "gitlab",
name: "GitLab",
description: "Allow members to log in or sign up to plane with their GitLab accounts.",
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
];

View File

@@ -1,20 +0,0 @@
import { clsx, type ClassValue } from "clsx";
import { twMerge } from "tailwind-merge";
export const API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL || "";
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "";
export const SPACE_BASE_URL = process.env.NEXT_PUBLIC_SPACE_BASE_URL || "";
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "";
export const WEB_BASE_URL = process.env.NEXT_PUBLIC_WEB_BASE_URL || "";
export const SUPPORT_EMAIL = process.env.NEXT_PUBLIC_SUPPORT_EMAIL || "";
export const ASSET_PREFIX = ADMIN_BASE_PATH;
export const cn = (...inputs: ClassValue[]) => twMerge(clsx(inputs));
export const resolveGeneralTheme = (resolvedTheme: string | undefined) =>
resolvedTheme?.includes("light") ? "light" : resolvedTheme?.includes("dark") ? "dark" : "system";

View File

@@ -1,14 +0,0 @@
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
/**
* @description combine the file path with the base URL
* @param {string} path
* @returns {string} final URL with the base URL
*/
export const getFileURL = (path: string): string | undefined => {
if (!path) return undefined;
const isValidURL = path.startsWith("http");
if (isValidURL) return path;
return `${API_BASE_URL}${path}`;
};

View File

@@ -1,2 +0,0 @@
export * from "./instance.helper";
export * from "./user.helper";

View File

@@ -1,67 +0,0 @@
import zxcvbn from "zxcvbn";
export enum E_PASSWORD_STRENGTH {
EMPTY = "empty",
LENGTH_NOT_VALID = "length_not_valid",
STRENGTH_NOT_VALID = "strength_not_valid",
STRENGTH_VALID = "strength_valid",
}
const PASSWORD_MIN_LENGTH = 8;
// const PASSWORD_NUMBER_REGEX = /\d/;
// const PASSWORD_CHAR_CAPS_REGEX = /[A-Z]/;
// const PASSWORD_SPECIAL_CHAR_REGEX = /[`!@#$%^&*()_\-+=\[\]{};':"\\|,.<>\/?~ ]/;
export const PASSWORD_CRITERIA = [
{
key: "min_8_char",
label: "Min 8 characters",
isCriteriaValid: (password: string) => password.length >= PASSWORD_MIN_LENGTH,
},
// {
// key: "min_1_upper_case",
// label: "Min 1 upper-case letter",
// isCriteriaValid: (password: string) => PASSWORD_NUMBER_REGEX.test(password),
// },
// {
// key: "min_1_number",
// label: "Min 1 number",
// isCriteriaValid: (password: string) => PASSWORD_CHAR_CAPS_REGEX.test(password),
// },
// {
// key: "min_1_special_char",
// label: "Min 1 special character",
// isCriteriaValid: (password: string) => PASSWORD_SPECIAL_CHAR_REGEX.test(password),
// },
];
export const getPasswordStrength = (password: string): E_PASSWORD_STRENGTH => {
let passwordStrength: E_PASSWORD_STRENGTH = E_PASSWORD_STRENGTH.EMPTY;
if (!password || password === "" || password.length <= 0) {
return passwordStrength;
}
if (password.length >= PASSWORD_MIN_LENGTH) {
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_NOT_VALID;
} else {
passwordStrength = E_PASSWORD_STRENGTH.LENGTH_NOT_VALID;
return passwordStrength;
}
const passwordCriteriaValidation = PASSWORD_CRITERIA.map((criteria) => criteria.isCriteriaValid(password)).every(
(criterion) => criterion
);
const passwordStrengthScore = zxcvbn(password).score;
if (passwordCriteriaValidation === false || passwordStrengthScore <= 2) {
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_NOT_VALID;
return passwordStrength;
}
if (passwordCriteriaValidation === true && passwordStrengthScore >= 3) {
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_VALID;
}
return passwordStrength;
};

View File

@@ -1,21 +0,0 @@
/**
* @description
* This function test whether a URL is valid or not.
*
* It accepts URLs with or without the protocol.
* @param {string} url
* @returns {boolean}
* @example
* checkURLValidity("https://example.com") => true
* checkURLValidity("example.com") => true
* checkURLValidity("example") => false
*/
export const checkURLValidity = (url: string): boolean => {
if (!url) return false;
// regex to support complex query parameters and fragments
const urlPattern =
/^(https?:\/\/)?((([a-z\d-]+\.)*[a-z\d-]+\.[a-z]{2,6})|(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}))(:\d+)?(\/[\w.-]*)*(\?[^#\s]*)?(#[\w-]*)?$/i;
return urlPattern.test(url);
};

View File

@@ -5,7 +5,6 @@
"baseUrl": ".",
"paths": {
"@/*": ["core/*"],
"@/helpers/*": ["helpers/*"],
"@/public/*": ["public/*"],
"@/plane-admin/*": ["ce/*"]
}

View File

@@ -4,7 +4,7 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import Cycle, CycleIssue
from plane.utils.timezone_converter import convert_to_utc
class CycleSerializer(BaseSerializer):
total_issues = serializers.IntegerField(read_only=True)
@@ -24,6 +24,18 @@ class CycleSerializer(BaseSerializer):
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
):
project_id = self.initial_data.get("project_id") or self.instance.project_id
data["start_date"] = convert_to_utc(
str(data.get("start_date").date()), project_id, is_start_date=True
)
data["end_date"] = convert_to_utc(
str(data.get("end_date", None).date()), project_id
)
return data
class Meta:

View File

@@ -237,17 +237,37 @@ class IssueSerializer(BaseSerializer):
from .user import UserLiteSerializer
data["assignees"] = UserLiteSerializer(
instance.assignees.all(), many=True
User.objects.filter(
pk__in=IssueAssignee.objects.filter(issue=instance).values_list(
"assignee_id", flat=True
)
),
many=True,
).data
else:
data["assignees"] = [
str(assignee.id) for assignee in instance.assignees.all()
str(assignee)
for assignee in IssueAssignee.objects.filter(
issue=instance
).values_list("assignee_id", flat=True)
]
if "labels" in self.fields:
if "labels" in self.expand:
data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
data["labels"] = LabelSerializer(
Label.objects.filter(
pk__in=IssueLabel.objects.filter(issue=instance).values_list(
"label_id", flat=True
)
),
many=True,
).data
else:
data["labels"] = [str(label.id) for label in instance.labels.all()]
data["labels"] = [
str(label)
for label in IssueLabel.objects.filter(issue=instance).values_list(
"label_id", flat=True
)
]
return data

View File

@@ -109,16 +109,6 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
)
# Create or get state
state, _ = State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Intake Issues",
project_id=project_id,
color="#ff7700",
is_triage=True,
)
# create an issue
issue = Issue.objects.create(
name=request.data.get("issue", {}).get("name"),
@@ -128,7 +118,6 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
),
priority=request.data.get("issue", {}).get("priority", "none"),
project_id=project_id,
state=state,
)
# create an intake issue

View File

@@ -258,7 +258,9 @@ class ProjectAPIEndpoint(BaseAPIView):
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
intake_view = request.data.get("inbox_view", project.intake_view)
intake_view = request.data.get(
"inbox_view", request.data.get("intake_view", project.intake_view)
)
if project.archived_at:
return Response(

View File

@@ -5,6 +5,7 @@ from rest_framework import serializers
from .base import BaseSerializer
from .issue import IssueStateSerializer
from plane.db.models import Cycle, CycleIssue, CycleUserProperties
from plane.utils.timezone_converter import convert_to_utc
class CycleWriteSerializer(BaseSerializer):
@@ -15,6 +16,17 @@ class CycleWriteSerializer(BaseSerializer):
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
):
project_id = self.initial_data.get("project_id") or self.instance.project_id
data["start_date"] = convert_to_utc(
str(data.get("start_date").date()), project_id, is_start_date=True
)
data["end_date"] = convert_to_utc(
str(data.get("end_date", None).date()), project_id
)
return data
class Meta:

View File

@@ -116,7 +116,7 @@ class WebhookSerializer(DynamicBaseSerializer):
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = ["workspace", "secret_key"]
read_only_fields = ["workspace", "secret_key", "deleted_at"]
class WebhookLogSerializer(DynamicBaseSerializer):

View File

@@ -17,6 +17,7 @@ from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .webhook import urlpatterns as webhook_urls
from .workspace import urlpatterns as workspace_urls
from .timezone import urlpatterns as timezone_urls
urlpatterns = [
*analytic_urls,
@@ -38,4 +39,5 @@ urlpatterns = [
*workspace_urls,
*api_urls,
*webhook_urls,
*timezone_urls,
]

View File

@@ -1,7 +1,7 @@
from django.urls import path
from plane.app.views import GlobalSearchEndpoint, IssueSearchEndpoint
from plane.app.views import GlobalSearchEndpoint, IssueSearchEndpoint, SearchEndpoint
urlpatterns = [
@@ -15,4 +15,9 @@ urlpatterns = [
IssueSearchEndpoint.as_view(),
name="project-issue-search",
),
path(
"workspaces/<str:slug>/entity-search/",
SearchEndpoint.as_view(),
name="entity-search",
),
]

View File

@@ -0,0 +1,8 @@
from django.urls import path
from plane.app.views import TimezoneEndpoint
urlpatterns = [
# timezone endpoint
path("timezones/", TimezoneEndpoint.as_view(), name="timezone-list")
]

View File

@@ -158,7 +158,7 @@ from .page.base import (
)
from .page.version import PageVersionEndpoint
from .search.base import GlobalSearchEndpoint
from .search.base import GlobalSearchEndpoint, SearchEndpoint
from .search.issue import IssueSearchEndpoint
@@ -204,3 +204,5 @@ from .error_404 import custom_404_view
from .notification.base import MarkAllReadNotificationViewSet
from .user.base import AccountEndpoint, ProfileEndpoint, UserSessionEndpoint
from .timezone.base import TimezoneEndpoint

View File

@@ -126,7 +126,13 @@ class UserAssetsV2Endpoint(BaseAPIView):
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
allowed_types = [
"image/jpeg",
"image/png",
"image/webp",
"image/jpg",
"image/gif",
]
if type not in allowed_types:
return Response(
{

View File

@@ -1,5 +1,7 @@
# Python imports
import json
import pytz
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
@@ -52,6 +54,11 @@ from plane.bgtasks.recent_visited_task import recent_visited_task
# Module imports
from .. import BaseAPIView, BaseViewSet
from plane.bgtasks.webhook_task import model_activity
from plane.utils.timezone_converter import (
convert_utc_to_project_timezone,
convert_to_utc,
user_timezone_converter,
)
class CycleViewSet(BaseViewSet):
@@ -67,6 +74,19 @@ class CycleViewSet(BaseViewSet):
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
)
project = Project.objects.get(id=self.kwargs.get("project_id"))
# Fetch project for the specific record or pass project_id dynamically
project_timezone = project.timezone
# Convert the current time (timezone.now()) to the project's timezone
local_tz = pytz.timezone(project_timezone)
current_time_in_project_tz = timezone.now().astimezone(local_tz)
# Convert project local time back to UTC for comparison (start_date is stored in UTC)
current_time_in_utc = current_time_in_project_tz.astimezone(pytz.utc)
return self.filter_queryset(
super()
.get_queryset()
@@ -119,12 +139,15 @@ class CycleViewSet(BaseViewSet):
.annotate(
status=Case(
When(
Q(start_date__lte=timezone.now())
& Q(end_date__gte=timezone.now()),
Q(start_date__lte=current_time_in_utc)
& Q(end_date__gte=current_time_in_utc),
then=Value("CURRENT"),
),
When(start_date__gt=timezone.now(), then=Value("UPCOMING")),
When(end_date__lt=timezone.now(), then=Value("COMPLETED")),
When(
start_date__gt=current_time_in_utc,
then=Value("UPCOMING"),
),
When(end_date__lt=current_time_in_utc, then=Value("COMPLETED")),
When(
Q(start_date__isnull=True) & Q(end_date__isnull=True),
then=Value("DRAFT"),
@@ -160,10 +183,22 @@ class CycleViewSet(BaseViewSet):
# Update the order by
queryset = queryset.order_by("-is_favorite", "-created_at")
project = Project.objects.get(id=self.kwargs.get("project_id"))
# Fetch project for the specific record or pass project_id dynamically
project_timezone = project.timezone
# Convert the current time (timezone.now()) to the project's timezone
local_tz = pytz.timezone(project_timezone)
current_time_in_project_tz = timezone.now().astimezone(local_tz)
# Convert project local time back to UTC for comparison (start_date is stored in UTC)
current_time_in_utc = current_time_in_project_tz.astimezone(pytz.utc)
# Current Cycle
if cycle_view == "current":
queryset = queryset.filter(
start_date__lte=timezone.now(), end_date__gte=timezone.now()
start_date__lte=current_time_in_utc, end_date__gte=current_time_in_utc
)
data = queryset.values(
@@ -191,6 +226,8 @@ class CycleViewSet(BaseViewSet):
"version",
"created_by",
)
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(data, datetime_fields, project_timezone)
if data:
return Response(data, status=status.HTTP_200_OK)
@@ -221,6 +258,8 @@ class CycleViewSet(BaseViewSet):
"version",
"created_by",
)
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(data, datetime_fields, request.user.user_timezone)
return Response(data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@@ -417,6 +456,8 @@ class CycleViewSet(BaseViewSet):
)
queryset = queryset.first()
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(data, datetime_fields, request.user.user_timezone)
recent_visited_task.delay(
slug=slug,
@@ -492,6 +533,9 @@ class CycleDateCheckEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
start_date = convert_to_utc(str(start_date), project_id, is_start_date=True)
end_date = convert_to_utc(str(end_date), project_id)
# Check if any cycle intersects in the given interval
cycles = Cycle.objects.filter(
Q(workspace__slug=slug)

View File

@@ -54,10 +54,11 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPaginator
from .. import BaseAPIView, BaseViewSet
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.timezone_converter import user_timezone_converter
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.global_paginator import paginate
from plane.bgtasks.webhook_task import model_activity
from plane.bgtasks.issue_description_version_task import issue_description_version_task
class IssueListEndpoint(BaseAPIView):
@@ -428,6 +429,13 @@ class IssueViewSet(BaseViewSet):
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
)
# updated issue description version
issue_description_version_task.delay(
updated_issue=json.dumps(request.data, cls=DjangoJSONEncoder),
issue_id=str(serializer.data["id"]),
user_id=request.user.id,
is_creating=True,
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -649,6 +657,12 @@ class IssueViewSet(BaseViewSet):
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
)
# updated issue description version
issue_description_version_task.delay(
updated_issue=current_instance,
issue_id=str(serializer.data.get("id", None)),
user_id=request.user.id,
)
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -20,7 +20,7 @@ from plane.app.serializers import IssueSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import Issue, IssueLink, FileAsset, CycleIssue
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.timezone_converter import user_timezone_converter
from collections import defaultdict

View File

@@ -28,7 +28,7 @@ from plane.app.permissions import ProjectEntityPermission
from plane.app.serializers import ModuleDetailSerializer
from plane.db.models import Issue, Module, ModuleLink, UserFavorite, Project
from plane.utils.analytics_plot import burndown_plot
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.timezone_converter import user_timezone_converter
# Module imports

View File

@@ -56,7 +56,7 @@ from plane.db.models import (
Project,
)
from plane.utils.analytics_plot import burndown_plot
from plane.utils.user_timezone_converter import user_timezone_converter
from plane.utils.timezone_converter import user_timezone_converter
from plane.bgtasks.webhook_task import model_activity
from .. import BaseAPIView, BaseViewSet
from plane.bgtasks.recent_visited_task import recent_visited_task

View File

@@ -2,10 +2,21 @@
import re
# Django imports
from django.db.models import Q, OuterRef, Subquery, Value, UUIDField, CharField
from django.db import models
from django.db.models import (
Q,
OuterRef,
Subquery,
Value,
UUIDField,
CharField,
When,
Case,
)
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models.functions import Coalesce
from django.db.models.functions import Coalesce, Concat
from django.utils import timezone
# Third party imports
from rest_framework import status
@@ -21,7 +32,9 @@ from plane.db.models import (
Module,
Page,
IssueView,
ProjectMember,
ProjectPage,
WorkspaceMember,
)
@@ -237,3 +250,459 @@ class GlobalSearchEndpoint(BaseAPIView):
func = MODELS_MAPPER.get(model, None)
results[model] = func(query, slug, project_id, workspace_search)
return Response({"results": results}, status=status.HTTP_200_OK)
class SearchEndpoint(BaseAPIView):
def get(self, request, slug):
query = request.query_params.get("query", False)
query_types = request.query_params.get("query_type", "user_mention").split(",")
query_types = [qt.strip() for qt in query_types]
count = int(request.query_params.get("count", 5))
project_id = request.query_params.get("project_id", None)
issue_id = request.query_params.get("issue_id", None)
response_data = {}
if project_id:
for query_type in query_types:
if query_type == "user_mention":
fields = [
"member__first_name",
"member__last_name",
"member__display_name",
]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
base_filters = Q(
q,
is_active=True,
workspace__slug=slug,
member__is_bot=False,
project_id=project_id,
role__gt=10,
)
if issue_id:
issue_created_by = (
Issue.objects.filter(id=issue_id)
.values_list("created_by_id", flat=True)
.first()
)
# Add condition to include `issue_created_by` in the query
filters = Q(member_id=issue_created_by) | base_filters
else:
filters = base_filters
# Query to fetch users
users = (
ProjectMember.objects.filter(filters)
.annotate(
member__avatar_url=Case(
When(
member__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"member__avatar_asset",
Value("/"),
),
),
When(
member__avatar_asset__isnull=True,
then="member__avatar",
),
default=Value(None),
output_field=CharField(),
)
)
.order_by("-created_at")
.values(
"member__avatar_url",
"member__display_name",
"member__id",
)[:count]
)
response_data["user_mention"] = list(users)
elif query_type == "project":
fields = ["name", "identifier"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
projects = (
Project.objects.filter(
q,
Q(project_projectmember__member=self.request.user)
| Q(network=2),
workspace__slug=slug,
)
.order_by("-created_at")
.distinct()
.values(
"name", "id", "identifier", "logo_props", "workspace__slug"
)[:count]
)
response_data["project"] = list(projects)
elif query_type == "issue":
fields = ["name", "sequence_id", "project__identifier"]
q = Q()
if query:
for field in fields:
if field == "sequence_id":
sequences = re.findall(r"\b\d+\b", query)
for sequence_id in sequences:
q |= Q(**{"sequence_id": sequence_id})
else:
q |= Q(**{f"{field}__icontains": query})
issues = (
Issue.issue_objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
project_id=project_id,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"sequence_id",
"project__identifier",
"project_id",
"priority",
"state_id",
"type_id",
)[:count]
)
response_data["issue"] = list(issues)
elif query_type == "cycle":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
cycles = (
Cycle.objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(
status=Case(
When(
Q(start_date__lte=timezone.now())
& Q(end_date__gte=timezone.now()),
then=Value("CURRENT"),
),
When(
start_date__gt=timezone.now(),
then=Value("UPCOMING"),
),
When(
end_date__lt=timezone.now(), then=Value("COMPLETED")
),
When(
Q(start_date__isnull=True)
& Q(end_date__isnull=True),
then=Value("DRAFT"),
),
default=Value("DRAFT"),
output_field=CharField(),
)
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"project_id",
"project__identifier",
"status",
"workspace__slug",
)[:count]
)
response_data["cycle"] = list(cycles)
elif query_type == "module":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
modules = (
Module.objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
project_id=project_id,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"project_id",
"project__identifier",
"status",
"workspace__slug",
)[:count]
)
response_data["module"] = list(modules)
elif query_type == "page":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
pages = (
Page.objects.filter(
q,
projects__project_projectmember__member=self.request.user,
projects__project_projectmember__is_active=True,
projects__id=project_id,
workspace__slug=slug,
access=0,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"logo_props",
"projects__id",
"workspace__slug",
)[:count]
)
response_data["page"] = list(pages)
return Response(response_data, status=status.HTTP_200_OK)
else:
for query_type in query_types:
if query_type == "user_mention":
fields = [
"member__first_name",
"member__last_name",
"member__display_name",
]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
users = (
WorkspaceMember.objects.filter(
q,
is_active=True,
workspace__slug=slug,
member__is_bot=False,
)
.annotate(
member__avatar_url=Case(
When(
member__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"member__avatar_asset",
Value("/"),
),
),
When(
member__avatar_asset__isnull=True,
then="member__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("-created_at")
.values(
"member__avatar_url", "member__display_name", "member__id"
)[:count]
)
response_data["user_mention"] = list(users)
elif query_type == "project":
fields = ["name", "identifier"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
projects = (
Project.objects.filter(
q,
Q(project_projectmember__member=self.request.user)
| Q(network=2),
workspace__slug=slug,
)
.order_by("-created_at")
.distinct()
.values(
"name", "id", "identifier", "logo_props", "workspace__slug"
)[:count]
)
response_data["project"] = list(projects)
elif query_type == "issue":
fields = ["name", "sequence_id", "project__identifier"]
q = Q()
if query:
for field in fields:
if field == "sequence_id":
sequences = re.findall(r"\b\d+\b", query)
for sequence_id in sequences:
q |= Q(**{"sequence_id": sequence_id})
else:
q |= Q(**{f"{field}__icontains": query})
issues = (
Issue.issue_objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"sequence_id",
"project__identifier",
"project_id",
"priority",
"state_id",
"type_id",
)[:count]
)
response_data["issue"] = list(issues)
elif query_type == "cycle":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
cycles = (
Cycle.objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
)
.annotate(
status=Case(
When(
Q(start_date__lte=timezone.now())
& Q(end_date__gte=timezone.now()),
then=Value("CURRENT"),
),
When(
start_date__gt=timezone.now(),
then=Value("UPCOMING"),
),
When(
end_date__lt=timezone.now(), then=Value("COMPLETED")
),
When(
Q(start_date__isnull=True)
& Q(end_date__isnull=True),
then=Value("DRAFT"),
),
default=Value("DRAFT"),
output_field=CharField(),
)
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"project_id",
"project__identifier",
"status",
"workspace__slug",
)[:count]
)
response_data["cycle"] = list(cycles)
elif query_type == "module":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
modules = (
Module.objects.filter(
q,
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
workspace__slug=slug,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"project_id",
"project__identifier",
"status",
"workspace__slug",
)[:count]
)
response_data["module"] = list(modules)
elif query_type == "page":
fields = ["name"]
q = Q()
if query:
for field in fields:
q |= Q(**{f"{field}__icontains": query})
pages = (
Page.objects.filter(
q,
projects__project_projectmember__member=self.request.user,
projects__project_projectmember__is_active=True,
workspace__slug=slug,
access=0,
is_global=True,
)
.order_by("-created_at")
.distinct()
.values(
"name",
"id",
"logo_props",
"projects__id",
"workspace__slug",
)[:count]
)
response_data["page"] = list(pages)
return Response(response_data, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,247 @@
# Python imports
import pytz
from datetime import datetime
# Django imports
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_page
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from rest_framework.views import APIView
# Module imports
from plane.authentication.rate_limit import AuthenticationThrottle
class TimezoneEndpoint(APIView):
permission_classes = [AllowAny]
throttle_classes = [AuthenticationThrottle]
@method_decorator(cache_page(60 * 60 * 24))
def get(self, request):
timezone_mapping = {
"-1100": [
("Midway Island", "Pacific/Midway"),
("American Samoa", "Pacific/Pago_Pago"),
],
"-1000": [
("Hawaii", "Pacific/Honolulu"),
("Aleutian Islands", "America/Adak"),
],
"-0930": [("Marquesas Islands", "Pacific/Marquesas")],
"-0900": [
("Alaska", "America/Anchorage"),
("Gambier Islands", "Pacific/Gambier"),
],
"-0800": [
("Pacific Time (US and Canada)", "America/Los_Angeles"),
("Baja California", "America/Tijuana"),
],
"-0700": [
("Mountain Time (US and Canada)", "America/Denver"),
("Arizona", "America/Phoenix"),
("Chihuahua, Mazatlan", "America/Chihuahua"),
],
"-0600": [
("Central Time (US and Canada)", "America/Chicago"),
("Saskatchewan", "America/Regina"),
("Guadalajara, Mexico City, Monterrey", "America/Mexico_City"),
("Tegucigalpa, Honduras", "America/Tegucigalpa"),
("Costa Rica", "America/Costa_Rica"),
],
"-0500": [
("Eastern Time (US and Canada)", "America/New_York"),
("Lima", "America/Lima"),
("Bogota", "America/Bogota"),
("Quito", "America/Guayaquil"),
("Chetumal", "America/Cancun"),
],
"-0430": [("Caracas (Old Venezuela Time)", "America/Caracas")],
"-0400": [
("Atlantic Time (Canada)", "America/Halifax"),
("Caracas", "America/Caracas"),
("Santiago", "America/Santiago"),
("La Paz", "America/La_Paz"),
("Manaus", "America/Manaus"),
("Georgetown", "America/Guyana"),
("Bermuda", "Atlantic/Bermuda"),
],
"-0330": [("Newfoundland Time (Canada)", "America/St_Johns")],
"-0300": [
("Buenos Aires", "America/Argentina/Buenos_Aires"),
("Brasilia", "America/Sao_Paulo"),
("Greenland", "America/Godthab"),
("Montevideo", "America/Montevideo"),
("Falkland Islands", "Atlantic/Stanley"),
],
"-0200": [
(
"South Georgia and the South Sandwich Islands",
"Atlantic/South_Georgia",
)
],
"-0100": [
("Azores", "Atlantic/Azores"),
("Cape Verde Islands", "Atlantic/Cape_Verde"),
],
"+0000": [
("Dublin", "Europe/Dublin"),
("Reykjavik", "Atlantic/Reykjavik"),
("Lisbon", "Europe/Lisbon"),
("Monrovia", "Africa/Monrovia"),
("Casablanca", "Africa/Casablanca"),
],
"+0100": [
("Central European Time (Berlin, Rome, Paris)", "Europe/Paris"),
("West Central Africa", "Africa/Lagos"),
("Algiers", "Africa/Algiers"),
("Lagos", "Africa/Lagos"),
("Tunis", "Africa/Tunis"),
],
"+0200": [
("Eastern European Time (Cairo, Helsinki, Kyiv)", "Europe/Kiev"),
("Athens", "Europe/Athens"),
("Jerusalem", "Asia/Jerusalem"),
("Johannesburg", "Africa/Johannesburg"),
("Harare, Pretoria", "Africa/Harare"),
],
"+0300": [
("Moscow Time", "Europe/Moscow"),
("Baghdad", "Asia/Baghdad"),
("Nairobi", "Africa/Nairobi"),
("Kuwait, Riyadh", "Asia/Riyadh"),
],
"+0330": [("Tehran", "Asia/Tehran")],
"+0400": [
("Abu Dhabi", "Asia/Dubai"),
("Baku", "Asia/Baku"),
("Yerevan", "Asia/Yerevan"),
("Astrakhan", "Europe/Astrakhan"),
("Tbilisi", "Asia/Tbilisi"),
("Mauritius", "Indian/Mauritius"),
],
"+0500": [
("Islamabad", "Asia/Karachi"),
("Karachi", "Asia/Karachi"),
("Tashkent", "Asia/Tashkent"),
("Yekaterinburg", "Asia/Yekaterinburg"),
("Maldives", "Indian/Maldives"),
("Chagos", "Indian/Chagos"),
],
"+0530": [
("Chennai", "Asia/Kolkata"),
("Kolkata", "Asia/Kolkata"),
("Mumbai", "Asia/Kolkata"),
("New Delhi", "Asia/Kolkata"),
("Sri Jayawardenepura", "Asia/Colombo"),
],
"+0545": [("Kathmandu", "Asia/Kathmandu")],
"+0600": [
("Dhaka", "Asia/Dhaka"),
("Almaty", "Asia/Almaty"),
("Bishkek", "Asia/Bishkek"),
("Thimphu", "Asia/Thimphu"),
],
"+0630": [
("Yangon (Rangoon)", "Asia/Yangon"),
("Cocos Islands", "Indian/Cocos"),
],
"+0700": [
("Bangkok", "Asia/Bangkok"),
("Hanoi", "Asia/Ho_Chi_Minh"),
("Jakarta", "Asia/Jakarta"),
("Novosibirsk", "Asia/Novosibirsk"),
("Krasnoyarsk", "Asia/Krasnoyarsk"),
],
"+0800": [
("Beijing", "Asia/Shanghai"),
("Singapore", "Asia/Singapore"),
("Perth", "Australia/Perth"),
("Hong Kong", "Asia/Hong_Kong"),
("Ulaanbaatar", "Asia/Ulaanbaatar"),
("Palau", "Pacific/Palau"),
],
"+0845": [("Eucla", "Australia/Eucla")],
"+0900": [
("Tokyo", "Asia/Tokyo"),
("Seoul", "Asia/Seoul"),
("Yakutsk", "Asia/Yakutsk"),
],
"+0930": [
("Adelaide", "Australia/Adelaide"),
("Darwin", "Australia/Darwin"),
],
"+1000": [
("Sydney", "Australia/Sydney"),
("Brisbane", "Australia/Brisbane"),
("Guam", "Pacific/Guam"),
("Vladivostok", "Asia/Vladivostok"),
("Tahiti", "Pacific/Tahiti"),
],
"+1030": [("Lord Howe Island", "Australia/Lord_Howe")],
"+1100": [
("Solomon Islands", "Pacific/Guadalcanal"),
("Magadan", "Asia/Magadan"),
("Norfolk Island", "Pacific/Norfolk"),
("Bougainville Island", "Pacific/Bougainville"),
("Chokurdakh", "Asia/Srednekolymsk"),
],
"+1200": [
("Auckland", "Pacific/Auckland"),
("Wellington", "Pacific/Auckland"),
("Fiji Islands", "Pacific/Fiji"),
("Anadyr", "Asia/Anadyr"),
],
"+1245": [("Chatham Islands", "Pacific/Chatham")],
"+1300": [("Nuku'alofa", "Pacific/Tongatapu"), ("Samoa", "Pacific/Apia")],
"+1400": [("Kiritimati Island", "Pacific/Kiritimati")],
}
timezone_list = []
now = datetime.now()
# Process timezone mapping
for offset, locations in timezone_mapping.items():
sign = "-" if offset.startswith("-") else "+"
hours = offset[1:3]
minutes = offset[3:] if len(offset) > 3 else "00"
for friendly_name, tz_identifier in locations:
try:
tz = pytz.timezone(tz_identifier)
current_offset = now.astimezone(tz).strftime("%z")
# converting and formatting UTC offset to GMT offset
current_utc_offset = now.astimezone(tz).utcoffset()
total_seconds = int(current_utc_offset.total_seconds())
hours_offset = total_seconds // 3600
minutes_offset = abs(total_seconds % 3600) // 60
gmt_offset = (
f"GMT{'+' if hours_offset >= 0 else '-'}"
f"{abs(hours_offset):02}:{minutes_offset:02}"
)
timezone_value = {
"offset": int(current_offset),
"utc_offset": f"UTC{sign}{hours}:{minutes}",
"gmt_offset": gmt_offset,
"value": tz_identifier,
"label": f"{friendly_name}",
}
timezone_list.append(timezone_value)
except pytz.exceptions.UnknownTimeZoneError:
continue
# Sort by offset and then by label
timezone_list.sort(key=lambda x: (x["offset"], x["label"]))
# Remove offset from final output
for tz in timezone_list:
del tz["offset"]
return Response({"timezones": timezone_list}, status=status.HTTP_200_OK)

View File

@@ -10,7 +10,7 @@ from plane.app.views.base import BaseAPIView
from plane.db.models import Cycle
from plane.app.permissions import WorkspaceViewerPermission
from plane.app.serializers.cycle import CycleSerializer
from plane.utils.timezone_converter import user_timezone_converter
class WorkspaceCyclesEndpoint(BaseAPIView):
permission_classes = [WorkspaceViewerPermission]

View File

@@ -0,0 +1,125 @@
# Python imports
from typing import Optional
import logging
# Django imports
from django.utils import timezone
from django.db import transaction
# Third party imports
from celery import shared_task
# Module imports
from plane.db.models import Issue, IssueDescriptionVersion, ProjectMember
from plane.utils.exception_logger import log_exception
def get_owner_id(issue: Issue) -> Optional[int]:
"""Get the owner ID of the issue"""
if issue.updated_by_id:
return issue.updated_by_id
if issue.created_by_id:
return issue.created_by_id
# Find project admin as fallback
project_member = ProjectMember.objects.filter(
project_id=issue.project_id,
role=20, # Admin role
).first()
return project_member.member_id if project_member else None
@shared_task
def sync_issue_description_version(batch_size=5000, offset=0, countdown=300):
"""Task to create IssueDescriptionVersion records for existing Issues in batches"""
try:
with transaction.atomic():
base_query = Issue.objects
total_issues_count = base_query.count()
if total_issues_count == 0:
return
# Calculate batch range
end_offset = min(offset + batch_size, total_issues_count)
# Fetch issues with related data
issues_batch = (
base_query.order_by("created_at")
.select_related("workspace", "project")
.only(
"id",
"workspace_id",
"project_id",
"created_by_id",
"updated_by_id",
"description_binary",
"description_html",
"description_stripped",
"description",
)[offset:end_offset]
)
if not issues_batch:
return
version_objects = []
for issue in issues_batch:
# Validate required fields
if not issue.workspace_id or not issue.project_id:
logging.warning(
f"Skipping {issue.id} - missing workspace_id or project_id"
)
continue
# Determine owned_by_id
owned_by_id = get_owner_id(issue)
if owned_by_id is None:
logging.warning(f"Skipping issue {issue.id} - missing owned_by")
continue
# Create version object
version_objects.append(
IssueDescriptionVersion(
workspace_id=issue.workspace_id,
project_id=issue.project_id,
created_by_id=issue.created_by_id,
updated_by_id=issue.updated_by_id,
owned_by_id=owned_by_id,
last_saved_at=timezone.now(),
issue_id=issue.id,
description_binary=issue.description_binary,
description_html=issue.description_html,
description_stripped=issue.description_stripped,
description_json=issue.description,
)
)
# Bulk create version objects
if version_objects:
IssueDescriptionVersion.objects.bulk_create(version_objects)
# Schedule next batch if needed
if end_offset < total_issues_count:
sync_issue_description_version.apply_async(
kwargs={
"batch_size": batch_size,
"offset": end_offset,
"countdown": countdown,
},
countdown=countdown,
)
return
except Exception as e:
log_exception(e)
return
@shared_task
def schedule_issue_description_version(batch_size=5000, countdown=300):
sync_issue_description_version.delay(
batch_size=int(batch_size), countdown=countdown
)

View File

@@ -0,0 +1,84 @@
from celery import shared_task
from django.db import transaction
from django.utils import timezone
from typing import Optional, Dict
import json
from plane.db.models import Issue, IssueDescriptionVersion
from plane.utils.exception_logger import log_exception
def should_update_existing_version(
version: IssueDescriptionVersion, user_id: str, max_time_difference: int = 600
) -> bool:
if not version:
return
time_difference = (timezone.now() - version.last_saved_at).total_seconds()
return (
str(version.owned_by_id) == str(user_id)
and time_difference <= max_time_difference
)
def update_existing_version(version: IssueDescriptionVersion, issue) -> None:
version.description_json = issue.description
version.description_html = issue.description_html
version.description_binary = issue.description_binary
version.description_stripped = issue.description_stripped
version.last_saved_at = timezone.now()
version.save(
update_fields=[
"description_json",
"description_html",
"description_binary",
"description_stripped",
"last_saved_at",
]
)
@shared_task
def issue_description_version_task(
updated_issue, issue_id, user_id, is_creating=False
) -> Optional[bool]:
try:
# Parse updated issue data
current_issue: Dict = json.loads(updated_issue) if updated_issue else {}
# Get current issue
issue = Issue.objects.get(id=issue_id)
# Check if description has changed
if (
current_issue.get("description_html") == issue.description_html
and not is_creating
):
return
with transaction.atomic():
# Get latest version
latest_version = (
IssueDescriptionVersion.objects.filter(issue_id=issue_id)
.order_by("-last_saved_at")
.first()
)
# Determine whether to update existing or create new version
if should_update_existing_version(version=latest_version, user_id=user_id):
update_existing_version(latest_version, issue)
else:
IssueDescriptionVersion.log_issue_description_version(issue, user_id)
return
except Issue.DoesNotExist:
# Issue no longer exists, skip processing
return
except json.JSONDecodeError as e:
log_exception(f"Invalid JSON for updated_issue: {e}")
return
except Exception as e:
log_exception(f"Error processing issue description version: {e}")
return

View File

@@ -0,0 +1,254 @@
# Python imports
import json
from typing import Optional, List, Dict
from uuid import UUID
from itertools import groupby
import logging
# Django imports
from django.utils import timezone
from django.db import transaction
# Third party imports
from celery import shared_task
# Module imports
from plane.db.models import (
Issue,
IssueVersion,
ProjectMember,
CycleIssue,
ModuleIssue,
IssueActivity,
IssueAssignee,
IssueLabel,
)
from plane.utils.exception_logger import log_exception
@shared_task
def issue_task(updated_issue, issue_id, user_id):
try:
current_issue = json.loads(updated_issue) if updated_issue else {}
issue = Issue.objects.get(id=issue_id)
updated_current_issue = {}
for key, value in current_issue.items():
if getattr(issue, key) != value:
updated_current_issue[key] = value
if updated_current_issue:
issue_version = (
IssueVersion.objects.filter(issue_id=issue_id)
.order_by("-last_saved_at")
.first()
)
if (
issue_version
and str(issue_version.owned_by) == str(user_id)
and (timezone.now() - issue_version.last_saved_at).total_seconds()
<= 600
):
for key, value in updated_current_issue.items():
setattr(issue_version, key, value)
issue_version.last_saved_at = timezone.now()
issue_version.save(
update_fields=list(updated_current_issue.keys()) + ["last_saved_at"]
)
else:
IssueVersion.log_issue_version(issue, user_id)
return
except Issue.DoesNotExist:
return
except Exception as e:
log_exception(e)
return
def get_owner_id(issue: Issue) -> Optional[int]:
"""Get the owner ID of the issue"""
if issue.updated_by_id:
return issue.updated_by_id
if issue.created_by_id:
return issue.created_by_id
# Find project admin as fallback
project_member = ProjectMember.objects.filter(
project_id=issue.project_id,
role=20, # Admin role
).first()
return project_member.member_id if project_member else None
def get_related_data(issue_ids: List[UUID]) -> Dict:
"""Get related data for the given issue IDs"""
cycle_issues = {
ci.issue_id: ci.cycle_id
for ci in CycleIssue.objects.filter(issue_id__in=issue_ids)
}
# Get assignees with proper grouping
assignee_records = list(
IssueAssignee.objects.filter(issue_id__in=issue_ids)
.values_list("issue_id", "assignee_id")
.order_by("issue_id")
)
assignees = {}
for issue_id, group in groupby(assignee_records, key=lambda x: x[0]):
assignees[issue_id] = [str(g[1]) for g in group]
# Get labels with proper grouping
label_records = list(
IssueLabel.objects.filter(issue_id__in=issue_ids)
.values_list("issue_id", "label_id")
.order_by("issue_id")
)
labels = {}
for issue_id, group in groupby(label_records, key=lambda x: x[0]):
labels[issue_id] = [str(g[1]) for g in group]
# Get modules with proper grouping
module_records = list(
ModuleIssue.objects.filter(issue_id__in=issue_ids)
.values_list("issue_id", "module_id")
.order_by("issue_id")
)
modules = {}
for issue_id, group in groupby(module_records, key=lambda x: x[0]):
modules[issue_id] = [str(g[1]) for g in group]
# Get latest activities
latest_activities = {}
activities = IssueActivity.objects.filter(issue_id__in=issue_ids).order_by(
"issue_id", "-created_at"
)
for issue_id, activities_group in groupby(activities, key=lambda x: x.issue_id):
first_activity = next(activities_group, None)
if first_activity:
latest_activities[issue_id] = first_activity.id
return {
"cycle_issues": cycle_issues,
"assignees": assignees,
"labels": labels,
"modules": modules,
"activities": latest_activities,
}
def create_issue_version(issue: Issue, related_data: Dict) -> Optional[IssueVersion]:
"""Create IssueVersion object from the given issue and related data"""
try:
if not issue.workspace_id or not issue.project_id:
logging.warning(
f"Skipping issue {issue.id} - missing workspace_id or project_id"
)
return None
owned_by_id = get_owner_id(issue)
if owned_by_id is None:
logging.warning(f"Skipping issue {issue.id} - missing owned_by")
return None
return IssueVersion(
workspace_id=issue.workspace_id,
project_id=issue.project_id,
created_by_id=issue.created_by_id,
updated_by_id=issue.updated_by_id,
owned_by_id=owned_by_id,
last_saved_at=timezone.now(),
activity_id=related_data["activities"].get(issue.id),
properties=getattr(issue, "properties", {}),
meta=getattr(issue, "meta", {}),
issue_id=issue.id,
parent=issue.parent_id,
state=issue.state_id,
estimate_point=issue.estimate_point_id,
name=issue.name,
priority=issue.priority,
start_date=issue.start_date,
target_date=issue.target_date,
assignees=related_data["assignees"].get(issue.id, []),
sequence_id=issue.sequence_id,
labels=related_data["labels"].get(issue.id, []),
sort_order=issue.sort_order,
completed_at=issue.completed_at,
archived_at=issue.archived_at,
is_draft=issue.is_draft,
external_source=issue.external_source,
external_id=issue.external_id,
type=issue.type_id,
cycle=related_data["cycle_issues"].get(issue.id),
modules=related_data["modules"].get(issue.id, []),
)
except Exception as e:
log_exception(e)
return None
@shared_task
def sync_issue_version(batch_size=5000, offset=0, countdown=300):
"""Task to create IssueVersion records for existing Issues in batches"""
try:
with transaction.atomic():
base_query = Issue.objects
total_issues_count = base_query.count()
if total_issues_count == 0:
return
end_offset = min(offset + batch_size, total_issues_count)
# Get issues batch with optimized queries
issues_batch = list(
base_query.order_by("created_at")
.select_related("workspace", "project")
.all()[offset:end_offset]
)
if not issues_batch:
return
# Get all related data in bulk
issue_ids = [issue.id for issue in issues_batch]
related_data = get_related_data(issue_ids)
issue_versions = []
for issue in issues_batch:
version = create_issue_version(issue, related_data)
if version:
issue_versions.append(version)
# Bulk create versions
if issue_versions:
IssueVersion.objects.bulk_create(issue_versions, batch_size=1000)
# Schedule the next batch if there are more workspaces to process
if end_offset < total_issues_count:
sync_issue_version.apply_async(
kwargs={
"batch_size": batch_size,
"offset": end_offset,
"countdown": countdown,
},
countdown=countdown,
)
logging.info(f"Processed Issues: {end_offset}")
return
except Exception as e:
log_exception(e)
return
@shared_task
def schedule_issue_version(batch_size=5000, countdown=300):
sync_issue_version.delay(batch_size=int(batch_size), countdown=countdown)

View File

@@ -32,7 +32,6 @@ from bs4 import BeautifulSoup
def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
aggregated_issue_mentions = []
for mention_id in new_mentions:
aggregated_issue_mentions.append(
IssueMention(
@@ -125,7 +124,9 @@ def extract_mentions(issue_instance):
data = json.loads(issue_instance)
html = data.get("description_html")
soup = BeautifulSoup(html, "html.parser")
mention_tags = soup.find_all("mention-component", attrs={"target": "users"})
mention_tags = soup.find_all(
"mention-component", attrs={"entity_name": "user_mention"}
)
mentions = [mention_tag["entity_identifier"] for mention_tag in mention_tags]
@@ -139,7 +140,9 @@ def extract_comment_mentions(comment_value):
try:
mentions = []
soup = BeautifulSoup(comment_value, "html.parser")
mentions_tags = soup.find_all("mention-component", attrs={"target": "users"})
mentions_tags = soup.find_all(
"mention-component", attrs={"entity_name": "user_mention"}
)
for mention_tag in mentions_tags:
mentions.append(mention_tag["entity_identifier"])
return list(set(mentions))
@@ -255,12 +258,9 @@ def notifications(
new_mentions = get_new_mentions(
requested_instance=requested_data, current_instance=current_instance
)
new_mentions = [
str(mention)
for mention in new_mentions
if mention in set(project_members)
]
new_mentions = list(
set(new_mentions) & {str(member) for member in project_members}
)
removed_mention = get_removed_mentions(
requested_instance=requested_data, current_instance=current_instance
)

View File

@@ -0,0 +1,23 @@
# Django imports
from django.core.management.base import BaseCommand
# Module imports
from plane.bgtasks.issue_description_version_sync import (
schedule_issue_description_version,
)
class Command(BaseCommand):
help = "Creates IssueDescriptionVersion records for existing Issues in batches"
def handle(self, *args, **options):
batch_size = input("Enter the batch size: ")
batch_countdown = input("Enter the batch countdown: ")
schedule_issue_description_version.delay(
batch_size=batch_size, countdown=int(batch_countdown)
)
self.stdout.write(
self.style.SUCCESS("Successfully created issue description version task")
)

View File

@@ -0,0 +1,19 @@
# Django imports
from django.core.management.base import BaseCommand
# Module imports
from plane.bgtasks.issue_version_sync import schedule_issue_version
class Command(BaseCommand):
help = "Creates IssueVersion records for existing Issues in batches"
def handle(self, *args, **options):
batch_size = input("Enter the batch size: ")
batch_countdown = input("Enter the batch countdown: ")
schedule_issue_version.delay(
batch_size=batch_size, countdown=int(batch_countdown)
)
self.stdout.write(self.style.SUCCESS("Successfully created issue version task"))

View File

@@ -0,0 +1,117 @@
# Generated by Django 4.2.17 on 2024-12-13 10:09
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
import plane.db.models.user
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0086_issueversion_alter_teampage_unique_together_and_more'),
]
operations = [
migrations.RemoveField(
model_name='issueversion',
name='description',
),
migrations.RemoveField(
model_name='issueversion',
name='description_binary',
),
migrations.RemoveField(
model_name='issueversion',
name='description_html',
),
migrations.RemoveField(
model_name='issueversion',
name='description_stripped',
),
migrations.AddField(
model_name='issueversion',
name='activity',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='versions', to='db.issueactivity'),
),
migrations.AddField(
model_name='profile',
name='is_mobile_onboarded',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='profile',
name='mobile_onboarding_step',
field=models.JSONField(default=plane.db.models.user.get_mobile_default_onboarding),
),
migrations.AddField(
model_name='profile',
name='mobile_timezone_auto_set',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='profile',
name='language',
field=models.CharField(default='en', max_length=255),
),
migrations.AlterField(
model_name='issueversion',
name='owned_by',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_versions', to=settings.AUTH_USER_MODEL),
),
migrations.CreateModel(
name='Sticky',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('deleted_at', models.DateTimeField(blank=True, null=True, verbose_name='Deleted At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('name', models.TextField()),
('description', models.JSONField(blank=True, default=dict)),
('description_html', models.TextField(blank=True, default='<p></p>')),
('description_stripped', models.TextField(blank=True, null=True)),
('description_binary', models.BinaryField(null=True)),
('logo_props', models.JSONField(default=dict)),
('color', models.CharField(blank=True, max_length=255, null=True)),
('background_color', models.CharField(blank=True, max_length=255, null=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('owner', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='stickies', to=settings.AUTH_USER_MODEL)),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='stickies', to='db.workspace')),
],
options={
'verbose_name': 'Sticky',
'verbose_name_plural': 'Stickies',
'db_table': 'stickies',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='IssueDescriptionVersion',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('deleted_at', models.DateTimeField(blank=True, null=True, verbose_name='Deleted At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('description_binary', models.BinaryField(null=True)),
('description_html', models.TextField(blank=True, default='<p></p>')),
('description_stripped', models.TextField(blank=True, null=True)),
('description_json', models.JSONField(blank=True, default=dict)),
('last_saved_at', models.DateTimeField(default=django.utils.timezone.now)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='description_versions', to='db.issue')),
('owned_by', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_description_versions', to=settings.AUTH_USER_MODEL)),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Description Version',
'verbose_name_plural': 'Issue Description Versions',
'db_table': 'issue_description_versions',
},
),
]

View File

@@ -0,0 +1,124 @@
# Generated by Django 4.2.15 on 2024-12-24 14:57
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0087_remove_issueversion_description_and_more'),
]
operations = [
migrations.AddField(
model_name="sticky",
name="sort_order",
field=models.FloatField(default=65535),
),
migrations.CreateModel(
name="WorkspaceUserLink",
fields=[
(
"created_at",
models.DateTimeField(auto_now_add=True, verbose_name="Created At"),
),
(
"updated_at",
models.DateTimeField(
auto_now=True, verbose_name="Last Modified At"
),
),
(
"deleted_at",
models.DateTimeField(
blank=True, null=True, verbose_name="Deleted At"
),
),
(
"id",
models.UUIDField(
db_index=True,
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
unique=True,
),
),
("title", models.CharField(blank=True, max_length=255, null=True)),
("url", models.TextField()),
("metadata", models.JSONField(default=dict)),
(
"created_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"owner",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="owner_workspace_user_link",
to=settings.AUTH_USER_MODEL,
),
),
(
"project",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="project_%(class)s",
to="db.project",
),
),
(
"updated_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Modified By",
),
),
(
"workspace",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="workspace_%(class)s",
to="db.workspace",
),
),
],
options={
"verbose_name": "Workspace User Link",
"verbose_name_plural": "Workspace User Links",
"db_table": "workspace_user_links",
"ordering": ("-created_at",),
},
),
migrations.AlterField(
model_name="pagelog",
name="entity_name",
field=models.CharField(max_length=30, verbose_name="Transaction Type"),
),
migrations.AlterUniqueTogether(
name="webhook",
unique_together={("workspace", "url", "deleted_at")},
),
migrations.AddConstraint(
model_name="webhook",
constraint=models.UniqueConstraint(
condition=models.Q(("deleted_at__isnull", True)),
fields=("workspace", "url"),
name="webhook_url_unique_url_when_deleted_at_null",
),
),
]

View File

@@ -41,6 +41,8 @@ from .issue import (
IssueSequence,
IssueSubscriber,
IssueVote,
IssueVersion,
IssueDescriptionVersion,
)
from .module import Module, ModuleIssue, ModuleLink, ModuleMember, ModuleUserProperties
from .notification import EmailNotificationLog, Notification, UserNotificationPreference
@@ -66,17 +68,9 @@ from .workspace import (
WorkspaceMemberInvite,
WorkspaceTheme,
WorkspaceUserProperties,
WorkspaceUserLink,
)
from .favorite import UserFavorite
from .issue_type import IssueType
@@ -86,3 +80,5 @@ from .recent_visit import UserRecentVisit
from .label import Label
from .device import Device, DeviceSession
from .sticky import Sticky

View File

@@ -15,6 +15,7 @@ from django import apps
from plane.utils.html_processor import strip_tags
from plane.db.mixins import SoftDeletionManager
from plane.utils.exception_logger import log_exception
from .base import BaseModel
from .project import ProjectBaseModel
@@ -660,9 +661,6 @@ class IssueVote(ProjectBaseModel):
class IssueVersion(ProjectBaseModel):
issue = models.ForeignKey(
"db.Issue", on_delete=models.CASCADE, related_name="versions"
)
PRIORITY_CHOICES = (
("urgent", "Urgent"),
("high", "High"),
@@ -670,14 +668,11 @@ class IssueVersion(ProjectBaseModel):
("low", "Low"),
("none", "None"),
)
parent = models.UUIDField(blank=True, null=True)
state = models.UUIDField(blank=True, null=True)
estimate_point = models.UUIDField(blank=True, null=True)
name = models.CharField(max_length=255, verbose_name="Issue Name")
description = models.JSONField(blank=True, default=dict)
description_html = models.TextField(blank=True, default="<p></p>")
description_stripped = models.TextField(blank=True, null=True)
description_binary = models.BinaryField(null=True)
priority = models.CharField(
max_length=30,
choices=PRIORITY_CHOICES,
@@ -686,7 +681,9 @@ class IssueVersion(ProjectBaseModel):
)
start_date = models.DateField(null=True, blank=True)
target_date = models.DateField(null=True, blank=True)
assignees = ArrayField(models.UUIDField(), blank=True, default=list)
sequence_id = models.IntegerField(default=1, verbose_name="Issue Sequence ID")
labels = ArrayField(models.UUIDField(), blank=True, default=list)
sort_order = models.FloatField(default=65535)
completed_at = models.DateTimeField(null=True)
archived_at = models.DateField(null=True)
@@ -694,14 +691,26 @@ class IssueVersion(ProjectBaseModel):
external_source = models.CharField(max_length=255, null=True, blank=True)
external_id = models.CharField(max_length=255, blank=True, null=True)
type = models.UUIDField(blank=True, null=True)
last_saved_at = models.DateTimeField(default=timezone.now)
owned_by = models.UUIDField()
assignees = ArrayField(models.UUIDField(), blank=True, default=list)
labels = ArrayField(models.UUIDField(), blank=True, default=list)
cycle = models.UUIDField(null=True, blank=True)
modules = ArrayField(models.UUIDField(), blank=True, default=list)
properties = models.JSONField(default=dict)
meta = models.JSONField(default=dict)
properties = models.JSONField(default=dict) # issue properties
meta = models.JSONField(default=dict) # issue meta
last_saved_at = models.DateTimeField(default=timezone.now)
issue = models.ForeignKey(
"db.Issue", on_delete=models.CASCADE, related_name="versions"
)
activity = models.ForeignKey(
"db.IssueActivity",
on_delete=models.SET_NULL,
null=True,
related_name="versions",
)
owned_by = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="issue_versions",
)
class Meta:
verbose_name = "Issue Version"
@@ -721,39 +730,93 @@ class IssueVersion(ProjectBaseModel):
Module = apps.get_model("db.Module")
CycleIssue = apps.get_model("db.CycleIssue")
IssueAssignee = apps.get_model("db.IssueAssignee")
IssueLabel = apps.get_model("db.IssueLabel")
cycle_issue = CycleIssue.objects.filter(issue=issue).first()
cls.objects.create(
issue=issue,
parent=issue.parent,
state=issue.state,
point=issue.point,
estimate_point=issue.estimate_point,
parent=issue.parent_id,
state=issue.state_id,
estimate_point=issue.estimate_point_id,
name=issue.name,
description=issue.description,
description_html=issue.description_html,
description_stripped=issue.description_stripped,
description_binary=issue.description_binary,
priority=issue.priority,
start_date=issue.start_date,
target_date=issue.target_date,
assignees=list(
IssueAssignee.objects.filter(issue=issue).values_list(
"assignee_id", flat=True
)
),
sequence_id=issue.sequence_id,
labels=list(
IssueLabel.objects.filter(issue=issue).values_list(
"label_id", flat=True
)
),
sort_order=issue.sort_order,
completed_at=issue.completed_at,
archived_at=issue.archived_at,
is_draft=issue.is_draft,
external_source=issue.external_source,
external_id=issue.external_id,
type=issue.type,
last_saved_at=issue.last_saved_at,
assignees=issue.assignees,
labels=issue.labels,
cycle=cycle_issue.cycle if cycle_issue else None,
modules=Module.objects.filter(issue=issue).values_list("id", flat=True),
type=issue.type_id,
cycle=cycle_issue.cycle_id if cycle_issue else None,
modules=list(
Module.objects.filter(issue=issue).values_list("id", flat=True)
),
properties={},
meta={},
last_saved_at=timezone.now(),
owned_by=user,
)
return True
except Exception as e:
log_exception(e)
return False
class IssueDescriptionVersion(ProjectBaseModel):
issue = models.ForeignKey(
"db.Issue", on_delete=models.CASCADE, related_name="description_versions"
)
description_binary = models.BinaryField(null=True)
description_html = models.TextField(blank=True, default="<p></p>")
description_stripped = models.TextField(blank=True, null=True)
description_json = models.JSONField(default=dict, blank=True)
last_saved_at = models.DateTimeField(default=timezone.now)
owned_by = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="issue_description_versions",
)
class Meta:
verbose_name = "Issue Description Version"
verbose_name_plural = "Issue Description Versions"
db_table = "issue_description_versions"
@classmethod
def log_issue_description_version(cls, issue, user):
try:
"""
Log the issue description version
"""
cls.objects.create(
workspace_id=issue.workspace_id,
project_id=issue.project_id,
created_by_id=issue.created_by_id,
updated_by_id=issue.updated_by_id,
owned_by_id=user,
last_saved_at=timezone.now(),
issue_id=issue.id,
description_binary=issue.description_binary,
description_html=issue.description_html,
description_stripped=issue.description_stripped,
description_json=issue.description,
)
return True
except Exception as e:
log_exception(e)
return False

View File

@@ -90,7 +90,7 @@ class PageLog(BaseModel):
page = models.ForeignKey(Page, related_name="page_log", on_delete=models.CASCADE)
entity_identifier = models.UUIDField(null=True)
entity_name = models.CharField(
max_length=30, choices=TYPE_CHOICES, verbose_name="Transaction Type"
max_length=30, verbose_name="Transaction Type"
)
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="workspace_page_log"

View File

@@ -0,0 +1,48 @@
# Django imports
from django.conf import settings
from django.db import models
# Module imports
from .base import BaseModel
class Sticky(BaseModel):
name = models.TextField()
description = models.JSONField(blank=True, default=dict)
description_html = models.TextField(blank=True, default="<p></p>")
description_stripped = models.TextField(blank=True, null=True)
description_binary = models.BinaryField(null=True)
logo_props = models.JSONField(default=dict)
color = models.CharField(max_length=255, blank=True, null=True)
background_color = models.CharField(max_length=255, blank=True, null=True)
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="stickies"
)
owner = models.ForeignKey(
settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name="stickies"
)
sort_order = models.FloatField(default=65535)
class Meta:
verbose_name = "Sticky"
verbose_name_plural = "Stickies"
db_table = "stickies"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
if self._state.adding:
# Get the maximum sequence value from the database
last_id = Sticky.objects.filter(workspace=self.workspace).aggregate(
largest=models.Max("sort_order")
)["largest"]
# if last_id is not None
if last_id is not None:
self.sort_order = last_id + 10000
super(Sticky, self).save(*args, **kwargs)
def __str__(self):
return str(self.name)

View File

@@ -26,6 +26,14 @@ def get_default_onboarding():
}
def get_mobile_default_onboarding():
return {
"profile_complete": False,
"workspace_create": False,
"workspace_join": False,
}
class User(AbstractBaseUser, PermissionsMixin):
id = models.UUIDField(
default=uuid.uuid4, unique=True, editable=False, db_index=True, primary_key=True
@@ -178,6 +186,12 @@ class Profile(TimeAuditModel):
billing_address = models.JSONField(null=True)
has_billing_address = models.BooleanField(default=False)
company_name = models.CharField(max_length=255, blank=True)
# mobile
is_mobile_onboarded = models.BooleanField(default=False)
mobile_onboarding_step = models.JSONField(default=get_mobile_default_onboarding)
mobile_timezone_auto_set = models.BooleanField(default=False)
# language
language = models.CharField(max_length=255, default="en")
class Meta:
verbose_name = "Profile"

View File

@@ -47,11 +47,18 @@ class Webhook(BaseModel):
return f"{self.workspace.slug} {self.url}"
class Meta:
unique_together = ["workspace", "url"]
unique_together = ["workspace", "url", "deleted_at"]
verbose_name = "Webhook"
verbose_name_plural = "Webhooks"
db_table = "webhooks"
ordering = ("-created_at",)
constraints = [
models.UniqueConstraint(
fields=["workspace", "url"],
condition=models.Q(deleted_at__isnull=True),
name="webhook_url_unique_url_when_deleted_at_null",
)
]
class WebhookLog(BaseModel):

View File

@@ -322,3 +322,23 @@ class WorkspaceUserProperties(BaseModel):
def __str__(self):
return f"{self.workspace.name} {self.user.email}"
class WorkspaceUserLink(WorkspaceBaseModel):
title = models.CharField(max_length=255, null=True, blank=True)
url = models.TextField()
metadata = models.JSONField(default=dict)
owner = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="owner_workspace_user_link",
)
class Meta:
verbose_name = "Workspace User Link"
verbose_name_plural = "Workspace User Links"
db_table = "workspace_user_links"
ordering = ("-created_at",)
def __str__(self):
return f"{self.workspace.id} {self.url}"

View File

@@ -262,6 +262,9 @@ CELERY_IMPORTS = (
"plane.license.bgtasks.tracer",
# management tasks
"plane.bgtasks.dummy_data_task",
# issue version tasks
"plane.bgtasks.issue_version_sync",
"plane.bgtasks.issue_description_version_sync",
)
# Sentry Settings

View File

@@ -91,6 +91,7 @@ def issue_on_results(issues, group_by, sub_group_by):
Case(
When(
votes__isnull=False,
votes__deleted_at__isnull=True,
then=JSONObject(
vote=F("votes__vote"),
actor_details=JSONObject(
@@ -117,13 +118,14 @@ def issue_on_results(issues, group_by, sub_group_by):
default=None,
output_field=JSONField(),
),
filter=Q(votes__isnull=False),
filter=Q(votes__isnull=False,votes__deleted_at__isnull=True),
distinct=True,
),
reaction_items=ArrayAgg(
Case(
When(
issue_reactions__isnull=False,
issue_reactions__deleted_at__isnull=True,
then=JSONObject(
reaction=F("issue_reactions__reaction"),
actor_details=JSONObject(
@@ -150,7 +152,7 @@ def issue_on_results(issues, group_by, sub_group_by):
default=None,
output_field=JSONField(),
),
filter=Q(issue_reactions__isnull=False),
filter=Q(issue_reactions__isnull=False, issue_reactions__deleted_at__isnull=True),
distinct=True,
),
).values(*required_fields, "vote_items", "reaction_items")

View File

@@ -86,7 +86,13 @@ class EntityAssetEndpoint(BaseAPIView):
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp"]
allowed_types = [
"image/jpeg",
"image/png",
"image/webp",
"image/jpg",
"image/gif",
]
if type not in allowed_types:
return Response(
{

View File

@@ -701,6 +701,7 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
Case(
When(
votes__isnull=False,
votes__deleted_at__isnull=True,
then=JSONObject(
vote=F("votes__vote"),
actor_details=JSONObject(
@@ -732,7 +733,11 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
output_field=JSONField(),
),
filter=Case(
When(votes__isnull=False, then=True),
When(
votes__isnull=False,
votes__deleted_at__isnull=True,
then=True,
),
default=False,
output_field=JSONField(),
),
@@ -742,6 +747,7 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
Case(
When(
issue_reactions__isnull=False,
issue_reactions__deleted_at__isnull=True,
then=JSONObject(
reaction=F("issue_reactions__reaction"),
actor_details=JSONObject(
@@ -775,7 +781,11 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
output_field=JSONField(),
),
filter=Case(
When(issue_reactions__isnull=False, then=True),
When(
issue_reactions__isnull=False,
issue_reactions__deleted_at__isnull=True,
then=True,
),
default=False,
output_field=JSONField(),
),

View File

@@ -0,0 +1,100 @@
import pytz
from plane.db.models import Project
from datetime import datetime, time
from datetime import timedelta
def user_timezone_converter(queryset, datetime_fields, user_timezone):
# Create a timezone object for the user's timezone
user_tz = pytz.timezone(user_timezone)
# Check if queryset is a dictionary (single item) or a list of dictionaries
if isinstance(queryset, dict):
queryset_values = [queryset]
else:
queryset_values = list(queryset)
# Iterate over the dictionaries in the list
for item in queryset_values:
# Iterate over the datetime fields
for field in datetime_fields:
# Convert the datetime field to the user's timezone
if field in item and item[field]:
item[field] = item[field].astimezone(user_tz)
# If queryset was a single item, return a single item
if isinstance(queryset, dict):
return queryset_values[0]
else:
return queryset_values
def convert_to_utc(date, project_id, is_start_date=False):
"""
Converts a start date string to the project's local timezone at 12:00 AM
and then converts it to UTC for storage.
Args:
date (str): The date string in "YYYY-MM-DD" format.
project_id (int): The project's ID to fetch the associated timezone.
Returns:
datetime: The UTC datetime.
"""
# Retrieve the project's timezone using the project ID
project = Project.objects.get(id=project_id)
project_timezone = project.timezone
if not date or not project_timezone:
raise ValueError("Both date and timezone must be provided.")
# Parse the string into a date object
start_date = datetime.strptime(date, "%Y-%m-%d").date()
# Get the project's timezone
local_tz = pytz.timezone(project_timezone)
# Combine the date with 12:00 AM time
local_datetime = datetime.combine(start_date, time.min)
# Localize the datetime to the project's timezone
localized_datetime = local_tz.localize(local_datetime)
# If it's an start date, add one minute
if is_start_date:
localized_datetime += timedelta(minutes=1)
# Convert the localized datetime to UTC
utc_datetime = localized_datetime.astimezone(pytz.utc)
# Return the UTC datetime for storage
return utc_datetime
def convert_utc_to_project_timezone(utc_datetime, project_id):
"""
Converts a UTC datetime (stored in the database) to the project's local timezone.
Args:
utc_datetime (datetime): The UTC datetime to be converted.
project_id (int): The project's ID to fetch the associated timezone.
Returns:
datetime: The datetime in the project's local timezone.
"""
# Retrieve the project's timezone using the project ID
project = Project.objects.get(id=project_id)
project_timezone = project.timezone
if not project_timezone:
raise ValueError("Project timezone must be provided.")
# Get the timezone object for the project's timezone
local_tz = pytz.timezone(project_timezone)
# Convert the UTC datetime to the project's local timezone
if utc_datetime.tzinfo is None:
# Localize UTC datetime if it's naive (i.e., without timezone info)
utc_datetime = pytz.utc.localize(utc_datetime)
# Convert to the project's local timezone
local_datetime = utc_datetime.astimezone(local_tz)
return local_datetime

View File

@@ -1,26 +0,0 @@
import pytz
def user_timezone_converter(queryset, datetime_fields, user_timezone):
# Create a timezone object for the user's timezone
user_tz = pytz.timezone(user_timezone)
# Check if queryset is a dictionary (single item) or a list of dictionaries
if isinstance(queryset, dict):
queryset_values = [queryset]
else:
queryset_values = list(queryset)
# Iterate over the dictionaries in the list
for item in queryset_values:
# Iterate over the datetime fields
for field in datetime_fields:
# Convert the datetime field to the user's timezone
if field in item and item[field]:
item[field] = item[field].astimezone(user_tz)
# If queryset was a single item, return a single item
if isinstance(queryset, dict):
return queryset_values[0]
else:
return queryset_values

View File

@@ -70,7 +70,7 @@
"value": ""
},
"GITHUB_CLIENT_SECRET": {
"description": "Github Client Secret",
"description": "GitHub Client Secret",
"value": ""
},
"NEXT_PUBLIC_API_BASE_URL": {

View File

@@ -62,7 +62,7 @@ mkdir plane-selfhost
cd plane-selfhost
curl -fsSL -o setup.sh https://raw.githubusercontent.com/makeplane/plane/master/deploy/selfhost/install.sh
curl -fsSL -o setup.sh https://github.com/makeplane/plane/releases/latest/download/setup.sh
chmod +x setup.sh
```

View File

@@ -1,54 +1,63 @@
x-app-env: &app-env
environment:
- NGINX_PORT=${NGINX_PORT:-80}
- WEB_URL=${WEB_URL:-http://localhost}
- DEBUG=${DEBUG:-0}
- SENTRY_DSN=${SENTRY_DSN:-""}
- SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT:-"production"}
- CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS:-}
# Gunicorn Workers
- GUNICORN_WORKERS=${GUNICORN_WORKERS:-1}
#DB SETTINGS
- PGHOST=${PGHOST:-plane-db}
- PGDATABASE=${PGDATABASE:-plane}
- POSTGRES_USER=${POSTGRES_USER:-plane}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-plane}
- POSTGRES_DB=${POSTGRES_DB:-plane}
- POSTGRES_PORT=${POSTGRES_PORT:-5432}
- PGDATA=${PGDATA:-/var/lib/postgresql/data}
- DATABASE_URL=${DATABASE_URL:-postgresql://plane:plane@plane-db/plane}
# REDIS SETTINGS
- REDIS_HOST=${REDIS_HOST:-plane-redis}
- REDIS_PORT=${REDIS_PORT:-6379}
- REDIS_URL=${REDIS_URL:-redis://plane-redis:6379/}
x-db-env: &db-env
PGHOST: ${PGHOST:-plane-db}
PGDATABASE: ${PGDATABASE:-plane}
POSTGRES_USER: ${POSTGRES_USER:-plane}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-plane}
POSTGRES_DB: ${POSTGRES_DB:-plane}
POSTGRES_PORT: ${POSTGRES_PORT:-5432}
PGDATA: ${PGDATA:-/var/lib/postgresql/data}
x-redis-env: &redis-env
REDIS_HOST: ${REDIS_HOST:-plane-redis}
REDIS_PORT: ${REDIS_PORT:-6379}
REDIS_URL: ${REDIS_URL:-redis://plane-redis:6379/}
x-minio-env: &minio-env
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID:-access-key}
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY:-secret-key}
x-aws-s3-env: &aws-s3-env
AWS_REGION: ${AWS_REGION:-}
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID:-access-key}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY:-secret-key}
AWS_S3_ENDPOINT_URL: ${AWS_S3_ENDPOINT_URL:-http://plane-minio:9000}
AWS_S3_BUCKET_NAME: ${AWS_S3_BUCKET_NAME:-uploads}
x-proxy-env: &proxy-env
NGINX_PORT: ${NGINX_PORT:-80}
BUCKET_NAME: ${AWS_S3_BUCKET_NAME:-uploads}
FILE_SIZE_LIMIT: ${FILE_SIZE_LIMIT:-5242880}
x-mq-env: &mq-env
# RabbitMQ Settings
RABBITMQ_HOST: ${RABBITMQ_HOST:-plane-mq}
RABBITMQ_PORT: ${RABBITMQ_PORT:-5672}
RABBITMQ_DEFAULT_USER: ${RABBITMQ_USER:-plane}
RABBITMQ_DEFAULT_PASS: ${RABBITMQ_PASSWORD:-plane}
RABBITMQ_DEFAULT_VHOST: ${RABBITMQ_VHOST:-plane}
RABBITMQ_VHOST: ${RABBITMQ_VHOST:-plane}
x-live-env: &live-env
API_BASE_URL: ${API_BASE_URL:-http://api:8000}
x-app-env: &app-env
WEB_URL: ${WEB_URL:-http://localhost}
DEBUG: ${DEBUG:-0}
SENTRY_DSN: ${SENTRY_DSN}
SENTRY_ENVIRONMENT: ${SENTRY_ENVIRONMENT:-production}
CORS_ALLOWED_ORIGINS: ${CORS_ALLOWED_ORIGINS}
GUNICORN_WORKERS: 1
USE_MINIO: ${USE_MINIO:-1}
DATABASE_URL: ${DATABASE_URL:-postgresql://plane:plane@plane-db/plane}
SECRET_KEY: ${SECRET_KEY:-60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5}
ADMIN_BASE_URL: ${ADMIN_BASE_URL}
SPACE_BASE_URL: ${SPACE_BASE_URL}
APP_BASE_URL: ${APP_BASE_URL}
AMQP_URL: ${AMQP_URL:-amqp://plane:plane@plane-mq:5672/plane}
# RabbitMQ Settings
- RABBITMQ_HOST=${RABBITMQ_HOST:-plane-mq}
- RABBITMQ_PORT=${RABBITMQ_PORT:-5672}
- RABBITMQ_DEFAULT_USER=${RABBITMQ_USER:-plane}
- RABBITMQ_DEFAULT_PASS=${RABBITMQ_PASSWORD:-plane}
- RABBITMQ_DEFAULT_VHOST=${RABBITMQ_VHOST:-plane}
- RABBITMQ_VHOST=${RABBITMQ_VHOST:-plane}
- AMQP_URL=${AMQP_URL:-amqp://plane:plane@plane-mq:5672/plane}
# Application secret
- SECRET_KEY=${SECRET_KEY:-60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5}
# DATA STORE SETTINGS
- USE_MINIO=${USE_MINIO:-1}
- AWS_REGION=${AWS_REGION:-}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID:-"access-key"}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY:-"secret-key"}
- AWS_S3_ENDPOINT_URL=${AWS_S3_ENDPOINT_URL:-http://plane-minio:9000}
- AWS_S3_BUCKET_NAME=${AWS_S3_BUCKET_NAME:-uploads}
- MINIO_ROOT_USER=${MINIO_ROOT_USER:-"access-key"}
- MINIO_ROOT_PASSWORD=${MINIO_ROOT_PASSWORD:-"secret-key"}
- BUCKET_NAME=${BUCKET_NAME:-uploads}
- FILE_SIZE_LIMIT=${FILE_SIZE_LIMIT:-5242880}
# Live server env
- API_BASE_URL=${API_BASE_URL:-http://api:8000}
services:
web:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-frontend:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -61,7 +70,6 @@ services:
- worker
space:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-space:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -75,7 +83,6 @@ services:
- web
admin:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-admin:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -88,12 +95,13 @@ services:
- web
live:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-live:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
restart: unless-stopped
command: node live/dist/server.js live
environment:
<<: [ *live-env ]
deploy:
replicas: ${LIVE_REPLICAS:-1}
depends_on:
@@ -101,7 +109,6 @@ services:
- web
api:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -111,14 +118,14 @@ services:
replicas: ${API_REPLICAS:-1}
volumes:
- logs_api:/code/plane/logs
environment:
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
depends_on:
- plane-db
- plane-redis
- plane-mq
worker:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -126,6 +133,8 @@ services:
command: ./bin/docker-entrypoint-worker.sh
volumes:
- logs_worker:/code/plane/logs
environment:
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
depends_on:
- api
- plane-db
@@ -133,7 +142,6 @@ services:
- plane-mq
beat-worker:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -141,6 +149,8 @@ services:
command: ./bin/docker-entrypoint-beat.sh
volumes:
- logs_beat-worker:/code/plane/logs
environment:
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
depends_on:
- api
- plane-db
@@ -148,7 +158,6 @@ services:
- plane-mq
migrator:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
@@ -156,21 +165,23 @@ services:
command: ./bin/docker-entrypoint-migrator.sh
volumes:
- logs_migrator:/code/plane/logs
environment:
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
depends_on:
- plane-db
- plane-redis
plane-db:
<<: *app-env
image: postgres:15.7-alpine
pull_policy: if_not_present
restart: unless-stopped
command: postgres -c 'max_connections=1000'
environment:
<<: *db-env
volumes:
- pgdata:/var/lib/postgresql/data
plane-redis:
<<: *app-env
image: valkey/valkey:7.2.5-alpine
pull_policy: if_not_present
restart: unless-stopped
@@ -178,30 +189,33 @@ services:
- redisdata:/data
plane-mq:
<<: *app-env
image: rabbitmq:3.13.6-management-alpine
restart: always
environment:
<<: *mq-env
volumes:
- rabbitmq_data:/var/lib/rabbitmq
plane-minio:
<<: *app-env
image: minio/minio:latest
pull_policy: if_not_present
restart: unless-stopped
command: server /export --console-address ":9090"
environment:
<<: *minio-env
volumes:
- uploads:/export
# Comment this if you already have a reverse proxy running
proxy:
<<: *app-env
image: ${DOCKERHUB_USER:-makeplane}/plane-proxy:${APP_RELEASE:-stable}
platform: ${DOCKER_PLATFORM:-}
pull_policy: if_not_present
restart: unless-stopped
ports:
- ${NGINX_PORT}:80
environment:
<<: *proxy-env
depends_on:
- web
- api

View File

@@ -4,9 +4,12 @@ BRANCH=${BRANCH:-master}
SCRIPT_DIR=$PWD
SERVICE_FOLDER=plane-app
PLANE_INSTALL_DIR=$PWD/$SERVICE_FOLDER
export APP_RELEASE="stable"
export APP_RELEASE=stable
export DOCKERHUB_USER=makeplane
export PULL_POLICY=${PULL_POLICY:-if_not_present}
export GH_REPO=makeplane/plane
export RELEASE_DOWNLOAD_URL="https://github.com/$GH_REPO/releases/download"
export FALLBACK_DOWNLOAD_URL="https://raw.githubusercontent.com/$GH_REPO/$BRANCH/deploy/selfhost"
CPU_ARCH=$(uname -m)
OS_NAME=$(uname)
@@ -16,13 +19,6 @@ mkdir -p $PLANE_INSTALL_DIR/archive
DOCKER_FILE_PATH=$PLANE_INSTALL_DIR/docker-compose.yaml
DOCKER_ENV_PATH=$PLANE_INSTALL_DIR/plane.env
SED_PREFIX=()
if [ "$OS_NAME" == "Darwin" ]; then
SED_PREFIX=("-i" "")
else
SED_PREFIX=("-i")
fi
function print_header() {
clear
@@ -59,6 +55,17 @@ function spinner() {
printf " \b\b\b\b" >&2
}
function checkLatestRelease(){
echo "Checking for the latest release..." >&2
local latest_release=$(curl -s https://api.github.com/repos/$GH_REPO/releases/latest | grep -o '"tag_name": "[^"]*"' | sed 's/"tag_name": "//;s/"//g')
if [ -z "$latest_release" ]; then
echo "Failed to check for the latest release. Exiting..." >&2
exit 1
fi
echo $latest_release
}
function initialize(){
printf "Please wait while we check the availability of Docker images for the selected release ($APP_RELEASE) with ${UPPER_CPU_ARCH} support." >&2
@@ -130,8 +137,12 @@ function updateEnvFile() {
echo "$key=$value" >> "$file"
return
else
# if key exists, update the value
sed "${SED_PREFIX[@]}" "s/^$key=.*/$key=$value/g" "$file"
if [ "$OS_NAME" == "Darwin" ]; then
value=$(echo "$value" | sed 's/|/\\|/g')
sed -i '' "s|^$key=.*|$key=$value|g" "$file"
else
sed -i "s/^$key=.*/$key=$value/g" "$file"
fi
fi
else
echo "File not found: $file"
@@ -182,7 +193,7 @@ function buildYourOwnImage(){
local PLANE_TEMP_CODE_DIR=~/tmp/plane
rm -rf $PLANE_TEMP_CODE_DIR
mkdir -p $PLANE_TEMP_CODE_DIR
REPO=https://github.com/makeplane/plane.git
REPO=https://github.com/$GH_REPO.git
git clone "$REPO" "$PLANE_TEMP_CODE_DIR" --branch "$BRANCH" --single-branch --depth 1
cp "$PLANE_TEMP_CODE_DIR/deploy/selfhost/build.yml" "$PLANE_TEMP_CODE_DIR/build.yml"
@@ -204,6 +215,10 @@ function install() {
echo "Begin Installing Plane"
echo ""
if [ "$APP_RELEASE" == "stable" ]; then
export APP_RELEASE=$(checkLatestRelease)
fi
local build_image=$(initialize)
if [ "$build_image" == "build" ]; then
@@ -232,8 +247,49 @@ function download() {
mv $PLANE_INSTALL_DIR/docker-compose.yaml $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml
fi
curl -H 'Cache-Control: no-cache, no-store' -s -o $PLANE_INSTALL_DIR/docker-compose.yaml https://raw.githubusercontent.com/makeplane/plane/$BRANCH/deploy/selfhost/docker-compose.yml?$(date +%s)
curl -H 'Cache-Control: no-cache, no-store' -s -o $PLANE_INSTALL_DIR/variables-upgrade.env https://raw.githubusercontent.com/makeplane/plane/$BRANCH/deploy/selfhost/variables.env?$(date +%s)
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$RELEASE_DOWNLOAD_URL/$APP_RELEASE/docker-compose.yml?$(date +%s)")
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
if [ "$STATUS" -eq 200 ]; then
echo "$BODY" > $PLANE_INSTALL_DIR/docker-compose.yaml
else
# Fallback to download from the raw github url
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$FALLBACK_DOWNLOAD_URL/docker-compose.yml?$(date +%s)")
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
if [ "$STATUS" -eq 200 ]; then
echo "$BODY" > $PLANE_INSTALL_DIR/docker-compose.yaml
else
echo "Failed to download docker-compose.yml. HTTP Status: $STATUS"
echo "URL: $RELEASE_DOWNLOAD_URL/$APP_RELEASE/docker-compose.yml"
mv $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml $PLANE_INSTALL_DIR/docker-compose.yaml
exit 1
fi
fi
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$RELEASE_DOWNLOAD_URL/$APP_RELEASE/variables.env?$(date +%s)")
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
if [ "$STATUS" -eq 200 ]; then
echo "$BODY" > $PLANE_INSTALL_DIR/variables-upgrade.env
else
# Fallback to download from the raw github url
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$FALLBACK_DOWNLOAD_URL/variables.env?$(date +%s)")
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
if [ "$STATUS" -eq 200 ]; then
echo "$BODY" > $PLANE_INSTALL_DIR/variables-upgrade.env
else
echo "Failed to download variables.env. HTTP Status: $STATUS"
echo "URL: $RELEASE_DOWNLOAD_URL/$APP_RELEASE/variables.env"
mv $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml $PLANE_INSTALL_DIR/docker-compose.yaml
exit 1
fi
fi
if [ -f "$DOCKER_ENV_PATH" ];
then
@@ -335,6 +391,34 @@ function restartServices() {
startServices
}
function upgrade() {
local latest_release=$(checkLatestRelease)
echo ""
echo "Current release: $APP_RELEASE"
if [ "$latest_release" == "$APP_RELEASE" ]; then
echo ""
echo "You are already using the latest release"
exit 0
fi
echo "Latest release: $latest_release"
echo ""
# Check for confirmation to upgrade
echo "Do you want to upgrade to the latest release ($latest_release)?"
read -p "Continue? [y/N]: " confirm
if [[ ! "$confirm" =~ ^[Yy]$ ]]; then
echo "Exiting..."
exit 0
fi
export APP_RELEASE=$latest_release
echo "Upgrading Plane to the latest release..."
echo ""
echo "***** STOPPING SERVICES ****"
stopServices

View File

@@ -47,9 +47,6 @@ AWS_ACCESS_KEY_ID=access-key
AWS_SECRET_ACCESS_KEY=secret-key
AWS_S3_ENDPOINT_URL=http://plane-minio:9000
AWS_S3_BUCKET_NAME=uploads
MINIO_ROOT_USER=access-key
MINIO_ROOT_PASSWORD=secret-key
BUCKET_NAME=uploads
FILE_SIZE_LIMIT=5242880
# Gunicorn Workers

6
live/.prettierignore Normal file
View File

@@ -0,0 +1,6 @@
.next
.turbo
out/
dist/
build/
node_modules/

5
live/.prettierrc Normal file
View File

@@ -0,0 +1,5 @@
{
"printWidth": 120,
"tabWidth": 2,
"trailingComma": "es5"
}

View File

@@ -20,6 +20,7 @@
"@hocuspocus/extension-logger": "^2.11.3",
"@hocuspocus/extension-redis": "^2.13.5",
"@hocuspocus/server": "^2.11.3",
"@plane/constants": "*",
"@plane/editor": "*",
"@plane/types": "*",
"@sentry/node": "^8.28.0",

View File

@@ -0,0 +1,3 @@
export enum AI_EDITOR_TASKS {
ASK_ANYTHING = "ASK_ANYTHING",
}

View File

@@ -1,3 +1,36 @@
export enum E_PASSWORD_STRENGTH {
EMPTY = "empty",
LENGTH_NOT_VALID = "length_not_valid",
STRENGTH_NOT_VALID = "strength_not_valid",
STRENGTH_VALID = "strength_valid",
}
export const PASSWORD_MIN_LENGTH = 8;
export const SPACE_PASSWORD_CRITERIA = [
{
key: "min_8_char",
label: "Min 8 characters",
isCriteriaValid: (password: string) =>
password.length >= PASSWORD_MIN_LENGTH,
},
// {
// key: "min_1_upper_case",
// label: "Min 1 upper-case letter",
// isCriteriaValid: (password: string) => PASSWORD_NUMBER_REGEX.test(password),
// },
// {
// key: "min_1_number",
// label: "Min 1 number",
// isCriteriaValid: (password: string) => PASSWORD_CHAR_CAPS_REGEX.test(password),
// },
// {
// key: "min_1_special_char",
// label: "Min 1 special character",
// isCriteriaValid: (password: string) => PASSWORD_SPECIAL_CHAR_REGEX.test(password),
// },
];
export enum EAuthPageTypes {
PUBLIC = "PUBLIC",
NON_AUTHENTICATED = "NON_AUTHENTICATED",
@@ -6,6 +39,14 @@ export enum EAuthPageTypes {
AUTHENTICATED = "AUTHENTICATED",
}
export enum EPageTypes {
INIT = "INIT",
PUBLIC = "PUBLIC",
NON_AUTHENTICATED = "NON_AUTHENTICATED",
ONBOARDING = "ONBOARDING",
AUTHENTICATED = "AUTHENTICATED",
}
export enum EAuthModes {
SIGN_IN = "SIGN_IN",
SIGN_UP = "SIGN_UP",
@@ -17,15 +58,35 @@ export enum EAuthSteps {
UNIQUE_CODE = "UNIQUE_CODE",
}
// TODO: remove this
export enum EErrorAlertType {
BANNER_ALERT = "BANNER_ALERT",
TOAST_ALERT = "TOAST_ALERT",
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
INLINE_EMAIL = "INLINE_EMAIL",
INLINE_PASSWORD = "INLINE_PASSWORD",
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
}
export type TAuthErrorInfo = {
type: EErrorAlertType;
code: EAdminAuthErrorCodes;
title: string;
message: any;
};
export enum EAdminAuthErrorCodes {
// Admin
ADMIN_ALREADY_EXIST = "5150",
REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME = "5155",
INVALID_ADMIN_EMAIL = "5160",
INVALID_ADMIN_PASSWORD = "5165",
REQUIRED_ADMIN_EMAIL_PASSWORD = "5170",
ADMIN_AUTHENTICATION_FAILED = "5175",
ADMIN_USER_ALREADY_EXIST = "5180",
ADMIN_USER_DOES_NOT_EXIST = "5185",
ADMIN_USER_DEACTIVATED = "5190",
}
export enum EAuthErrorCodes {
// Global
INSTANCE_NOT_CONFIGURED = "5000",
@@ -74,7 +135,7 @@ export enum EAuthErrorCodes {
INCORRECT_OLD_PASSWORD = "5135",
MISSING_PASSWORD = "5138",
INVALID_NEW_PASSWORD = "5140",
// set passowrd
// set password
PASSWORD_ALREADY_SET = "5145",
// Admin
ADMIN_ALREADY_EXIST = "5150",

View File

@@ -1,18 +1,25 @@
export const API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL || "";
// PI Base Url
export const PI_BASE_URL = process.env.NEXT_PUBLIC_PI_BASE_URL || "";
export const API_BASE_PATH = process.env.NEXT_PUBLIC_API_BASE_PATH || "/";
export const API_URL = encodeURI(`${API_BASE_URL}${API_BASE_PATH}`);
// God Mode Admin App Base Url
export const ADMIN_BASE_URL = process.env.NEXT_PUBLIC_ADMIN_BASE_URL || "";
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "";
export const GOD_MODE_URL = encodeURI(`${ADMIN_BASE_URL}${ADMIN_BASE_PATH}/`);
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "/";
export const GOD_MODE_URL = encodeURI(`${ADMIN_BASE_URL}${ADMIN_BASE_PATH}`);
// Publish App Base Url
export const SPACE_BASE_URL = process.env.NEXT_PUBLIC_SPACE_BASE_URL || "";
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "";
export const SITES_URL = encodeURI(`${SPACE_BASE_URL}${SPACE_BASE_PATH}/`);
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "/";
export const SITES_URL = encodeURI(`${SPACE_BASE_URL}${SPACE_BASE_PATH}`);
// Live App Base Url
export const LIVE_BASE_URL = process.env.NEXT_PUBLIC_LIVE_BASE_URL || "";
export const LIVE_BASE_PATH = process.env.NEXT_PUBLIC_LIVE_BASE_PATH || "";
export const LIVE_URL = encodeURI(`${LIVE_BASE_URL}${LIVE_BASE_PATH}/`);
export const LIVE_BASE_PATH = process.env.NEXT_PUBLIC_LIVE_BASE_PATH || "/";
export const LIVE_URL = encodeURI(`${LIVE_BASE_URL}${LIVE_BASE_PATH}`);
// Web App Base Url
export const WEB_BASE_URL = process.env.NEXT_PUBLIC_WEB_BASE_URL || "";
export const WEB_BASE_PATH = process.env.NEXT_PUBLIC_WEB_BASE_PATH || "/";
export const WEB_URL = encodeURI(`${WEB_BASE_URL}${WEB_BASE_PATH}`);
// plane website url
export const WEBSITE_URL =
process.env.NEXT_PUBLIC_WEBSITE_URL || "https://plane.so";
// support email
export const SUPPORT_EMAIL =
process.env.NEXT_PUBLIC_SUPPORT_EMAIL || "support@plane.so";

View File

@@ -1,4 +1,11 @@
export * from "./ai";
export * from "./auth";
export * from "./endpoints";
export * from "./file";
export * from "./instance";
export * from "./issue";
export * from "./metadata";
export * from "./state";
export * from "./swr";
export * from "./user";
export * from "./workspace";

View File

@@ -1,5 +1,25 @@
import { List, Kanban } from "lucide-react";
export const ALL_ISSUES = "All Issues";
export type TIssuePriorities = "urgent" | "high" | "medium" | "low" | "none";
export type TIssueFilterKeys = "priority" | "state" | "labels";
export type TIssueLayout =
| "list"
| "kanban"
| "calendar"
| "spreadsheet"
| "gantt";
export type TIssueFilterPriorityObject = {
key: TIssuePriorities;
title: string;
className: string;
icon: string;
};
export enum EIssueGroupByToServerOptions {
"state" = "state_id",
"priority" = "priority",
@@ -11,6 +31,7 @@ export enum EIssueGroupByToServerOptions {
"target_date" = "target_date",
"project" = "project_id",
"created_by" = "created_by",
"team_project" = "project_id",
}
export enum EIssueGroupBYServerToProperty {
@@ -38,3 +59,127 @@ export enum EServerGroupByToFilterOptions {
"project_id" = "project",
"created_by" = "created_by",
}
export enum EIssueServiceType {
ISSUES = "issues",
EPICS = "epics",
}
export enum EIssueLayoutTypes {
LIST = "list",
KANBAN = "kanban",
CALENDAR = "calendar",
GANTT = "gantt_chart",
SPREADSHEET = "spreadsheet",
}
export enum EIssuesStoreType {
GLOBAL = "GLOBAL",
PROFILE = "PROFILE",
TEAM = "TEAM",
PROJECT = "PROJECT",
CYCLE = "CYCLE",
MODULE = "MODULE",
TEAM_VIEW = "TEAM_VIEW",
PROJECT_VIEW = "PROJECT_VIEW",
ARCHIVED = "ARCHIVED",
DRAFT = "DRAFT",
DEFAULT = "DEFAULT",
WORKSPACE_DRAFT = "WORKSPACE_DRAFT",
EPIC = "EPIC",
}
export enum EIssueFilterType {
FILTERS = "filters",
DISPLAY_FILTERS = "display_filters",
DISPLAY_PROPERTIES = "display_properties",
KANBAN_FILTERS = "kanban_filters",
}
export enum EIssueCommentAccessSpecifier {
EXTERNAL = "EXTERNAL",
INTERNAL = "INTERNAL",
}
export enum EIssueListRow {
HEADER = "HEADER",
ISSUE = "ISSUE",
NO_ISSUES = "NO_ISSUES",
QUICK_ADD = "QUICK_ADD",
}
export const ISSUE_DISPLAY_FILTERS_BY_LAYOUT: {
[key in TIssueLayout]: Record<"filters", TIssueFilterKeys[]>;
} = {
list: {
filters: ["priority", "state", "labels"],
},
kanban: {
filters: ["priority", "state", "labels"],
},
calendar: {
filters: ["priority", "state", "labels"],
},
spreadsheet: {
filters: ["priority", "state", "labels"],
},
gantt: {
filters: ["priority", "state", "labels"],
},
};
export const ISSUE_PRIORITIES: {
key: TIssuePriorities;
title: string;
}[] = [
{ key: "urgent", title: "Urgent" },
{ key: "high", title: "High" },
{ key: "medium", title: "Medium" },
{ key: "low", title: "Low" },
{ key: "none", title: "None" },
];
export const ISSUE_PRIORITY_FILTERS: TIssueFilterPriorityObject[] = [
{
key: "urgent",
title: "Urgent",
className: "bg-red-500 border-red-500 text-white",
icon: "error",
},
{
key: "high",
title: "High",
className: "text-orange-500 border-custom-border-300",
icon: "signal_cellular_alt",
},
{
key: "medium",
title: "Medium",
className: "text-yellow-500 border-custom-border-300",
icon: "signal_cellular_alt_2_bar",
},
{
key: "low",
title: "Low",
className: "text-green-500 border-custom-border-300",
icon: "signal_cellular_alt_1_bar",
},
{
key: "none",
title: "None",
className: "text-gray-500 border-custom-border-300",
icon: "block",
},
];
export const SITES_ISSUE_LAYOUTS: {
key: TIssueLayout;
title: string;
icon: any;
}[] = [
{ key: "list", title: "List", icon: List },
{ key: "kanban", title: "Kanban", icon: Kanban },
// { key: "calendar", title: "Calendar", icon: Calendar },
// { key: "spreadsheet", title: "Spreadsheet", icon: Sheet },
// { key: "gantt", title: "Gantt chart", icon: GanttChartSquare },
];

View File

@@ -0,0 +1,23 @@
export const SITE_NAME =
"Plane | Simple, extensible, open-source project management tool.";
export const SITE_TITLE =
"Plane | Simple, extensible, open-source project management tool.";
export const SITE_DESCRIPTION =
"Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.";
export const SITE_KEYWORDS =
"software development, plan, ship, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
export const SITE_URL = "https://app.plane.so/";
export const TWITTER_USER_NAME =
"Plane | Simple, extensible, open-source project management tool.";
// Plane Sites Metadata
export const SPACE_SITE_NAME =
"Plane Publish | Make your Plane boards and roadmaps pubic with just one-click. ";
export const SPACE_SITE_TITLE =
"Plane Publish | Make your Plane boards public with one-click";
export const SPACE_SITE_DESCRIPTION =
"Plane Publish is a customer feedback management tool built on top of plane.so";
export const SPACE_SITE_KEYWORDS =
"software development, customer feedback, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
export const SPACE_SITE_URL = "https://app.plane.so/";
export const SPACE_TWITTER_USER_NAME = "planepowers";

View File

@@ -1,4 +1,9 @@
import { TStateGroups } from "@plane/types";
export type TStateGroups =
| "backlog"
| "unstarted"
| "started"
| "completed"
| "cancelled";
export const STATE_GROUPS: {
[key in TStateGroups]: {
@@ -34,4 +39,7 @@ export const STATE_GROUPS: {
},
};
export const ARCHIVABLE_STATE_GROUPS = [STATE_GROUPS.completed.key, STATE_GROUPS.cancelled.key];
export const ARCHIVABLE_STATE_GROUPS = [
STATE_GROUPS.completed.key,
STATE_GROUPS.cancelled.key,
];

View File

@@ -1,4 +1,4 @@
export const SWR_CONFIG = {
export const DEFAULT_SWR_CONFIG = {
refreshWhenHidden: false,
revalidateIfStale: false,
revalidateOnFocus: false,

Some files were not shown because too many files have changed in this diff Show More