Compare commits

...

172 Commits

Author SHA1 Message Date
pablohashescobar
1bef55ca88 dev: remove append / setting in common 2023-11-18 19:05:59 +05:30
pablohashescobar
c6d6c29e51 dev: update docker environment variable 2023-11-18 18:53:12 +05:30
pablohashescobar
0bb13148b7 dev: fix access token variable error 2023-11-17 12:27:44 +05:30
pablohashescobar
6531e420fc dev: environment configuration for backend 2023-11-17 12:14:39 +05:30
pablohashescobar
93aeb3795d Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-17 11:52:18 +05:30
Manish Gupta
0a88db975a dev: Self Hosting with private repo fixes (#2787)
* fixes to self hosting

* self hosting fixes

* removed .temp

* wip

* wip

* self install private repo

* folder change

* fix

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-17 11:51:54 +05:30
pablohashescobar
5afb71ada1 dev: removed DJANGO_SETTINGS_MODULE from environment files and deleted auto bucket create script 2023-11-16 18:33:58 +05:30
pablohashescobar
a65da91969 dev: create instance activation route if the instance is not activated during startup 2023-11-16 17:52:55 +05:30
pablohashescobar
63d5951e36 Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 16:55:58 +05:30
pablohashescobar
cc158ae6da dev: fix email sending for ssl 2023-11-16 16:49:12 +05:30
pablohashescobar
3f2b890a48 dev: check magic link login 2023-11-16 16:40:16 +05:30
pablohashescobar
936535fef2 dev: instance admin unique constraints 2023-11-16 16:27:27 +05:30
pablohashescobar
29798fa08a Merge branch 'dev/settings_configuration' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 15:07:42 +05:30
Manish Gupta
dd60dec887 Dev/mg selfhosting fix (#2782)
* fixes to self hosting

* self hosting fixes

* removed .temp

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-16 14:38:55 +05:30
Bavisetti Narayan
0c1097592e fix: pages revamping (#2760)
* fix: page transaction model

* fix: page transaction model

* fix: migration and optimisation

* fix: back migration of page blocks

* fix: added issue embed

* fix: migration fixes

* fix: resolved changes
2023-11-16 14:38:12 +05:30
Anmol Singh Bhatia
bed66235f2 style: workspace sidebar dropdown improvement (#2783) 2023-11-16 14:11:33 +05:30
pablohashescobar
af5534ffe3 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 13:42:36 +05:30
pablohashescobar
46e733a2c5 dev: admin api to check if the current user is admin 2023-11-16 13:42:03 +05:30
sriram veeraghanta
c4d5ccdae4 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 12:35:23 +05:30
sriram veeraghanta
78d841d67e dev: instance admin routes and initial set of screens 2023-11-16 12:34:41 +05:30
sriram veeraghanta
95c35cde4a Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 12:33:10 +05:30
pablohashescobar
6045e659bc Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 06:46:48 +00:00
pablohashescobar
ca79739ead Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-16 06:45:57 +00:00
pablohashescobar
44a3097ed1 dev: default bucket creation script 2023-11-16 12:13:45 +05:30
pablohashescobar
55869ee994 dev: update script to use manage command 2023-11-15 20:22:14 +05:30
pablohashescobar
6102de37de dev: check email validity 2023-11-15 20:19:55 +05:30
pablohashescobar
deefdac8b4 dev: migrate instance registration and configuration to manage commands 2023-11-15 20:07:33 +05:30
pablohashescobar
1e9f4dd938 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-15 19:25:04 +05:30
pablohashescobar
d9a06a5fe3 Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-15 19:24:39 +05:30
pablohashescobar
89a5088f31 dev: update cors setup and local settings 2023-11-15 19:22:56 +05:30
pablohashescobar
9789757d2d dev: fix current site domain registration 2023-11-15 18:18:00 +05:30
pablohashescobar
2aee627616 dev: remove deprecated variables 2023-11-15 17:54:33 +05:30
Nikhil
26b1e9d5f1 dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
sriram veeraghanta
5b2964dc48 fix: updating env config in case of db values are missing 2023-11-15 16:54:56 +05:30
Bavisetti Narayan
79347ec62b feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8 dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589 chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5 fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207 Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676 Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
pablohashescobar
095f64793f dev: fix email senders 2023-11-14 20:31:29 +05:30
pablohashescobar
ccfd1c703e dev: fix instance permissions and serializer 2023-11-14 20:30:54 +05:30
pablohashescobar
d05b63fc51 dev: instance configuration values 2023-11-14 14:14:11 +00:00
Prateek Shourya
1fe09d369f style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1 style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7 chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8 chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249 fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48 style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
pablohashescobar
fea812f1b1 dev: instance owner fix 2023-11-14 11:03:09 +00:00
pablohashescobar
46f2663854 dev: enable instance configuration and instance admin roles 2023-11-14 15:56:48 +05:30
pablohashescobar
1eac8ead86 dev: instance admin 2023-11-14 09:08:24 +00:00
pablohashescobar
aca0f5eadb Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-14 07:26:26 +00:00
pablohashescobar
32a8133a9f dev: auto instance registration script 2023-11-14 07:23:18 +00:00
pablohashescobar
efbf834e77 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-13 20:29:44 +05:30
pablohashescobar
4fdd6a0e26 dev: instance configuration on instance bootup 2023-11-13 20:29:09 +05:30
Nikhil
b31041726b dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90 style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171 fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
Anmol Singh Bhatia
00e61a8753 fix: peek overview comment ordering and comment icon alignment fix (#2753) 2023-11-10 18:45:41 +05:30
Anmol Singh Bhatia
733fed76cc fix: cycle card title responsiveness added (#2752) 2023-11-10 18:43:48 +05:30
Anmol Singh Bhatia
e78dd4b1c0 fix: app sidebar dropdown fix (#2751) 2023-11-10 18:43:16 +05:30
Anmol Singh Bhatia
d479781fce style: header consistency (#2750) 2023-11-10 18:30:43 +05:30
Bavisetti Narayan
c449b46bf4 fix: added external folder in urls (#2749) 2023-11-10 16:00:55 +05:30
Prateek Shourya
fd6430c3e3 style: Update modal appearance for UI consistency (#2747) 2023-11-10 15:48:34 +05:30
Ramesh Kumar Chandra
6f580ce2d9 fix: project settings layout render in export (#2746) 2023-11-10 13:07:18 +05:30
Ankush Deshmukh
2748133bd0 Fix: Show Priority icon in custom analytics table. (#2744) 2023-11-10 13:06:23 +05:30
Aaryan Khandelwal
884b219508 refactor: cycles store (#2716)
* refactor: cycles store

* refactor: active cycle details
2023-11-09 18:37:45 +05:30
Anmol Singh Bhatia
162faf8339 fix: date select tooltip fix (#2740) 2023-11-09 18:29:45 +05:30
Anmol Singh Bhatia
c291ff05ee fix: fliter list item clear button alignment fix (#2741) 2023-11-09 18:27:19 +05:30
Nikhil
446981422e feat: issues v2 endpoint (#2713)
* feat: issue v2 listing endpoint

* dev: issues v3 endpoint

* dev: add permission in the grouped endpoint

* dev: update grouped endpoint
2023-11-09 18:24:26 +05:30
Bavisetti Narayan
630e21b954 fix: favourite cycle and modules displayed at top (#2719) 2023-11-09 18:22:38 +05:30
Bavisetti Narayan
894ffb6c21 fix: mention notification (#2670)
* fix: mention notification

* feat: updated mentions for comments in the notification background task

* feat: added subscription for issue_comment_mentions as well

* fix: removed the print statement

* fix: double notification popup for mentioned assignees

* fix: added issue subscriber

* fix: removed creator for subscribed

* fix: creator will not be subscribed to issue

* fix: double notification removed

---------

Co-authored-by: Henit Chobisa <chobisa.henit@gmail.com>
2023-11-09 18:22:13 +05:30
Nikhil
515dba02d3 chore: configuration add tracker variables (#2709)
* chore: configuration add tracker variables

* dev: unsplash configuration
2023-11-09 18:20:03 +05:30
Dakshesh Jain
34bccd7e06 refactor: gantt sidebar (#2705)
* refactor: gantt sidebar

* fix: exception error

fix: file placement

* refactor: not passing sidebar block as props
2023-11-09 17:57:41 +05:30
sriram veeraghanta
79df59f618 fix: spliting out the project members from project store and service (#2739) 2023-11-09 17:56:55 +05:30
Prateek Shourya
7676aab773 fix: UI improvements. (#2734)
* style: update check icon colors to match our design.

* fix: automatically focus input box in pages `add label` modal.
2023-11-09 17:37:45 +05:30
Prateek Shourya
8832d8e00e style: sidebar UI improvements (#2735)
* updated font weight and color as per designs.
* removed background color from workspace with logo.
* updated dropdown design.
2023-11-09 17:01:48 +05:30
rahulramesha
d733a53ea6 fix: Add horizontal scroll bar to views (#2736)
* add errors for duplicate labels

* adding horizonatal scroll bar to views

---------

Co-authored-by: rahulramesha <rahul@appsmith.com>
2023-11-09 15:12:00 +05:30
Lakhan Baheti
96862e06ef fix: cystom analytics bar graph index alignment (#2737) 2023-11-09 14:36:22 +05:30
pablohashescobar
081e42ced0 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-09 07:03:20 +00:00
pablohashescobar
216f07f9d8 dev: email configuration settings moved to database 2023-11-09 12:32:34 +05:30
pablohashescobar
589e29ee45 Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-09 06:16:05 +00:00
sriram veeraghanta
a6567bbce4 fix: workspace members store added and implemented across the app (#2732)
* fix: minor changes

* fix: workspace members store added and implemnted across the app
2023-11-09 00:35:12 +05:30
Nikhil
556b2d2617 feat: state list endpoint (#2717)
* feat: state list endpoint

* dev: update states endpoint

* dev: mark default state endpoint
2023-11-08 22:38:53 +05:30
Lakhan Baheti
10037222b6 fix: Tooltip content on assignee hover in all layouts (#2724)
* fix: Tooltip content on assignee hover in all layouts

* chore: comments added
2023-11-08 22:35:30 +05:30
sabith-tu
931f9d288a fix: toast alert inconsistency (#2730) 2023-11-08 20:37:47 +05:30
sriram veeraghanta
20fb79567f fix: project states fixes (#2731)
* fix: project states fixes

* fix: states fixes

* fix: formating all files
2023-11-08 20:31:46 +05:30
pablohashescobar
b90218580d dev: update instance configuration load 2023-11-08 14:21:26 +00:00
pablohashescobar
f37bae9ad9 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-08 14:13:50 +00:00
pablohashescobar
7eaf52bae5 dev: set default values 2023-11-08 19:43:09 +05:30
pablohashescobar
3f78c1bbc0 dev: instance configuration variables 2023-11-08 13:56:52 +00:00
pablohashescobar
66cb297587 dev: update instance configuration model to take null and empty values 2023-11-08 19:01:45 +05:30
Ramesh Kumar Chandra
bd1a850f35 style: kanban card label overflow (#2722)
* chore: kanban card lable drop down items overflow

* style: kaban card label text overflow, tool tip, hover cursor

* style: label overflow in list layout
2023-11-08 18:12:36 +05:30
M. Palanikannan
206f5744a3 [fix]: Error Handling for Images and Table Fix for Form Submissions in Editor (#2710)
* cancellable uploads and image limits with better error handling

* fixed table row/column picker behaviour on modals

* Merge branch 'rerender-debounce-editor-fix' into editor-draggable-nodes

* fix: added mention suggestions and highlights in `create-issue-modal`

* removed uncessary files

* solved lint error of trailing spaces

* added plane/ui dependency for tooltips

---------

Co-authored-by: Henit Chobisa <chobisa.henit@gmail.com>
2023-11-08 18:00:53 +05:30
sabith-tu
faaba45e59 fix: all issues values not changeable and assignee image not rendering (#2707)
* fix: all issues values are not changeable and assignee image not rendering

* chore: removed console log
2023-11-08 17:56:09 +05:30
sabith-tu
f8002852e0 fix: issue property height and peek view date picker border radius (#2726) 2023-11-08 17:55:28 +05:30
Lakhan Baheti
2d71377722 fix: added empty project state when no project exists. (#2727)
* fix: added empty project state when no project exists

* fix: duplicate import
2023-11-08 17:54:59 +05:30
Lakhan Baheti
6ebee05951 fix: unwanted go back button in onboarding step 2 (#2714) 2023-11-08 17:53:06 +05:30
Lakhan Baheti
5a3bac998e fix: bug fixes and ui improvements (#2703)
* fix: gantt chart duration in decimal

* fix: Loading text instead Spinner in peek view

* fix: cycle more popover typo & icon overlapping

* fix: list layout properties alignment

* fix: project search empty state

* fix: calendar layout issue text overflow & redirection inconsistency

* style: urgent priority hover background color

* fix: Cycle issues kanban layout empty state missing

* style: custom snooze modal placeholder text color

* refactor: replaced unwanted anchor tag with div

* chore: removed empty state for cycle kanban layout
2023-11-08 17:52:34 +05:30
Anmol Singh Bhatia
4096136b44 style: ui consistency and improvement (#2725)
* style: create/update issue modal properties ui improvement

* style: create update issue modal improvement

* style: modal ui consistency
2023-11-08 17:51:32 +05:30
Aaryan Khandelwal
83e0c4ebbd chore: remove active ids from the MobX stores if not present in the route (#2681)
* chore: remove active ids if not present in the route

* refactor: set active id logic
2023-11-08 17:35:45 +05:30
Aaryan Khandelwal
df8bdfd5b9 fix: project automation settings flickering (#2680)
* fix: cycle and module sidebar z-index

* fix: project automation settings flickering
2023-11-08 17:34:42 +05:30
Ankush Deshmukh
da799b5a63 Fix: Render bar chart axis labels in lighter color when dark theme applied (#2721) 2023-11-08 17:34:09 +05:30
Dakshesh Jain
621d551c4a fix: project select validation (#2723) 2023-11-08 17:33:26 +05:30
pablohashescobar
a4f28b32f5 dev: instance configuration migration 2023-11-08 16:45:41 +05:30
pablohashescobar
39f69b7ed2 Merge branch 'feat/self_hosted_instance' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-08 11:14:26 +00:00
pablohashescobar
485e3e6a0f dev: instance configuration 2023-11-08 11:13:04 +00:00
Aaryan Khandelwal
5a84ed279d refactor: replace keyboard events with command palette store (#2688) 2023-11-08 13:51:55 +05:30
Dakshesh Jain
53e7da08e4 fix: quick add not working for labels & assignee (#2689)
* fix: quick add not working for labels & assignee

* fix: build error on spreadsheet view
2023-11-08 12:28:02 +05:30
pablohashescobar
ea54bbca49 dev: reset migrations 2023-11-08 12:20:00 +05:30
pablohashescobar
f34d508a48 dev: add default properties and issue mention migration 2023-11-08 12:16:01 +05:30
pablohashescobar
02a8caaabc Merge branch 'develop' of github.com:makeplane/plane into feat/self_hosted_instance 2023-11-08 12:12:50 +05:30
Anmol Singh Bhatia
9f206331bc style: spinner component improvement (#2708) 2023-11-07 18:35:05 +05:30
rahulramesha
b56d188a83 add errors for duplicate labels (#2706)
Co-authored-by: rahulramesha <rahul@appsmith.com>
2023-11-07 18:22:52 +05:30
Bavisetti Narayan
8d3853b129 fix: issue draft delete functionality (#2696) 2023-11-07 18:19:32 +05:30
Bavisetti Narayan
30d6235108 fix: add issues to cycles and modules (#2659)
* fix: able to add issue in cycle and module

* fix: issue activity message
2023-11-07 18:18:51 +05:30
Prateek Shourya
6e461dd8c3 style: update user profile button alignment. (#2695) 2023-11-07 18:17:44 +05:30
Nikhil
86379c51b7 dev: worker count (#2573)
* dev: workers count

* dev: update the worker count variable to GUNICORN_WORKERS
2023-11-07 18:05:38 +05:30
Manish Gupta
f7cc2eca36 dev: Modified the branch-build action yaml (#2704)
* cherrypicked code

* removed PUSH event
2023-11-07 17:41:26 +05:30
Prateek Shourya
63d1ad286b style: update font weight in project general setting section. (#2697) 2023-11-07 17:39:12 +05:30
Manish Gupta
1412c1c94a dev: modified the branch wise build (#2702)
* cherrypicked branch build code

* trigger on pull request

* branch filter

* checking branch filter

* checking push

* checking push again

* code cleanup before PR
2023-11-07 17:23:32 +05:30
sriram veeraghanta
26de35bd8d fix: environment config changes in the API are replicated in web and space app (#2699)
* fix: envconfig type changes

* chore: configuration variables  (#2692)

* chore: update avatar group logic (#2672)

* chore: configuration variables

---------

* fix: replacing slack client id with env config

---------

Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
2023-11-07 17:17:10 +05:30
Aaryan Khandelwal
1986c0dfd4 fix: state icon (#2678) 2023-11-07 16:15:34 +05:30
Anmol Singh Bhatia
25f3a5b2e4 chore: peek overview improvement and bug fixes (#2683)
* style: issue peek overview improvement

* style: peek overview improvement

* fix: subscribe issue from peek overview fix and validation added

* fix: build error
2023-11-07 15:58:42 +05:30
Anmol Singh Bhatia
f30b16e9d8 chore: user profile issue improvement (#2679)
* fix: user profile filters z-index

* chore: user profile issue state group heading fix

* fix: build error
2023-11-07 15:58:19 +05:30
Anmol Singh Bhatia
93d03f82b4 chore: spreadsheet layout improvement (#2677)
* style: spreadsheet column width fix

* style: spreadsheet label column styling

* chore: spreadsheet layout issue properties improvement

* fix: build error
2023-11-07 15:58:00 +05:30
Anmol Singh Bhatia
98974fdc50 fix: cycle and module bug fixes and improvement (#2691)
* fix: cycle and module card issue count fix

* fix: cycle and module list progress icon fix

* fix: module card progress fix

* style: cycle & module empty date label updated

* fix: build error
2023-11-07 15:14:47 +05:30
sriram veeraghanta
040563d148 fix: replacing jira importer image (#2685) 2023-11-07 14:35:04 +05:30
Nikhil
4de64f112f fix: slack project integration (#2684) 2023-11-07 14:34:30 +05:30
Ramesh Kumar Chandra
0afb900678 fix: kanban card state name and drop down items text overflow (#2686) 2023-11-07 14:31:29 +05:30
Prateek Shourya
baf17a109b style: update project description as per design. (#2682) 2023-11-07 13:13:14 +05:30
Prateek Shourya
37bf465fcd style: update border across workspace and project settings. (#2669)
* style: update border across workspace and project settings.

* update border width
2023-11-07 13:12:05 +05:30
Anmol Singh Bhatia
d8c96536f0 fix: bug fixes and ui improvement (#2674)
* chore: peekoverview edit permission updated

* chore: tab index added in create project modal

* chore: project card improvement

* style: avatar component improvement

* chore: create issue modal improvement

* style: global style sidebar border variable name fix
2023-11-06 21:08:01 +05:30
Nikhil
b372ccfdb3 fix: slack integration workflow (#2675)
* fix: slack integration workflow

* dev: add slack client id as configuration

* fix: clean up

* fix: added env to turbo

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-06 21:00:49 +05:30
guru_sainath
984b36f45a fix: In kanban issues can be shifted between the column in order_by (#2676) 2023-11-06 21:00:36 +05:30
Aaryan Khandelwal
46f307fed5 chore: update avatar group logic (#2672) 2023-11-06 20:43:54 +05:30
Aaryan Khandelwal
1dce72cb3c style: updated layouts UI in the space app (#2671)
* style: updated layouts UI in space

* fix: build error
2023-11-06 20:43:34 +05:30
Aaryan Khandelwal
a6dea3af23 fix: render the estimate select if estimate is enabled for the project (#2663) 2023-11-06 20:43:10 +05:30
Henit Chobisa
6eb0bf4785 Fix/mentions spaces fix (#2667)
* feat: add mentions store to the space project

* fix: added mentions highlights in read only comment cards

* feat: added mention highlights in richtexteditor in space app
2023-11-06 20:42:24 +05:30
Manish Gupta
13389d1b2b dev: On Demand Code Build for any branch (#2668)
* wip

* wip

* testing

* wip

* wip

* wip

* wip

* image push fix

* wip

* wip

* dynamic branch name and tag

* workflow_dispatch modified

* job splitting

* file sharing

* wip

* checking

* wip

* wip

* wip

* wip

* build fixes

* code upload download fixes

* image name change

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-06 19:05:20 +05:30
Aaryan Khandelwal
742143766f fix: existing issues modal for cycle and module (#2664)
* fix: existing issues modal for cycle and module

* refactor: existing issues modal code

* fix: build errors
2023-11-06 16:30:09 +05:30
sriram veeraghanta
1ed72c51df fix: package version fixes and mentions build error fixes (#2665) 2023-11-06 16:28:15 +05:30
Aaryan Khandelwal
a03e0c788f fix: notifications option in the sidebar menu not collapsing (#2662) 2023-11-06 14:53:26 +05:30
guru_sainath
0c8a867565 fix: handled drag and drop issue, gantt hover issue for issue peek overview (#2660) 2023-11-06 13:52:33 +05:30
Aaryan Khandelwal
3a07bb6060 refactor: removed unused packages (#2658) 2023-11-06 13:17:02 +05:30
Aaryan Khandelwal
bf48d93a25 fix: product tour modal bugs (#2657)
* fix: product tour

* style: product tour navigation buttons

* refactor: step logic
2023-11-06 13:06:00 +05:30
M. Palanikannan
14ac885e55 [feat]: Extended Tables Support (#2596)
* migrated table to new project structure

* fixed range errors while deleting table nodes with no nodes below and removed console logs

* fixed css for rendering table menu

* removed old table menu

* added support for read only editors as well

* text-black removed

* added design colors

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-05 18:54:00 +05:30
Henit Chobisa
f0335751b3 fix: mentions enter error (#2646)
* fix: fixed readonly lite text editor not rendering highlights

* fix: removed enter extension in lite text editor
2023-11-04 01:57:57 +05:30
Anmol Singh Bhatia
52395d0563 chore : dropdown loading state added and project card avatar fix (#2643)
* chore: project card avatar rendering fix

* chore: state, assignee and label dropdown loading state added
2023-11-04 01:56:23 +05:30
Dakshesh Jain
ad558833af refactor: archive issue (#2628)
* dev: archived issue store

* dev: archived issue layouts and store binding

* dev: archived issue detail store

* dev: is read only

* fix: archived issue delete

* fix: build error
2023-11-03 20:20:05 +05:30
sriram veeraghanta
ff258c60fd fix: open changelog in new tab (#2645) 2023-11-03 19:49:02 +05:30
Bavisetti Narayan
db2a1b8033 fix: custom analytics graph display issue (#2637)
* chore: fixed custom analytics

* chore: typo changes
2023-11-03 19:23:35 +05:30
Dakshesh Jain
91878fb3dd fix: notification read and snooze bugs (#2639)
* fix: marking notification as read doesn't remove it from un-read list

* refactor: arranged imports

* fix: past snooze notifications coming in snooze tab
2023-11-03 19:21:35 +05:30
Lakhan Baheti
c233e6e3b6 style: custom analytics bar graph label overlapping (#2636)
* style: custom analytics bar graph label overlapping

fix: Bar graph avatar image rendering & tooltip

* fix: import
2023-11-03 19:18:24 +05:30
Lakhan Baheti
0d2c399555 style: quick add issue UI improvements in all the layouts (#2615)
* style: quick add issue UI improvements in all the layouts

* style: ui improvements

* style: quick add icon size

* chore: static sizes to tailwind classes

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-03 19:17:50 +05:30
Aaryan Khandelwal
79cad16aba chore: update all layout selections (#2641) 2023-11-03 19:17:13 +05:30
Aaryan Khandelwal
41e9d5d7e3 chore: added missing columns to the spreadsheet layout (#2640) 2023-11-03 19:15:09 +05:30
Aaryan Khandelwal
992cf79031 chore: peek overview authorization (#2632)
* chore: peek overview authorization

* chore: comment access specifier validation
2023-11-03 19:13:10 +05:30
Aaryan Khandelwal
d48f13416f Update readme content (#2635) 2023-11-03 19:12:46 +05:30
Aaryan Khandelwal
cf19afa707 fix: string helper function (#2633) 2023-11-03 19:11:28 +05:30
sriram veeraghanta
cc26f604aa fix: next image fixes for selfhosted instances (#2642) 2023-11-03 19:09:40 +05:30
pablohashescobar
97b7a27695 dev: change license response structure 2023-10-30 20:17:30 +05:30
pablohashescobar
088615f9d6 dev: instance licenses 2023-10-30 17:38:18 +05:30
pablohashescobar
b3868423f6 feat: self hosted licensing initialize 2023-10-30 17:01:30 +05:30
pablohashescobar
1fde71c633 dev: remove migration file 0046 2023-10-30 16:33:17 +05:30
pablohashescobar
5b2a1dda72 dev: initiate licensing 2023-10-30 16:32:51 +05:30
pablohashescobar
3bfbc3a132 dev: remove default user 2023-10-30 16:28:32 +05:30
778 changed files with 19396 additions and 12437 deletions

View File

@@ -21,15 +21,15 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80

213
.github/workflows/build-branch.yml vendored Normal file
View File

@@ -0,0 +1,213 @@
name: Branch Build
on:
pull_request:
types:
- closed
branches:
- master
- release
- qa
- develop
env:
TARGET_BRANCH: ${{ github.event.pull_request.base.ref }}
jobs:
branch_build_and_push:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) }}
name: Build-Push Web/Space/API/Proxy Docker Image
runs-on: ubuntu-20.04
steps:
- name: Check out the repo
uses: actions/checkout@v3.3.0
# - name: Set Target Branch Name on PR close
# if: ${{ github.event_name == 'pull_request' && github.event.action =='closed' }}
# run: echo "TARGET_BRANCH=${{ github.event.pull_request.base.ref }}" >> $GITHUB_ENV
# - name: Set Target Branch Name on other than PR close
# if: ${{ github.event_name == 'push' }}
# run: echo "TARGET_BRANCH=${{ github.ref_name }}" >> $GITHUB_ENV
- uses: ASzc/change-string-case-action@v2
id: gh_branch_upper_lower
with:
string: ${{env.TARGET_BRANCH}}
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_slash
with:
source: ${{ steps.gh_branch_upper_lower.outputs.lowercase }}
find: '/'
replace: '-'
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_dot
with:
source: ${{ steps.gh_branch_replace_slash.outputs.value }}
find: '.'
replace: ''
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_clean
with:
source: ${{ steps.gh_branch_replace_dot.outputs.value }}
find: '_'
replace: ''
- name: Uploading Proxy Source
uses: actions/upload-artifact@v3
with:
name: proxy-src-code
path: ./nginx
- name: Uploading Backend Source
uses: actions/upload-artifact@v3
with:
name: backend-src-code
path: ./apiserver
- name: Uploading Web Source
uses: actions/upload-artifact@v3
with:
name: web-src-code
path: |
./
!./apiserver
!./nginx
!./deploy
!./space
- name: Uploading Space Source
uses: actions/upload-artifact@v3
with:
name: space-src-code
path: |
./
!./apiserver
!./nginx
!./deploy
!./web
outputs:
gh_branch_name: ${{ steps.gh_branch_clean.outputs.value }}
branch_build_push_frontend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Web Source Code
uses: actions/download-artifact@v3
with:
name: web-src-code
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./web/Dockerfile.web
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_space:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Space Source Code
uses: actions/download-artifact@v3
with:
name: space-src-code
- name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./space/Dockerfile.space
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_backend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Backend Source Code
uses: actions/download-artifact@v3
with:
name: backend-src-code
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./Dockerfile.api
platforms: linux/amd64
push: true
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_proxy:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Proxy Source Code
uses: actions/download-artifact@v3
with:
name: proxy-src-code
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./Dockerfile
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}

3
.gitignore vendored
View File

@@ -75,7 +75,8 @@ pnpm-lock.yaml
pnpm-workspace.yaml
.npmrc
.secrets
tmp/
## packages
dist
.temp/

View File

@@ -8,8 +8,8 @@ Before submitting a new issue, please search the [issues](https://github.com/mak
While we want to fix all the [issues](https://github.com/makeplane/plane/issues), before fixing a bug we need to be able to reproduce and confirm it. Please provide us with a minimal reproduction scenario using a repository or [Gist](https://gist.github.com/). Having a live, reproducible scenario gives us the information without asking questions back & forth with additional questions like:
- 3rd-party libraries being used and their versions
- a use-case that fails
- 3rd-party libraries being used and their versions
- a use-case that fails
Without said minimal reproduction, we won't be able to investigate all [issues](https://github.com/makeplane/plane/issues), and the issue might not be resolved.
@@ -19,10 +19,10 @@ You can open a new issue with this [issue form](https://github.com/makeplane/pla
### Requirements
- Node.js version v16.18.0
- Python version 3.8+
- Postgres version v14
- Redis version v6.2.7
- Node.js version v16.18.0
- Python version 3.8+
- Postgres version v14
- Redis version v6.2.7
### Setup the project
@@ -81,8 +81,8 @@ If you would like to _implement_ it, an issue with your proposal must be submitt
To ensure consistency throughout the source code, please keep these rules in mind as you are working:
- All features or bug fixes must be tested by one or more specs (unit-tests).
- We use [Eslint default rule guide](https://eslint.org/docs/rules/), with minor changes. An automated formatter is available using prettier.
- All features or bug fixes must be tested by one or more specs (unit-tests).
- We use [Eslint default rule guide](https://eslint.org/docs/rules/), with minor changes. An automated formatter is available using prettier.
## Need help? Questions and suggestions
@@ -90,11 +90,11 @@ Questions, suggestions, and thoughts are most welcome. We can also be reached in
## Ways to contribute
- Try Plane Cloud and the self hosting platform and give feedback
- Add new integrations
- Help with open [issues](https://github.com/makeplane/plane/issues) or [create your own](https://github.com/makeplane/plane/issues/new/choose)
- Share your thoughts and suggestions with us
- Help create tutorials and blog posts
- Request a feature by submitting a proposal
- Report a bug
- **Improve documentation** - fix incomplete or missing [docs](https://docs.plane.so/), bad wording, examples or explanations.
- Try Plane Cloud and the self hosting platform and give feedback
- Add new integrations
- Help with open [issues](https://github.com/makeplane/plane/issues) or [create your own](https://github.com/makeplane/plane/issues/new/choose)
- Share your thoughts and suggestions with us
- Help create tutorials and blog posts
- Request a feature by submitting a proposal
- Report a bug
- **Improve documentation** - fix incomplete or missing [docs](https://docs.plane.so/), bad wording, examples or explanations.

View File

@@ -43,8 +43,6 @@ FROM python:3.11.1-alpine3.17 AS backend
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV DJANGO_SETTINGS_MODULE plane.settings.production
ENV DOCKERIZED 1
WORKDIR /code

View File

@@ -1,8 +1,10 @@
# Environment Variables
Environment variables are distributed in various files. Please refer them carefully.
Environment variables are distributed in various files. Please refer them carefully.
## {PROJECT_FOLDER}/.env
File is available in the project root folder
```
@@ -29,42 +31,52 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
```
## {PROJECT_FOLDER}/web/.env.example
```
# Enable/Disable OAUTH - default 0 for selfhosted instance
NEXT_PUBLIC_ENABLE_OAUTH=0
# Public boards deploy URL
NEXT_PUBLIC_DEPLOY_URL="http://localhost/spaces"
```
## {PROJECT_FOLDER}/spaces/.env.example
```
# Flag to toggle OAuth
NEXT_PUBLIC_ENABLE_OAUTH=0
```
## {PROJECT_FOLDER}/apiserver/.env
```
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.selfhosted"
DJANGO_SETTINGS_MODULE="plane.settings.selfhosted" # deprecated
# Error logs
SENTRY_DSN=""
@@ -101,24 +113,22 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # Deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
# Settings related to Docker
DOCKERIZED=1
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
# Default Creds
DEFAULT_EMAIL="captain@plane.so"
DEFAULT_PASSWORD="password123"
# SignUps
ENABLE_SIGNUP="1"
@@ -126,7 +136,9 @@ ENABLE_SIGNUP="1"
# Email Redirection URL
WEB_URL="http://localhost"
```
## Updates
- The environment variable NEXT_PUBLIC_API_BASE_URL has been removed from both the web and space projects.
- The naming convention for containers and images has been updated.
- The plane-worker image will no longer be maintained, as it has been merged with plane-backend.

View File

@@ -7,7 +7,7 @@
</p>
<h3 align="center"><b>Plane</b></h3>
<p align="center"><b>Open-source, self-hosted project planning tool</b></p>
<p align="center"><b>Flexible, extensible open-source project management</b></p>
<p align="center">
<a href="https://discord.com/invite/A92xrEGCge">

View File

@@ -1,10 +1,11 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.production"
CORS_ALLOWED_ORIGINS=""
# Error logs
SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
# Database Settings
PGUSER="plane"
@@ -18,15 +19,6 @@ REDIS_HOST="plane-redis"
REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/"
# Email Settings
EMAIL_HOST=""
EMAIL_HOST_USER=""
EMAIL_HOST_PASSWORD=""
EMAIL_PORT=587
EMAIL_FROM="Team Plane <team@mailer.plane.so>"
EMAIL_USE_TLS="1"
EMAIL_USE_SSL="0"
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
@@ -38,9 +30,9 @@ AWS_S3_BUCKET_NAME="uploads"
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # change if using a custom endpoint
OPENAI_API_KEY="sk-" # add your openai key here
GPT_ENGINE="gpt-3.5-turbo" # use "gpt-4" if you have access
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
@@ -53,9 +45,6 @@ USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
# Default Creds
DEFAULT_EMAIL="captain@plane.so"
DEFAULT_PASSWORD="password123"
# SignUps
ENABLE_SIGNUP="1"
@@ -70,3 +59,6 @@ ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings
WEB_URL="http://localhost"
# Gunicorn Workers
GUNICORN_WORKERS=2

View File

@@ -43,7 +43,7 @@ USER captain
COPY manage.py manage.py
COPY plane plane/
COPY templates templates/
COPY package.json package.json
COPY gunicorn.config.py ./
USER root
RUN apk --no-cache add "bash~=5.2"

View File

@@ -3,7 +3,11 @@ set -e
python manage.py wait_for_db
python manage.py migrate
# Create a Default User
python bin/user_script.py
# Register instance
python manage.py register_instance
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python bin/bucket_script.py
exec gunicorn -w 8 -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@@ -1,28 +0,0 @@
import os, sys
import uuid
sys.path.append("/code")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
import django
django.setup()
from plane.db.models import User
def populate():
default_email = os.environ.get("DEFAULT_EMAIL", "captain@plane.so")
default_password = os.environ.get("DEFAULT_PASSWORD", "password123")
if not User.objects.filter(email=default_email).exists():
user = User.objects.create(email=default_email, username=uuid.uuid4().hex)
user.set_password(default_password)
user.save()
print(f"User created with an email: {default_email}")
else:
print(f"User already exists with the default email: {default_email}")
if __name__ == "__main__":
populate()

4
apiserver/package.json Normal file
View File

@@ -0,0 +1,4 @@
{
"name": "plane-api",
"version": "0.13.2"
}

View File

@@ -1,2 +1,17 @@
from .workspace import WorkSpaceBasePermission, WorkSpaceAdminPermission, WorkspaceEntityPermission, WorkspaceViewerPermission
from .project import ProjectBasePermission, ProjectEntityPermission, ProjectMemberPermission, ProjectLitePermission
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
)
from .project import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@@ -13,14 +13,15 @@ Guest = 5
class ProjectBasePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
@@ -29,6 +30,7 @@ class ProjectBasePermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@@ -37,19 +39,21 @@ class ProjectBasePermission(BasePermission):
member=request.user,
role=Admin,
project_id=view.project_id,
is_active=True,
).exists()
class ProjectMemberPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
if request.method == "POST":
@@ -57,6 +61,7 @@ class ProjectMemberPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@@ -65,12 +70,12 @@ class ProjectMemberPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectEntityPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
@@ -80,6 +85,7 @@ class ProjectEntityPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()
## Only project members or admins can create and edit the project attributes
@@ -88,17 +94,18 @@ class ProjectEntityPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectLitePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()

View File

@@ -32,15 +32,31 @@ class WorkSpaceBasePermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
# allow only owner to delete the workspace
if request.method == "DELETE":
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role=Owner
member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
is_active=True,
).exists()
class WorkspaceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
).exists()
class WorkSpaceAdminPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
@@ -50,6 +66,7 @@ class WorkSpaceAdminPermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@@ -63,12 +80,14 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@@ -78,5 +97,20 @@ class WorkspaceViewerPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role__gte=10
member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()
class WorkspaceUserPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists()

View File

@@ -71,7 +71,7 @@ from .module import (
ModuleFavoriteSerializer,
)
from .api_token import APITokenSerializer
from .api import APITokenSerializer, APITokenReadSerializer
from .integration import (
IntegrationSerializer,
@@ -85,7 +85,7 @@ from .integration import (
from .importer import ImporterSerializer
from .page import PageSerializer, PageBlockSerializer, PageFavoriteSerializer
from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
from .estimate import (
EstimateSerializer,
@@ -100,3 +100,5 @@ from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer

View File

@@ -0,0 +1,31 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
read_only_fields = [
"token",
"expired_at",
"created_at",
"updated_at",
"workspace",
"user",
]
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@@ -1,14 +0,0 @@
from .base import BaseSerializer
from plane.db.models import APIToken
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = [
"label",
"user",
"user_type",
"workspace",
"created_at",
]

View File

@@ -5,7 +5,7 @@ from django.utils import timezone
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
from .state import StateSerializer, StateLiteSerializer
from .project import ProjectLiteSerializer
@@ -548,7 +548,7 @@ class IssueSerializer(BaseSerializer):
]
class IssueLiteSerializer(BaseSerializer):
class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")

View File

@@ -6,28 +6,7 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Page, PageBlock, PageFavorite, PageLabel, Label
class PageBlockSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = PageBlock
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageBlockLiteSerializer(BaseSerializer):
class Meta:
model = PageBlock
fields = "__all__"
from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
class PageSerializer(BaseSerializer):
@@ -38,7 +17,6 @@ class PageSerializer(BaseSerializer):
write_only=True,
required=False,
)
blocks = PageBlockLiteSerializer(read_only=True, many=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
@@ -102,6 +80,41 @@ class PageSerializer(BaseSerializer):
return super().update(instance, validated_data)
class SubPageSerializer(BaseSerializer):
entity_details = serializers.SerializerMethodField()
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
def get_entity_details(self, obj):
entity_name = obj.entity_name
if entity_name == 'forward_link' or entity_name == 'back_link':
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
except Page.DoesNotExist:
return None
return None
class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageFavoriteSerializer(BaseSerializer):
page_detail = PageSerializer(source="page", read_only=True)

View File

@@ -103,13 +103,16 @@ class ProjectListSerializer(DynamicBaseSerializer):
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(project_id=obj.id).values(
project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return project_members
return list(project_members)
class Meta:
model = Project

View File

@@ -7,8 +7,6 @@ from plane.db.models import State
class StateSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
model = State

View File

@@ -4,6 +4,7 @@ from rest_framework import serializers
# Module import
from .base import BaseSerializer
from plane.db.models import User, Workspace, WorkspaceMemberInvite
from plane.license.models import InstanceAdmin, Instance
class UserSerializer(BaseSerializer):
@@ -86,7 +87,9 @@ class UserMeSettingsSerializer(BaseSerializer):
"last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": workspace.slug if workspace is not None else "",
"fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": workspace.slug if workspace is not None else "",
"fallback_workspace_slug": workspace.slug
if workspace is not None
else "",
"invites": workspace_invites,
}
else:

View File

@@ -0,0 +1,30 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]

View File

@@ -1,10 +1,10 @@
from .analytic import urlpatterns as analytic_urls
from .asset import urlpatterns as asset_urls
from .authentication import urlpatterns as authentication_urls
from .configuration import urlpatterns as configuration_urls
from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls
from .estimate import urlpatterns as estimate_urls
from .gpt import urlpatterns as gpt_urls
from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls
from .integration import urlpatterns as integration_urls
@@ -14,13 +14,17 @@ from .notification import urlpatterns as notification_urls
from .page import urlpatterns as page_urls
from .project import urlpatterns as project_urls
from .public_board import urlpatterns as public_board_urls
from .release_note import urlpatterns as release_note_urls
from .search import urlpatterns as search_urls
from .state import urlpatterns as state_urls
from .unsplash import urlpatterns as unsplash_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls
from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
# Django imports
from django.conf import settings
urlpatterns = [
@@ -30,7 +34,7 @@ urlpatterns = [
*configuration_urls,
*cycle_urls,
*estimate_urls,
*gpt_urls,
*external_urls,
*importer_urls,
*inbox_urls,
*integration_urls,
@@ -40,11 +44,11 @@ urlpatterns = [
*page_urls,
*project_urls,
*public_board_urls,
*release_note_urls,
*search_urls,
*state_urls,
*unsplash_urls,
*user_urls,
*view_urls,
*workspace_urls,
*api_urls,
*webhook_urls,
]

View File

@@ -0,0 +1,17 @@
from django.urls import path
from plane.api.views import ApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
## End API Tokens
]

View File

@@ -0,0 +1,25 @@
from django.urls import path
from plane.api.views import UnsplashEndpoint
from plane.api.views import ReleaseNotesEndpoint
from plane.api.views import GPTIntegrationEndpoint
urlpatterns = [
path(
"unsplash/",
UnsplashEndpoint.as_view(),
name="unsplash",
),
path(
"release-notes/",
ReleaseNotesEndpoint.as_view(),
name="release-notes",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/",
GPTIntegrationEndpoint.as_view(),
name="importer",
),
]

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import GPTIntegrationEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/",
GPTIntegrationEndpoint.as_view(),
name="importer",
),
]

View File

@@ -3,6 +3,8 @@ from django.urls import path
from plane.api.views import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
@@ -35,6 +37,16 @@ urlpatterns = [
),
name="project-issue",
),
path(
"v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListGroupedEndpoint.as_view(),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(

View File

@@ -3,9 +3,9 @@ from django.urls import path
from plane.api.views import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
PageLogEndpoint,
SubPagesEndpoint,
)
@@ -31,27 +31,6 @@ urlpatterns = [
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
PageFavoriteViewSet.as_view(
@@ -72,8 +51,83 @@ urlpatterns = [
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(),
name="page-block-issues",
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
PageViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(),
name="sub-page",
),
]

View File

@@ -2,17 +2,16 @@ from django.urls import path
from plane.api.views import (
ProjectViewSet,
InviteProjectEndpoint,
ProjectInvitationsViewset,
ProjectMemberViewSet,
ProjectMemberInvitationsViewset,
ProjectMemberUserEndpoint,
ProjectJoinEndpoint,
AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint,
ProjectFavoritesViewSet,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
UserProjectInvitationsViewset,
)
@@ -45,13 +44,48 @@ urlpatterns = [
name="project-identifiers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invite/",
InviteProjectEndpoint.as_view(),
name="invite-project",
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view({"get": "list", "post": "create"}),
ProjectMemberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-member",
),
path(
@@ -66,30 +100,19 @@ urlpatterns = [
name="project-member",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
ProjectMemberViewSet.as_view(
{
"post": "leave",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(),
name="projects",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectMemberInvitationsViewset.as_view({"get": "list"}),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectMemberInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(),
@@ -119,11 +142,6 @@ urlpatterns = [
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
LeaveProjectEndpoint.as_view(),
name="leave-project",
),
path(
"project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(),

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import ReleaseNotesEndpoint
urlpatterns = [
path(
"release-notes/",
ReleaseNotesEndpoint.as_view(),
name="release-notes",
),
]

View File

@@ -20,11 +20,19 @@ urlpatterns = [
StateViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-state",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/mark-default/",
StateViewSet.as_view(
{
"post": "mark_as_default",
}
),
name="project-state",
),
]

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import UnsplashEndpoint
urlpatterns = [
path(
"unsplash/",
UnsplashEndpoint.as_view(),
name="unsplash",
),
]

View File

@@ -9,15 +9,10 @@ from plane.api.views import (
ChangePasswordEndpoint,
## End User
## Workspaces
UserWorkspaceInvitationsEndpoint,
UserWorkSpacesEndpoint,
JoinWorkspaceEndpoint,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint,
UserProjectInvitationsViewset,
## End Workspaces
)
@@ -26,7 +21,11 @@ urlpatterns = [
path(
"users/me/",
UserEndpoint.as_view(
{"get": "retrieve", "patch": "partial_update", "delete": "destroy"}
{
"get": "retrieve",
"patch": "partial_update",
"delete": "deactivate",
}
),
name="users",
),
@@ -39,6 +38,15 @@ urlpatterns = [
),
name="users",
),
path(
"users/me/instance-admin/",
UserEndpoint.as_view(
{
"get": "retrieve_instance_admin",
}
),
name="users",
),
path(
"users/me/change-password/",
ChangePasswordEndpoint.as_view(),
@@ -65,23 +73,6 @@ urlpatterns = [
UserWorkSpacesEndpoint.as_view(),
name="user-workspace",
),
# user workspace invitations
path(
"users/me/invitations/workspaces/",
UserWorkspaceInvitationsEndpoint.as_view({"get": "list", "post": "create"}),
name="user-workspace-invitations",
),
# user workspace invitation
path(
"users/me/invitations/<uuid:pk>/",
UserWorkspaceInvitationEndpoint.as_view(
{
"get": "retrieve",
}
),
name="user-workspace-invitation",
),
# user join workspace
# User Graphs
path(
"users/me/workspaces/<str:slug>/activity-graph/",
@@ -99,15 +90,4 @@ urlpatterns = [
name="user-workspace-dashboard",
),
## End User Graph
path(
"users/me/invitations/workspaces/<str:slug>/<uuid:pk>/join/",
JoinWorkspaceEndpoint.as_view(),
name="user-join-workspace",
),
# user project invitations
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view({"get": "list", "post": "create"}),
name="user-project-invitations",
),
]

View File

@@ -0,0 +1,31 @@
from django.urls import path
from plane.api.views import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/webhooks/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/regenerate/",
WebhookSecretRegenerateEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhook-logs/<uuid:webhook_id>/",
WebhookLogsEndpoint.as_view(),
name="webhooks",
),
]

View File

@@ -2,8 +2,9 @@ from django.urls import path
from plane.api.views import (
UserWorkspaceInvitationsViewSet,
WorkSpaceViewSet,
InviteWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
WorkspaceInvitationsViewset,
WorkspaceMemberUserEndpoint,
@@ -17,7 +18,6 @@ from plane.api.views import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
@@ -49,14 +49,14 @@ urlpatterns = [
),
name="workspace",
),
path(
"workspaces/<str:slug>/invite/",
InviteWorkspaceEndpoint.as_view(),
name="invite-workspace",
),
path(
"workspaces/<str:slug>/invitations/",
WorkspaceInvitationsViewset.as_view({"get": "list"}),
WorkspaceInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="workspace-invitations",
),
path(
@@ -69,6 +69,23 @@ urlpatterns = [
),
name="workspace-invitations",
),
# user workspace invitations
path(
"users/me/workspaces/invitations/",
UserWorkspaceInvitationsViewSet.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-workspace-invitations",
),
path(
"workspaces/<str:slug>/invitations/<uuid:pk>/join/",
WorkspaceJoinEndpoint.as_view(),
name="workspace-join",
),
# user join workspace
path(
"workspaces/<str:slug>/members/",
WorkSpaceMemberViewSet.as_view({"get": "list"}),
@@ -85,6 +102,15 @@ urlpatterns = [
),
name="workspace-member",
),
path(
"workspaces/<str:slug>/members/leave/",
WorkSpaceMemberViewSet.as_view(
{
"post": "leave",
},
),
name="leave-workspace-members",
),
path(
"workspaces/<str:slug>/teams/",
TeamMemberViewSet.as_view(
@@ -168,9 +194,4 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels",
),
path(
"workspaces/<str:slug>/members/leave/",
LeaveWorkspaceEndpoint.as_view(),
name="leave-workspace-members",
),
]

View File

@@ -124,9 +124,10 @@ from plane.api.views import (
## End Modules
# Pages
PageViewSet,
PageBlockViewSet,
PageLogEndpoint,
SubPagesEndpoint,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
CreateIssueFromBlockEndpoint,
## End Pages
# Api Tokens
ApiTokenEndpoint,
@@ -1222,25 +1223,81 @@ urlpatterns = [
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
)
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(), name="sub-page"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/",
BulkEstimatePointEndpoint.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
name="bulk-create-estimate-points",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/<uuid:estimate_id>/",
BulkEstimatePointEndpoint.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
name="bulk-create-estimate-points",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
@@ -1263,7 +1320,7 @@ urlpatterns = [
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(),
CreateIssueFromBlockEndpoint.as_view(),
name="page-block-issues",
),
## End Pages

View File

@@ -2,10 +2,8 @@ from .project import (
ProjectViewSet,
ProjectMemberViewSet,
UserProjectInvitationsViewset,
InviteProjectEndpoint,
ProjectInvitationsViewset,
AddTeamToProjectEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberInviteDetailViewSet,
ProjectIdentifierEndpoint,
ProjectJoinEndpoint,
ProjectUserViewsEndpoint,
@@ -14,7 +12,6 @@ from .project import (
ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint,
WorkspaceProjectDeployBoardEndpoint,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
)
from .user import (
@@ -26,19 +23,17 @@ from .user import (
from .oauth import OauthEndpoint
from .base import BaseAPIView, BaseViewSet
from .base import BaseAPIView, BaseViewSet, WebhookMixin
from .workspace import (
WorkSpaceViewSet,
UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint,
InviteWorkspaceEndpoint,
JoinWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
TeamMemberViewSet,
WorkspaceInvitationsViewset,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserWorkspaceInvitationsViewSet,
UserLastProjectWithWorkspaceEndpoint,
WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint,
@@ -51,10 +46,14 @@ from .workspace import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
from .state import StateViewSet
from .view import GlobalViewViewSet, GlobalViewIssuesViewSet, IssueViewViewSet, IssueViewFavoriteViewSet
from .view import (
GlobalViewViewSet,
GlobalViewIssuesViewSet,
IssueViewViewSet,
IssueViewFavoriteViewSet,
)
from .cycle import (
CycleViewSet,
CycleIssueViewSet,
@@ -65,6 +64,8 @@ from .cycle import (
from .asset import FileAssetEndpoint, UserAssetsEndpoint
from .issue import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
WorkSpaceIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
@@ -114,7 +115,7 @@ from .module import (
ModuleFavoriteViewSet,
)
from .api_token import ApiTokenEndpoint
from .api import ApiTokenEndpoint
from .integration import (
WorkspaceIntegrationViewSet,
@@ -137,9 +138,10 @@ from .importer import (
from .page import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
PageLogEndpoint,
SubPagesEndpoint,
CreateIssueFromBlockEndpoint,
)
from .search import GlobalSearchEndpoint, IssueSearchEndpoint
@@ -162,8 +164,14 @@ from .analytic import (
DefaultAnalyticsEndpoint,
)
from .notification import NotificationViewSet, UnreadNotificationEndpoint, MarkAllReadNotificationViewSet
from .notification import (
NotificationViewSet,
UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet,
)
from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint

View File

@@ -0,0 +1,78 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.api.serializers import APITokenSerializer, APITokenReadSerializer
from plane.api.permissions import WorkspaceOwnerPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
label = request.data.get("label", str(uuid4().hex))
description = request.data.get("description", "")
workspace = Workspace.objects.get(slug=slug)
expired_at = request.data.get("expired_at", None)
# Check the user type
user_type = 1 if request.user.is_bot else 0
api_token = APIToken.objects.create(
label=label,
description=description,
user=request.user,
workspace=workspace,
user_type=user_type,
expired_at=expired_at,
)
serializer = APITokenSerializer(api_token)
# Token will be only visible while creating
return Response(
serializer.data,
status=status.HTTP_201_CREATED,
)
def get(self, request, slug, pk=None):
if pk == None:
api_tokens = APIToken.objects.filter(
user=request.user, workspace__slug=slug
)
serializer = APITokenReadSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
api_tokens = APIToken.objects.get(
user=request.user, workspace__slug=slug, pk=pk
)
serializer = APITokenReadSerializer(api_tokens)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -1,47 +0,0 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken
from plane.api.serializers import APITokenSerializer
class ApiTokenEndpoint(BaseAPIView):
def post(self, request):
label = request.data.get("label", str(uuid4().hex))
workspace = request.data.get("workspace", False)
if not workspace:
return Response(
{"error": "Workspace is required"}, status=status.HTTP_200_OK
)
api_token = APIToken.objects.create(
label=label, user=request.user, workspace_id=workspace
)
serializer = APITokenSerializer(api_token)
# Token will be only vissible while creating
return Response(
{"api_token": serializer.data, "token": api_token.token},
status=status.HTTP_201_CREATED,
)
def get(self, request):
api_tokens = APIToken.objects.filter(user=request.user)
serializer = APITokenSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
api_token = APIToken.objects.get(pk=pk)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -33,7 +33,7 @@ from plane.bgtasks.forgot_password_task import forgot_password
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
@@ -76,7 +76,7 @@ class ForgotPasswordEndpoint(BaseAPIView):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site

View File

@@ -4,7 +4,7 @@ import random
import string
import json
import requests
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
from django.core.exceptions import ValidationError
@@ -22,8 +22,13 @@ from sentry_sdk import capture_exception, capture_message
# Module imports
from . import BaseAPIView
from plane.db.models import User
from plane.api.serializers import UserSerializer
from plane.db.models import (
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
@@ -86,35 +91,93 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
return Response(data, status=status.HTTP_200_OK)
@@ -176,33 +239,92 @@ class SignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
@@ -287,7 +409,8 @@ class MagicSignInGenerateEndpoint(BaseAPIView):
ri.set(key, json.dumps(value), ex=expiry)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
@@ -319,27 +442,37 @@ class MagicSignInEndpoint(BaseAPIView):
if str(token) == str(user_token):
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else:
user = User.objects.create(
email=email,
@@ -347,27 +480,30 @@ class MagicSignInEndpoint(BaseAPIView):
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
"event_type": "SIGN_UP",
},
)
)
except RequestException as e:
capture_exception(e)
user.last_active = timezone.now()
user.last_login_time = timezone.now()
@@ -376,6 +512,63 @@ class MagicSignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,

View File

@@ -1,5 +1,6 @@
# Python imports
import zoneinfo
import json
# Django imports
from django.urls import resolve
@@ -7,6 +8,7 @@ from django.conf import settings
from django.utils import timezone
from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.core.serializers.json import DjangoJSONEncoder
# Third part imports
from rest_framework import status
@@ -22,6 +24,7 @@ from django_filters.rest_framework import DjangoFilterBackend
# Module imports
from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin:
@@ -29,6 +32,7 @@ class TimezoneMixin:
This enables timezone conversion according
to the user set timezone
"""
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
if request.user.is_authenticated:
@@ -37,8 +41,28 @@ class TimezoneMixin:
timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class WebhookMixin:
webhook_event = None
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
action=self.request.method,
slug=self.workspace_slug,
)
return response
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
model = None
permission_classes = [
@@ -60,7 +84,7 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
except Exception as e:
capture_exception(e)
raise APIException("Please check the view", status.HTTP_400_BAD_REQUEST)
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@@ -71,18 +95,30 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
capture_exception(e)
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error")
capture_exception(e)
@@ -99,8 +135,8 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
print(
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
)
return response
return response
except Exception as exc:
response = self.handle_exception(exc)
return exc
@@ -120,7 +156,6 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
permission_classes = [
IsAuthenticated,
]
@@ -139,7 +174,6 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
queryset = backend().filter_queryset(self.request, queryset, self)
return queryset
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@@ -150,19 +184,29 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
print(e) if settings.DEBUG else print("Server Error")
if settings.DEBUG:
print(e)
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)

View File

@@ -12,6 +12,8 @@ from sentry_sdk import capture_exception
# Module imports
from .base import BaseAPIView
from plane.license.models import Instance, InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
class ConfigurationEndpoint(BaseAPIView):
@@ -20,14 +22,75 @@ class ConfigurationEndpoint(BaseAPIView):
]
def get(self, request):
instance_configuration = InstanceConfiguration.objects.values("key", "value")
data = {}
data["google"] = os.environ.get("GOOGLE_CLIENT_ID", None)
data["github"] = os.environ.get("GITHUB_CLIENT_ID", None)
data["github_app_name"] = os.environ.get("GITHUB_APP_NAME", None)
data["magic_login"] = (
bool(settings.EMAIL_HOST_USER) and bool(settings.EMAIL_HOST_PASSWORD)
) and os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0") == "1"
data["email_password_login"] = (
os.environ.get("ENABLE_EMAIL_PASSWORD", "0") == "1"
# Authentication
data["google_client_id"] = get_configuration_value(
instance_configuration,
"GOOGLE_CLIENT_ID",
os.environ.get("GOOGLE_CLIENT_ID", None),
)
data["github_client_id"] = get_configuration_value(
instance_configuration,
"GITHUB_CLIENT_ID",
os.environ.get("GITHUB_CLIENT_ID", None),
)
data["github_app_name"] = get_configuration_value(
instance_configuration,
"GITHUB_APP_NAME",
os.environ.get("GITHUB_APP_NAME", None),
)
data["magic_login"] = (
bool(
get_configuration_value(
instance_configuration,
"EMAIL_HOST_USER",
os.environ.get("GITHUB_APP_NAME", None),
),
)
and bool(
get_configuration_value(
instance_configuration,
"EMAIL_HOST_PASSWORD",
os.environ.get("GITHUB_APP_NAME", None),
)
)
) and get_configuration_value(
instance_configuration, "ENABLE_MAGIC_LINK_LOGIN", "0"
) == "1"
data["email_password_login"] = (
get_configuration_value(
instance_configuration, "ENABLE_EMAIL_PASSWORD", "0"
)
== "1"
)
# Slack client
data["slack_client_id"] = get_configuration_value(
instance_configuration,
"SLACK_CLIENT_ID",
os.environ.get("SLACK_CLIENT_ID", None),
)
# Posthog
data["posthog_api_key"] = get_configuration_value(
instance_configuration,
"POSTHOG_API_KEY",
os.environ.get("POSTHOG_API_KEY", None),
)
data["posthog_host"] = get_configuration_value(
instance_configuration,
"POSTHOG_HOST",
os.environ.get("POSTHOG_HOST", None),
)
# Unsplash
data["has_unsplash_configured"] = bool(
get_configuration_value(
instance_configuration,
"UNSPLASH_ACCESS_KEY",
os.environ.get("UNSPLASH_ACCESS_KEY", None),
)
)
return Response(data, status=status.HTTP_200_OK)

View File

@@ -23,7 +23,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
@@ -48,9 +48,10 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class CycleViewSet(BaseViewSet):
class CycleViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
@@ -176,9 +177,8 @@ class CycleViewSet(BaseViewSet):
def list(self, request, slug, project_id):
queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all")
order_by = request.GET.get("order_by", "sort_order")
queryset = queryset.order_by(order_by)
queryset = queryset.order_by("-is_favorite","-created_at")
# Current Cycle
if cycle_view == "current":
@@ -479,13 +479,13 @@ class CycleViewSet(BaseViewSet):
)
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
# Delete the cycle
cycle.delete()
issue_activity.delay(
type="cycle.activity.deleted",
requested_data=json.dumps(
{
"cycle_id": str(pk),
"cycle_name": str(cycle.name),
"issues": [str(issue_id) for issue_id in cycle_issues],
}
),
@@ -495,13 +495,15 @@ class CycleViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
# Delete the cycle
cycle.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueViewSet(BaseViewSet):
class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
@@ -511,12 +513,6 @@ class CycleIssueViewSet(BaseViewSet):
"issue__assignees__id",
]
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
cycle_id=self.kwargs.get("cycle_id"),
)
def get_queryset(self):
return self.filter_queryset(
super()
@@ -669,7 +665,7 @@ class CycleIssueViewSet(BaseViewSet):
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": issues}),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{

View File

@@ -2,7 +2,7 @@
import requests
# Third party imports
import openai
from openai import OpenAI
from rest_framework.response import Response
from rest_framework import status
from rest_framework.permissions import AllowAny
@@ -17,7 +17,8 @@ from plane.api.permissions import ProjectEntityPermission
from plane.db.models import Workspace, Project
from plane.api.serializers import ProjectLiteSerializer, WorkspaceLiteSerializer
from plane.utils.integrations.github import get_release_notes
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
class GPTIntegrationEndpoint(BaseAPIView):
permission_classes = [
@@ -25,7 +26,14 @@ class GPTIntegrationEndpoint(BaseAPIView):
]
def post(self, request, slug, project_id):
if not settings.OPENAI_API_KEY or not settings.GPT_ENGINE:
# Get the configuration value
instance_configuration = InstanceConfiguration.objects.values("key", "value")
api_key = get_configuration_value(instance_configuration, "OPENAI_API_KEY")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
# Check the keys
if not api_key or not gpt_engine:
return Response(
{"error": "OpenAI API key and engine is required"},
status=status.HTTP_400_BAD_REQUEST,
@@ -41,12 +49,17 @@ class GPTIntegrationEndpoint(BaseAPIView):
final_text = task + "\n" + prompt
openai.api_key = settings.OPENAI_API_KEY
response = openai.ChatCompletion.create(
model=settings.GPT_ENGINE,
instance_configuration = InstanceConfiguration.objects.values("key", "value")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
client = OpenAI(
api_key=api_key,
)
response = client.chat.completions.create(
model=gpt_engine,
messages=[{"role": "user", "content": final_text}],
temperature=0.7,
max_tokens=1024,
)
workspace = Workspace.objects.get(slug=slug)
@@ -89,4 +102,4 @@ class UnsplashEndpoint(BaseAPIView):
}
resp = requests.get(url=url, headers=headers)
return Response(resp.json(), status=status.HTTP_200_OK)
return Response(resp.json(), status=resp.status_code)

View File

@@ -64,9 +64,7 @@ class InboxViewSet(BaseViewSet):
serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
inbox = Inbox.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
# Handle default inbox delete
if inbox.is_default:
return Response(
@@ -128,9 +126,7 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -150,7 +146,6 @@ class InboxIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
@@ -198,7 +193,7 @@ class InboxIssueViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@@ -216,10 +211,20 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
@@ -230,11 +235,13 @@ class InboxIssueViewSet(BaseViewSet):
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
# viewers and guests since only viewers and guests
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
@@ -256,7 +263,7 @@ class InboxIssueViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
else:
@@ -307,7 +314,9 @@ class InboxIssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK)
return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def retrieve(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
@@ -324,15 +333,27 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id).delete()
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -347,7 +368,10 @@ class InboxIssuePublicViewSet(BaseViewSet):
]
def get_queryset(self):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=self.kwargs.get("slug"), project_id=self.kwargs.get("project_id"))
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
if project_deploy_board is not None:
return self.filter_queryset(
super()
@@ -363,9 +387,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
return InboxIssue.objects.none()
def list(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
filters = issue_filters(request.query_params, "GET")
issues = (
@@ -392,9 +421,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -415,9 +442,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
)
def create(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
if not request.data.get("issue", {}).get("name", False):
return Response(
@@ -465,7 +497,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@@ -479,34 +511,41 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
# viewers and guests since only viewers and guests
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
issue, data=issue_data, partial=True
)
issue_serializer = IssueCreateSerializer(issue, data=issue_data, partial=True)
if issue_serializer.is_valid():
current_instance = issue
@@ -523,17 +562,22 @@ class InboxIssuePublicViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
return Response(issue_serializer.data, status=status.HTTP_200_OK)
return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
@@ -544,16 +588,24 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,6 +1,6 @@
# Python improts
import uuid
import requests
# Django imports
from django.contrib.auth.hashers import make_password
@@ -25,7 +25,7 @@ from plane.utils.integrations.github import (
delete_github_installation,
)
from plane.api.permissions import WorkSpaceAdminPermission
from plane.utils.integrations.slack import slack_oauth
class IntegrationViewSet(BaseViewSet):
serializer_class = IntegrationSerializer
@@ -98,12 +98,19 @@ class WorkspaceIntegrationViewSet(BaseViewSet):
config = {"installation_id": installation_id}
if provider == "slack":
metadata = request.data.get("metadata", {})
code = request.data.get("code", False)
if not code:
return Response({"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST)
slack_response = slack_oauth(code=code)
metadata = slack_response
access_token = metadata.get("access_token", False)
team_id = metadata.get("team", {}).get("id", False)
if not metadata or not access_token or not team_id:
return Response(
{"error": "Access token and team id is required"},
{"error": "Slack could not be installed. Please try again later"},
status=status.HTTP_400_BAD_REQUEST,
)
config = {"team_id": team_id, "access_token": access_token}

View File

@@ -11,6 +11,7 @@ from plane.api.views import BaseViewSet, BaseAPIView
from plane.db.models import SlackProjectSync, WorkspaceIntegration, ProjectMember
from plane.api.serializers import SlackProjectSyncSerializer
from plane.api.permissions import ProjectBasePermission, ProjectEntityPermission
from plane.utils.integrations.slack import slack_oauth
class SlackProjectSyncViewSet(BaseViewSet):
@@ -32,25 +33,47 @@ class SlackProjectSyncViewSet(BaseViewSet):
)
def create(self, request, slug, project_id, workspace_integration_id):
serializer = SlackProjectSyncSerializer(data=request.data)
try:
code = request.data.get("code", False)
workspace_integration = WorkspaceIntegration.objects.get(
workspace__slug=slug, pk=workspace_integration_id
)
if not code:
return Response(
{"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST
)
if serializer.is_valid():
serializer.save(
project_id=project_id,
workspace_integration_id=workspace_integration_id,
slack_response = slack_oauth(code=code)
workspace_integration = WorkspaceIntegration.objects.get(
workspace__slug=slug, pk=workspace_integration_id
)
workspace_integration = WorkspaceIntegration.objects.get(
pk=workspace_integration_id, workspace__slug=slug
)
slack_project_sync = SlackProjectSync.objects.create(
access_token=slack_response.get("access_token"),
scopes=slack_response.get("scope"),
bot_user_id=slack_response.get("bot_user_id"),
webhook_url=slack_response.get("incoming_webhook", {}).get("url"),
data=slack_response,
team_id=slack_response.get("team", {}).get("id"),
team_name=slack_response.get("team", {}).get("name"),
workspace_integration=workspace_integration,
project_id=project_id,
)
_ = ProjectMember.objects.get_or_create(
member=workspace_integration.actor, role=20, project_id=project_id
)
serializer = SlackProjectSyncSerializer(slack_project_sync)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "Slack is already installed for the project"},
status=status.HTTP_410_GONE,
)
capture_exception(e)
return Response(
{"error": "Slack could not be installed. Please try again later"},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -33,7 +33,7 @@ from rest_framework.permissions import AllowAny, IsAuthenticated
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
IssueCreateSerializer,
IssueActivitySerializer,
@@ -84,7 +84,7 @@ from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
class IssueViewSet(BaseViewSet):
class IssueViewSet(WebhookMixin, BaseViewSet):
def get_serializer_class(self):
return (
IssueCreateSerializer
@@ -93,6 +93,7 @@ class IssueViewSet(BaseViewSet):
)
model = Issue
webhook_event = "issue"
permission_classes = [
ProjectEntityPermission,
]
@@ -312,6 +313,104 @@ class IssueViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class IssueListEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
serializer = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueListGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class UserWorkSpaceIssues(BaseAPIView):
@method_decorator(gzip_page)
def get(self, request, slug):
@@ -496,9 +595,10 @@ class IssueActivityEndpoint(BaseAPIView):
return Response(result_list, status=status.HTTP_200_OK)
class IssueCommentViewSet(BaseViewSet):
class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
permission_classes = [
ProjectLitePermission,
]
@@ -525,6 +625,7 @@ class IssueCommentViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
@@ -655,8 +756,8 @@ class LabelViewSet(BaseViewSet):
.select_related("project")
.select_related("workspace")
.select_related("parent")
.order_by("name")
.distinct()
.order_by("sort_order")
)
@@ -1156,7 +1257,11 @@ class IssueSubscriberViewSet(BaseViewSet):
def list(self, request, slug, project_id, issue_id):
members = (
ProjectMember.objects.filter(workspace__slug=slug, project_id=project_id)
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
@@ -1400,6 +1505,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
@@ -1440,6 +1546,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1553,6 +1660,7 @@ class IssueReactionPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1646,7 +1754,9 @@ class CommentReactionPublicViewSet(BaseViewSet):
project_id=project_id, comment_id=comment_id, actor=request.user
)
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1731,7 +1841,9 @@ class IssueVotePublicViewSet(BaseViewSet):
)
# Add the user for workspace tracking
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
_ = ProjectPublicMember.objects.get_or_create(
project_id=project_id,
@@ -2230,7 +2342,7 @@ class IssueDraftViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
current_instance = json.dumps(
IssueSerializer(current_instance).data, cls=DjangoJSONEncoder
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
issue.delete()
issue_activity.delay(

View File

@@ -15,7 +15,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet
from . import BaseViewSet, WebhookMixin
from plane.api.serializers import (
ModuleWriteSerializer,
ModuleSerializer,
@@ -41,11 +41,12 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class ModuleViewSet(BaseViewSet):
class ModuleViewSet(WebhookMixin, BaseViewSet):
model = Module
permission_classes = [
ProjectEntityPermission,
]
webhook_event = "module"
def get_serializer_class(self):
return (
@@ -55,7 +56,6 @@ class ModuleViewSet(BaseViewSet):
)
def get_queryset(self):
order_by = self.request.GET.get("order_by", "sort_order")
subquery = ModuleFavorite.objects.filter(
user=self.request.user,
@@ -138,7 +138,7 @@ class ModuleViewSet(BaseViewSet):
),
)
)
.order_by(order_by, "name")
.order_by("-is_favorite","-created_at")
)
def create(self, request, slug, project_id):
@@ -266,12 +266,12 @@ class ModuleViewSet(BaseViewSet):
module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
)
module.delete()
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
{
"module_id": str(pk),
"module_name": str(module.name),
"issues": [str(issue_id) for issue_id in module_issues],
}
),
@@ -281,6 +281,7 @@ class ModuleViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
module.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -297,12 +298,6 @@ class ModuleIssueViewSet(BaseViewSet):
ProjectEntityPermission,
]
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
module_id=self.kwargs.get("module_id"),
)
def get_queryset(self):
return self.filter_queryset(
super()
@@ -446,7 +441,7 @@ class ModuleIssueViewSet(BaseViewSet):
type="module.activity.created",
requested_data=json.dumps({"modules_list": issues}),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{

View File

@@ -85,7 +85,10 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:
@@ -255,7 +258,10 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:

View File

@@ -2,6 +2,7 @@
import uuid
import requests
import os
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@@ -20,7 +21,14 @@ from google.oauth2 import id_token
from google.auth.transport import requests as google_auth_request
# Module imports
from plane.db.models import SocialLoginConnection, User
from plane.db.models import (
SocialLoginConnection,
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.api.serializers import UserSerializer
from .base import BaseAPIView
@@ -168,7 +176,6 @@ class OauthEndpoint(BaseAPIView):
)
## Login Case
if not user.is_active:
return Response(
{
@@ -185,12 +192,61 @@ class OauthEndpoint(BaseAPIView):
user.is_email_verified = email_verified
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -201,26 +257,36 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
"event_type": "SIGN_IN",
},
)
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist:
@@ -260,31 +326,85 @@ class OauthEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
"event_type": "SIGN_UP",
},
)
)
except RequestException as e:
capture_exception(e)
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -295,4 +415,10 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED)

View File

@@ -1,9 +1,19 @@
# Python imports
from datetime import timedelta, date
from datetime import timedelta, date, datetime
# Django imports
from django.db import connection
from django.db.models import Exists, OuterRef, Q, Prefetch
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.db.models import (
OuterRef,
Func,
F,
Q,
Exists,
)
# Third party imports
from rest_framework import status
@@ -15,20 +25,37 @@ from .base import BaseViewSet, BaseAPIView
from plane.api.permissions import ProjectEntityPermission
from plane.db.models import (
Page,
PageBlock,
PageFavorite,
Issue,
IssueAssignee,
IssueActivity,
PageLog,
)
from plane.api.serializers import (
PageSerializer,
PageBlockSerializer,
PageFavoriteSerializer,
PageLogSerializer,
IssueLiteSerializer,
SubPageSerializer,
)
def unarchive_archive_page_and_descendants(page_id, archived_at):
# Your SQL query
sql = """
WITH RECURSIVE descendants AS (
SELECT id FROM pages WHERE id = %s
UNION ALL
SELECT pages.id FROM pages, descendants WHERE pages.parent_id = descendants.id
)
UPDATE pages SET archived_at = %s WHERE id IN (SELECT id FROM descendants);
"""
# Execute the SQL query
with connection.cursor() as cursor:
cursor.execute(sql, [page_id, archived_at])
class PageViewSet(BaseViewSet):
serializer_class = PageSerializer
model = Page
@@ -52,6 +79,7 @@ class PageViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.filter(parent__isnull=True)
.filter(Q(owned_by=self.request.user) | Q(access=0))
.select_related("project")
.select_related("workspace")
@@ -59,15 +87,7 @@ class PageViewSet(BaseViewSet):
.annotate(is_favorite=Exists(subquery))
.order_by(self.request.GET.get("order_by", "-created_at"))
.prefetch_related("labels")
.order_by("name", "-is_favorite")
.prefetch_related(
Prefetch(
"blocks",
queryset=PageBlock.objects.select_related(
"page", "issue", "workspace", "project"
),
)
)
.order_by("-is_favorite","-created_at")
.distinct()
)
@@ -88,34 +108,90 @@ class PageViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
and page.owned_by_id != request.user.id
):
try:
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
if page.is_locked:
return Response(
{"error": "Page is locked"},
status=status.HTTP_400_BAD_REQUEST,
)
parent = request.data.get("parent", None)
if parent:
_ = Page.objects.get(
pk=parent, workspace__slug=slug, project_id=project_id
)
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
and page.owned_by_id != request.user.id
):
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except Page.DoesNotExist:
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def lock(self, request, slug, project_id, pk):
page = Page.objects.filter(
pk=pk, workspace__slug=slug, project_id=project_id
)
# only the owner can lock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can lock the page"},
)
page.is_locked = True
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def unlock(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
# only the owner can unlock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can unlock the page"},
status=status.HTTP_400_BAD_REQUEST,
)
page.is_locked = False
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
queryset = self.get_queryset().filter(archived_at__isnull=True)
page_view = request.GET.get("page_view", False)
if not page_view:
return Response({"error": "Page View parameter is required"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Page View parameter is required"},
status=status.HTTP_400_BAD_REQUEST,
)
# All Pages
if page_view == "all":
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Recent pages
if page_view == "recent":
@@ -123,66 +199,95 @@ class PageViewSet(BaseViewSet):
day_before = current_time - timedelta(days=1)
todays_pages = queryset.filter(updated_at__date=date.today())
yesterdays_pages = queryset.filter(updated_at__date=day_before)
earlier_this_week = queryset.filter( updated_at__date__range=(
earlier_this_week = queryset.filter(
updated_at__date__range=(
(timezone.now() - timedelta(days=7)),
(timezone.now() - timedelta(days=2)),
))
)
)
return Response(
{
"today": PageSerializer(todays_pages, many=True).data,
"yesterday": PageSerializer(yesterdays_pages, many=True).data,
"earlier_this_week": PageSerializer(earlier_this_week, many=True).data,
},
status=status.HTTP_200_OK,
)
{
"today": PageSerializer(todays_pages, many=True).data,
"yesterday": PageSerializer(yesterdays_pages, many=True).data,
"earlier_this_week": PageSerializer(
earlier_this_week, many=True
).data,
},
status=status.HTTP_200_OK,
)
# Favorite Pages
if page_view == "favorite":
queryset = queryset.filter(is_favorite=True)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# My pages
if page_view == "created_by_me":
queryset = queryset.filter(owned_by=request.user)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Created by other Pages
if page_view == "created_by_other":
queryset = queryset.filter(~Q(owned_by=request.user), access=0)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
queryset = queryset.filter(~Q(owned_by=request.user), access=0)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
return Response({"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST)
class PageBlockViewSet(BaseViewSet):
serializer_class = PageBlockSerializer
model = PageBlock
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(page_id=self.kwargs.get("page_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
.select_related("workspace")
.select_related("page")
.select_related("issue")
.order_by("sort_order")
.distinct()
return Response(
{"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST
)
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
page_id=self.kwargs.get("page_id"),
def archive(self, request, slug, project_id, page_id):
_ = Page.objects.get(
project_id=project_id,
owned_by_id=request.user.id,
workspace__slug=slug,
pk=page_id,
)
unarchive_archive_page_and_descendants(page_id, datetime.now())
return Response(status=status.HTTP_204_NO_CONTENT)
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(
project_id=project_id,
owned_by_id=request.user.id,
workspace__slug=slug,
pk=page_id,
)
page.parent = None
page.save()
unarchive_archive_page_and_descendants(page_id, None)
return Response(status=status.HTTP_204_NO_CONTENT)
def archive_list(self, request, slug, project_id):
pages = (
Page.objects.filter(
project_id=project_id,
workspace__slug=slug,
)
.filter(archived_at__isnull=False)
.filter(parent_id__isnull=True)
)
if not pages:
return Response(
{"error": "No pages found"}, status=status.HTTP_400_BAD_REQUEST
)
return Response(
PageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)
class PageFavoriteViewSet(BaseViewSet):
permission_classes = [
@@ -196,6 +301,7 @@ class PageFavoriteViewSet(BaseViewSet):
return self.filter_queryset(
super()
.get_queryset()
.filter(archived_at__isnull=True)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related("page", "page__owned_by")
@@ -218,24 +324,62 @@ class PageFavoriteViewSet(BaseViewSet):
page_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromPageBlockEndpoint(BaseAPIView):
class PageLogEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id, page_block_id):
page_block = PageBlock.objects.get(
pk=page_block_id,
serializer_class = PageLogSerializer
model = PageLog
def post(self, request, slug, project_id, page_id):
serializer = PageLogSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, page_id=page_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, page_id, transaction):
page_transaction = PageLog.objects.get(
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
transaction=transaction,
)
serializer = PageLogSerializer(
page_transaction, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, page_id, transaction):
transaction = PageLog.objects.get(
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
transaction=transaction,
)
# Delete the transaction object
transaction.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromBlockEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id):
page = Page.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=page_id,
)
issue = Issue.objects.create(
name=page_block.name,
name=request.data.get("name"),
project_id=project_id,
description=page_block.description,
description_html=page_block.description_html,
description_stripped=page_block.description_stripped,
)
_ = IssueAssignee.objects.create(
issue=issue, assignee=request.user, project_id=project_id
@@ -245,11 +389,32 @@ class CreateIssueFromPageBlockEndpoint(BaseAPIView):
issue=issue,
actor=request.user,
project_id=project_id,
comment=f"created the issue from {page_block.name} block",
comment=f"created the issue from {page.name} block",
verb="created",
)
page_block.issue = issue
page_block.save()
return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK)
class SubPagesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@method_decorator(gzip_page)
def get(self, request, slug, project_id, page_id):
pages = (
PageLog.objects.filter(
page_id=page_id,
project_id=project_id,
workspace__slug=slug,
entity_name__in=["forward_link", "back_link"],
)
.filter(archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
)
return Response(
SubPageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)

View File

@@ -17,16 +17,16 @@ from django.db.models import (
)
from django.core.validators import validate_email
from django.conf import settings
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework import serializers
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
# Module imports
from .base import BaseViewSet, BaseAPIView
from .base import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
ProjectSerializer,
ProjectListSerializer,
@@ -39,6 +39,7 @@ from plane.api.serializers import (
)
from plane.api.permissions import (
WorkspaceUserPermission,
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
@@ -58,13 +59,6 @@ from plane.db.models import (
ProjectIdentifier,
Module,
Cycle,
CycleFavorite,
ModuleFavorite,
PageFavorite,
IssueViewFavorite,
Page,
IssueAssignee,
ModuleMember,
Inbox,
ProjectDeployBoard,
IssueProperty,
@@ -73,9 +67,10 @@ from plane.db.models import (
from plane.bgtasks.project_invitation_task import project_invitation
class ProjectViewSet(BaseViewSet):
class ProjectViewSet(WebhookMixin, BaseViewSet):
serializer_class = ProjectSerializer
model = Project
webhook_event = "project"
permission_classes = [
ProjectBasePermission,
@@ -110,12 +105,15 @@ class ProjectViewSet(BaseViewSet):
member=self.request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
)
)
)
.annotate(
total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False
project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -137,6 +135,7 @@ class ProjectViewSet(BaseViewSet):
member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"),
member_id=self.request.user.id,
is_active=True,
).values("role")
)
.annotate(
@@ -157,6 +156,7 @@ class ProjectViewSet(BaseViewSet):
member=request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order")
projects = (
self.get_queryset()
@@ -166,6 +166,7 @@ class ProjectViewSet(BaseViewSet):
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug,
is_active=True,
).select_related("member"),
)
)
@@ -345,66 +346,104 @@ class ProjectViewSet(BaseViewSet):
)
class InviteProjectEndpoint(BaseAPIView):
class ProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
email = request.data.get("email", False)
role = request.data.get("role", False)
# Check if email is provided
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
)
validate_email(email)
# Check if user is already a member of workspace
if ProjectMember.objects.filter(
project_id=project_id,
member__email=email,
member__is_bot=False,
).exists():
return Response(
{"error": "User is already member of workspace"},
status=status.HTTP_400_BAD_REQUEST,
)
user = User.objects.filter(email=email).first()
if user is None:
token = jwt.encode(
{"email": email, "timestamp": datetime.now().timestamp()},
settings.SECRET_KEY,
algorithm="HS256",
)
project_invitation_obj = ProjectMemberInvite.objects.create(
email=email.strip().lower(),
project_id=project_id,
token=token,
role=role,
)
domain = settings.WEB_URL
project_invitation.delay(email, project_id, token, domain)
return Response(
{
"message": "Email sent successfully",
"id": project_invitation_obj.id,
},
status=status.HTTP_200_OK,
)
project_member = ProjectMember.objects.create(
member=user, project_id=project_id, role=role
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
_ = IssueProperty.objects.create(user=user, project_id=project_id)
def create(self, request, slug, project_id):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
requesting_user = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, member_id=request.user.id
)
# Check if any invited user has an higher role
if len(
[
email
for email in emails
if int(email.get("role", 10)) > requesting_user.role
]
):
return Response(
{"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.get(slug=slug)
project_invitations = []
for email in emails:
try:
validate_email(email.get("email"))
project_invitations.append(
ProjectMemberInvite(
email=email.get("email").strip().lower(),
project_id=project_id,
workspace_id=workspace.id,
token=jwt.encode(
{
"email": email,
"timestamp": datetime.now().timestamp(),
},
settings.SECRET_KEY,
algorithm="HS256",
),
role=email.get("role", 10),
created_by=request.user,
)
)
except ValidationError:
return Response(
{
"error": f"Invalid email - {email} provided a valid email address is required to send the invite"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Create workspace member invite
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
# Send invitations
for invitation in project_invitations:
project_invitations.delay(
invitation.email,
project_id,
invitation.token,
current_site,
request.user.email,
)
return Response(
ProjectMemberSerializer(project_member).data, status=status.HTTP_200_OK
{
"message": "Email sent successfully",
},
status=status.HTTP_200_OK,
)
@@ -420,40 +459,134 @@ class UserProjectInvitationsViewset(BaseViewSet):
.select_related("workspace", "workspace__owner", "project")
)
def create(self, request):
invitations = request.data.get("invitations")
project_invitations = ProjectMemberInvite.objects.filter(
pk__in=invitations, accepted=True
def create(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
project_id=project_id,
member=request.user,
role=invitation.role,
role=15 if workspace_role >= 15 else 10,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
# Delete joined project invites
project_invitations.delete()
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request, slug, project_id, pk):
project_invite = ProjectMemberInvite.objects.get(
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
email = request.data.get("email", "")
if email == "" or project_invite.email != email:
return Response(
{"error": "You do not have permission to join the project"},
status=status.HTTP_403_FORBIDDEN,
)
if project_invite.responded_at is None:
project_invite.accepted = request.data.get("accepted", False)
project_invite.responded_at = timezone.now()
project_invite.save()
if project_invite.accepted:
# Check if the user account exists
user = User.objects.filter(email=email).first()
# Check if user is a part of workspace
workspace_member = WorkspaceMember.objects.filter(
workspace__slug=slug, member=user
).first()
# Add him to workspace
if workspace_member is None:
_ = WorkspaceMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=15 if project_invite.role >= 15 else project_invite.role,
)
else:
# Else make him active
workspace_member.is_active = True
workspace_member.save()
# Check if the user was already a member of project then activate the user
project_member = ProjectMember.objects.filter(
workspace_id=project_invite.workspace_id, member=user
).first()
if project_member is None:
# Create a Project Member
_ = ProjectMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=project_invite.role,
)
else:
project_member.is_active = True
project_member.role = project_member.role
project_member.save()
return Response(
{"message": "Project Invitation Accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"message": "Project Invitation was not accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"error": "You have already responded to the invitation request"},
status=status.HTTP_400_BAD_REQUEST,
)
def get(self, request, slug, project_id, pk):
project_invitation = ProjectMemberInvite.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
serializer = ProjectMemberInviteSerializer(project_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMemberViewSet(BaseViewSet):
@@ -475,6 +608,7 @@ class ProjectMemberViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(member__is_bot=False)
.filter()
.select_related("project")
.select_related("member")
.select_related("workspace", "workspace__owner")
@@ -542,13 +676,17 @@ class ProjectMemberViewSet(BaseViewSet):
def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
member=request.user, workspace__slug=slug, project_id=project_id
member=request.user,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
).select_related("project", "member", "workspace")
if project_member.role > 10:
@@ -559,7 +697,10 @@ class ProjectMemberViewSet(BaseViewSet):
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
pk=pk,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
if request.user.id == project_member.member_id:
return Response(
@@ -568,7 +709,10 @@ class ProjectMemberViewSet(BaseViewSet):
)
# Check while updating user roles
requested_project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
if (
"role" in request.data
@@ -591,54 +735,66 @@ class ProjectMemberViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
workspace__slug=slug,
project_id=project_id,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug, member=request.user, project_id=project_id
workspace__slug=slug,
member=request.user,
project_id=project_id,
is_active=True,
)
# User cannot remove himself
if str(project_member.id) == str(requesting_project_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# User cannot deactivate higher role
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than yourself"},
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug,
project_id=project_id,
assignee=project_member.member,
).delete()
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
# Remove if module member
ModuleMember.objects.filter(
def leave(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=project_member.member,
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug,
project_id=project_id,
owned_by=project_member.member,
).delete()
project_member.delete()
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the project
if (
project_member.role == 20
and not ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the project as your the only admin of the project you will have to either delete the project or create an another admin",
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -691,46 +847,6 @@ class AddTeamToProjectEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
class ProjectMemberInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectMemberInviteDetailViewSet(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectIdentifierEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
@@ -774,59 +890,14 @@ class ProjectIdentifierEndpoint(BaseAPIView):
)
class ProjectJoinEndpoint(BaseAPIView):
def post(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project_id=project_id,
member=request.user,
role=20
if workspace_role >= 15
else (15 if workspace_role == 10 else workspace_role),
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
class ProjectUserViewsEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
member=request.user, project=project
member=request.user,
project=project,
is_active=True,
).first()
if project_member is None:
@@ -850,7 +921,10 @@ class ProjectUserViewsEndpoint(BaseAPIView):
class ProjectMemberUserEndpoint(BaseAPIView):
def get(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
serializer = ProjectMemberSerializer(project_member)
@@ -983,39 +1057,6 @@ class WorkspaceProjectDeployBoardEndpoint(BaseAPIView):
return Response(projects, status=status.HTTP_200_OK)
class LeaveProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def delete(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
member=request.user,
project_id=project_id,
)
# Only Admin case
if (
project_member.role == 20
and ProjectMember.objects.filter(
workspace__slug=slug,
role=20,
project_id=project_id,
).count()
== 1
):
return Response(
{
"error": "You cannot leave the project since you are the only admin of the project you should delete the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectPublicCoverImagesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,

View File

@@ -47,36 +47,45 @@ class StateViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def list(self, request, slug, project_id):
state_dict = dict()
states = StateSerializer(self.get_queryset(), many=True).data
grouped = request.GET.get("grouped", False)
if grouped == "true":
state_dict = {}
for key, value in groupby(
sorted(states, key=lambda state: state["group"]),
lambda state: state.get("group"),
):
state_dict[str(key)] = list(value)
return Response(state_dict, status=status.HTTP_200_OK)
return Response(states, status=status.HTTP_200_OK)
for key, value in groupby(
sorted(states, key=lambda state: state["group"]),
lambda state: state.get("group"),
):
state_dict[str(key)] = list(value)
return Response(state_dict, status=status.HTTP_200_OK)
def mark_as_default(self, request, slug, project_id, pk):
# Select all the states which are marked as default
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, default=True
).update(default=False)
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
).update(default=True)
return Response(status=status.HTTP_204_NO_CONTENT)
def destroy(self, request, slug, project_id, pk):
state = State.objects.get(
~Q(name="Triage"),
pk=pk, project_id=project_id, workspace__slug=slug,
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
if state.default:
return Response(
{"error": "Default state cannot be deleted"}, status=False
)
return Response({"error": "Default state cannot be deleted"}, status=False)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()
if issue_exist:
return Response(
{
"error": "The state is not empty, only empty states can be deleted"
},
{"error": "The state is not empty, only empty states can be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -13,13 +13,8 @@ from plane.api.serializers import (
)
from plane.api.views.base import BaseViewSet, BaseAPIView
from plane.db.models import (
User,
Workspace,
WorkspaceMemberInvite,
Issue,
IssueActivity,
)
from plane.db.models import User, IssueActivity, WorkspaceMember
from plane.license.models import Instance, InstanceAdmin
from plane.utils.paginator import BasePaginator
@@ -41,10 +36,33 @@ class UserEndpoint(BaseViewSet):
serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK)
def retrieve_instance_admin(self, request):
instance = Instance.objects.first()
is_admin = InstanceAdmin.objects.filter(
instance=instance, user=request.user
).exists()
return Response({"is_instance_admin": is_admin}, status=status.HTTP_200_OK)
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(member=request.user, is_active=True).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
user.is_active = False
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class UpdateUserOnBoardedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_onboarded = request.data.get("is_onboarded", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
@@ -52,7 +70,7 @@ class UpdateUserOnBoardedEndpoint(BaseAPIView):
class UpdateUserTourCompletedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,130 @@
# Django imports
from django.db import IntegrityError
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import Webhook, WebhookLog, Workspace
from plane.db.models.webhook import generate_token
from .base import BaseAPIView
from plane.api.permissions import WorkspaceOwnerPermission
from plane.api.serializers import WebhookSerializer, WebhookLogSerializer
class WebhookEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "URL already exists for the workspace"},
status=status.HTTP_410_GONE,
)
raise IntegrityError
def get(self, request, slug, pk=None):
if pk == None:
webhooks = Webhook.objects.filter(workspace__slug=slug)
serializer = WebhookSerializer(
webhooks,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
data=request.data,
partial=True,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, pk):
webhook = Webhook.objects.get(pk=pk, workspace__slug=slug)
webhook.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class WebhookSecretRegenerateEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
webhook.secret_key = generate_token()
webhook.save()
serializer = WebhookSerializer(webhook)
return Response(serializer.data, status=status.HTTP_200_OK)
class WebhookLogsEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def get(self, request, slug, webhook_id):
webhook_logs = WebhookLog.objects.filter(
workspace__slug=slug, webhook_id=webhook_id
)
serializer = WebhookLogSerializer(webhook_logs, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -2,7 +2,6 @@
import jwt
from datetime import date, datetime
from dateutil.relativedelta import relativedelta
from uuid import uuid4
# Django imports
from django.db import IntegrityError
@@ -26,13 +25,11 @@ from django.db.models import (
)
from django.db.models.functions import ExtractWeek, Cast, ExtractDay
from django.db.models.fields import DateField
from django.contrib.auth.hashers import make_password
# Third party modules
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
from rest_framework.permissions import AllowAny, IsAuthenticated
# Module imports
from plane.api.serializers import (
@@ -59,14 +56,6 @@ from plane.db.models import (
IssueActivity,
Issue,
WorkspaceTheme,
IssueAssignee,
ProjectFavorite,
CycleFavorite,
ModuleMember,
ModuleFavorite,
PageFavorite,
Page,
IssueViewFavorite,
IssueLink,
IssueAttachment,
IssueSubscriber,
@@ -106,7 +95,9 @@ class WorkSpaceViewSet(BaseViewSet):
def get_queryset(self):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -181,7 +172,9 @@ class UserWorkSpacesEndpoint(BaseAPIView):
def get(self, request):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -227,23 +220,40 @@ class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
return Response({"status": not workspace}, status=status.HTTP_200_OK)
class InviteWorkspaceEndpoint(BaseAPIView):
class WorkspaceInvitationsViewset(BaseViewSet):
"""Endpoint for creating, listing and deleting workspaces"""
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def post(self, request, slug):
emails = request.data.get("emails", False)
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def create(self, request, slug):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails or not len(emails):
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
# check for role level
# check for role level of the requesting user
requesting_user = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if any invited user has an higher role
if len(
[
email
@@ -256,15 +266,17 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace object
workspace = Workspace.objects.get(slug=slug)
# Check if user is already a member of workspace
workspace_members = WorkspaceMember.objects.filter(
workspace_id=workspace.id,
member__email__in=[email.get("email") for email in emails],
is_active=True,
).select_related("member", "workspace", "workspace__owner")
if len(workspace_members):
if workspace_members:
return Response(
{
"error": "Some users are already member of workspace",
@@ -302,35 +314,20 @@ class InviteWorkspaceEndpoint(BaseAPIView):
},
status=status.HTTP_400_BAD_REQUEST,
)
WorkspaceMemberInvite.objects.bulk_create(
# Create workspace member invite
workspace_invitations = WorkspaceMemberInvite.objects.bulk_create(
workspace_invitations, batch_size=10, ignore_conflicts=True
)
workspace_invitations = WorkspaceMemberInvite.objects.filter(
email__in=[email.get("email") for email in emails]
).select_related("workspace")
# create the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
_ = User.objects.bulk_create(
[
User(
username=str(uuid4().hex),
email=invitation.email,
password=make_password(uuid4().hex),
is_password_autoset=True,
)
for invitation in workspace_invitations
],
batch_size=100,
)
current_site = request.META.get('HTTP_ORIGIN')
# Send invitations
for invitation in workspace_invitations:
workspace_invitation.delay(
invitation.email,
workspace.id,
invitation.token,
settings.WEB_URL,
current_site,
request.user.email,
)
@@ -341,11 +338,19 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class JoinWorkspaceEndpoint(BaseAPIView):
class WorkspaceJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
"""Invitation response endpoint the user can respond to the invitation"""
def post(self, request, slug, pk):
workspace_invite = WorkspaceMemberInvite.objects.get(
@@ -354,12 +359,14 @@ class JoinWorkspaceEndpoint(BaseAPIView):
email = request.data.get("email", "")
# Check the email
if email == "" or workspace_invite.email != email:
return Response(
{"error": "You do not have permission to join the workspace"},
status=status.HTTP_403_FORBIDDEN,
)
# If already responded then return error
if workspace_invite.responded_at is None:
workspace_invite.accepted = request.data.get("accepted", False)
workspace_invite.responded_at = timezone.now()
@@ -371,12 +378,23 @@ class JoinWorkspaceEndpoint(BaseAPIView):
# If the user is present then create the workspace member
if user is not None:
WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Check if the user was already a member of workspace then activate the user
workspace_member = WorkspaceMember.objects.filter(
workspace=workspace_invite.workspace, member=user
).first()
if workspace_member is not None:
workspace_member.is_active = True
workspace_member.role = workspace_invite.role
workspace_member.save()
else:
# Create a Workspace
_ = WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Set the user last_workspace_id to the accepted workspace
user.last_workspace_id = workspace_invite.workspace.id
user.save()
@@ -388,6 +406,7 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
# Workspace invitation rejected
return Response(
{"message": "Workspace Invitation was not accepted"},
status=status.HTTP_200_OK,
@@ -398,37 +417,13 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
class WorkspaceInvitationsViewset(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
# delete the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
user = User.objects.filter(email=workspace_member_invite.email).first()
if user is not None:
user.delete()
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, pk):
workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class UserWorkspaceInvitationsEndpoint(BaseViewSet):
class UserWorkspaceInvitationsViewSet(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
@@ -442,9 +437,19 @@ class UserWorkspaceInvitationsEndpoint(BaseViewSet):
)
def create(self, request):
invitations = request.data.get("invitations")
workspace_invitations = WorkspaceMemberInvite.objects.filter(pk__in=invitations)
invitations = request.data.get("invitations", [])
workspace_invitations = WorkspaceMemberInvite.objects.filter(
pk__in=invitations, email=request.user.email
).order_by("-created_at")
# If the user is already a member of workspace and was deactivated then activate the user
for invitation in workspace_invitations:
# Update the WorkspaceMember for this specific invitation
WorkspaceMember.objects.filter(
workspace_id=invitation.workspace_id, member=request.user
).update(is_active=True, role=invitation.role)
# Bulk create the user for all the workspaces
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
@@ -481,20 +486,24 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"), member__is_bot=False)
.filter(
workspace__slug=self.kwargs.get("slug"),
member__is_bot=False,
is_active=True,
)
.select_related("workspace", "workspace__owner")
.select_related("member")
)
def list(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
member=request.user,
workspace__slug=slug,
is_active=True,
)
workspace_members = WorkspaceMember.objects.filter(
workspace__slug=slug,
member__is_bot=False,
).select_related("workspace", "member")
# Get all active workspace members
workspace_members = self.get_queryset()
if workspace_member.role > 10:
serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True)
@@ -506,7 +515,12 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, pk):
workspace_member = WorkspaceMember.objects.get(pk=pk, workspace__slug=slug)
workspace_member = WorkspaceMember.objects.get(
pk=pk,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
)
if request.user.id == workspace_member.member_id:
return Response(
{"error": "You cannot update your own role"},
@@ -515,7 +529,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Get the requested user role
requested_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if role is being updated
# One cannot update role higher than his own role
@@ -540,68 +556,121 @@ class WorkSpaceMemberViewSet(BaseViewSet):
def destroy(self, request, slug, pk):
# Check the user role who is deleting the user
workspace_member = WorkspaceMember.objects.get(workspace__slug=slug, pk=pk)
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
if str(workspace_member.id) == str(requesting_workspace_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
if requesting_workspace_member.role < workspace_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for the only member in the workspace
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
member__is_bot=False,
).count()
== 1
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{"error": "Cannot delete the only Admin for the workspace"},
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the user also from all the projects
ProjectMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug, assignee=workspace_member.member
).delete()
# Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# Remove if module member
ModuleMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug, owned_by=workspace_member.member
).delete()
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
workspace_member.delete()
def leave(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the workspace
if (
workspace_member.role == 20
and not WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
},
status=status.HTTP_400_BAD_REQUEST,
)
if (
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# # Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# # Deactivate the user
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -629,7 +698,9 @@ class TeamMemberViewSet(BaseViewSet):
def create(self, request, slug):
members = list(
WorkspaceMember.objects.filter(
workspace__slug=slug, member__id__in=request.data.get("members", [])
workspace__slug=slug,
member__id__in=request.data.get("members", []),
is_active=True,
)
.annotate(member_str_id=Cast("member", output_field=CharField()))
.distinct()
@@ -658,23 +729,6 @@ class TeamMemberViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class UserWorkspaceInvitationEndpoint(BaseViewSet):
model = WorkspaceMemberInvite
serializer_class = WorkSpaceMemberInviteSerializer
permission_classes = [
AllowAny,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(pk=self.kwargs.get("pk"))
.select_related("workspace")
)
class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
def get(self, request):
user = User.objects.get(pk=request.user.id)
@@ -711,7 +765,9 @@ class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView):
def get(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
member=request.user,
workspace__slug=slug,
is_active=True,
)
serializer = WorkspaceMemberMeSerializer(workspace_member)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -720,7 +776,9 @@ class WorkspaceMemberUserEndpoint(BaseAPIView):
class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
def post(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
workspace_member.view_props = request.data.get("view_props", {})
workspace_member.save()
@@ -1046,7 +1104,9 @@ class WorkspaceUserProfileEndpoint(BaseAPIView):
user_data = User.objects.get(pk=user_id)
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
projects = []
if requesting_workspace_member.role >= 10:
@@ -1250,9 +1310,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):
@@ -1266,30 +1324,3 @@ class WorkspaceLabelsEndpoint(BaseAPIView):
project__project_projectmember__member=request.user,
).values("parent", "name", "color", "id", "project_id", "workspace__slug")
return Response(labels, status=status.HTTP_200_OK)
class LeaveWorkspaceEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
def delete(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
)
# Only Admin case
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(workspace__slug=slug, role=20).count()
== 1
):
return Response(
{
"error": "You cannot leave the workspace since you are the only admin of the workspace you should delete the workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
workspace_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@@ -0,0 +1,5 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "plane.authentication"

View File

@@ -3,7 +3,7 @@ import csv
import io
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -16,6 +16,8 @@ from sentry_sdk import capture_exception
from plane.db.models import Issue
from plane.utils.analytics_plot import build_graph_plot
from plane.utils.issue_filters import issue_filters
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
row_mapping = {
"state__name": "State",
@@ -47,7 +49,19 @@ def send_export_email(email, slug, csv_buffer):
text_content = strip_tags(html_content)
csv_buffer.seek(0)
msg = EmailMultiAlternatives(subject, text_content, settings.EMAIL_FROM, [email])
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
use_ssl=bool(get_configuration_value(instance_configuration, "EMAIL_USE_SSL", "0")),
)
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach(f"{slug}-analytics.csv", csv_buffer.getvalue())
msg.send(fail_silently=False)
@@ -408,7 +422,6 @@ def analytic_export_task(email, data, slug):
distribution,
x_axis,
y_axis,
segment,
key,
assignee_details,
label_details,

View File

@@ -1,5 +1,5 @@
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -11,8 +11,8 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.db.models import User
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def email_verification(first_name, email, token, current_site):
@@ -34,7 +34,18 @@ def email_verification(first_name, email, token, current_site):
text_content = strip_tags(html_content)
msg = EmailMultiAlternatives(subject, text_content, from_email_string, [email])
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -1,5 +1,5 @@
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -8,7 +8,9 @@ from django.conf import settings
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def forgot_password(first_name, email, uidb64, token, current_site):
@@ -30,7 +32,16 @@ def forgot_password(first_name, email, uidb64, token, current_site):
text_content = strip_tags(html_content)
msg = EmailMultiAlternatives(subject, text_content, from_email_string, [email])
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -73,6 +73,12 @@ def service_importer(service, importer_id):
]
)
# Check if any of the users are already member of workspace
_ = WorkspaceMember.objects.filter(
member__in=[user for user in workspace_users],
workspace_id=importer.workspace_id,
).update(is_active=True)
# Add new users to Workspace and project automatically
WorkspaceMember.objects.bulk_create(
[

View File

@@ -691,6 +691,10 @@ def create_cycle_issue_activity(
new_cycle = Cycle.objects.filter(
pk=updated_record.get("new_cycle_id", None)
).first()
issue = Issue.objects.filter(pk=updated_record.get("issue_id")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -713,6 +717,10 @@ def create_cycle_issue_activity(
cycle = Cycle.objects.filter(
pk=created_record.get("fields").get("cycle")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -747,22 +755,27 @@ def delete_cycle_issue_activity(
)
cycle_id = requested_data.get("cycle_id", "")
cycle_name = requested_data.get("cycle_name", "")
cycle = Cycle.objects.filter(pk=cycle_id).first()
issues = requested_data.get("issues")
for issue in issues:
current_issue = Issue.objects.filter(pk=issue).first()
if issue:
current_issue.updated_at = timezone.now()
current_issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=issue,
actor_id=actor_id,
verb="deleted",
old_value=cycle.name if cycle is not None else "",
old_value=cycle.name if cycle is not None else cycle_name,
new_value="",
field="cycles",
project_id=project_id,
workspace_id=workspace_id,
comment=f"removed this issue from {cycle.name if cycle is not None else None}",
old_identifier=cycle.id if cycle is not None else None,
comment=f"removed this issue from {cycle.name if cycle is not None else cycle_name}",
old_identifier=cycle_id if cycle_id is not None else None,
epoch=epoch,
)
)
@@ -794,6 +807,10 @@ def create_module_issue_activity(
new_module = Module.objects.filter(
pk=updated_record.get("new_module_id", None)
).first()
issue = Issue.objects.filter(pk=updated_record.get("issue_id")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -816,6 +833,10 @@ def create_module_issue_activity(
module = Module.objects.filter(
pk=created_record.get("fields").get("module")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=created_record.get("fields").get("issue"),
@@ -849,22 +870,27 @@ def delete_module_issue_activity(
)
module_id = requested_data.get("module_id", "")
module_name = requested_data.get("module_name", "")
module = Module.objects.filter(pk=module_id).first()
issues = requested_data.get("issues")
for issue in issues:
current_issue = Issue.objects.filter(pk=issue).first()
if issue:
current_issue.updated_at = timezone.now()
current_issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=issue,
actor_id=actor_id,
verb="deleted",
old_value=module.name if module is not None else "",
old_value=module.name if module is not None else module_name,
new_value="",
field="modules",
project_id=project_id,
workspace_id=workspace_id,
comment=f"removed this issue from ",
old_identifier=module.id if module is not None else None,
comment=f"removed this issue from {module.name if module is not None else module_name}",
old_identifier=module_id if module_id is not None else None,
epoch=epoch,
)
)
@@ -1452,15 +1478,16 @@ def issue_activity(
issue_activities = []
project = Project.objects.get(pk=project_id)
issue = Issue.objects.filter(pk=issue_id).first()
workspace_id = project.workspace_id
if issue is not None:
try:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
except Exception as e:
pass
if issue_id is not None:
issue = Issue.objects.filter(pk=issue_id).first()
if issue:
try:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
except Exception as e:
pass
ACTIVITY_MAPPER = {
"issue.activity.created": create_issue_activity,

View File

@@ -12,7 +12,7 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.db.models import Issue, Project, State
from plane.db.models import Issue, Project, State, Page
from plane.bgtasks.issue_activites_task import issue_activity
@@ -20,6 +20,7 @@ from plane.bgtasks.issue_activites_task import issue_activity
def archive_and_close_old_issues():
archive_old_issues()
close_old_issues()
delete_archived_pages()
def archive_old_issues():
@@ -67,7 +68,7 @@ def archive_old_issues():
issues_to_update.append(issue)
# Bulk Update the issues and log the activity
if issues_to_update:
if issues_to_update:
Issue.objects.bulk_update(
issues_to_update, ["archived_at"], batch_size=100
)
@@ -80,7 +81,7 @@ def archive_old_issues():
project_id=project_id,
current_instance=json.dumps({"archived_at": None}),
subscriber=False,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
for issue in issues_to_update
]
@@ -142,17 +143,21 @@ def close_old_issues():
# Bulk Update the issues and log the activity
if issues_to_update:
Issue.objects.bulk_update(issues_to_update, ["state"], batch_size=100)
Issue.objects.bulk_update(
issues_to_update, ["state"], batch_size=100
)
[
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"closed_to": str(issue.state_id)}),
requested_data=json.dumps(
{"closed_to": str(issue.state_id)}
),
actor_id=str(project.created_by_id),
issue_id=issue.id,
project_id=project_id,
current_instance=None,
subscriber=False,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
for issue in issues_to_update
]
@@ -162,3 +167,20 @@ def close_old_issues():
print(e)
capture_exception(e)
return
def delete_archived_pages():
try:
pages_to_delete = Page.objects.filter(
archived_at__isnull=False,
archived_at__lte=(timezone.now() - timedelta(days=30)),
)
pages_to_delete._raw_delete(pages_to_delete.db)
return
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -1,5 +1,5 @@
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -8,6 +8,9 @@ from django.conf import settings
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def magic_link(email, key, token, current_site):
@@ -15,8 +18,6 @@ def magic_link(email, key, token, current_site):
realtivelink = f"/magic-sign-in/?password={token}&key={key}"
abs_url = current_site + realtivelink
from_email_string = settings.EMAIL_FROM
subject = "Login for Plane"
context = {"magic_url": abs_url, "code": token}
@@ -25,7 +26,17 @@ def magic_link(email, key, token, current_site):
text_content = strip_tags(html_content)
msg = EmailMultiAlternatives(subject, text_content, from_email_string, [email])
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -1,8 +1,6 @@
# Python imports
import json
# Django imports
from django.utils import timezone
import uuid
# Module imports
from plane.db.models import (
@@ -14,6 +12,7 @@ from plane.db.models import (
Issue,
Notification,
IssueComment,
IssueActivity
)
# Third Party imports
@@ -21,12 +20,35 @@ from celery import shared_task
from bs4 import BeautifulSoup
# =========== Issue Description Html Parsing and Notification Functions ======================
def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
aggregated_issue_mentions = []
for mention_id in new_mentions:
aggregated_issue_mentions.append(
IssueMention(
mention_id=mention_id,
issue=issue,
project=project,
workspace_id=project.workspace_id
)
)
IssueMention.objects.bulk_create(
aggregated_issue_mentions, batch_size=100)
IssueMention.objects.filter(
issue=issue, mention__in=removed_mention).delete()
def get_new_mentions(requested_instance, current_instance):
# requested_data is the newer instance of the current issue
# current_instance is the older instance of the current issue, saved in the database
# extract mentions from both the instance of data
mentions_older = extract_mentions(current_instance)
mentions_newer = extract_mentions(requested_instance)
# Getting Set Difference from mentions_newer
@@ -64,25 +86,26 @@ def extract_mentions_as_subscribers(project_id, issue_id, mentions):
# If the particular mention has not already been subscribed to the issue, he must be sent the mentioned notification
if not IssueSubscriber.objects.filter(
issue_id=issue_id,
subscriber=mention_id,
project=project_id,
subscriber_id=mention_id,
project_id=project_id,
).exists() and not IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id,
assignee_id=mention_id
).exists() and not Issue.objects.filter(
project_id=project_id, pk=issue_id, created_by_id=mention_id
).exists():
mentioned_user = User.objects.get(pk=mention_id)
project = Project.objects.get(pk=project_id)
issue = Issue.objects.get(pk=issue_id)
bulk_mention_subscribers.append(IssueSubscriber(
workspace=project.workspace,
project=project,
issue=issue,
subscriber=mentioned_user,
workspace_id=project.workspace_id,
project_id=project_id,
issue_id=issue_id,
subscriber_id=mention_id,
))
return bulk_mention_subscribers
# Parse Issue Description & extracts mentions
def extract_mentions(issue_instance):
try:
# issue_instance has to be a dictionary passed, containing the description_html and other set of activity data.
@@ -99,6 +122,65 @@ def extract_mentions(issue_instance):
return list(set(mentions))
except Exception as e:
return []
# =========== Comment Parsing and Notification Functions ======================
def extract_comment_mentions(comment_value):
try:
mentions = []
soup = BeautifulSoup(comment_value, 'html.parser')
mentions_tags = soup.find_all(
'mention-component', attrs={'target': 'users'}
)
for mention_tag in mentions_tags:
mentions.append(mention_tag['id'])
return list(set(mentions))
except Exception as e:
return []
def get_new_comment_mentions(new_value, old_value):
mentions_newer = extract_comment_mentions(new_value)
if old_value is None:
return mentions_newer
mentions_older = extract_comment_mentions(old_value)
# Getting Set Difference from mentions_newer
new_mentions = [
mention for mention in mentions_newer if mention not in mentions_older]
return new_mentions
def createMentionNotification(project, notification_comment, issue, actor_id, mention_id, issue_id, activity):
return Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mentioned",
triggered_by_id=actor_id,
receiver_id=mention_id,
entity_identifier=issue_id,
entity_name="issue",
project=project,
message=notification_comment,
data={
"issue": {
"id": str(issue_id),
"name": str(issue.name),
"identifier": str(issue.project.identifier),
"sequence_id": issue.sequence_id,
"state_name": issue.state.name,
"state_group": issue.state.group,
},
"issue_activity": {
"id": str(activity.get("id")),
"verb": str(activity.get("verb")),
"field": str(activity.get("field")),
"actor": str(activity.get("actor_id")),
"new_value": str(activity.get("new_value")),
"old_value": str(activity.get("old_value")),
}
},
)
@shared_task
@@ -126,61 +208,97 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
bulk_notifications = []
"""
Mention Tasks
1. Perform Diffing and Extract the mentions, that mention notification needs to be sent
2. From the latest set of mentions, extract the users which are not a subscribers & make them subscribers
"""
Mention Tasks
1. Perform Diffing and Extract the mentions, that mention notification needs to be sent
2. From the latest set of mentions, extract the users which are not a subscribers & make them subscribers
"""
# Get new mentions from the newer instance
new_mentions = get_new_mentions(
requested_instance=requested_data, current_instance=current_instance)
removed_mention = get_removed_mentions(
requested_instance=requested_data, current_instance=current_instance)
comment_mentions = []
all_comment_mentions = []
# Get New Subscribers from the mentions of the newer instance
requested_mentions = extract_mentions(
issue_instance=requested_data)
mention_subscribers = extract_mentions_as_subscribers(
project_id=project_id, issue_id=issue_id, mentions=requested_mentions)
issue_subscribers = list(
IssueSubscriber.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(subscriber_id__in=list(new_mentions + [actor_id]))
.values_list("subscriber", flat=True)
)
for issue_activity in issue_activities_created:
issue_comment = issue_activity.get("issue_comment")
issue_comment_new_value = issue_activity.get("new_value")
issue_comment_old_value = issue_activity.get("old_value")
if issue_comment is not None:
# TODO: Maybe save the comment mentions, so that in future, we can filter out the issues based on comment mentions as well.
all_comment_mentions = all_comment_mentions + extract_comment_mentions(issue_comment_new_value)
new_comment_mentions = get_new_comment_mentions(old_value=issue_comment_old_value, new_value=issue_comment_new_value)
comment_mentions = comment_mentions + new_comment_mentions
comment_mention_subscribers = extract_mentions_as_subscribers( project_id=project_id, issue_id=issue_id, mentions=all_comment_mentions)
"""
We will not send subscription activity notification to the below mentioned user sets
- Those who have been newly mentioned in the issue description, we will send mention notification to them.
- When the activity is a comment_created and there exist a mention in the comment, then we have to send the "mention_in_comment" notification
- When the activity is a comment_updated and there exist a mention change, then also we have to send the "mention_in_comment" notification
"""
issue_assignees = list(
IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(assignee_id=actor_id)
.exclude(assignee_id__in=list(new_mentions + comment_mentions))
.values_list("assignee", flat=True)
)
issue_subscribers = issue_subscribers + issue_assignees
issue_subscribers = list(
IssueSubscriber.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(subscriber_id__in=list(new_mentions + comment_mentions + [actor_id]))
.values_list("subscriber", flat=True)
)
issue = Issue.objects.filter(pk=issue_id).first()
if (issue.created_by_id is not None and str(issue.created_by_id) != str(actor_id)):
issue_subscribers = issue_subscribers + [issue.created_by_id]
if subscriber:
# add the user to issue subscriber
try:
_ = IssueSubscriber.objects.get_or_create(
issue_id=issue_id, subscriber_id=actor_id
)
if str(issue.created_by_id) != str(actor_id) and uuid.UUID(actor_id) not in issue_assignees:
_ = IssueSubscriber.objects.get_or_create(
project_id=project_id, issue_id=issue_id, subscriber_id=actor_id
)
except Exception as e:
pass
project = Project.objects.get(pk=project_id)
for subscriber in list(set(issue_subscribers)):
issue_subscribers = list(set(issue_subscribers + issue_assignees) - {uuid.UUID(actor_id)})
for subscriber in issue_subscribers:
if subscriber in issue_subscribers:
sender = "in_app:issue_activities:subscribed"
if issue.created_by_id is not None and subscriber == issue.created_by_id:
sender = "in_app:issue_activities:created"
if subscriber in issue_assignees:
sender = "in_app:issue_activities:assigned"
for issue_activity in issue_activities_created:
issue_comment = issue_activity.get("issue_comment")
if issue_comment is not None:
issue_comment = IssueComment.objects.get(id=issue_comment, issue_id=issue_id, project_id=project_id, workspace_id=project.workspace_id)
issue_comment = IssueComment.objects.get(
id=issue_comment, issue_id=issue_id, project_id=project_id, workspace_id=project.workspace_id)
bulk_notifications.append(
Notification(
workspace=project.workspace,
sender="in_app:issue_activities",
sender=sender,
triggered_by_id=actor_id,
receiver_id=subscriber,
entity_identifier=issue_id,
@@ -215,15 +333,42 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
# Add Mentioned as Issue Subscribers
IssueSubscriber.objects.bulk_create(
mention_subscribers, batch_size=100)
mention_subscribers + comment_mention_subscribers, batch_size=100)
last_activity = (
IssueActivity.objects.filter(issue_id=issue_id)
.order_by("-created_at")
.first()
)
actor = User.objects.get(pk=actor_id)
for mention_id in comment_mentions:
if (mention_id != actor_id):
for issue_activity in issue_activities_created:
notification = createMentionNotification(
project=project,
issue=issue,
notification_comment=f"{actor.display_name} has mentioned you in a comment in issue {issue.name}",
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
)
bulk_notifications.append(notification)
for mention_id in new_mentions:
if (mention_id != actor_id):
for issue_activity in issue_activities_created:
if (
last_activity is not None
and last_activity.field == "description"
and actor_id == str(last_activity.actor_id)
):
bulk_notifications.append(
Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mention",
Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mentioned",
triggered_by_id=actor_id,
receiver_id=mention_id,
entity_identifier=issue_id,
@@ -237,38 +382,37 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
"identifier": str(issue.project.identifier),
"sequence_id": issue.sequence_id,
"state_name": issue.state.name,
"state_group": issue.state.group,
},
"issue_activity": {
"id": str(issue_activity.get("id")),
"verb": str(issue_activity.get("verb")),
"field": str(issue_activity.get("field")),
"actor": str(issue_activity.get("actor_id")),
"new_value": str(issue_activity.get("new_value")),
"old_value": str(issue_activity.get("old_value")),
},
},
"state_group": issue.state.group,
},
"issue_activity": {
"id": str(last_activity.id),
"verb": str(last_activity.verb),
"field": str(last_activity.field),
"actor": str(last_activity.actor_id),
"new_value": str(last_activity.new_value),
"old_value": str(last_activity.old_value),
},
},
)
)
else:
for issue_activity in issue_activities_created:
notification = createMentionNotification(
project=project,
issue=issue,
notification_comment=f"You have been mentioned in the issue {issue.name}",
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
)
)
# Create New Mentions Here
aggregated_issue_mentions = []
for mention_id in new_mentions:
mentioned_user = User.objects.get(pk=mention_id)
aggregated_issue_mentions.append(
IssueMention(
mention=mentioned_user,
issue=issue,
project=project,
workspace=project.workspace
)
)
IssueMention.objects.bulk_create(
aggregated_issue_mentions, batch_size=100)
IssueMention.objects.filter(
issue=issue, mention__in=removed_mention).delete()
bulk_notifications.append(notification)
# save new mentions for the particular issue and remove the mentions that has been deleted from the description
update_mentions_for_issue(issue=issue, project=project, new_mentions=new_mentions,
removed_mention=removed_mention)
# Bulk create notifications
Notification.objects.bulk_create(bulk_notifications, batch_size=100)

View File

@@ -1,5 +1,5 @@
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -10,26 +10,28 @@ from sentry_sdk import capture_exception
# Module imports
from plane.db.models import Project, User, ProjectMemberInvite
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def project_invitation(email, project_id, token, current_site):
def project_invitation(email, project_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
project = Project.objects.get(pk=project_id)
project_member_invite = ProjectMemberInvite.objects.get(
token=token, email=email
)
relativelink = f"/project-member-invitation/{project_member_invite.id}"
relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM
subject = f"{project.created_by.first_name or project.created_by.email} invited you to join {project.name} on Plane"
subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = {
"email": email,
"first_name": project.created_by.first_name,
"first_name": user.first_name,
"project_name": project.name,
"invitation_url": abs_url,
}
@@ -43,7 +45,17 @@ def project_invitation(email, project_id, token, current_site):
project_member_invite.message = text_content
project_member_invite.save()
msg = EmailMultiAlternatives(subject, text_content, from_email_string, [email])
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -0,0 +1,139 @@
import requests
import uuid
import hashlib
import json
# Django imports
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
@shared_task(
bind=True,
autoretry_for=(requests.RequestException,),
retry_backoff=600,
max_retries=5,
retry_jitter=True,
)
def webhook_task(self, webhook, slug, event, event_data, action):
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
headers = {
"Content-Type": "application/json",
"User-Agent": "Autopilot",
"X-Plane-Delivery": str(uuid.uuid4()),
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
action = {
"POST": "create",
"PATCH": "update",
"PUT": "update",
"DELETE": "delete",
}.get(action, action)
payload = {
"event": event,
"action": action,
"webhook_id": str(webhook.id),
"workspace_id": str(webhook.workspace_id),
"data": event_data,
}
# Send the webhook event
response = requests.post(
webhook.url,
headers=headers,
json=payload,
timeout=30,
)
# Log the webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=str(response.status_code),
response_headers=str(response.headers),
response_body=str(response.text),
retry_count=str(self.request.retries),
)
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=500,
response_headers="",
response_body=str(e),
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return
@shared_task()
def send_webhook(event, event_data, action, slug):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
if event == "project":
webhooks = webhooks.filter(project=True)
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
webhooks = webhooks.filter(module=True)
if event == "cycle":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -1,5 +1,5 @@
# Django imports
from django.core.mail import EmailMultiAlternatives
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
@@ -11,25 +11,32 @@ from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
# Module imports
from plane.db.models import Workspace, WorkspaceMemberInvite
from plane.db.models import Workspace, WorkspaceMemberInvite, User
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def workspace_invitation(email, workspace_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
workspace = Workspace.objects.get(pk=workspace_id)
workspace_member_invite = WorkspaceMemberInvite.objects.get(
token=token, email=email
)
realtivelink = (
f"/workspace-member-invitation/?invitation_id={workspace_member_invite.id}&email={email}"
)
abs_url = current_site + realtivelink
# Relative link
relative_link = f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
# The complete url including the domain
abs_url = current_site + relative_link
# The email from
from_email_string = settings.EMAIL_FROM
subject = f"{invitor or email} invited you to join {workspace.name} on Plane"
# Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
context = {
"email": email,
@@ -47,7 +54,30 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
workspace_member_invite.message = text_content
workspace_member_invite.save()
msg = EmailMultiAlternatives(subject, text_content, from_email_string, [email])
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(
get_configuration_value(instance_configuration, "EMAIL_PORT", "587")
),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(
instance_configuration, "EMAIL_HOST_PASSWORD"
),
use_tls=bool(
get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")
),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"),
to=[email],
connection=connection,
)
msg.attach_alternative(html_content, "text/html")
msg.send()

View File

@@ -0,0 +1,71 @@
# Python imports
import boto3
import json
from botocore.exceptions import ClientError
# Django imports
from django.core.management import BaseCommand
from django.conf import settings
class Command(BaseCommand):
help = "Create the default bucket for the instance"
def set_bucket_public_policy(self, s3_client, bucket_name):
public_policy = {
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": [f"arn:aws:s3:::{bucket_name}/*"]
}]
}
try:
s3_client.put_bucket_policy(
Bucket=bucket_name,
Policy=json.dumps(public_policy)
)
self.stdout.write(self.style.SUCCESS(f"Public read access policy set for bucket '{bucket_name}'."))
except ClientError as e:
self.stdout.write(self.style.ERROR(f"Error setting public read access policy: {e}"))
def handle(self, *args, **options):
# Create a session using the credentials from Django settings
try:
session = boto3.session.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
# Create an S3 client using the session
s3_client = session.client('s3', endpoint_url=settings.AWS_S3_ENDPOINT_URL)
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
self.stdout.write(self.style.NOTICE("Checking bucket..."))
# Check if the bucket exists
s3_client.head_bucket(Bucket=bucket_name)
self.set_bucket_public_policy(s3_client, bucket_name)
except ClientError as e:
error_code = int(e.response['Error']['Code'])
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
if error_code == 404:
# Bucket does not exist, create it
self.stdout.write(self.style.WARNING(f"Bucket '{bucket_name}' does not exist. Creating bucket..."))
try:
s3_client.create_bucket(Bucket=bucket_name)
self.stdout.write(self.style.SUCCESS(f"Bucket '{bucket_name}' created successfully."))
self.set_bucket_public_policy(s3_client, bucket_name)
except ClientError as create_error:
self.stdout.write(self.style.ERROR(f"Failed to create bucket: {create_error}"))
elif error_code == 403:
# Access to the bucket is forbidden
self.stdout.write(self.style.ERROR(f"Access to the bucket '{bucket_name}' is forbidden. Check permissions."))
else:
# Another ClientError occurred
self.stdout.write(self.style.ERROR(f"Failed to check bucket: {e}"))
except Exception as ex:
# Handle any other exception
self.stdout.write(self.style.ERROR(f"An error occurred: {ex}"))

View File

@@ -3,7 +3,7 @@
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api_token
import plane.db.models.api
import uuid
@@ -40,8 +40,8 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token', models.CharField(default=plane.db.models.api_token.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api_token.generate_label_token, max_length=255)),
('token', models.CharField(default=plane.db.models.api.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api.generate_label_token, max_length=255)),
('user_type', models.PositiveSmallIntegerField(choices=[(0, 'Human'), (1, 'Bot')], default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),

View File

@@ -1,41 +0,0 @@
# Generated by Django 4.2.5 on 2023-10-18 12:04
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.CreateModel(
name="issue_mentions",
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4,editable=False, primary_key=True, serialize=False, unique=True)),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL,related_name='issuemention_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_issuemention', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='issuemention_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_issuemention', to='db.workspace')),
],
options={
'verbose_name': 'IssueMention',
'verbose_name_plural': 'IssueMentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
},
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
]

View File

@@ -0,0 +1,984 @@
# Generated by Django 4.2.5 on 2023-11-15 09:47
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
import random
def random_sort_ordering(apps, schema_editor):
Label = apps.get_model("db", "Label")
bulk_labels = []
for label in Label.objects.all():
label.sort_order = random.randint(0,65535)
bulk_labels.append(label)
Label.objects.bulk_update(bulk_labels, ["sort_order"], batch_size=1000)
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.AddField(
model_name='label',
name='sort_order',
field=models.FloatField(default=65535),
),
migrations.AlterField(
model_name='analyticview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='analyticview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='apitoken',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='apitoken',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycle',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycle',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cycleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimate',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimate',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimate',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimate',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimatepoint',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimatepoint',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='fileasset',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='fileasset',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubissuesync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubissuesync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepository',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepository',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepository',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepository',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='importer',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='importer',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='importer',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='importer',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inbox',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inbox',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inbox',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inbox',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inboxissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inboxissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inboxissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inboxissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='integration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='integration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueactivity',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueactivity',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueactivity',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueactivity',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueassignee',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueassignee',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueassignee',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueassignee',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueattachment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueattachment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueattachment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueattachment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueblocker',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueblocker',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueblocker',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueblocker',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuecomment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuecomment',
name='issue',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_comments', to='db.issue'),
),
migrations.AlterField(
model_name='issuecomment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuecomment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuecomment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueproperty',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueproperty',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
migrations.AlterField(
model_name='issueproperty',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueproperty',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuesequence',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuesequence',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuesequence',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuesequence',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueview',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueview',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='label',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='label',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='label',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='module',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='module',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='module',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='module',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='moduleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='moduleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='moduleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='moduleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulemember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulemember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='page',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='page',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='page',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='page',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pageblock',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pageblock',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pageblock',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pageblock',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='project',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='project',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectidentifier',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectidentifier',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='state',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='state',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='team',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='teammember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='teammember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspace',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspace',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.CreateModel(
name='IssueMention',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Mention',
'verbose_name_plural': 'Issue Mentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
'unique_together': {('issue', 'mention')},
},
),
migrations.RunPython(random_sort_ordering),
]

View File

@@ -0,0 +1,131 @@
# Generated by Django 4.2.5 on 2023-11-15 11:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api
import plane.db.models.webhook
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0046_label_sort_order_alter_analyticview_created_by_and_more'),
]
operations = [
migrations.CreateModel(
name='Webhook',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('url', models.URLField(validators=[plane.db.models.webhook.validate_schema, plane.db.models.webhook.validate_domain])),
('is_active', models.BooleanField(default=True)),
('secret_key', models.CharField(default=plane.db.models.webhook.generate_token, max_length=255)),
('project', models.BooleanField(default=False)),
('issue', models.BooleanField(default=False)),
('module', models.BooleanField(default=False)),
('cycle', models.BooleanField(default=False)),
('issue_comment', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_webhooks', to='db.workspace')),
],
options={
'verbose_name': 'Webhook',
'verbose_name_plural': 'Webhooks',
'db_table': 'webhooks',
'ordering': ('-created_at',),
'unique_together': {('workspace', 'url')},
},
),
migrations.AddField(
model_name='apitoken',
name='description',
field=models.TextField(blank=True),
),
migrations.AddField(
model_name='apitoken',
name='expired_at',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='apitoken',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='apitoken',
name='last_used',
field=models.DateTimeField(null=True),
),
migrations.AddField(
model_name='projectmember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='workspacemember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AlterField(
model_name='apitoken',
name='token',
field=models.CharField(db_index=True, default=plane.db.models.api.generate_token, max_length=255, unique=True),
),
migrations.CreateModel(
name='WebhookLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('event_type', models.CharField(blank=True, max_length=255, null=True)),
('request_method', models.CharField(blank=True, max_length=10, null=True)),
('request_headers', models.TextField(blank=True, null=True)),
('request_body', models.TextField(blank=True, null=True)),
('response_status', models.TextField(blank=True, null=True)),
('response_headers', models.TextField(blank=True, null=True)),
('response_body', models.TextField(blank=True, null=True)),
('retry_count', models.PositiveSmallIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('webhook', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='logs', to='db.webhook')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='webhook_logs', to='db.workspace')),
],
options={
'verbose_name': 'Webhook Log',
'verbose_name_plural': 'Webhook Logs',
'db_table': 'webhook_logs',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='APIActivityLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token_identifier', models.CharField(max_length=255)),
('path', models.CharField(max_length=255)),
('method', models.CharField(max_length=10)),
('query_params', models.TextField(blank=True, null=True)),
('headers', models.TextField(blank=True, null=True)),
('body', models.TextField(blank=True, null=True)),
('response_code', models.PositiveIntegerField()),
('response_body', models.TextField(blank=True, null=True)),
('ip_address', models.GenericIPAddressField(blank=True, null=True)),
('user_agent', models.CharField(blank=True, max_length=512, null=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
'verbose_name': 'API Activity Log',
'verbose_name_plural': 'API Activity Logs',
'db_table': 'api_activity_logs',
'ordering': ('-created_at',),
},
),
]

View File

@@ -0,0 +1,54 @@
# Generated by Django 4.2.5 on 2023-11-13 12:53
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0047_webhook_apitoken_description_apitoken_expired_at_and_more'),
]
operations = [
migrations.CreateModel(
name='PageLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('transaction', models.UUIDField(default=uuid.uuid4)),
('entity_identifier', models.UUIDField(null=True)),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('mention', 'Mention')], max_length=30, verbose_name='Transaction Type')),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_log', to='db.page')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Page Log',
'verbose_name_plural': 'Page Logs',
'db_table': 'page_logs',
'ordering': ('-created_at',),
'unique_together': {('page', 'transaction')}
},
),
migrations.AddField(
model_name='page',
name='archived_at',
field=models.DateField(null=True),
),
migrations.AddField(
model_name='page',
name='is_locked',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='page',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='child_page', to='db.page'),
),
]

View File

@@ -0,0 +1,72 @@
# Generated by Django 4.2.5 on 2023-11-15 09:16
# Python imports
import uuid
from django.db import migrations
def update_pages(apps, schema_editor):
try:
Page = apps.get_model("db", "Page")
PageBlock = apps.get_model("db", "PageBlock")
PageLog = apps.get_model("db", "PageLog")
updated_pages = []
page_logs = []
# looping through all the pages
for page in Page.objects.all():
page_blocks = PageBlock.objects.filter(
page_id=page.id, project_id=page.project_id, workspace_id=page.workspace_id
).order_by("sort_order")
if page_blocks:
# looping through all the page blocks in a page
for page_block in page_blocks:
if page_block.issue is not None:
project_identifier = page.project.identifier
sequence_id = page_block.issue.sequence_id
transaction = uuid.uuid4().hex
embed_component = f'<issue-embed-component id="{transaction}" entity_name="issue" entity_identifier="{page_block.issue_id}" sequence_id="{sequence_id}" project_identifier="{project_identifier}" title="{page_block.name}"></issue-embed-component>'
page.description_html += embed_component
# create the page transaction for the issue
page_logs.append(
PageLog(
page_id=page_block.page_id,
transaction=transaction,
entity_identifier=page_block.issue_id,
entity_name="issue",
project_id=page.project_id,
workspace_id=page.workspace_id,
created_by_id=page_block.created_by_id,
updated_by_id=page_block.updated_by_id,
)
)
else:
# adding the page block name and description to the page description
page.description_html += f"<h2>{page_block.name}</h2>"
page.description_html += page_block.description_html
updated_pages.append(page)
Page.objects.bulk_update(
updated_pages,
["description_html"],
batch_size=100,
)
PageLog.objects.bulk_create(page_logs, batch_size=100)
except Exception as e:
print(e)
class Migration(migrations.Migration):
dependencies = [
("db", "0048_auto_20231116_0713"),
]
operations = [
migrations.RunPython(update_pages),
]

View File

@@ -54,7 +54,7 @@ from .view import GlobalView, IssueView, IssueViewFavorite
from .module import Module, ModuleMember, ModuleIssue, ModuleLink, ModuleFavorite
from .api_token import APIToken
from .api import APIToken, APIActivityLog
from .integration import (
WorkspaceIntegration,
@@ -68,7 +68,7 @@ from .integration import (
from .importer import Importer
from .page import Page, PageBlock, PageFavorite, PageLabel
from .page import Page, PageLog, PageFavorite, PageLabel
from .estimate import Estimate, EstimatePoint
@@ -79,3 +79,5 @@ from .analytic import AnalyticView
from .notification import Notification
from .exporter import ExporterHistory
from .webhook import Webhook, WebhookLog

View File

@@ -0,0 +1,80 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return "plane_api_" + uuid4().hex
class APIToken(BaseModel):
# Meta information
label = models.CharField(max_length=255, default=generate_label_token)
description = models.TextField(blank=True)
is_active = models.BooleanField(default=True)
last_used = models.DateTimeField(null=True)
# Token
token = models.CharField(
max_length=255, unique=True, default=generate_token, db_index=True
)
# User Information
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
expired_at = models.DateTimeField(blank=True, null=True)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.id)
class APIActivityLog(BaseModel):
token_identifier = models.CharField(max_length=255)
# Request Info
path = models.CharField(max_length=255)
method = models.CharField(max_length=10)
query_params = models.TextField(null=True, blank=True)
headers = models.TextField(null=True, blank=True)
body = models.TextField(null=True, blank=True)
# Response info
response_code = models.PositiveIntegerField()
response_body = models.TextField(null=True, blank=True)
# Meta information
ip_address = models.GenericIPAddressField(null=True, blank=True)
user_agent = models.CharField(max_length=512, null=True, blank=True)
class Meta:
verbose_name = "API Activity Log"
verbose_name_plural = "API Activity Logs"
db_table = "api_activity_logs"
ordering = ("-created_at",)
def __str__(self):
return str(self.token_identifier)

View File

@@ -1,41 +0,0 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return uuid4().hex + uuid4().hex
class APIToken(BaseModel):
token = models.CharField(max_length=255, unique=True, default=generate_token)
label = models.CharField(max_length=255, default=generate_label_token)
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.name)

View File

@@ -132,25 +132,7 @@ class Issue(ProjectBaseModel):
self.state = default_state
except ImportError:
pass
else:
try:
from plane.db.models import State, PageBlock
# Check if the current issue state and completed state id are same
if self.state.group == "completed":
self.completed_at = timezone.now()
# check if there are any page blocks
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=timezone.now()
)
else:
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=None
)
self.completed_at = None
except ImportError:
pass
if self._state.adding:
# Get the maximum display_id value from the database
last_id = IssueSequence.objects.filter(project=self.project).aggregate(
@@ -431,6 +413,7 @@ class Label(ProjectBaseModel):
name = models.CharField(max_length=255)
description = models.TextField(blank=True)
color = models.CharField(max_length=255, blank=True)
sort_order = models.FloatField(default=65535)
class Meta:
unique_together = ["name", "project"]
@@ -439,6 +422,18 @@ class Label(ProjectBaseModel):
db_table = "labels"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
if self._state.adding:
# Get the maximum sequence value from the database
last_id = Label.objects.filter(project=self.project).aggregate(
largest=models.Max("sort_order")
)["largest"]
# if last_id is not None
if last_id is not None:
self.sort_order = last_id + 10000
super(Label, self).save(*args, **kwargs)
def __str__(self):
return str(self.name)

View File

@@ -1,3 +1,5 @@
import uuid
# Django imports
from django.db import models
from django.conf import settings
@@ -22,6 +24,15 @@ class Page(ProjectBaseModel):
labels = models.ManyToManyField(
"db.Label", blank=True, related_name="pages", through="db.PageLabel"
)
parent = models.ForeignKey(
"self",
on_delete=models.CASCADE,
null=True,
blank=True,
related_name="child_page",
)
archived_at = models.DateField(null=True)
is_locked = models.BooleanField(default=False)
class Meta:
verbose_name = "Page"
@@ -34,6 +45,42 @@ class Page(ProjectBaseModel):
return f"{self.owned_by.email} <{self.name}>"
class PageLog(ProjectBaseModel):
TYPE_CHOICES = (
("to_do", "To Do"),
("issue", "issue"),
("image", "Image"),
("video", "Video"),
("file", "File"),
("link", "Link"),
("cycle","Cycle"),
("module", "Module"),
("back_link", "Back Link"),
("forward_link", "Forward Link"),
("mention", "Mention"),
)
transaction = models.UUIDField(default=uuid.uuid4)
page = models.ForeignKey(
Page, related_name="page_log", on_delete=models.CASCADE
)
entity_identifier = models.UUIDField(null=True)
entity_name = models.CharField(
max_length=30,
choices=TYPE_CHOICES,
verbose_name="Transaction Type",
)
class Meta:
unique_together = ["page", "transaction"]
verbose_name = "Page Log"
verbose_name_plural = "Page Logs"
db_table = "page_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.page.name} {self.type}"
class PageBlock(ProjectBaseModel):
page = models.ForeignKey("db.Page", on_delete=models.CASCADE, related_name="blocks")
name = models.CharField(max_length=255)

View File

@@ -166,6 +166,7 @@ class ProjectMember(ProjectBaseModel):
default_props = models.JSONField(default=get_default_props)
preferences = models.JSONField(default=get_default_preferences)
sort_order = models.FloatField(default=65535)
is_active = models.BooleanField(default=True)
def save(self, *args, **kwargs):
if self._state.adding:

View File

@@ -0,0 +1,90 @@
# Python imports
from uuid import uuid4
from urllib.parse import urlparse
# Django imports
from django.db import models
from django.core.exceptions import ValidationError
# Module imports
from plane.db.models import BaseModel
def generate_token():
return "plane_wh_" + uuid4().hex
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")
def validate_domain(value):
parsed_url = urlparse(value)
domain = parsed_url.netloc
if domain in ["localhost", "127.0.0.1"]:
raise ValidationError("Local URLs are not allowed.")
class Webhook(BaseModel):
workspace = models.ForeignKey(
"db.Workspace",
on_delete=models.CASCADE,
related_name="workspace_webhooks",
)
url = models.URLField(
validators=[
validate_schema,
validate_domain,
]
)
is_active = models.BooleanField(default=True)
secret_key = models.CharField(max_length=255, default=generate_token)
project = models.BooleanField(default=False)
issue = models.BooleanField(default=False)
module = models.BooleanField(default=False)
cycle = models.BooleanField(default=False)
issue_comment = models.BooleanField(default=False)
def __str__(self):
return f"{self.workspace.slug} {self.url}"
class Meta:
unique_together = ["workspace", "url"]
verbose_name = "Webhook"
verbose_name_plural = "Webhooks"
db_table = "webhooks"
ordering = ("-created_at",)
class WebhookLog(BaseModel):
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="webhook_logs"
)
# Associated webhook
webhook = models.ForeignKey(Webhook, on_delete=models.CASCADE, related_name="logs")
# Basic request details
event_type = models.CharField(max_length=255, blank=True, null=True)
request_method = models.CharField(max_length=10, blank=True, null=True)
request_headers = models.TextField(blank=True, null=True)
request_body = models.TextField(blank=True, null=True)
# Response details
response_status = models.TextField(blank=True, null=True)
response_headers = models.TextField(blank=True, null=True)
response_body = models.TextField(blank=True, null=True)
# Retry Count
retry_count = models.PositiveSmallIntegerField(default=0)
class Meta:
verbose_name = "Webhook Log"
verbose_name_plural = "Webhook Logs"
db_table = "webhook_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.event_type} {str(self.webhook.url)}"

View File

@@ -99,6 +99,7 @@ class WorkspaceMember(BaseModel):
view_props = models.JSONField(default=get_default_props)
default_props = models.JSONField(default=get_default_props)
issue_props = models.JSONField(default=get_issue_props)
is_active = models.BooleanField(default=True)
class Meta:
unique_together = ["workspace", "member"]

View File

View File

View File

@@ -0,0 +1 @@
from .instance import InstanceOwnerPermission, InstanceAdminPermission

View File

@@ -0,0 +1,33 @@
# Third party imports
from rest_framework.permissions import BasePermission
# Module imports
from plane.license.models import Instance, InstanceAdmin
class InstanceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
instance = Instance.objects.first()
return InstanceAdmin.objects.filter(
role=20,
instance=instance,
user=request.user,
).exists()
class InstanceAdminPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
instance = Instance.objects.first()
return InstanceAdmin.objects.filter(
role__gte=15,
instance=instance,
user=request.user,
).exists()

View File

@@ -0,0 +1 @@
from .instance import InstanceSerializer, InstanceAdminSerializer, InstanceConfigurationSerializer

View File

@@ -0,0 +1,42 @@
# Module imports
from plane.license.models import Instance, InstanceAdmin, InstanceConfiguration
from plane.api.serializers import BaseSerializer
from plane.api.serializers import UserAdminLiteSerializer
class InstanceSerializer(BaseSerializer):
primary_owner_details = UserAdminLiteSerializer(source="primary_owner", read_only=True)
class Meta:
model = Instance
fields = "__all__"
read_only_fields = [
"id",
"primary_owner",
"primary_email",
"instance_id",
"license_key",
"api_key",
"version",
"email",
"last_checked_at",
]
class InstanceAdminSerializer(BaseSerializer):
user_detail = UserAdminLiteSerializer(source="user", read_only=True)
class Meta:
model = InstanceAdmin
fields = "__all__"
read_only_fields = [
"id",
"instance",
"user",
]
class InstanceConfigurationSerializer(BaseSerializer):
class Meta:
model = InstanceConfiguration
fields = "__all__"

View File

@@ -0,0 +1,6 @@
from .instance import (
InstanceEndpoint,
TransferPrimaryOwnerEndpoint,
InstanceAdminEndpoint,
InstanceConfigurationEndpoint,
)

View File

@@ -0,0 +1,242 @@
# Python imports
import json
import os
import requests
# Django imports
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.api.views import BaseAPIView
from plane.license.models import Instance, InstanceAdmin, InstanceConfiguration
from plane.license.api.serializers import (
InstanceSerializer,
InstanceAdminSerializer,
InstanceConfigurationSerializer,
)
from plane.license.api.permissions import (
InstanceOwnerPermission,
InstanceAdminPermission,
)
from plane.db.models import User
class InstanceEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "PATCH"]:
self.permission_classes = [
InstanceOwnerPermission,
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceEndpoint, self).get_permissions()
def post(self, request):
# Check if the instance is registered
instance = Instance.objects.first()
# If instance is None then register this instance
if instance is None:
with open("package.json", "r") as file:
# Load JSON content from the file
data = json.load(file)
license_engine_base_url = os.environ.get("LICENSE_ENGINE_BASE_URL")
if not license_engine_base_url:
raise Response(
{"error": "LICENSE_ENGINE_BASE_URL is required"},
status=status.HTTP_400_BAD_REQUEST,
)
headers = {"Content-Type": "application/json"}
payload = {
"email": request.user.email,
"version": data.get("version", 0.1),
}
response = requests.post(
f"{license_engine_base_url}/api/instances",
headers=headers,
data=json.dumps(payload),
)
if response.status_code == 201:
data = response.json()
# Create instance
instance = Instance.objects.create(
instance_name="Plane Free",
instance_id=data.get("id"),
license_key=data.get("license_key"),
api_key=data.get("api_key"),
version=data.get("version"),
primary_email=data.get("email"),
primary_owner=request.user,
last_checked_at=timezone.now(),
)
# Create instance admin
_ = InstanceAdmin.objects.create(
user=request.user,
instance=instance,
role=20,
)
return Response(
{
"message": f"Instance succesfully registered with owner: {instance.primary_owner.email}"
},
status=status.HTTP_201_CREATED,
)
return Response(
{"error": "Instance could not be registered"},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return Response(
{
"message": f"Instance already registered with instance owner: {instance.primary_owner.email}"
},
status=status.HTTP_200_OK,
)
def get(self, request):
instance = Instance.objects.first()
# get the instance
if instance is None:
return Response({"activated": False}, status=status.HTTP_400_BAD_REQUEST)
# Return instance
serializer = InstanceSerializer(instance)
serializer.data["activated"] = True
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request):
# Get the instance
instance = Instance.objects.first()
serializer = InstanceSerializer(instance, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class TransferPrimaryOwnerEndpoint(BaseAPIView):
permission_classes = [
InstanceOwnerPermission,
]
# Transfer the owner of the instance
def post(self, request):
instance = Instance.objects.first()
# Get the email of the new user
email = request.data.get("email", False)
if not email:
return Response(
{"error": "User is required"}, status=status.HTTP_400_BAD_REQUEST
)
# Get users
user = User.objects.get(email=email)
# Save the instance user
instance.primary_owner = user
instance.primary_email = user.email
instance.save(update_fields=["owner", "email"])
# Add the user to admin
_ = InstanceAdmin.objects.get_or_create(
instance=instance,
user=user,
role=20,
)
return Response(
{"message": "Owner successfully updated"}, status=status.HTTP_200_OK
)
class InstanceAdminEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "DELETE"]:
self.permission_classes = [
InstanceOwnerPermission,
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceAdminEndpoint, self).get_permissions()
# Create an instance admin
def post(self, request):
email = request.data.get("email", False)
role = request.data.get("role", 15)
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
)
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not registered yet"},
status=status.HTTP_403_FORBIDDEN,
)
# Fetch the user
user = User.objects.get(email=email)
instance_admin = InstanceAdmin.objects.create(
instance=instance,
user=user,
role=role,
)
serializer = InstanceAdminSerializer(instance_admin)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request):
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not registered yet"},
status=status.HTTP_403_FORBIDDEN,
)
instance_admins = InstanceAdmin.objects.filter(instance=instance)
serializer = InstanceAdminSerializer(instance_admins, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
instance = Instance.objects.first()
instance_admin = InstanceAdmin.objects.filter(instance=instance, pk=pk).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class InstanceConfigurationEndpoint(BaseAPIView):
permission_classes = [
InstanceAdminPermission,
]
def get(self, request):
instance_configurations = InstanceConfiguration.objects.all()
serializer = InstanceConfigurationSerializer(instance_configurations, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request):
key = request.data.get("key", False)
if not key:
return Response(
{"error": "Key is required"}, status=status.HTTP_400_BAD_REQUEST
)
configuration = InstanceConfiguration.objects.get(key=key)
configuration.value = request.data.get("value")
configuration.save()
serializer = InstanceConfigurationSerializer(configuration)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,5 @@
from django.apps import AppConfig
class LicenseConfig(AppConfig):
name = "plane.license"

Some files were not shown because too many files have changed in this diff Show More