Compare commits

...

190 Commits

Author SHA1 Message Date
NarayanBavisetti
71de47496a fix: resolved changes 2023-11-16 14:19:06 +05:30
NarayanBavisetti
cd8f1eb952 fix: migration fixes 2023-11-16 12:45:39 +05:30
NarayanBavisetti
2028f3ede2 Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-11-16 12:39:49 +05:30
NarayanBavisetti
572be0fe60 fix: added issue embed 2023-11-16 12:36:30 +05:30
NarayanBavisetti
176e184220 fix: back migration of page blocks 2023-11-15 17:18:21 +05:30
Nikhil
26b1e9d5f1 dev: squashed migrations (#2779)
* dev: migration squash

* dev: migrations squashed for apis and webhooks

* dev: packages updated and  move dj-database-url for local settings

* dev: update package changes
2023-11-15 17:15:02 +05:30
Bavisetti Narayan
79347ec62b feat: api webhooks (#2543)
* dev: initiate external apis

* dev: external api

* dev: external public api implementation

* dev: add prefix to all api tokens

* dev: flag to enable disable api token api access

* dev: webhook model create and apis

* dev: webhook settings

* fix: webhook logs

* chore: removed drf spectacular

* dev: remove retry_count and fix api logging for get requests

* dev: refactor webhook logic

* fix: celery retry mechanism

* chore: event and action change

* chore: migrations changes

* dev: proxy setup for apis

* chore: changed retry time and cleanup

* chore: added issue comment and inbox issue api endpoints

* fix: migration files

* fix: added env variables

* fix: removed issue attachment from proxy

* fix: added new migration file

* fix: restricted wehbook access

* chore: changed urls

* chore: fixed porject serializer

* fix: set expire for api token

* fix: retrive endpoint for api token

* feat: Api Token screens & api integration

* dev: webhook endpoint changes

* dev: add fields for webhook updates

* feat: Download Api secret key

* chore: removed BASE API URL

* feat: revoke token access

* dev: migration fixes

* feat: workspace webhooks (#2748)

* feat: workspace webhook store, services integeration and rendered webhook list and create

* chore: handled webhook update and rengenerate token in workspace webhooks

* feat: regenerate key and delete functionality

---------

Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>

* fix: url validation added

* fix: seperated env for webhook and api

* Web hooks refactoring

* add show option for generated hook key

* Api token restructure

* webhook minor fixes

* fix build errors

* chore: improvements in file structring

* dev: rate limiting the open apis

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: LAKHAN BAHETI <lakhanbaheti9@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: Ramesh Kumar <rameshkumar@rameshs-MacBook-Pro.local>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-15 15:56:57 +05:30
Nikhil
7b965179d8 dev: update bucket script to make the bucket public (#2767)
* dev: update bucket script to make the bucket public

* dev: remove auto bucket script from docker compose
2023-11-15 15:56:08 +05:30
Nikhil
fc51ffc589 chore: user workflow (#2762)
* dev: workspace member deactivation and leave endpoints and filters

* dev: deactivated for project members

* dev: project members leave

* dev: project member check on workspace deactivation

* dev: project member queryset update and remove leave project endpoint

* dev: rename is_deactivated to is_active and user deactivation apis

* dev: check if the user is already part of workspace then make them active

* dev: workspace and project save

* dev: update project members to make them active

* dev: project invitation

* dev: automatic user workspace and project member create when user sign in/up

* dev: fix member invites

* dev: rename deactivation variable

* dev: update project member invitation

* dev: additional permission layer for workspace

* dev: update the url for  workspace invitations

* dev: remove invitation urls from users

* dev: cleanup workspace invitation workflow

* dev: workspace and project invitation
2023-11-15 15:53:16 +05:30
sabith-tu
96f6e37cc5 fix: Delete estimate popup is not closing automatically (#2777) 2023-11-15 14:08:52 +05:30
Nikhil
29774ce84a dev: API settings (#2594)
* dev: update settings file structure and added extra settings for CORS

* dev: remove WEB_URL variable and add celery integration for sentry

* dev: aws and minio settings

* dev: add cors origins to env

* dev: update settings
2023-11-15 12:31:52 +05:30
Nikhil
8cbe9c26fc enhancement: label sort order (#2763)
* chore: label sort ordering

* dev: ordering

* fix: sort order

* fix: save of labels

* dev: remove ordering by name

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-15 12:25:44 +05:30
Prateek Shourya
7f42566207 Fix: Custom menu item not automatically closing, affecting delete popup behavior. (#2771) 2023-11-14 23:05:30 +05:30
Ankush Deshmukh
b60237b676 Standarding priority icons across the platform (#2776) 2023-11-14 20:52:43 +05:30
Prateek Shourya
1fe09d369f style: text overflow fix and border color update (#2769)
* style: fix text overflow in:
* Issue activity
* Cycle and Module Select in Create Issue form
* Delete Module modal
* Join Project modal

* style: update assignee select border as per design.
2023-11-14 18:34:51 +05:30
Dakshesh Jain
b7757c6b1a fix: bugs (#2761)
* fix: semicolon on estimate settings page

* refactor: project settings automations store implementation

* fix: active cycle stuck on infinite loading

* fix: removed delete project option from sidebar

* fix: discloser not opening when navigating to project

* fix: clear filter not working & filter appearing even if nothing is selected

* refactor: select label store implementation

* refactor: select state store implementation
2023-11-14 18:33:01 +05:30
Anmol Singh Bhatia
1a25bacce1 style: create update view modal consistency (#2775) 2023-11-14 18:30:10 +05:30
Anmol Singh Bhatia
6797df239d chore: no lead option added in lead select dropdown (#2774) 2023-11-14 18:29:39 +05:30
Anmol Singh Bhatia
43e7c10eb7 chore: spreadsheet layout column responsiveness (#2768) 2023-11-14 18:28:49 +05:30
Anmol Singh Bhatia
bdc9c9c2a8 chore: create update issue modal improvement (#2765) 2023-11-14 18:28:15 +05:30
Anmol Singh Bhatia
f0c72bf249 fix: breadcrumb project icon improvement (#2764) 2023-11-14 18:27:47 +05:30
sabith-tu
a8904bfc48 style: ui fixes for pages and views (#2770) 2023-11-14 18:26:50 +05:30
NarayanBavisetti
73c2416055 fix: migration and optimisation 2023-11-13 19:12:55 +05:30
NarayanBavisetti
598c22f91d Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-11-13 17:23:00 +05:30
Nikhil
b31041726b dev: create bucket through application (#2720) 2023-11-13 15:57:19 +05:30
Prateek Shourya
e6f947ad90 style: ui improvements and bug fixes (#2758)
* style: add transition to favorite projects dropdown.

* style: update project integration settings borders.

* style: fix text overflow issue in project views.

* fix: issue with non-functional cancel button in leave project modal.
2023-11-13 14:42:45 +05:30
Dakshesh Jain
7963993171 fix: workspace settings bugs (#2743)
* fix: double layout in exports

* fix: typo in jira email address section

* fix: workspace members not mutating

* fix: removed un-used variable

* fix: workspace members can't be filtered using email

* fix: autocomplete in workspace delete

* fix: autocomplete in project delete modal

* fix: update member function in store

* fix: sidebar link not active when in github/jira

* style: margin top & icon inconsistency

* fix: typo in create workspace

* fix: workspace leave flow

* fix: redirection to delete issue

* fix: autocomplete off in jira api token

* refactor: reduced api call, added optional chaining & removed variable with low scope
2023-11-13 13:34:05 +05:30
Anmol Singh Bhatia
00e61a8753 fix: peek overview comment ordering and comment icon alignment fix (#2753) 2023-11-10 18:45:41 +05:30
Anmol Singh Bhatia
733fed76cc fix: cycle card title responsiveness added (#2752) 2023-11-10 18:43:48 +05:30
Anmol Singh Bhatia
e78dd4b1c0 fix: app sidebar dropdown fix (#2751) 2023-11-10 18:43:16 +05:30
Anmol Singh Bhatia
d479781fce style: header consistency (#2750) 2023-11-10 18:30:43 +05:30
Bavisetti Narayan
c449b46bf4 fix: added external folder in urls (#2749) 2023-11-10 16:00:55 +05:30
Prateek Shourya
fd6430c3e3 style: Update modal appearance for UI consistency (#2747) 2023-11-10 15:48:34 +05:30
Ramesh Kumar Chandra
6f580ce2d9 fix: project settings layout render in export (#2746) 2023-11-10 13:07:18 +05:30
Ankush Deshmukh
2748133bd0 Fix: Show Priority icon in custom analytics table. (#2744) 2023-11-10 13:06:23 +05:30
Aaryan Khandelwal
884b219508 refactor: cycles store (#2716)
* refactor: cycles store

* refactor: active cycle details
2023-11-09 18:37:45 +05:30
Anmol Singh Bhatia
162faf8339 fix: date select tooltip fix (#2740) 2023-11-09 18:29:45 +05:30
Anmol Singh Bhatia
c291ff05ee fix: fliter list item clear button alignment fix (#2741) 2023-11-09 18:27:19 +05:30
Nikhil
446981422e feat: issues v2 endpoint (#2713)
* feat: issue v2 listing endpoint

* dev: issues v3 endpoint

* dev: add permission in the grouped endpoint

* dev: update grouped endpoint
2023-11-09 18:24:26 +05:30
Bavisetti Narayan
630e21b954 fix: favourite cycle and modules displayed at top (#2719) 2023-11-09 18:22:38 +05:30
Bavisetti Narayan
894ffb6c21 fix: mention notification (#2670)
* fix: mention notification

* feat: updated mentions for comments in the notification background task

* feat: added subscription for issue_comment_mentions as well

* fix: removed the print statement

* fix: double notification popup for mentioned assignees

* fix: added issue subscriber

* fix: removed creator for subscribed

* fix: creator will not be subscribed to issue

* fix: double notification removed

---------

Co-authored-by: Henit Chobisa <chobisa.henit@gmail.com>
2023-11-09 18:22:13 +05:30
Nikhil
515dba02d3 chore: configuration add tracker variables (#2709)
* chore: configuration add tracker variables

* dev: unsplash configuration
2023-11-09 18:20:03 +05:30
Dakshesh Jain
34bccd7e06 refactor: gantt sidebar (#2705)
* refactor: gantt sidebar

* fix: exception error

fix: file placement

* refactor: not passing sidebar block as props
2023-11-09 17:57:41 +05:30
sriram veeraghanta
79df59f618 fix: spliting out the project members from project store and service (#2739) 2023-11-09 17:56:55 +05:30
Prateek Shourya
7676aab773 fix: UI improvements. (#2734)
* style: update check icon colors to match our design.

* fix: automatically focus input box in pages `add label` modal.
2023-11-09 17:37:45 +05:30
Prateek Shourya
8832d8e00e style: sidebar UI improvements (#2735)
* updated font weight and color as per designs.
* removed background color from workspace with logo.
* updated dropdown design.
2023-11-09 17:01:48 +05:30
rahulramesha
d733a53ea6 fix: Add horizontal scroll bar to views (#2736)
* add errors for duplicate labels

* adding horizonatal scroll bar to views

---------

Co-authored-by: rahulramesha <rahul@appsmith.com>
2023-11-09 15:12:00 +05:30
Lakhan Baheti
96862e06ef fix: cystom analytics bar graph index alignment (#2737) 2023-11-09 14:36:22 +05:30
sriram veeraghanta
a6567bbce4 fix: workspace members store added and implemented across the app (#2732)
* fix: minor changes

* fix: workspace members store added and implemnted across the app
2023-11-09 00:35:12 +05:30
Nikhil
556b2d2617 feat: state list endpoint (#2717)
* feat: state list endpoint

* dev: update states endpoint

* dev: mark default state endpoint
2023-11-08 22:38:53 +05:30
Lakhan Baheti
10037222b6 fix: Tooltip content on assignee hover in all layouts (#2724)
* fix: Tooltip content on assignee hover in all layouts

* chore: comments added
2023-11-08 22:35:30 +05:30
sabith-tu
931f9d288a fix: toast alert inconsistency (#2730) 2023-11-08 20:37:47 +05:30
sriram veeraghanta
20fb79567f fix: project states fixes (#2731)
* fix: project states fixes

* fix: states fixes

* fix: formating all files
2023-11-08 20:31:46 +05:30
Ramesh Kumar Chandra
bd1a850f35 style: kanban card label overflow (#2722)
* chore: kanban card lable drop down items overflow

* style: kaban card label text overflow, tool tip, hover cursor

* style: label overflow in list layout
2023-11-08 18:12:36 +05:30
M. Palanikannan
206f5744a3 [fix]: Error Handling for Images and Table Fix for Form Submissions in Editor (#2710)
* cancellable uploads and image limits with better error handling

* fixed table row/column picker behaviour on modals

* Merge branch 'rerender-debounce-editor-fix' into editor-draggable-nodes

* fix: added mention suggestions and highlights in `create-issue-modal`

* removed uncessary files

* solved lint error of trailing spaces

* added plane/ui dependency for tooltips

---------

Co-authored-by: Henit Chobisa <chobisa.henit@gmail.com>
2023-11-08 18:00:53 +05:30
sabith-tu
faaba45e59 fix: all issues values not changeable and assignee image not rendering (#2707)
* fix: all issues values are not changeable and assignee image not rendering

* chore: removed console log
2023-11-08 17:56:09 +05:30
sabith-tu
f8002852e0 fix: issue property height and peek view date picker border radius (#2726) 2023-11-08 17:55:28 +05:30
Lakhan Baheti
2d71377722 fix: added empty project state when no project exists. (#2727)
* fix: added empty project state when no project exists

* fix: duplicate import
2023-11-08 17:54:59 +05:30
Lakhan Baheti
6ebee05951 fix: unwanted go back button in onboarding step 2 (#2714) 2023-11-08 17:53:06 +05:30
Lakhan Baheti
5a3bac998e fix: bug fixes and ui improvements (#2703)
* fix: gantt chart duration in decimal

* fix: Loading text instead Spinner in peek view

* fix: cycle more popover typo & icon overlapping

* fix: list layout properties alignment

* fix: project search empty state

* fix: calendar layout issue text overflow & redirection inconsistency

* style: urgent priority hover background color

* fix: Cycle issues kanban layout empty state missing

* style: custom snooze modal placeholder text color

* refactor: replaced unwanted anchor tag with div

* chore: removed empty state for cycle kanban layout
2023-11-08 17:52:34 +05:30
Anmol Singh Bhatia
4096136b44 style: ui consistency and improvement (#2725)
* style: create/update issue modal properties ui improvement

* style: create update issue modal improvement

* style: modal ui consistency
2023-11-08 17:51:32 +05:30
Aaryan Khandelwal
83e0c4ebbd chore: remove active ids from the MobX stores if not present in the route (#2681)
* chore: remove active ids if not present in the route

* refactor: set active id logic
2023-11-08 17:35:45 +05:30
Aaryan Khandelwal
df8bdfd5b9 fix: project automation settings flickering (#2680)
* fix: cycle and module sidebar z-index

* fix: project automation settings flickering
2023-11-08 17:34:42 +05:30
Ankush Deshmukh
da799b5a63 Fix: Render bar chart axis labels in lighter color when dark theme applied (#2721) 2023-11-08 17:34:09 +05:30
Dakshesh Jain
621d551c4a fix: project select validation (#2723) 2023-11-08 17:33:26 +05:30
Aaryan Khandelwal
5a84ed279d refactor: replace keyboard events with command palette store (#2688) 2023-11-08 13:51:55 +05:30
Dakshesh Jain
53e7da08e4 fix: quick add not working for labels & assignee (#2689)
* fix: quick add not working for labels & assignee

* fix: build error on spreadsheet view
2023-11-08 12:28:02 +05:30
Anmol Singh Bhatia
9f206331bc style: spinner component improvement (#2708) 2023-11-07 18:35:05 +05:30
rahulramesha
b56d188a83 add errors for duplicate labels (#2706)
Co-authored-by: rahulramesha <rahul@appsmith.com>
2023-11-07 18:22:52 +05:30
Bavisetti Narayan
8d3853b129 fix: issue draft delete functionality (#2696) 2023-11-07 18:19:32 +05:30
Bavisetti Narayan
30d6235108 fix: add issues to cycles and modules (#2659)
* fix: able to add issue in cycle and module

* fix: issue activity message
2023-11-07 18:18:51 +05:30
Prateek Shourya
6e461dd8c3 style: update user profile button alignment. (#2695) 2023-11-07 18:17:44 +05:30
Nikhil
86379c51b7 dev: worker count (#2573)
* dev: workers count

* dev: update the worker count variable to GUNICORN_WORKERS
2023-11-07 18:05:38 +05:30
Manish Gupta
f7cc2eca36 dev: Modified the branch-build action yaml (#2704)
* cherrypicked code

* removed PUSH event
2023-11-07 17:41:26 +05:30
Prateek Shourya
63d1ad286b style: update font weight in project general setting section. (#2697) 2023-11-07 17:39:12 +05:30
Manish Gupta
1412c1c94a dev: modified the branch wise build (#2702)
* cherrypicked branch build code

* trigger on pull request

* branch filter

* checking branch filter

* checking push

* checking push again

* code cleanup before PR
2023-11-07 17:23:32 +05:30
sriram veeraghanta
26de35bd8d fix: environment config changes in the API are replicated in web and space app (#2699)
* fix: envconfig type changes

* chore: configuration variables  (#2692)

* chore: update avatar group logic (#2672)

* chore: configuration variables

---------

* fix: replacing slack client id with env config

---------

Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
2023-11-07 17:17:10 +05:30
Aaryan Khandelwal
1986c0dfd4 fix: state icon (#2678) 2023-11-07 16:15:34 +05:30
Anmol Singh Bhatia
25f3a5b2e4 chore: peek overview improvement and bug fixes (#2683)
* style: issue peek overview improvement

* style: peek overview improvement

* fix: subscribe issue from peek overview fix and validation added

* fix: build error
2023-11-07 15:58:42 +05:30
Anmol Singh Bhatia
f30b16e9d8 chore: user profile issue improvement (#2679)
* fix: user profile filters z-index

* chore: user profile issue state group heading fix

* fix: build error
2023-11-07 15:58:19 +05:30
Anmol Singh Bhatia
93d03f82b4 chore: spreadsheet layout improvement (#2677)
* style: spreadsheet column width fix

* style: spreadsheet label column styling

* chore: spreadsheet layout issue properties improvement

* fix: build error
2023-11-07 15:58:00 +05:30
Anmol Singh Bhatia
98974fdc50 fix: cycle and module bug fixes and improvement (#2691)
* fix: cycle and module card issue count fix

* fix: cycle and module list progress icon fix

* fix: module card progress fix

* style: cycle & module empty date label updated

* fix: build error
2023-11-07 15:14:47 +05:30
sriram veeraghanta
040563d148 fix: replacing jira importer image (#2685) 2023-11-07 14:35:04 +05:30
Nikhil
4de64f112f fix: slack project integration (#2684) 2023-11-07 14:34:30 +05:30
Ramesh Kumar Chandra
0afb900678 fix: kanban card state name and drop down items text overflow (#2686) 2023-11-07 14:31:29 +05:30
Prateek Shourya
baf17a109b style: update project description as per design. (#2682) 2023-11-07 13:13:14 +05:30
Prateek Shourya
37bf465fcd style: update border across workspace and project settings. (#2669)
* style: update border across workspace and project settings.

* update border width
2023-11-07 13:12:05 +05:30
Anmol Singh Bhatia
d8c96536f0 fix: bug fixes and ui improvement (#2674)
* chore: peekoverview edit permission updated

* chore: tab index added in create project modal

* chore: project card improvement

* style: avatar component improvement

* chore: create issue modal improvement

* style: global style sidebar border variable name fix
2023-11-06 21:08:01 +05:30
Nikhil
b372ccfdb3 fix: slack integration workflow (#2675)
* fix: slack integration workflow

* dev: add slack client id as configuration

* fix: clean up

* fix: added env to turbo

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-06 21:00:49 +05:30
guru_sainath
984b36f45a fix: In kanban issues can be shifted between the column in order_by (#2676) 2023-11-06 21:00:36 +05:30
Aaryan Khandelwal
46f307fed5 chore: update avatar group logic (#2672) 2023-11-06 20:43:54 +05:30
Aaryan Khandelwal
1dce72cb3c style: updated layouts UI in the space app (#2671)
* style: updated layouts UI in space

* fix: build error
2023-11-06 20:43:34 +05:30
Aaryan Khandelwal
a6dea3af23 fix: render the estimate select if estimate is enabled for the project (#2663) 2023-11-06 20:43:10 +05:30
Henit Chobisa
6eb0bf4785 Fix/mentions spaces fix (#2667)
* feat: add mentions store to the space project

* fix: added mentions highlights in read only comment cards

* feat: added mention highlights in richtexteditor in space app
2023-11-06 20:42:24 +05:30
Manish Gupta
13389d1b2b dev: On Demand Code Build for any branch (#2668)
* wip

* wip

* testing

* wip

* wip

* wip

* wip

* image push fix

* wip

* wip

* dynamic branch name and tag

* workflow_dispatch modified

* job splitting

* file sharing

* wip

* checking

* wip

* wip

* wip

* wip

* build fixes

* code upload download fixes

* image name change

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-06 19:05:20 +05:30
Aaryan Khandelwal
742143766f fix: existing issues modal for cycle and module (#2664)
* fix: existing issues modal for cycle and module

* refactor: existing issues modal code

* fix: build errors
2023-11-06 16:30:09 +05:30
sriram veeraghanta
1ed72c51df fix: package version fixes and mentions build error fixes (#2665) 2023-11-06 16:28:15 +05:30
Aaryan Khandelwal
a03e0c788f fix: notifications option in the sidebar menu not collapsing (#2662) 2023-11-06 14:53:26 +05:30
guru_sainath
0c8a867565 fix: handled drag and drop issue, gantt hover issue for issue peek overview (#2660) 2023-11-06 13:52:33 +05:30
Aaryan Khandelwal
3a07bb6060 refactor: removed unused packages (#2658) 2023-11-06 13:17:02 +05:30
Aaryan Khandelwal
bf48d93a25 fix: product tour modal bugs (#2657)
* fix: product tour

* style: product tour navigation buttons

* refactor: step logic
2023-11-06 13:06:00 +05:30
M. Palanikannan
14ac885e55 [feat]: Extended Tables Support (#2596)
* migrated table to new project structure

* fixed range errors while deleting table nodes with no nodes below and removed console logs

* fixed css for rendering table menu

* removed old table menu

* added support for read only editors as well

* text-black removed

* added design colors

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-05 18:54:00 +05:30
Henit Chobisa
f0335751b3 fix: mentions enter error (#2646)
* fix: fixed readonly lite text editor not rendering highlights

* fix: removed enter extension in lite text editor
2023-11-04 01:57:57 +05:30
Anmol Singh Bhatia
52395d0563 chore : dropdown loading state added and project card avatar fix (#2643)
* chore: project card avatar rendering fix

* chore: state, assignee and label dropdown loading state added
2023-11-04 01:56:23 +05:30
Dakshesh Jain
ad558833af refactor: archive issue (#2628)
* dev: archived issue store

* dev: archived issue layouts and store binding

* dev: archived issue detail store

* dev: is read only

* fix: archived issue delete

* fix: build error
2023-11-03 20:20:05 +05:30
sriram veeraghanta
ff258c60fd fix: open changelog in new tab (#2645) 2023-11-03 19:49:02 +05:30
Bavisetti Narayan
db2a1b8033 fix: custom analytics graph display issue (#2637)
* chore: fixed custom analytics

* chore: typo changes
2023-11-03 19:23:35 +05:30
Dakshesh Jain
91878fb3dd fix: notification read and snooze bugs (#2639)
* fix: marking notification as read doesn't remove it from un-read list

* refactor: arranged imports

* fix: past snooze notifications coming in snooze tab
2023-11-03 19:21:35 +05:30
Lakhan Baheti
c233e6e3b6 style: custom analytics bar graph label overlapping (#2636)
* style: custom analytics bar graph label overlapping

fix: Bar graph avatar image rendering & tooltip

* fix: import
2023-11-03 19:18:24 +05:30
Lakhan Baheti
0d2c399555 style: quick add issue UI improvements in all the layouts (#2615)
* style: quick add issue UI improvements in all the layouts

* style: ui improvements

* style: quick add icon size

* chore: static sizes to tailwind classes

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-03 19:17:50 +05:30
Aaryan Khandelwal
79cad16aba chore: update all layout selections (#2641) 2023-11-03 19:17:13 +05:30
Aaryan Khandelwal
41e9d5d7e3 chore: added missing columns to the spreadsheet layout (#2640) 2023-11-03 19:15:09 +05:30
Aaryan Khandelwal
992cf79031 chore: peek overview authorization (#2632)
* chore: peek overview authorization

* chore: comment access specifier validation
2023-11-03 19:13:10 +05:30
Aaryan Khandelwal
d48f13416f Update readme content (#2635) 2023-11-03 19:12:46 +05:30
Aaryan Khandelwal
cf19afa707 fix: string helper function (#2633) 2023-11-03 19:11:28 +05:30
sriram veeraghanta
cc26f604aa fix: next image fixes for selfhosted instances (#2642) 2023-11-03 19:09:40 +05:30
Anmol Singh Bhatia
8919b724c5 chore: breadcrumbs ui revamp and refactor (#2634) 2023-11-03 18:01:49 +05:30
Anmol Singh Bhatia
7eeac188d7 chore: peek overview improvement and bug fixes (#2627)
* chore: peekoverview issue properties text size fix

* chore: peekoverview icon updated and active view indicator added

* chore: peekoverview and issue sidebar improvement
2023-11-03 18:01:34 +05:30
Anmol Singh Bhatia
f639e467f8 refactor: replace ui components with plane ui components (#2626)
* refactor: replace button component with plane ui component and remove old button component

* refactor: replace dropdown component with plane ui component

* refactor: replace tooltip, input, textarea, spinner and loader component with plane ui component

* refactor: plane ui code refactor
2023-11-03 17:21:38 +05:30
Anmol Singh Bhatia
737fea28c6 chore: refactor and improve project member settings (#2625)
* fix: project member setting improvement and refactor

* fix: typo fix in automations setting
2023-11-03 17:20:49 +05:30
Anmol Singh Bhatia
4c1aee0cfc fix: resolve z-index and peek overview component bug (#2624)
* fix: resolved z-index issue on peek overview component

* fix: fix issue with peekover view in spreadsheet view

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2023-11-03 17:20:14 +05:30
guru_sainath
1352c200dd fix: Project Rendering Error in Kanban Layout and Layout Rendering Fixes in Subscribed Profile Issues (#2629)
* fix: rendering projects error in kanabn layout in profile issues and resolved multiplr layout rendering in subscribed profile issues

* fix: implemented spinner loader in profile issues and remove logs in kanban layout
2023-11-03 13:17:52 +05:30
Aaryan Khandelwal
dd2ba2ec6f fix: slug field not working (#2622) 2023-11-03 13:17:01 +05:30
Aaryan Khandelwal
c66d76df26 chore: set sub group by to null if group by and sub group by are same (#2621) 2023-11-03 13:15:50 +05:30
Bavisetti Narayan
7a11161cd0 dev: migrations for 0.14 (#2630)
* chore: migration files

* dev: deleted the old migration
2023-11-03 13:00:37 +05:30
Aaryan Khandelwal
260974b0de fix: user authentication on the index page (#2619)
* fix: user authentication on the index page

* fix: login redirection cleanup

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-03 12:42:43 +05:30
sriram veeraghanta
5efc6993cd fix: Update analytics page layout fixes (#2623) 2023-11-03 00:09:13 +05:30
sriram veeraghanta
3c884fd46e fix: implementing layouts using _app.tsx get layout method. (#2620)
* fix: implementing layouts in all pages

* fix: layout fixes, implemting using standard nextjs parctice
2023-11-02 23:57:44 +05:30
Anmol Singh Bhatia
a582021f2c fix: active cycle fix (#2618) 2023-11-02 22:17:10 +05:30
Bavisetti Narayan
caca2bb548 chore: bug fixes (#2609)
* chore: sub issue activity task

* fix: mentions and issue comment

* chore: added string for issue

* chore: changed sub issue id
2023-11-02 19:32:44 +05:30
Henit Chobisa
da391064aa [FIX] Minor bug fixes in MentionList and MentionNode UI (#2600)
* fix: removed text color in peek view

* fix: fixed list view UI bugs and node view colors

* feat: update imports in suggestions for mentionSuggestion type

* fix: updated mention list css

* fix: updated mention node UI according to the design provided

* style: update the mentions dropdown UI

* style: mentioned users UI in the editor

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-02 19:31:25 +05:30
Aaryan Khandelwal
a9b72fa1d2 chore: implemented MobX in the onboarding screens (#2617)
* fix: wrap the onboarding route with the UserWrapper

* chore: implement mobx in the onboarding screens
2023-11-02 19:27:25 +05:30
Anmol Singh Bhatia
b0397dfd74 fix: module sidebar status select (#2614) 2023-11-02 18:48:57 +05:30
guru_sainath
02d4d32f7a chore: Improved Handling of Empty Properties for Labels and Assignees (#2616)
* fix: show empty group

* chore: handled None values for lables and assignees in list and kanban layouts

---------

Co-authored-by: dakshesh14 <dakshesh.jain14@gmail.com>
2023-11-02 18:44:02 +05:30
Anmol Singh Bhatia
43e42f1896 fix: project view item edit action fix (#2612) 2023-11-02 17:12:04 +05:30
Aaryan Khandelwal
d5fd69354e chore: removed unused hooks and components (#2611)
* chore: remove unused hooks

* chore: removed useProjectMembers hook

* chore: removed issue hooks

* fix: build errors
2023-11-02 17:11:33 +05:30
Aaryan Khandelwal
c987c6f308 fix: calendar layout not being rendered (#2610) 2023-11-02 17:08:48 +05:30
Anmol Singh Bhatia
56e4152756 fix: select label dropdown fix for list view (#2608) 2023-11-02 16:28:07 +05:30
Aaryan Khandelwal
7b5ed252ef chore: updated the contact email (#2605)
Updated the contact email to squawk@plane.so
2023-11-02 16:27:23 +05:30
Aaryan Khandelwal
c394a4f64e style: lite text editor editor toolbar (#2601)
* style: comment editor toolbar

* style: updated icon styling
2023-11-02 16:26:57 +05:30
Dakshesh Jain
5b808571e5 fix: exception error (#2606)
* fix: exception error

* fix: invitation type
2023-11-02 16:26:16 +05:30
Dakshesh Jain
2cda47dc8a refactor: user profile store setup and bug fixes (#2586)
* fix: autorun not working when filters are changed

* fix: filter/display on overview page

* refactor: store implementation & loader in 'created' & 'subscribed' page
2023-11-02 16:25:44 +05:30
Anmol Singh Bhatia
7f3dbe298c fix: bug fixes (#2607)
* fix: module card issue count fix

* fix: project kanban view add issue bug fix

* fix: draft issue modal button alignment fix
2023-11-02 16:03:03 +05:30
Anmol Singh Bhatia
0072160891 fix: peekoverview (#2603)
* fix: peekoverview mutation fix

* fix: peekoverview mutation fix

* fix: sub-issue peekoverview
2023-11-02 16:02:34 +05:30
Anmol Singh Bhatia
4512651f8b fix: spreadsheet view properties fix (#2599) 2023-11-02 16:01:49 +05:30
guru_sainath
f6b95b8d31 fix: Issue properties dropdown overflow issue for date and labels (#2604) 2023-11-02 15:59:43 +05:30
Anmol Singh Bhatia
325fb4a377 chore: fixes and improvement (#2595)
* fix: project card fix

* chore: bug fixes and ui improvement
2023-11-02 14:01:56 +05:30
guru_sainath
ba7b7d6f8b chore: implemented module and cycle select dropdown in issue create modal (#2602) 2023-11-02 13:55:45 +05:30
Nikhil
7249f84e18 dev: code improvements and minor performance upgrades (#2201)
* dev: remove len for empty comparison

* dev: using in instead of multiple ors

* dev: assign expression to empty variables

* dev: use f-string

* dev: remove list comprehension and use generators

* dev: remove assert from paginator

* dev: use is for identity comparison with singleton

* dev: remove unnecessary else statements

* dev: fix does not exists error for both project and workspace

* dev: remove reimports

* dev: iterate a dictionary

* dev: remove unused commented code

* dev: remove redefinition

* dev: remove unused imports

* dev: remove unused imports

* dev: remove unnecessary f strings

* dev: remove unused variables

* dev: use literal structure to create the data structure

* dev: add empty lines at the end of the file

* dev: remove user middleware

* dev: remove unnecessary default None
2023-11-01 20:35:06 +05:30
Aaryan Khandelwal
d63e7cf254 chore: filters view more and less buttons (#2583)
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-01 20:34:02 +05:30
Aaryan Khandelwal
36152ea2fa chore: loading state for all layouts (#2588)
* chore: add loading states to layouts

* chore: don't show count for 0 inbox issues
2023-11-01 20:24:57 +05:30
Lakhan Baheti
1a46c6c399 feat: search project & workspace members (#2590)
* feat: search project & workspace members

* chore: formatting
2023-11-01 20:23:21 +05:30
Bavisetti Narayan
4f09a89f5e chore: unarchived issue and date format changes (#2598)
* chore: unarchived issue message corrected

* chore: passing the date in archived at
2023-11-01 20:11:40 +05:30
sriram veeraghanta
8c620c4f96 fix: store level fixes (#2597) 2023-11-01 19:22:10 +05:30
Anmol Singh Bhatia
d46eb9c59a chore: add issue option added in header group (#2592)
* chore: add issue option added in list view group header

* chore: add issue option added in kanban view group header
2023-11-01 19:10:29 +05:30
Bavisetti Narayan
e9321a66e7 chore: added validation for archived issue (#2593)
* chore: added validation for archived issue

* fix: optimised code
2023-11-01 17:20:55 +05:30
sriram veeraghanta
0121a4ab51 [FED-594] fix: user change theme interface bug fixes (#2587)
* fix: user change theme interface bugfixes

* fix: handling error case
2023-11-01 17:11:29 +05:30
Anmol Singh Bhatia
548e95c7e0 fix: bug fixes (#2581)
* fix: module sidebar fix for kanban layout

* chore: cycle & module sidebar improvement

* chore: join project content updated

* chore: project empty state header fix

* chore: create project modal dropdown consistency

* chore: list view group header overlapping issue fix

* chore: popover code refactor

* chore: module sidebar fix for cycle kanban view

* chore: add existing issue option added in module empty state

* chore: add existing issue option added in cycle empty state
2023-11-01 17:11:07 +05:30
Aaryan Khandelwal
13ead7c314 fix: project wrapper (#2589)
* fix: project wrapper

* fix: project wrapper for unjoined project

* chore: update store structure
2023-11-01 17:10:10 +05:30
sriram veeraghanta
4fcc4b4a01 fix: build fixes (#2591) 2023-11-01 16:56:44 +05:30
Henit Chobisa
d511799f31 [FEATURE] Enabled User @mentions and @mention-filters in core editor package (#2544)
* feat: created custom mention component

* feat: added mention suggestions and suggestion highlights

* feat: created mention suggestion list for displaying mention suggestions

* feat: created custom mention text component, for handling click event

* feat: exposed mention component

* feat: integrated and exposed `mentions` componenet with `editor-core`

* feat: integrated mentions extension with the core editor package

* feat: exposed suggestion types from mentions

* feat: added `mention-suggestion` parameters in `r-t-e` and `l-t-e`

* feat: added `IssueMention` model in apiserver models

* chore: updated activities background job and added bs4 in requirements

* feat: added mention removal logic in issue_activity

* chore: exposed mention types from `r-t-e` and `l-t-e`

* feat: integrated mentions in side peek view description form

* feat: added mentions in issue modal form

* feat: created custom react-hook for editor suggestions

* feat: integrated mention suggestions block in RichTextEditor

* feat: added `mentions` integration in `lite-text-editor` instances

* fix: tailwind loading nodemodules from packages

* feat: added styles for the mention suggestion list

* fix: update module import to resolve build failure

* feat: added mentions as an issue filter

* feat: added UI Changes to Implement `mention` filters

* feat: added `mentions` as a filter option in the header

* feat: added mentions in the filter list options

* feat: added mentions in default display filter options

* feat: added filters in applied and issue params in store

* feat: modified types for adding mentions as a filter option

* feat: modified `notification-card` to display message when it exists in object

* feat: rewrote user mention management upon the changes made in develop

* chore: merged debounce PR with the current PR for tracing changes

* fix: mentions_filters updated with the new setup

* feat: updated requirements for bs4

* feat: modified `mentions-filter` to remove many to many dependency

* feat: implemented list manipulation instead of for loop

* feat: added readonly functionality in `read-only` editor core

* feat: added UI Changes for read-only mode

* feat: added mentions store in web Root Store

* chore: renamed `use-editor-suggestions` hook

* feat: UI Improvements for conditional highlights w.r.t readonly in mentionNode

* fix: removed mentions from `filter_set` parameters

* fix: minor merge fixes

* fix: package lock updates

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-01 16:36:37 +05:30
Aaryan Khandelwal
490e032ac6 style: new avatar and avatar group components (#2584)
* style: new avatar components

* chore: bug fixes

* chore: add pixel to size

* chore: add comments to helper functions

* fix: build errors
2023-11-01 15:24:11 +05:30
Bavisetti Narayan
1a24f9ec25 chore: removed start date and target date filter (#2582) 2023-11-01 14:34:03 +05:30
Bavisetti Narayan
2cb94b4105 chore: added permission for importer (#2577) 2023-11-01 14:32:33 +05:30
Bavisetti Narayan
ecde7edf09 fix: removed numbers from magic code (#2570) 2023-11-01 14:31:42 +05:30
Bavisetti Narayan
02f4916e49 chore: workspace members and project members endpoint (#2560)
* fix: removed members endpoint

* fix: changed project permisson class for project member

* fix: permission changed in workspace and project

* fix: added project filter in members
2023-11-01 14:30:45 +05:30
Anmol Singh Bhatia
1be82814fc style: issue peek overview ui improvement (#2574)
* style: issue peek overview ui improvement

* chore: implemented issue subscription in peek overview

* chore: issue properties dropdown refactor

* fix: build error

* chore: label select refactor

* chore: issue peekoverview revamp and refactor

* chore: issue peekoverview properties added and code refactor

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2023-11-01 14:22:29 +05:30
sriram veeraghanta
10e35d9a06 chore: delete unused files (#2585) 2023-11-01 13:45:04 +05:30
Dakshesh Jain
2d64caef90 refactor: project settings (#2575)
* refactor: project setting estimate

* refactor: project setting label

* refactor: project setting state

* refactor: project setting integration

* refactor: project settings member

* fix: estimate not updating

* fix: estimate not in observable

* fix: build error
2023-11-01 13:42:51 +05:30
deepsource-autofix[bot]
80e6d7e1ea refactor: remove true from boolean attribute (#2579)
When using a boolean attribute in JSX, you can set the attribute value to true or omit the value. This helps to keep consistency in code.

Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
2023-11-01 12:30:21 +05:30
sriram veeraghanta
b7d5a42d45 fix: added deepsource config file (#2578) 2023-10-31 19:52:13 +05:30
sriram veeraghanta
2b1e1557ca fix: upgrading to turbo new version (#2576) 2023-10-31 19:27:56 +05:30
Aaryan Khandelwal
705b33377c fix: members list endpoint authorization (#2571)
* fix: members list endpoint authorization

* chore: update user types
2023-10-31 13:11:13 +05:30
Nikhil
49fd4427c8 chore: user settings endpoint (#2557)
* chore: user settings endpoint

* dev: fix the user settings
2023-10-31 13:08:13 +05:30
Bavisetti Narayan
bdbb64f385 fix: changed assignees and labels in pages and modules (#2553) 2023-10-31 13:06:57 +05:30
Aaryan Khandelwal
98716859d5 chore: reove unused files (#2567) 2023-10-31 12:43:08 +05:30
M. Palanikannan
8072bbb559 fix: Debounce title and Editor initialization (#2530)
* fixed debounce logic and extracted the same

* fixed editor mounting with custom hook

* removed console logs and improved structure

* fixed comment editor behavior on Shift-Enter

* fixed editor initialization behaviour for new peek view

* fixed button type to avoid reload while editing comments

* fixed initialization of content in peek overview

* improved naming variables in updated title debounce logic

* added react-hook-form support to the issue detail in peek view with save states

* delete image plugin's ts support improved
2023-10-31 12:26:10 +05:30
Aaryan Khandelwal
442c83eea2 style: spreadsheet columns (#2554)
* style: spreadsheet columns

* fix: build errors
2023-10-31 12:18:04 +05:30
Aaryan Khandelwal
cb533849e8 chore: update members endpoint (#2569) 2023-10-31 12:16:40 +05:30
Aaryan Khandelwal
59c52023fb style: list layout (#2566) 2023-10-31 12:14:06 +05:30
Aaryan Khandelwal
08ca016f65 fix: custom theme form validations (#2565) 2023-10-31 12:12:24 +05:30
Aaryan Khandelwal
1c2ea6da5e fix: edit project button redirection (#2564)
* fix: redirect to project settings

* fix: 404 page button alignment
2023-10-31 12:06:55 +05:30
Aaryan Khandelwal
8b7b5c54b9 fix: global views bugs (#2563) 2023-10-31 12:06:11 +05:30
guru_sainath
52474715de chore: handled next_url redirection issue (#2562) 2023-10-31 12:04:36 +05:30
Aaryan Khandelwal
dcf81e28e4 dev: implemented MobX in workspace settings and create workspace form (#2561)
* dev: implement mobx store for workspace settings

* chore: workspace general settings mobx integration

* chore: workspace members settings mobx integration
2023-10-30 20:38:50 +05:30
Aaryan Khandelwal
050406b8a4 chore: add empty state for list and spreadsheet layouts (#2531)
* chore: add empty state for list and spreadsheet layouts

* fix: build errors
2023-10-30 20:09:04 +05:30
Anmol Singh Bhatia
8eaac60aa5 style: cycle ui revamp, and chore: code refactor (#2558)
* chore: cycle custom svg icon added and code refactor

* chore: module code refactor

* style: cycle ui revamp and code refactor

* chore: cycle card view layout fix

* chore: layout fix

* style: module and cycle title tooltip position
2023-10-30 19:22:27 +05:30
NarayanBavisetti
03026449ea fix: page transaction model 2023-10-11 15:13:05 +05:30
NarayanBavisetti
1039337c45 Merge branch 'develop' of github.com:makeplane/plane into fix/page_structuring 2023-10-09 11:16:33 +05:30
NarayanBavisetti
5216f184c1 fix: page transaction model 2023-10-06 12:00:18 +05:30
1006 changed files with 29996 additions and 29865 deletions

17
.deepsource.toml Normal file
View File

@@ -0,0 +1,17 @@
version = 1
[[analyzers]]
name = "shell"
[[analyzers]]
name = "javascript"
[analyzers.meta]
plugins = ["react"]
environment = ["nodejs"]
[[analyzers]]
name = "python"
[analyzers.meta]
runtime_version = "3.x.x"

View File

@@ -33,3 +33,8 @@ USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
# Set it to 0, to disable it
ENABLE_WEBHOOK=1
# Set it to 0, to disable it
ENABLE_API=1

213
.github/workflows/build-branch.yml vendored Normal file
View File

@@ -0,0 +1,213 @@
name: Branch Build
on:
pull_request:
types:
- closed
branches:
- master
- release
- qa
- develop
env:
TARGET_BRANCH: ${{ github.event.pull_request.base.ref }}
jobs:
branch_build_and_push:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) }}
name: Build-Push Web/Space/API/Proxy Docker Image
runs-on: ubuntu-20.04
steps:
- name: Check out the repo
uses: actions/checkout@v3.3.0
# - name: Set Target Branch Name on PR close
# if: ${{ github.event_name == 'pull_request' && github.event.action =='closed' }}
# run: echo "TARGET_BRANCH=${{ github.event.pull_request.base.ref }}" >> $GITHUB_ENV
# - name: Set Target Branch Name on other than PR close
# if: ${{ github.event_name == 'push' }}
# run: echo "TARGET_BRANCH=${{ github.ref_name }}" >> $GITHUB_ENV
- uses: ASzc/change-string-case-action@v2
id: gh_branch_upper_lower
with:
string: ${{env.TARGET_BRANCH}}
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_slash
with:
source: ${{ steps.gh_branch_upper_lower.outputs.lowercase }}
find: '/'
replace: '-'
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_dot
with:
source: ${{ steps.gh_branch_replace_slash.outputs.value }}
find: '.'
replace: ''
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_clean
with:
source: ${{ steps.gh_branch_replace_dot.outputs.value }}
find: '_'
replace: ''
- name: Uploading Proxy Source
uses: actions/upload-artifact@v3
with:
name: proxy-src-code
path: ./nginx
- name: Uploading Backend Source
uses: actions/upload-artifact@v3
with:
name: backend-src-code
path: ./apiserver
- name: Uploading Web Source
uses: actions/upload-artifact@v3
with:
name: web-src-code
path: |
./
!./apiserver
!./nginx
!./deploy
!./space
- name: Uploading Space Source
uses: actions/upload-artifact@v3
with:
name: space-src-code
path: |
./
!./apiserver
!./nginx
!./deploy
!./web
outputs:
gh_branch_name: ${{ steps.gh_branch_clean.outputs.value }}
branch_build_push_frontend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Web Source Code
uses: actions/download-artifact@v3
with:
name: web-src-code
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./web/Dockerfile.web
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_space:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Space Source Code
uses: actions/download-artifact@v3
with:
name: space-src-code
- name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./space/Dockerfile.space
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_backend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Backend Source Code
uses: actions/download-artifact@v3
with:
name: backend-src-code
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./Dockerfile.api
platforms: linux/amd64
push: true
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_proxy:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Downloading Proxy Source Code
uses: actions/download-artifact@v3
with:
name: proxy-src-code
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./Dockerfile
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}

2
.gitignore vendored
View File

@@ -75,7 +75,7 @@ pnpm-lock.yaml
pnpm-workspace.yaml
.npmrc
.secrets
tmp/
## packages
dist

View File

@@ -60,7 +60,7 @@ representative at an online or offline event.
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
hello@plane.so.
squawk@plane.so.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the

View File

@@ -1,8 +1,10 @@
# Environment Variables
Environment variables are distributed in various files. Please refer them carefully.
## {PROJECT_FOLDER}/.env
File is available in the project root folder
```
@@ -41,25 +43,37 @@ USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
```
## {PROJECT_FOLDER}/web/.env.example
```
# Enable/Disable OAUTH - default 0 for selfhosted instance
NEXT_PUBLIC_ENABLE_OAUTH=0
# Public boards deploy URL
NEXT_PUBLIC_DEPLOY_URL="http://localhost/spaces"
```
## {PROJECT_FOLDER}/spaces/.env.example
```
# Flag to toggle OAuth
NEXT_PUBLIC_ENABLE_OAUTH=0
```
## {PROJECT_FOLDER}/apiserver/.env
```
# Backend
# Debug value for api server use it as 0 for production use
@@ -126,7 +140,9 @@ ENABLE_SIGNUP="1"
# Email Redirection URL
WEB_URL="http://localhost"
```
## Updates
- The environment variable NEXT_PUBLIC_API_BASE_URL has been removed from both the web and space projects.
- The naming convention for containers and images has been updated.
- The plane-worker image will no longer be maintained, as it has been merged with plane-backend.

View File

@@ -7,7 +7,7 @@
</p>
<h3 align="center"><b>Plane</b></h3>
<p align="center"><b>Open-source, self-hosted project planning tool</b></p>
<p align="center"><b>Flexible, extensible open-source project management</b></p>
<p align="center">
<a href="https://discord.com/invite/A92xrEGCge">

View File

@@ -1,7 +1,7 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.production"
CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs
SENTRY_DSN=""
@@ -70,3 +70,12 @@ ENABLE_MAGIC_LINK_LOGIN="0"
# Email redirections and minio domain settings
WEB_URL="http://localhost"
# Set it to 0, to disable it
ENABLE_WEBHOOK=1
# Set it to 0, to disable it
ENABLE_API=1
# Gunicorn Workers
GUNICORN_WORKERS=2

View File

@@ -0,0 +1,83 @@
import os, sys
import boto3
import json
from botocore.exceptions import ClientError
sys.path.append("/code")
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
import django
django.setup()
def set_bucket_public_policy(s3_client, bucket_name):
public_policy = {
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": [f"arn:aws:s3:::{bucket_name}/*"]
}]
}
try:
s3_client.put_bucket_policy(
Bucket=bucket_name,
Policy=json.dumps(public_policy)
)
print(f"Public read access policy set for bucket '{bucket_name}'.")
except ClientError as e:
print(f"Error setting public read access policy: {e}")
def create_bucket():
try:
from django.conf import settings
# Create a session using the credentials from Django settings
session = boto3.session.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
# Create an S3 client using the session
s3_client = session.client('s3', endpoint_url=settings.AWS_S3_ENDPOINT_URL)
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
print("Checking bucket...")
# Check if the bucket exists
s3_client.head_bucket(Bucket=bucket_name)
# If head_bucket does not raise an exception, the bucket exists
print(f"Bucket '{bucket_name}' already exists.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as e:
error_code = int(e.response['Error']['Code'])
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
if error_code == 404:
# Bucket does not exist, create it
print(f"Bucket '{bucket_name}' does not exist. Creating bucket...")
try:
s3_client.create_bucket(Bucket=bucket_name)
print(f"Bucket '{bucket_name}' created successfully.")
set_bucket_public_policy(s3_client, bucket_name)
except ClientError as create_error:
print(f"Failed to create bucket: {create_error}")
elif error_code == 403:
# Access to the bucket is forbidden
print(f"Access to the bucket '{bucket_name}' is forbidden. Check permissions.")
else:
# Another ClientError occurred
print(f"Failed to check bucket: {e}")
except Exception as ex:
# Handle any other exception
print(f"An error occurred: {ex}")
if __name__ == "__main__":
create_bucket()

View File

@@ -5,5 +5,7 @@ python manage.py migrate
# Create a Default User
python bin/user_script.py
# Create the default bucket
python bin/bucket_script.py
exec gunicorn -w 8 -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

View File

@@ -1,4 +1,4 @@
import os, sys, random, string
import os, sys
import uuid
sys.path.append("/code")

View File

@@ -1,2 +1,17 @@
from .workspace import WorkSpaceBasePermission, WorkSpaceAdminPermission, WorkspaceEntityPermission, WorkspaceViewerPermission
from .project import ProjectBasePermission, ProjectEntityPermission, ProjectMemberPermission, ProjectLitePermission
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
)
from .project import (
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@@ -13,14 +13,15 @@ Guest = 5
class ProjectBasePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
@@ -29,6 +30,7 @@ class ProjectBasePermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@@ -37,19 +39,21 @@ class ProjectBasePermission(BasePermission):
member=request.user,
role=Admin,
project_id=view.project_id,
is_active=True,
).exists()
class ProjectMemberPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug, member=request.user
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
## Only workspace owners or admins can create the projects
if request.method == "POST":
@@ -57,6 +61,7 @@ class ProjectMemberPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
role__in=[Admin, Member],
is_active=True,
).exists()
## Only Project Admins can update project attributes
@@ -65,12 +70,12 @@ class ProjectMemberPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectEntityPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
@@ -80,6 +85,7 @@ class ProjectEntityPermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()
## Only project members or admins can create and edit the project attributes
@@ -88,11 +94,11 @@ class ProjectEntityPermission(BasePermission):
member=request.user,
role__in=[Admin, Member],
project_id=view.project_id,
is_active=True,
).exists()
class ProjectLitePermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
@@ -101,4 +107,5 @@ class ProjectLitePermission(BasePermission):
workspace__slug=view.workspace_slug,
member=request.user,
project_id=view.project_id,
is_active=True,
).exists()

View File

@@ -32,12 +32,28 @@ class WorkSpaceBasePermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
# allow only owner to delete the workspace
if request.method == "DELETE":
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role=Owner
member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
is_active=True,
).exists()
class WorkspaceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
).exists()
@@ -50,6 +66,7 @@ class WorkSpaceAdminPermission(BasePermission):
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@@ -63,12 +80,14 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
).exists()
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
is_active=True,
).exists()
@@ -78,5 +97,20 @@ class WorkspaceViewerPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
member=request.user, workspace__slug=view.workspace_slug, role__gte=10
member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()
class WorkspaceUserPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
).exists()

View File

@@ -71,7 +71,7 @@ from .module import (
ModuleFavoriteSerializer,
)
from .api_token import APITokenSerializer
from .api import APITokenSerializer, APITokenReadSerializer
from .integration import (
IntegrationSerializer,
@@ -85,7 +85,7 @@ from .integration import (
from .importer import ImporterSerializer
from .page import PageSerializer, PageBlockSerializer, PageFavoriteSerializer
from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
from .estimate import (
EstimateSerializer,
@@ -100,3 +100,5 @@ from .analytic import AnalyticViewSerializer
from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer

View File

@@ -17,7 +17,7 @@ class AnalyticViewSerializer(BaseSerializer):
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:
validated_data["query"] = dict()
validated_data["query"] = {}
return AnalyticView.objects.create(**validated_data)
def update(self, instance, validated_data):
@@ -25,6 +25,6 @@ class AnalyticViewSerializer(BaseSerializer):
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:
validated_data["query"] = dict()
validated_data["query"] = {}
validated_data["query"] = issue_filters(query_params, "PATCH")
return super().update(instance, validated_data)

View File

@@ -0,0 +1,31 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
read_only_fields = [
"token",
"expired_at",
"created_at",
"updated_at",
"workspace",
"user",
]
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@@ -1,14 +0,0 @@
from .base import BaseSerializer
from plane.db.models import APIToken
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = [
"label",
"user",
"user_type",
"workspace",
"created_at",
]

View File

@@ -1,6 +1,3 @@
# Django imports
from django.db.models.functions import TruncDate
# Third party imports
from rest_framework import serializers

View File

@@ -6,7 +6,6 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .project import ProjectLiteSerializer
from .user import UserLiteSerializer
from plane.db.models import Inbox, InboxIssue, Issue

View File

@@ -5,11 +5,10 @@ from django.utils import timezone
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
from .state import StateSerializer, StateLiteSerializer
from .user import UserLiteSerializer
from .project import ProjectSerializer, ProjectLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from plane.db.models import (
User,
@@ -232,25 +231,6 @@ class IssueActivitySerializer(BaseSerializer):
fields = "__all__"
class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = IssueComment
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssuePropertySerializer(BaseSerializer):
class Meta:
@@ -287,7 +267,6 @@ class LabelLiteSerializer(BaseSerializer):
class IssueLabelSerializer(BaseSerializer):
# label_details = LabelSerializer(read_only=True, source="label")
class Meta:
model = IssueLabel
@@ -569,7 +548,7 @@ class IssueSerializer(BaseSerializer):
]
class IssueLiteSerializer(BaseSerializer):
class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")

View File

@@ -4,9 +4,8 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .user import UserLiteSerializer
from .project import ProjectSerializer, ProjectLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
from .issue import IssueStateSerializer
from plane.db.models import (
User,
@@ -19,7 +18,7 @@ from plane.db.models import (
class ModuleWriteSerializer(BaseSerializer):
members_list = serializers.ListField(
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
@@ -40,13 +39,18 @@ class ModuleWriteSerializer(BaseSerializer):
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data['members'] = [str(member.id) for member in instance.members.all()]
return data
def validate(self, data):
if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None):
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
members = validated_data.pop("members_list", None)
members = validated_data.pop("members", None)
project = self.context["project"]
@@ -72,7 +76,7 @@ class ModuleWriteSerializer(BaseSerializer):
return module
def update(self, instance, validated_data):
members = validated_data.pop("members_list", None)
members = validated_data.pop("members", None)
if members is not None:
ModuleMember.objects.filter(module=instance).delete()

View File

@@ -6,39 +6,17 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Page, PageBlock, PageFavorite, PageLabel, Label
class PageBlockSerializer(BaseSerializer):
issue_detail = IssueFlatSerializer(source="issue", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = PageBlock
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageBlockLiteSerializer(BaseSerializer):
class Meta:
model = PageBlock
fields = "__all__"
from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
class PageSerializer(BaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
labels_list = serializers.ListField(
labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
blocks = PageBlockLiteSerializer(read_only=True, many=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
@@ -50,9 +28,13 @@ class PageSerializer(BaseSerializer):
"project",
"owned_by",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data['labels'] = [str(label.id) for label in instance.labels.all()]
return data
def create(self, validated_data):
labels = validated_data.pop("labels_list", None)
labels = validated_data.pop("labels", None)
project_id = self.context["project_id"]
owned_by_id = self.context["owned_by_id"]
page = Page.objects.create(
@@ -77,7 +59,7 @@ class PageSerializer(BaseSerializer):
return page
def update(self, instance, validated_data):
labels = validated_data.pop("labels_list", None)
labels = validated_data.pop("labels", None)
if labels is not None:
PageLabel.objects.filter(page=instance).delete()
PageLabel.objects.bulk_create(
@@ -98,6 +80,41 @@ class PageSerializer(BaseSerializer):
return super().update(instance, validated_data)
class SubPageSerializer(BaseSerializer):
entity_details = serializers.SerializerMethodField()
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
def get_entity_details(self, obj):
entity_name = obj.entity_name
if entity_name == 'forward_link' or entity_name == 'back_link':
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
except Page.DoesNotExist:
return None
return None
class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"page",
]
class PageFavoriteSerializer(BaseSerializer):
page_detail = PageSerializer(source="page", read_only=True)

View File

@@ -103,13 +103,16 @@ class ProjectListSerializer(DynamicBaseSerializer):
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(project_id=obj.id).values(
project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return project_members
return list(project_members)
class Meta:
model = Project

View File

@@ -7,8 +7,6 @@ from plane.db.models import State
class StateSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
model = State

View File

@@ -79,14 +79,14 @@ class UserMeSettingsSerializer(BaseSerializer):
email=obj.email
).count()
if obj.last_workspace_id is not None:
workspace = Workspace.objects.get(
workspace = Workspace.objects.filter(
pk=obj.last_workspace_id, workspace_member__member=obj.id
)
).first()
return {
"last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": workspace.slug,
"last_workspace_slug": workspace.slug if workspace is not None else "",
"fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": workspace.slug,
"fallback_workspace_slug": workspace.slug if workspace is not None else "",
"invites": workspace_invites,
}
else:

View File

@@ -57,7 +57,7 @@ class IssueViewSerializer(BaseSerializer):
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:
validated_data["query"] = dict()
validated_data["query"] = {}
return IssueView.objects.create(**validated_data)
def update(self, instance, validated_data):
@@ -65,7 +65,7 @@ class IssueViewSerializer(BaseSerializer):
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:
validated_data["query"] = dict()
validated_data["query"] = {}
validated_data["query"] = issue_filters(query_params, "PATCH")
return super().update(instance, validated_data)

View File

@@ -0,0 +1,30 @@
# Third party imports
from rest_framework import serializers
# Module imports
from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]

View File

@@ -110,7 +110,6 @@ class TeamSerializer(BaseSerializer):
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return team
else:
team = Team.objects.create(**validated_data)
return team
@@ -124,7 +123,6 @@ class TeamSerializer(BaseSerializer):
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return super().update(instance, validated_data)
else:
return super().update(instance, validated_data)

View File

@@ -1,10 +1,10 @@
from .analytic import urlpatterns as analytic_urls
from .asset import urlpatterns as asset_urls
from .authentication import urlpatterns as authentication_urls
from .configuration import urlpatterns as configuration_urls
from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls
from .estimate import urlpatterns as estimate_urls
from .gpt import urlpatterns as gpt_urls
from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
from .inbox import urlpatterns as inbox_urls
from .integration import urlpatterns as integration_urls
@@ -14,13 +14,17 @@ from .notification import urlpatterns as notification_urls
from .page import urlpatterns as page_urls
from .project import urlpatterns as project_urls
from .public_board import urlpatterns as public_board_urls
from .release_note import urlpatterns as release_note_urls
from .search import urlpatterns as search_urls
from .state import urlpatterns as state_urls
from .unsplash import urlpatterns as unsplash_urls
from .user import urlpatterns as user_urls
from .views import urlpatterns as view_urls
from .workspace import urlpatterns as workspace_urls
from .api import urlpatterns as api_urls
from .webhook import urlpatterns as webhook_urls
# Django imports
from django.conf import settings
urlpatterns = [
@@ -30,7 +34,7 @@ urlpatterns = [
*configuration_urls,
*cycle_urls,
*estimate_urls,
*gpt_urls,
*external_urls,
*importer_urls,
*inbox_urls,
*integration_urls,
@@ -40,11 +44,15 @@ urlpatterns = [
*page_urls,
*project_urls,
*public_board_urls,
*release_note_urls,
*search_urls,
*state_urls,
*unsplash_urls,
*user_urls,
*view_urls,
*workspace_urls,
]
if settings.ENABLE_WEBHOOK:
urlpatterns += webhook_urls
if settings.ENABLE_API:
urlpatterns += api_urls

View File

@@ -0,0 +1,17 @@
from django.urls import path
from plane.api.views import ApiTokenEndpoint
urlpatterns = [
# API Tokens
path(
"workspaces/<str:slug>/api-tokens/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
path(
"workspaces/<str:slug>/api-tokens/<uuid:pk>/",
ApiTokenEndpoint.as_view(),
name="api-tokens",
),
## End API Tokens
]

View File

@@ -0,0 +1,25 @@
from django.urls import path
from plane.api.views import UnsplashEndpoint
from plane.api.views import ReleaseNotesEndpoint
from plane.api.views import GPTIntegrationEndpoint
urlpatterns = [
path(
"unsplash/",
UnsplashEndpoint.as_view(),
name="unsplash",
),
path(
"release-notes/",
ReleaseNotesEndpoint.as_view(),
name="release-notes",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/",
GPTIntegrationEndpoint.as_view(),
name="importer",
),
]

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import GPTIntegrationEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/ai-assistant/",
GPTIntegrationEndpoint.as_view(),
name="importer",
),
]

View File

@@ -3,6 +3,8 @@ from django.urls import path
from plane.api.views import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
@@ -35,6 +37,16 @@ urlpatterns = [
),
name="project-issue",
),
path(
"v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListGroupedEndpoint.as_view(),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(

View File

@@ -3,9 +3,9 @@ from django.urls import path
from plane.api.views import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
PageLogEndpoint,
SubPagesEndpoint,
)
@@ -31,27 +31,6 @@ urlpatterns = [
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
PageFavoriteViewSet.as_view(
@@ -72,8 +51,83 @@ urlpatterns = [
name="user-favorite-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(),
name="page-block-issues",
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/",
PageViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/",
PageViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(),
name="sub-page",
),
]

View File

@@ -2,19 +2,16 @@ from django.urls import path
from plane.api.views import (
ProjectViewSet,
InviteProjectEndpoint,
ProjectInvitationsViewset,
ProjectMemberViewSet,
ProjectMemberEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberUserEndpoint,
AddMemberToProjectEndpoint,
ProjectJoinEndpoint,
AddTeamToProjectEndpoint,
ProjectUserViewsEndpoint,
ProjectIdentifierEndpoint,
ProjectFavoritesViewSet,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint
ProjectPublicCoverImagesEndpoint,
UserProjectInvitationsViewset,
)
@@ -47,13 +44,48 @@ urlpatterns = [
name="project-identifiers",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invite/",
InviteProjectEndpoint.as_view(),
name="invite-project",
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/",
ProjectMemberViewSet.as_view({"get": "list"}),
ProjectMemberViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-member",
),
path(
@@ -68,40 +100,19 @@ urlpatterns = [
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-members/",
ProjectMemberEndpoint.as_view(),
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
ProjectMemberViewSet.as_view(
{
"post": "leave",
}
),
name="project-member",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/add/",
AddMemberToProjectEndpoint.as_view(),
name="project",
),
path(
"workspaces/<str:slug>/projects/join/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/team-invite/",
AddTeamToProjectEndpoint.as_view(),
name="projects",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/",
ProjectMemberInvitationsViewset.as_view({"get": "list"}),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/invitations/<uuid:pk>/",
ProjectMemberInvitationsViewset.as_view(
{
"get": "retrieve",
"delete": "destroy",
}
),
name="project-member-invite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/project-views/",
ProjectUserViewsEndpoint.as_view(),
@@ -131,11 +142,6 @@ urlpatterns = [
),
name="project-favorite",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/members/leave/",
LeaveProjectEndpoint.as_view(),
name="leave-project",
),
path(
"project-covers/",
ProjectPublicCoverImagesEndpoint.as_view(),

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import ReleaseNotesEndpoint
urlpatterns = [
path(
"release-notes/",
ReleaseNotesEndpoint.as_view(),
name="release-notes",
),
]

View File

@@ -20,11 +20,19 @@ urlpatterns = [
StateViewSet.as_view(
{
"get": "retrieve",
"put": "update",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-state",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:pk>/mark-default/",
StateViewSet.as_view(
{
"post": "mark_as_default",
}
),
name="project-state",
),
]

View File

@@ -1,13 +0,0 @@
from django.urls import path
from plane.api.views import UnsplashEndpoint
urlpatterns = [
path(
"unsplash/",
UnsplashEndpoint.as_view(),
name="unsplash",
),
]

View File

@@ -9,15 +9,10 @@ from plane.api.views import (
ChangePasswordEndpoint,
## End User
## Workspaces
UserWorkspaceInvitationsEndpoint,
UserWorkSpacesEndpoint,
JoinWorkspaceEndpoint,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserActivityGraphEndpoint,
UserIssueCompletedGraphEndpoint,
UserWorkspaceDashboardEndpoint,
UserProjectInvitationsViewset,
## End Workspaces
)
@@ -26,7 +21,11 @@ urlpatterns = [
path(
"users/me/",
UserEndpoint.as_view(
{"get": "retrieve", "patch": "partial_update", "delete": "destroy"}
{
"get": "retrieve",
"patch": "partial_update",
"delete": "deactivate",
}
),
name="users",
),
@@ -65,23 +64,6 @@ urlpatterns = [
UserWorkSpacesEndpoint.as_view(),
name="user-workspace",
),
# user workspace invitations
path(
"users/me/invitations/workspaces/",
UserWorkspaceInvitationsEndpoint.as_view({"get": "list", "post": "create"}),
name="user-workspace-invitations",
),
# user workspace invitation
path(
"users/me/invitations/<uuid:pk>/",
UserWorkspaceInvitationEndpoint.as_view(
{
"get": "retrieve",
}
),
name="user-workspace-invitation",
),
# user join workspace
# User Graphs
path(
"users/me/workspaces/<str:slug>/activity-graph/",
@@ -99,15 +81,4 @@ urlpatterns = [
name="user-workspace-dashboard",
),
## End User Graph
path(
"users/me/invitations/workspaces/<str:slug>/<uuid:pk>/join/",
JoinWorkspaceEndpoint.as_view(),
name="user-join-workspace",
),
# user project invitations
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view({"get": "list", "post": "create"}),
name="user-project-invitations",
),
]

View File

@@ -0,0 +1,31 @@
from django.urls import path
from plane.api.views import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)
urlpatterns = [
path(
"workspaces/<str:slug>/webhooks/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/",
WebhookEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhooks/<uuid:pk>/regenerate/",
WebhookSecretRegenerateEndpoint.as_view(),
name="webhooks",
),
path(
"workspaces/<str:slug>/webhook-logs/<uuid:webhook_id>/",
WebhookLogsEndpoint.as_view(),
name="webhooks",
),
]

View File

@@ -2,10 +2,10 @@ from django.urls import path
from plane.api.views import (
UserWorkspaceInvitationsViewSet,
WorkSpaceViewSet,
InviteWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
WorkspaceMembersEndpoint,
WorkspaceInvitationsViewset,
WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint,
@@ -18,7 +18,6 @@ from plane.api.views import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
LeaveWorkspaceEndpoint,
)
@@ -50,14 +49,14 @@ urlpatterns = [
),
name="workspace",
),
path(
"workspaces/<str:slug>/invite/",
InviteWorkspaceEndpoint.as_view(),
name="invite-workspace",
),
path(
"workspaces/<str:slug>/invitations/",
WorkspaceInvitationsViewset.as_view({"get": "list"}),
WorkspaceInvitationsViewset.as_view(
{
"get": "list",
"post": "create",
},
),
name="workspace-invitations",
),
path(
@@ -70,6 +69,23 @@ urlpatterns = [
),
name="workspace-invitations",
),
# user workspace invitations
path(
"users/me/workspaces/invitations/",
UserWorkspaceInvitationsViewSet.as_view(
{
"get": "list",
"post": "create",
},
),
name="user-workspace-invitations",
),
path(
"workspaces/<str:slug>/invitations/<uuid:pk>/join/",
WorkspaceJoinEndpoint.as_view(),
name="workspace-join",
),
# user join workspace
path(
"workspaces/<str:slug>/members/",
WorkSpaceMemberViewSet.as_view({"get": "list"}),
@@ -87,9 +103,13 @@ urlpatterns = [
name="workspace-member",
),
path(
"workspaces/<str:slug>/workspace-members/",
WorkspaceMembersEndpoint.as_view(),
name="workspace-members",
"workspaces/<str:slug>/members/leave/",
WorkSpaceMemberViewSet.as_view(
{
"post": "leave",
},
),
name="leave-workspace-members",
),
path(
"workspaces/<str:slug>/teams/",
@@ -174,9 +194,4 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels",
),
path(
"workspaces/<str:slug>/members/leave/",
LeaveWorkspaceEndpoint.as_view(),
name="leave-workspace-members",
),
]

View File

@@ -28,7 +28,6 @@ from plane.api.views import (
## End User
# Workspaces
WorkSpaceViewSet,
UserWorkspaceInvitationsEndpoint,
UserWorkSpacesEndpoint,
InviteWorkspaceEndpoint,
JoinWorkspaceEndpoint,
@@ -125,9 +124,10 @@ from plane.api.views import (
## End Modules
# Pages
PageViewSet,
PageBlockViewSet,
PageLogEndpoint,
SubPagesEndpoint,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
CreateIssueFromBlockEndpoint,
## End Pages
# Api Tokens
ApiTokenEndpoint,
@@ -1223,25 +1223,81 @@ urlpatterns = [
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/",
PageBlockViewSet.as_view(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/archive/",
PageViewSet.as_view(
{
"post": "archive",
}
),
name="project-page-archive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unarchive/",
PageViewSet.as_view(
{
"post": "unarchive",
}
),
name="project-page-unarchive"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
PageViewSet.as_view(
{
"get": "archive_list",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/lock/",
PageViewSet.as_view(
{
"post": "lock",
}
),
name="project-pages",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/unlock/",
PageViewSet.as_view(
{
"post": "unlock",
}
)
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(), name="page-transactions"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(), name="sub-page"
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/",
BulkEstimatePointEndpoint.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-page-blocks",
name="bulk-create-estimate-points",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:pk>/",
PageBlockViewSet.as_view(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/<uuid:estimate_id>/",
BulkEstimatePointEndpoint.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-page-blocks",
name="bulk-create-estimate-points",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-pages/",
@@ -1264,7 +1320,7 @@ urlpatterns = [
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/page-blocks/<uuid:page_block_id>/issues/",
CreateIssueFromPageBlockEndpoint.as_view(),
CreateIssueFromBlockEndpoint.as_view(),
name="page-block-issues",
),
## End Pages

View File

@@ -2,21 +2,16 @@ from .project import (
ProjectViewSet,
ProjectMemberViewSet,
UserProjectInvitationsViewset,
InviteProjectEndpoint,
ProjectInvitationsViewset,
AddTeamToProjectEndpoint,
ProjectMemberInvitationsViewset,
ProjectMemberInviteDetailViewSet,
ProjectIdentifierEndpoint,
AddMemberToProjectEndpoint,
ProjectJoinEndpoint,
ProjectUserViewsEndpoint,
ProjectMemberUserEndpoint,
ProjectFavoritesViewSet,
ProjectDeployBoardViewSet,
ProjectDeployBoardPublicSettingsEndpoint,
ProjectMemberEndpoint,
WorkspaceProjectDeployBoardEndpoint,
LeaveProjectEndpoint,
ProjectPublicCoverImagesEndpoint,
)
from .user import (
@@ -28,19 +23,17 @@ from .user import (
from .oauth import OauthEndpoint
from .base import BaseAPIView, BaseViewSet
from .base import BaseAPIView, BaseViewSet, WebhookMixin
from .workspace import (
WorkSpaceViewSet,
UserWorkSpacesEndpoint,
WorkSpaceAvailabilityCheckEndpoint,
InviteWorkspaceEndpoint,
JoinWorkspaceEndpoint,
WorkspaceJoinEndpoint,
WorkSpaceMemberViewSet,
TeamMemberViewSet,
WorkspaceInvitationsViewset,
UserWorkspaceInvitationsEndpoint,
UserWorkspaceInvitationEndpoint,
UserWorkspaceInvitationsViewSet,
UserLastProjectWithWorkspaceEndpoint,
WorkspaceMemberUserEndpoint,
WorkspaceMemberUserViewsEndpoint,
@@ -53,11 +46,14 @@ from .workspace import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
WorkspaceMembersEndpoint,
LeaveWorkspaceEndpoint,
)
from .state import StateViewSet
from .view import GlobalViewViewSet, GlobalViewIssuesViewSet, IssueViewViewSet, IssueViewFavoriteViewSet
from .view import (
GlobalViewViewSet,
GlobalViewIssuesViewSet,
IssueViewViewSet,
IssueViewFavoriteViewSet,
)
from .cycle import (
CycleViewSet,
CycleIssueViewSet,
@@ -68,6 +64,8 @@ from .cycle import (
from .asset import FileAssetEndpoint, UserAssetsEndpoint
from .issue import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
WorkSpaceIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
@@ -117,7 +115,7 @@ from .module import (
ModuleFavoriteViewSet,
)
from .api_token import ApiTokenEndpoint
from .api import ApiTokenEndpoint
from .integration import (
WorkspaceIntegrationViewSet,
@@ -140,9 +138,10 @@ from .importer import (
from .page import (
PageViewSet,
PageBlockViewSet,
PageFavoriteViewSet,
CreateIssueFromPageBlockEndpoint,
PageLogEndpoint,
SubPagesEndpoint,
CreateIssueFromBlockEndpoint,
)
from .search import GlobalSearchEndpoint, IssueSearchEndpoint
@@ -165,8 +164,14 @@ from .analytic import (
DefaultAnalyticsEndpoint,
)
from .notification import NotificationViewSet, UnreadNotificationEndpoint, MarkAllReadNotificationViewSet
from .notification import (
NotificationViewSet,
UnreadNotificationEndpoint,
MarkAllReadNotificationViewSet,
)
from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint

View File

@@ -0,0 +1,78 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.api.serializers import APITokenSerializer, APITokenReadSerializer
from plane.api.permissions import WorkspaceOwnerPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
label = request.data.get("label", str(uuid4().hex))
description = request.data.get("description", "")
workspace = Workspace.objects.get(slug=slug)
expired_at = request.data.get("expired_at", None)
# Check the user type
user_type = 1 if request.user.is_bot else 0
api_token = APIToken.objects.create(
label=label,
description=description,
user=request.user,
workspace=workspace,
user_type=user_type,
expired_at=expired_at,
)
serializer = APITokenSerializer(api_token)
# Token will be only visible while creating
return Response(
serializer.data,
status=status.HTTP_201_CREATED,
)
def get(self, request, slug, pk=None):
if pk == None:
api_tokens = APIToken.objects.filter(
user=request.user, workspace__slug=slug
)
serializer = APITokenReadSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
api_tokens = APIToken.objects.get(
user=request.user, workspace__slug=slug, pk=pk
)
serializer = APITokenReadSerializer(api_tokens)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, pk):
api_token = APIToken.objects.get(
workspace__slug=slug,
user=request.user,
pk=pk,
)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -1,47 +0,0 @@
# Python import
from uuid import uuid4
# Third party
from rest_framework.response import Response
from rest_framework import status
from sentry_sdk import capture_exception
# Module import
from .base import BaseAPIView
from plane.db.models import APIToken
from plane.api.serializers import APITokenSerializer
class ApiTokenEndpoint(BaseAPIView):
def post(self, request):
label = request.data.get("label", str(uuid4().hex))
workspace = request.data.get("workspace", False)
if not workspace:
return Response(
{"error": "Workspace is required"}, status=status.HTTP_200_OK
)
api_token = APIToken.objects.create(
label=label, user=request.user, workspace_id=workspace
)
serializer = APITokenSerializer(api_token)
# Token will be only vissible while creating
return Response(
{"api_token": serializer.data, "token": api_token.token},
status=status.HTTP_201_CREATED,
)
def get(self, request):
api_tokens = APIToken.objects.filter(user=request.user)
serializer = APITokenSerializer(api_tokens, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def delete(self, request, pk):
api_token = APIToken.objects.get(pk=pk)
api_token.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -33,7 +33,7 @@ from plane.bgtasks.forgot_password_task import forgot_password
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
@@ -55,11 +55,11 @@ class VerifyEmailEndpoint(BaseAPIView):
return Response(
{"email": "Successfully activated"}, status=status.HTTP_200_OK
)
except jwt.ExpiredSignatureError as indentifier:
except jwt.ExpiredSignatureError as _indentifier:
return Response(
{"email": "Activation expired"}, status=status.HTTP_400_BAD_REQUEST
)
except jwt.exceptions.DecodeError as indentifier:
except jwt.exceptions.DecodeError as _indentifier:
return Response(
{"email": "Invalid token"}, status=status.HTTP_400_BAD_REQUEST
)
@@ -76,7 +76,7 @@ class ForgotPasswordEndpoint(BaseAPIView):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site

View File

@@ -4,7 +4,7 @@ import random
import string
import json
import requests
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
from django.core.exceptions import ValidationError
@@ -22,8 +22,13 @@ from sentry_sdk import capture_exception, capture_message
# Module imports
from . import BaseAPIView
from plane.db.models import User
from plane.api.serializers import UserSerializer
from plane.db.models import (
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
@@ -86,13 +91,63 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
@@ -114,7 +169,15 @@ class SignUpEndpoint(BaseAPIView):
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
@@ -176,7 +239,63 @@ class SignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
@@ -198,11 +317,14 @@ class SignInEndpoint(BaseAPIView):
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
access_token, refresh_token = get_tokens_for_user(user)
return Response(data, status=status.HTTP_200_OK)
@@ -249,11 +371,11 @@ class MagicSignInGenerateEndpoint(BaseAPIView):
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase + string.digits, k=4))
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase + string.digits, k=4))
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase + string.digits, k=4))
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
ri = redis_instance()
@@ -287,7 +409,8 @@ class MagicSignInGenerateEndpoint(BaseAPIView):
ri.set(key, json.dumps(value), ex=expiry)
current_site = settings.WEB_URL
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
@@ -319,6 +442,14 @@ class MagicSignInEndpoint(BaseAPIView):
if str(token) == str(user_token):
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
@@ -340,6 +471,8 @@ class MagicSignInEndpoint(BaseAPIView):
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else:
user = User.objects.create(
email=email,
@@ -347,6 +480,7 @@ class MagicSignInEndpoint(BaseAPIView):
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
@@ -368,6 +502,8 @@ class MagicSignInEndpoint(BaseAPIView):
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
user.last_active = timezone.now()
user.last_login_time = timezone.now()
@@ -376,6 +512,63 @@ class MagicSignInEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,

View File

@@ -1,5 +1,6 @@
# Python imports
import zoneinfo
import json
# Django imports
from django.urls import resolve
@@ -7,6 +8,7 @@ from django.conf import settings
from django.utils import timezone
from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.core.serializers.json import DjangoJSONEncoder
# Third part imports
from rest_framework import status
@@ -22,6 +24,7 @@ from django_filters.rest_framework import DjangoFilterBackend
# Module imports
from plane.utils.paginator import BasePaginator
from plane.bgtasks.webhook_task import send_webhook
class TimezoneMixin:
@@ -29,6 +32,7 @@ class TimezoneMixin:
This enables timezone conversion according
to the user set timezone
"""
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
if request.user.is_authenticated:
@@ -37,8 +41,29 @@ class TimezoneMixin:
timezone.deactivate()
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class WebhookMixin:
webhook_event = None
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
and settings.ENABLE_WEBHOOK
):
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
action=self.request.method,
slug=self.workspace_slug,
)
return response
class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
model = None
permission_classes = [
@@ -71,18 +96,30 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
capture_exception(e)
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": f"key {e} does not exist"},
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error")
capture_exception(e)
@@ -99,8 +136,8 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
print(
f"{request.method} - {request.get_full_path()} of Queries: {len(connection.queries)}"
)
return response
return response
except Exception as exc:
response = self.handle_exception(exc)
return exc
@@ -120,7 +157,6 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
permission_classes = [
IsAuthenticated,
]
@@ -139,7 +175,6 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
queryset = backend().filter_queryset(self.request, queryset, self)
return queryset
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@@ -150,19 +185,29 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
return response
except Exception as e:
if isinstance(e, IntegrityError):
return Response({"error": "The payload is not valid"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "The payload is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ValidationError):
return Response({"error": "Please provide valid detail"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Please provide valid detail"},
status=status.HTTP_400_BAD_REQUEST,
)
if isinstance(e, ObjectDoesNotExist):
model_name = str(exc).split(" matching query does not exist.")[0]
return Response({"error": f"{model_name} does not exist."}, status=status.HTTP_404_NOT_FOUND)
return Response(
{"error": f"{model_name} does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
if isinstance(e, KeyError):
return Response({"error": f"key {e} does not exist"}, status=status.HTTP_400_BAD_REQUEST)
print(e) if settings.DEBUG else print("Server Error")
if settings.DEBUG:
print(e)
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)

View File

@@ -21,8 +21,8 @@ class ConfigurationEndpoint(BaseAPIView):
def get(self, request):
data = {}
data["google"] = os.environ.get("GOOGLE_CLIENT_ID", None)
data["github"] = os.environ.get("GITHUB_CLIENT_ID", None)
data["google_client_id"] = os.environ.get("GOOGLE_CLIENT_ID", None)
data["github_client_id"] = os.environ.get("GITHUB_CLIENT_ID", None)
data["github_app_name"] = os.environ.get("GITHUB_APP_NAME", None)
data["magic_login"] = (
bool(settings.EMAIL_HOST_USER) and bool(settings.EMAIL_HOST_PASSWORD)
@@ -30,4 +30,8 @@ class ConfigurationEndpoint(BaseAPIView):
data["email_password_login"] = (
os.environ.get("ENABLE_EMAIL_PASSWORD", "0") == "1"
)
data["slack_client_id"] = os.environ.get("SLACK_CLIENT_ID", None)
data["posthog_api_key"] = os.environ.get("POSTHOG_API_KEY", None)
data["posthog_host"] = os.environ.get("POSTHOG_HOST", None)
data["has_unsplash_configured"] = bool(settings.UNSPLASH_ACCESS_KEY)
return Response(data, status=status.HTTP_200_OK)

View File

@@ -3,7 +3,6 @@ import json
# Django imports
from django.db.models import (
OuterRef,
Func,
F,
Q,
@@ -24,7 +23,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
@@ -49,9 +48,10 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class CycleViewSet(BaseViewSet):
class CycleViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
@@ -177,9 +177,8 @@ class CycleViewSet(BaseViewSet):
def list(self, request, slug, project_id):
queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all")
order_by = request.GET.get("order_by", "sort_order")
queryset = queryset.order_by(order_by)
queryset = queryset.order_by("-is_favorite","-created_at")
# Current Cycle
if cycle_view == "current":
@@ -480,13 +479,13 @@ class CycleViewSet(BaseViewSet):
)
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
# Delete the cycle
cycle.delete()
issue_activity.delay(
type="cycle.activity.deleted",
requested_data=json.dumps(
{
"cycle_id": str(pk),
"cycle_name": str(cycle.name),
"issues": [str(issue_id) for issue_id in cycle_issues],
}
),
@@ -496,13 +495,15 @@ class CycleViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
# Delete the cycle
cycle.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueViewSet(BaseViewSet):
class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
@@ -512,12 +513,6 @@ class CycleIssueViewSet(BaseViewSet):
"issue__assignees__id",
]
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
cycle_id=self.kwargs.get("cycle_id"),
)
def get_queryset(self):
return self.filter_queryset(
super()
@@ -670,7 +665,7 @@ class CycleIssueViewSet(BaseViewSet):
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": issues}),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{

View File

@@ -89,4 +89,4 @@ class UnsplashEndpoint(BaseAPIView):
}
resp = requests.get(url=url, headers=headers)
return Response(resp.json(), status=status.HTTP_200_OK)
return Response(resp.json(), status=resp.status_code)

View File

@@ -39,6 +39,7 @@ from plane.utils.integrations.github import get_github_repo_details
from plane.utils.importers.jira import jira_project_issue_summary
from plane.bgtasks.importer_task import service_importer
from plane.utils.html_processor import strip_tags
from plane.api.permissions import WorkSpaceAdminPermission
class ServiceIssueImportSummaryEndpoint(BaseAPIView):
@@ -119,6 +120,9 @@ class ServiceIssueImportSummaryEndpoint(BaseAPIView):
class ImportServiceEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
def post(self, request, slug, service):
project_id = request.data.get("project_id", False)

View File

@@ -64,9 +64,7 @@ class InboxViewSet(BaseViewSet):
serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
inbox = Inbox.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
# Handle default inbox delete
if inbox.is_default:
return Response(
@@ -128,9 +126,7 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -150,7 +146,6 @@ class InboxIssueViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
@@ -198,7 +193,7 @@ class InboxIssueViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@@ -216,10 +211,20 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
@@ -233,8 +238,10 @@ class InboxIssueViewSet(BaseViewSet):
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
@@ -256,7 +263,7 @@ class InboxIssueViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
else:
@@ -307,7 +314,9 @@ class InboxIssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK)
return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def retrieve(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
@@ -324,15 +333,27 @@ class InboxIssueViewSet(BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
project_member = ProjectMember.objects.get(workspace__slug=slug, project_id=project_id, member=request.user)
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id).delete()
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -347,7 +368,10 @@ class InboxIssuePublicViewSet(BaseViewSet):
]
def get_queryset(self):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=self.kwargs.get("slug"), project_id=self.kwargs.get("project_id"))
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
)
if project_deploy_board is not None:
return self.filter_queryset(
super()
@@ -360,13 +384,17 @@ class InboxIssuePublicViewSet(BaseViewSet):
)
.select_related("issue", "workspace", "project")
)
else:
return InboxIssue.objects.none()
def list(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
filters = issue_filters(request.query_params, "GET")
issues = (
@@ -393,9 +421,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -416,9 +442,14 @@ class InboxIssuePublicViewSet(BaseViewSet):
)
def create(self, request, slug, project_id, inbox_id):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
if not request.data.get("issue", {}).get("name", False):
return Response(
@@ -466,7 +497,7 @@ class InboxIssuePublicViewSet(BaseViewSet):
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
@@ -480,34 +511,41 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
# Get the project member
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot edit inbox issues"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot edit inbox issues"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get issue data
issue_data = request.data.pop("issue", False)
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
)
# viewers and guests since only viewers and guests
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get("description_html", issue.description_html),
"description": issue_data.get("description", issue.description)
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueCreateSerializer(
issue, data=issue_data, partial=True
)
issue_serializer = IssueCreateSerializer(issue, data=issue_data, partial=True)
if issue_serializer.is_valid():
current_instance = issue
@@ -524,16 +562,21 @@ class InboxIssuePublicViewSet(BaseViewSet):
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
issue_serializer.save()
return Response(issue_serializer.data, status=status.HTTP_200_OK)
return Response(issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
@@ -545,16 +588,24 @@ class InboxIssuePublicViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk):
project_deploy_board = ProjectDeployBoard.objects.get(workspace__slug=slug, project_id=project_id)
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
if project_deploy_board.inbox is None:
return Response({"error": "Inbox is not enabled for this Project Board"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Inbox is not enabled for this Project Board"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
if str(inbox_issue.created_by_id) != str(request.user.id):
return Response({"error": "You cannot delete inbox issue"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "You cannot delete inbox issue"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,6 +1,6 @@
# Python improts
import uuid
import requests
# Django imports
from django.contrib.auth.hashers import make_password
@@ -25,7 +25,7 @@ from plane.utils.integrations.github import (
delete_github_installation,
)
from plane.api.permissions import WorkSpaceAdminPermission
from plane.utils.integrations.slack import slack_oauth
class IntegrationViewSet(BaseViewSet):
serializer_class = IntegrationSerializer
@@ -98,12 +98,19 @@ class WorkspaceIntegrationViewSet(BaseViewSet):
config = {"installation_id": installation_id}
if provider == "slack":
metadata = request.data.get("metadata", {})
code = request.data.get("code", False)
if not code:
return Response({"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST)
slack_response = slack_oauth(code=code)
metadata = slack_response
access_token = metadata.get("access_token", False)
team_id = metadata.get("team", {}).get("id", False)
if not metadata or not access_token or not team_id:
return Response(
{"error": "Access token and team id is required"},
{"error": "Slack could not be installed. Please try again later"},
status=status.HTTP_400_BAD_REQUEST,
)
config = {"team_id": team_id, "access_token": access_token}

View File

@@ -11,6 +11,7 @@ from plane.api.views import BaseViewSet, BaseAPIView
from plane.db.models import SlackProjectSync, WorkspaceIntegration, ProjectMember
from plane.api.serializers import SlackProjectSyncSerializer
from plane.api.permissions import ProjectBasePermission, ProjectEntityPermission
from plane.utils.integrations.slack import slack_oauth
class SlackProjectSyncViewSet(BaseViewSet):
@@ -32,25 +33,47 @@ class SlackProjectSyncViewSet(BaseViewSet):
)
def create(self, request, slug, project_id, workspace_integration_id):
serializer = SlackProjectSyncSerializer(data=request.data)
try:
code = request.data.get("code", False)
if not code:
return Response(
{"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST
)
slack_response = slack_oauth(code=code)
workspace_integration = WorkspaceIntegration.objects.get(
workspace__slug=slug, pk=workspace_integration_id
)
if serializer.is_valid():
serializer.save(
project_id=project_id,
workspace_integration_id=workspace_integration_id,
)
workspace_integration = WorkspaceIntegration.objects.get(
pk=workspace_integration_id, workspace__slug=slug
)
slack_project_sync = SlackProjectSync.objects.create(
access_token=slack_response.get("access_token"),
scopes=slack_response.get("scope"),
bot_user_id=slack_response.get("bot_user_id"),
webhook_url=slack_response.get("incoming_webhook", {}).get("url"),
data=slack_response,
team_id=slack_response.get("team", {}).get("id"),
team_name=slack_response.get("team", {}).get("name"),
workspace_integration=workspace_integration,
project_id=project_id,
)
_ = ProjectMember.objects.get_or_create(
member=workspace_integration.actor, role=20, project_id=project_id
)
serializer = SlackProjectSyncSerializer(slack_project_sync)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "Slack is already installed for the project"},
status=status.HTTP_410_GONE,
)
capture_exception(e)
return Response(
{"error": "Slack could not be installed. Please try again later"},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -33,13 +33,12 @@ from rest_framework.permissions import AllowAny, IsAuthenticated
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet, BaseAPIView
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
IssueCreateSerializer,
IssueActivitySerializer,
IssueCommentSerializer,
IssuePropertySerializer,
LabelSerializer,
IssueSerializer,
LabelSerializer,
IssueFlatSerializer,
@@ -85,7 +84,7 @@ from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
class IssueViewSet(BaseViewSet):
class IssueViewSet(WebhookMixin, BaseViewSet):
def get_serializer_class(self):
return (
IssueCreateSerializer
@@ -94,6 +93,7 @@ class IssueViewSet(BaseViewSet):
)
model = Issue
webhook_event = "issue"
permission_classes = [
ProjectEntityPermission,
]
@@ -235,10 +235,7 @@ class IssueViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -316,6 +313,104 @@ class IssueViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class IssueListEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
serializer = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueListGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class UserWorkSpaceIssues(BaseAPIView):
@method_decorator(gzip_page)
def get(self, request, slug):
@@ -443,9 +538,7 @@ class UserWorkSpaceIssues(BaseAPIView):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
class WorkSpaceIssuesEndpoint(BaseAPIView):
@@ -502,9 +595,10 @@ class IssueActivityEndpoint(BaseAPIView):
return Response(result_list, status=status.HTTP_200_OK)
class IssueCommentViewSet(BaseViewSet):
class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
permission_classes = [
ProjectLitePermission,
]
@@ -531,6 +625,7 @@ class IssueCommentViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
@@ -623,7 +718,6 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
serializer = IssuePropertySerializer(issue_property)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug, project_id):
issue_property, _ = IssueProperty.objects.get_or_create(
user=request.user, project_id=project_id
@@ -662,8 +756,8 @@ class LabelViewSet(BaseViewSet):
.select_related("project")
.select_related("workspace")
.select_related("parent")
.order_by("name")
.distinct()
.order_by("sort_order")
)
@@ -780,6 +874,20 @@ class SubIssuesEndpoint(BaseAPIView):
updated_sub_issues = Issue.issue_objects.filter(id__in=sub_issue_ids)
# Track the issue
_ = [
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"parent": str(issue_id)}),
actor_id=str(request.user.id),
issue_id=str(sub_issue_id),
project_id=str(project_id),
current_instance=json.dumps({"parent": str(sub_issue_id)}),
epoch=int(timezone.now().timestamp()),
)
for sub_issue_id in sub_issue_ids
]
return Response(
IssueFlatSerializer(updated_sub_issues, many=True).data,
status=status.HTTP_200_OK,
@@ -1092,17 +1200,19 @@ class IssueArchiveViewSet(BaseViewSet):
archived_at__isnull=False,
pk=pk,
)
issue.archived_at = None
issue.save()
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"archived_at": None}),
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=None,
current_instance=json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
),
epoch=int(timezone.now().timestamp()),
)
issue.archived_at = None
issue.save()
return Response(IssueSerializer(issue).data, status=status.HTTP_200_OK)
@@ -1147,7 +1257,11 @@ class IssueSubscriberViewSet(BaseViewSet):
def list(self, request, slug, project_id, issue_id):
members = (
ProjectMember.objects.filter(workspace__slug=slug, project_id=project_id)
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
@@ -1391,12 +1505,12 @@ class IssueCommentPublicViewSet(BaseViewSet):
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
member_id=self.request.user.id,
is_active=True,
)
)
)
.distinct()
).order_by("created_at")
else:
return IssueComment.objects.none()
except ProjectDeployBoard.DoesNotExist:
return IssueComment.objects.none()
@@ -1432,6 +1546,7 @@ class IssueCommentPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1522,7 +1637,6 @@ class IssueReactionPublicViewSet(BaseViewSet):
.order_by("-created_at")
.distinct()
)
else:
return IssueReaction.objects.none()
except ProjectDeployBoard.DoesNotExist:
return IssueReaction.objects.none()
@@ -1546,6 +1660,7 @@ class IssueReactionPublicViewSet(BaseViewSet):
if not ProjectMember.objects.filter(
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1618,7 +1733,6 @@ class CommentReactionPublicViewSet(BaseViewSet):
.order_by("-created_at")
.distinct()
)
else:
return CommentReaction.objects.none()
except ProjectDeployBoard.DoesNotExist:
return CommentReaction.objects.none()
@@ -1640,7 +1754,9 @@ class CommentReactionPublicViewSet(BaseViewSet):
project_id=project_id, comment_id=comment_id, actor=request.user
)
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
# Add the user for workspace tracking
_ = ProjectPublicMember.objects.get_or_create(
@@ -1713,7 +1829,6 @@ class IssueVotePublicViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
)
else:
return IssueVote.objects.none()
except ProjectDeployBoard.DoesNotExist:
return IssueVote.objects.none()
@@ -1726,7 +1841,9 @@ class IssueVotePublicViewSet(BaseViewSet):
)
# Add the user for workspace tracking
if not ProjectMember.objects.filter(
project_id=project_id, member=request.user
project_id=project_id,
member=request.user,
is_active=True,
).exists():
_ = ProjectPublicMember.objects.get_or_create(
project_id=project_id,
@@ -2160,9 +2277,7 @@ class IssueDraftViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -2227,7 +2342,7 @@ class IssueDraftViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
current_instance = json.dumps(
IssueSerializer(current_instance).data, cls=DjangoJSONEncoder
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
issue.delete()
issue_activity.delay(

View File

@@ -15,7 +15,7 @@ from rest_framework import status
from sentry_sdk import capture_exception
# Module imports
from . import BaseViewSet
from . import BaseViewSet, WebhookMixin
from plane.api.serializers import (
ModuleWriteSerializer,
ModuleSerializer,
@@ -41,11 +41,12 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
class ModuleViewSet(BaseViewSet):
class ModuleViewSet(WebhookMixin, BaseViewSet):
model = Module
permission_classes = [
ProjectEntityPermission,
]
webhook_event = "module"
def get_serializer_class(self):
return (
@@ -55,7 +56,6 @@ class ModuleViewSet(BaseViewSet):
)
def get_queryset(self):
order_by = self.request.GET.get("order_by", "sort_order")
subquery = ModuleFavorite.objects.filter(
user=self.request.user,
@@ -138,7 +138,7 @@ class ModuleViewSet(BaseViewSet):
),
)
)
.order_by(order_by, "name")
.order_by("-is_favorite","-created_at")
)
def create(self, request, slug, project_id):
@@ -266,12 +266,12 @@ class ModuleViewSet(BaseViewSet):
module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
)
module.delete()
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
{
"module_id": str(pk),
"module_name": str(module.name),
"issues": [str(issue_id) for issue_id in module_issues],
}
),
@@ -281,6 +281,7 @@ class ModuleViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
module.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -297,12 +298,6 @@ class ModuleIssueViewSet(BaseViewSet):
ProjectEntityPermission,
]
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
module_id=self.kwargs.get("module_id"),
)
def get_queryset(self):
return self.filter_queryset(
super()
@@ -446,7 +441,7 @@ class ModuleIssueViewSet(BaseViewSet):
type="module.activity.created",
requested_data=json.dumps({"modules_list": issues}),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{

View File

@@ -85,7 +85,10 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:
@@ -255,7 +258,10 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
# Created issues
if type == "created":
if WorkspaceMember.objects.filter(
workspace__slug=slug, member=request.user, role__lt=15
workspace__slug=slug,
member=request.user,
role__lt=15,
is_active=True,
).exists():
notifications = Notification.objects.none()
else:

View File

@@ -2,6 +2,7 @@
import uuid
import requests
import os
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@@ -11,7 +12,6 @@ from django.conf import settings
from rest_framework.response import Response
from rest_framework import exceptions
from rest_framework.permissions import AllowAny
from rest_framework.views import APIView
from rest_framework_simplejwt.tokens import RefreshToken
from rest_framework import status
from sentry_sdk import capture_exception
@@ -21,7 +21,14 @@ from google.oauth2 import id_token
from google.auth.transport import requests as google_auth_request
# Module imports
from plane.db.models import SocialLoginConnection, User
from plane.db.models import (
SocialLoginConnection,
User,
WorkspaceMemberInvite,
WorkspaceMember,
ProjectMemberInvite,
ProjectMember,
)
from plane.api.serializers import UserSerializer
from .base import BaseAPIView
@@ -113,7 +120,7 @@ def get_user_data(access_token: str) -> dict:
url="https://api.github.com/user/emails", headers=headers
).json()
[
_ = [
user_data.update({"email": item.get("email")})
for item in response
if item.get("primary") is True
@@ -147,7 +154,7 @@ class OauthEndpoint(BaseAPIView):
data = get_user_data(access_token)
email = data.get("email", None)
if email == None:
if email is None:
return Response(
{
"error": "Something went wrong. Please try again later or contact the support team."
@@ -158,7 +165,6 @@ class OauthEndpoint(BaseAPIView):
if "@" in email:
user = User.objects.get(email=email)
email = data["email"]
channel = "email"
mobile_number = uuid.uuid4().hex
email_verified = True
else:
@@ -170,7 +176,6 @@ class OauthEndpoint(BaseAPIView):
)
## Login Case
if not user.is_active:
return Response(
{
@@ -182,17 +187,66 @@ class OauthEndpoint(BaseAPIView):
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
user.last_login_medium = f"oauth"
user.last_login_medium = "oauth"
user.last_login_uagent = request.META.get("HTTP_USER_AGENT")
user.is_email_verified = email_verified
user.save()
access_token, refresh_token = get_tokens_for_user(user)
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -203,6 +257,7 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
@@ -223,6 +278,15 @@ class OauthEndpoint(BaseAPIView):
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist:
@@ -233,7 +297,6 @@ class OauthEndpoint(BaseAPIView):
if "@" in email:
email = data["email"]
mobile_number = uuid.uuid4().hex
channel = "email"
email_verified = True
else:
return Response(
@@ -263,11 +326,63 @@ class OauthEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
@@ -288,6 +403,8 @@ class OauthEndpoint(BaseAPIView):
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -298,4 +415,10 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED)

View File

@@ -1,9 +1,19 @@
# Python imports
from datetime import timedelta, datetime, date
from datetime import timedelta, date, datetime
# Django imports
from django.db import connection
from django.db.models import Exists, OuterRef, Q, Prefetch
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.db.models import (
OuterRef,
Func,
F,
Q,
Exists,
)
# Third party imports
from rest_framework import status
@@ -15,20 +25,37 @@ from .base import BaseViewSet, BaseAPIView
from plane.api.permissions import ProjectEntityPermission
from plane.db.models import (
Page,
PageBlock,
PageFavorite,
Issue,
IssueAssignee,
IssueActivity,
PageLog,
)
from plane.api.serializers import (
PageSerializer,
PageBlockSerializer,
PageFavoriteSerializer,
PageLogSerializer,
IssueLiteSerializer,
SubPageSerializer,
)
def unarchive_archive_page_and_descendants(page_id, archived_at):
# Your SQL query
sql = """
WITH RECURSIVE descendants AS (
SELECT id FROM pages WHERE id = %s
UNION ALL
SELECT pages.id FROM pages, descendants WHERE pages.parent_id = descendants.id
)
UPDATE pages SET archived_at = %s WHERE id IN (SELECT id FROM descendants);
"""
# Execute the SQL query
with connection.cursor() as cursor:
cursor.execute(sql, [page_id, archived_at])
class PageViewSet(BaseViewSet):
serializer_class = PageSerializer
model = Page
@@ -52,6 +79,7 @@ class PageViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.filter(parent__isnull=True)
.filter(Q(owned_by=self.request.user) | Q(access=0))
.select_related("project")
.select_related("workspace")
@@ -59,15 +87,7 @@ class PageViewSet(BaseViewSet):
.annotate(is_favorite=Exists(subquery))
.order_by(self.request.GET.get("order_by", "-created_at"))
.prefetch_related("labels")
.order_by("name", "-is_favorite")
.prefetch_related(
Prefetch(
"blocks",
queryset=PageBlock.objects.select_related(
"page", "issue", "workspace", "project"
),
)
)
.order_by("-is_favorite","-created_at")
.distinct()
)
@@ -88,7 +108,21 @@ class PageViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
try:
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
if page.is_locked:
return Response(
{"error": "Page is locked"},
status=status.HTTP_400_BAD_REQUEST,
)
parent = request.data.get("parent", None)
if parent:
_ = Page.objects.get(
pk=parent, workspace__slug=slug, project_id=project_id
)
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
@@ -100,22 +134,64 @@ class PageViewSet(BaseViewSet):
},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = PageSerializer(page, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except Page.DoesNotExist:
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
def lock(self, request, slug, project_id, pk):
page = Page.objects.filter(
pk=pk, workspace__slug=slug, project_id=project_id
)
# only the owner can lock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can lock the page"},
)
page.is_locked = True
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def unlock(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
# only the owner can unlock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can unlock the page"},
status=status.HTTP_400_BAD_REQUEST,
)
page.is_locked = False
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
queryset = self.get_queryset().filter(archived_at__isnull=True)
page_view = request.GET.get("page_view", False)
if not page_view:
return Response({"error": "Page View parameter is required"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Page View parameter is required"},
status=status.HTTP_400_BAD_REQUEST,
)
# All Pages
if page_view == "all":
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Recent pages
if page_view == "recent":
@@ -123,15 +199,19 @@ class PageViewSet(BaseViewSet):
day_before = current_time - timedelta(days=1)
todays_pages = queryset.filter(updated_at__date=date.today())
yesterdays_pages = queryset.filter(updated_at__date=day_before)
earlier_this_week = queryset.filter( updated_at__date__range=(
earlier_this_week = queryset.filter(
updated_at__date__range=(
(timezone.now() - timedelta(days=7)),
(timezone.now() - timedelta(days=2)),
))
)
)
return Response(
{
"today": PageSerializer(todays_pages, many=True).data,
"yesterday": PageSerializer(yesterdays_pages, many=True).data,
"earlier_this_week": PageSerializer(earlier_this_week, many=True).data,
"earlier_this_week": PageSerializer(
earlier_this_week, many=True
).data,
},
status=status.HTTP_200_OK,
)
@@ -139,50 +219,75 @@ class PageViewSet(BaseViewSet):
# Favorite Pages
if page_view == "favorite":
queryset = queryset.filter(is_favorite=True)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# My pages
if page_view == "created_by_me":
queryset = queryset.filter(owned_by=request.user)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Created by other Pages
if page_view == "created_by_other":
queryset = queryset.filter(~Q(owned_by=request.user), access=0)
return Response(PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK)
return Response({"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST)
class PageBlockViewSet(BaseViewSet):
serializer_class = PageBlockSerializer
model = PageBlock
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(page_id=self.kwargs.get("page_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
.select_related("workspace")
.select_related("page")
.select_related("issue")
.order_by("sort_order")
.distinct()
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"),
page_id=self.kwargs.get("page_id"),
return Response(
{"error": "No matching view found"}, status=status.HTTP_400_BAD_REQUEST
)
def archive(self, request, slug, project_id, page_id):
_ = Page.objects.get(
project_id=project_id,
owned_by_id=request.user.id,
workspace__slug=slug,
pk=page_id,
)
unarchive_archive_page_and_descendants(page_id, datetime.now())
return Response(status=status.HTTP_204_NO_CONTENT)
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(
project_id=project_id,
owned_by_id=request.user.id,
workspace__slug=slug,
pk=page_id,
)
page.parent = None
page.save()
unarchive_archive_page_and_descendants(page_id, None)
return Response(status=status.HTTP_204_NO_CONTENT)
def archive_list(self, request, slug, project_id):
pages = (
Page.objects.filter(
project_id=project_id,
workspace__slug=slug,
)
.filter(archived_at__isnull=False)
.filter(parent_id__isnull=True)
)
if not pages:
return Response(
{"error": "No pages found"}, status=status.HTTP_400_BAD_REQUEST
)
return Response(
PageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)
class PageFavoriteViewSet(BaseViewSet):
permission_classes = [
@@ -196,6 +301,7 @@ class PageFavoriteViewSet(BaseViewSet):
return self.filter_queryset(
super()
.get_queryset()
.filter(archived_at__isnull=True)
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(user=self.request.user)
.select_related("page", "page__owned_by")
@@ -218,24 +324,62 @@ class PageFavoriteViewSet(BaseViewSet):
page_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromPageBlockEndpoint(BaseAPIView):
class PageLogEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id, page_block_id):
page_block = PageBlock.objects.get(
pk=page_block_id,
serializer_class = PageLogSerializer
model = PageLog
def post(self, request, slug, project_id, page_id):
serializer = PageLogSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, page_id=page_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, page_id, transaction):
page_transaction = PageLog.objects.get(
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
transaction=transaction,
)
serializer = PageLogSerializer(
page_transaction, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, page_id, transaction):
transaction = PageLog.objects.get(
workspace__slug=slug,
project_id=project_id,
page_id=page_id,
transaction=transaction,
)
# Delete the transaction object
transaction.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromBlockEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id):
page = Page.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=page_id,
)
issue = Issue.objects.create(
name=page_block.name,
name=request.data.get("name"),
project_id=project_id,
description=page_block.description,
description_html=page_block.description_html,
description_stripped=page_block.description_stripped,
)
_ = IssueAssignee.objects.create(
issue=issue, assignee=request.user, project_id=project_id
@@ -245,11 +389,32 @@ class CreateIssueFromPageBlockEndpoint(BaseAPIView):
issue=issue,
actor=request.user,
project_id=project_id,
comment=f"created the issue from {page_block.name} block",
comment=f"created the issue from {page.name} block",
verb="created",
)
page_block.issue = issue
page_block.save()
return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK)
class SubPagesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@method_decorator(gzip_page)
def get(self, request, slug, project_id, page_id):
pages = (
PageLog.objects.filter(
page_id=page_id,
project_id=project_id,
workspace__slug=slug,
entity_name__in=["forward_link", "back_link"],
)
.filter(archived_at__isnull=True)
.select_related("project")
.select_related("workspace")
)
return Response(
SubPageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)

View File

@@ -11,23 +11,22 @@ from django.db.models import (
Q,
Exists,
OuterRef,
Func,
F,
Func,
Subquery,
)
from django.core.validators import validate_email
from django.conf import settings
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework import serializers
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
# Module imports
from .base import BaseViewSet, BaseAPIView
from .base import BaseViewSet, BaseAPIView, WebhookMixin
from plane.api.serializers import (
ProjectSerializer,
ProjectListSerializer,
@@ -35,12 +34,12 @@ from plane.api.serializers import (
ProjectDetailSerializer,
ProjectMemberInviteSerializer,
ProjectFavoriteSerializer,
IssueLiteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
)
from plane.api.permissions import (
WorkspaceUserPermission,
ProjectBasePermission,
ProjectEntityPermission,
ProjectMemberPermission,
@@ -60,13 +59,6 @@ from plane.db.models import (
ProjectIdentifier,
Module,
Cycle,
CycleFavorite,
ModuleFavorite,
PageFavorite,
IssueViewFavorite,
Page,
IssueAssignee,
ModuleMember,
Inbox,
ProjectDeployBoard,
IssueProperty,
@@ -75,16 +67,17 @@ from plane.db.models import (
from plane.bgtasks.project_invitation_task import project_invitation
class ProjectViewSet(BaseViewSet):
class ProjectViewSet(WebhookMixin, BaseViewSet):
serializer_class = ProjectSerializer
model = Project
webhook_event = "project"
permission_classes = [
ProjectBasePermission,
]
def get_serializer_class(self, *args, **kwargs):
if self.action == "update" or self.action == "partial_update":
if self.action in ["update", "partial_update"]:
return ProjectSerializer
return ProjectDetailSerializer
@@ -112,12 +105,15 @@ class ProjectViewSet(BaseViewSet):
member=self.request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
)
)
)
.annotate(
total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"), member__is_bot=False
project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -139,6 +135,7 @@ class ProjectViewSet(BaseViewSet):
member_role=ProjectMember.objects.filter(
project_id=OuterRef("pk"),
member_id=self.request.user.id,
is_active=True,
).values("role")
)
.annotate(
@@ -159,6 +156,7 @@ class ProjectViewSet(BaseViewSet):
member=request.user,
project_id=OuterRef("pk"),
workspace__slug=self.kwargs.get("slug"),
is_active=True,
).values("sort_order")
projects = (
self.get_queryset()
@@ -168,6 +166,7 @@ class ProjectViewSet(BaseViewSet):
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug,
is_active=True,
).select_related("member"),
)
)
@@ -336,7 +335,7 @@ class ProjectViewSet(BaseViewSet):
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
)
except Project.DoesNotExist or Workspace.DoesNotExist as e:
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
)
@@ -347,68 +346,106 @@ class ProjectViewSet(BaseViewSet):
)
class InviteProjectEndpoint(BaseAPIView):
class ProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
email = request.data.get("email", False)
role = request.data.get("role", False)
# Check if email is provided
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
validate_email(email)
# Check if user is already a member of workspace
if ProjectMember.objects.filter(
project_id=project_id,
member__email=email,
member__is_bot=False,
).exists():
def create(self, request, slug, project_id):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails:
return Response(
{"error": "User is already member of workspace"},
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
requesting_user = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, member_id=request.user.id
)
# Check if any invited user has an higher role
if len(
[
email
for email in emails
if int(email.get("role", 10)) > requesting_user.role
]
):
return Response(
{"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST,
)
user = User.objects.filter(email=email).first()
workspace = Workspace.objects.get(slug=slug)
if user is None:
token = jwt.encode(
{"email": email, "timestamp": datetime.now().timestamp()},
project_invitations = []
for email in emails:
try:
validate_email(email.get("email"))
project_invitations.append(
ProjectMemberInvite(
email=email.get("email").strip().lower(),
project_id=project_id,
workspace_id=workspace.id,
token=jwt.encode(
{
"email": email,
"timestamp": datetime.now().timestamp(),
},
settings.SECRET_KEY,
algorithm="HS256",
),
role=email.get("role", 10),
created_by=request.user,
)
project_invitation_obj = ProjectMemberInvite.objects.create(
email=email.strip().lower(),
project_id=project_id,
token=token,
role=role,
)
domain = settings.WEB_URL
project_invitation.delay(email, project_id, token, domain)
except ValidationError:
return Response(
{
"error": f"Invalid email - {email} provided a valid email address is required to send the invite"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Create workspace member invite
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = f"{request.scheme}://{request.get_host()}",
# Send invitations
for invitation in project_invitations:
project_invitations.delay(
invitation.email,
project_id,
invitation.token,
current_site,
request.user.email,
)
return Response(
{
"message": "Email sent successfully",
"id": project_invitation_obj.id,
},
status=status.HTTP_200_OK,
)
project_member = ProjectMember.objects.create(
member=user, project_id=project_id, role=role
)
_ = IssueProperty.objects.create(user=user, project_id=project_id)
return Response(
ProjectMemberSerializer(project_member).data, status=status.HTTP_200_OK
)
class UserProjectInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
@@ -422,40 +459,134 @@ class UserProjectInvitationsViewset(BaseViewSet):
.select_related("workspace", "workspace__owner", "project")
)
def create(self, request):
invitations = request.data.get("invitations")
project_invitations = ProjectMemberInvite.objects.filter(
pk__in=invitations, accepted=True
def create(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
project_id=project_id,
member=request.user,
role=invitation.role,
role=15 if workspace_role >= 15 else 10,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
ProjectMember(
project=invitation.project,
workspace=invitation.project.workspace,
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for invitation in project_invitations
]
for project_id in project_ids
],
ignore_conflicts=True,
)
# Delete joined project invites
project_invitations.delete()
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request, slug, project_id, pk):
project_invite = ProjectMemberInvite.objects.get(
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
email = request.data.get("email", "")
if email == "" or project_invite.email != email:
return Response(
{"error": "You do not have permission to join the project"},
status=status.HTTP_403_FORBIDDEN,
)
if project_invite.responded_at is None:
project_invite.accepted = request.data.get("accepted", False)
project_invite.responded_at = timezone.now()
project_invite.save()
if project_invite.accepted:
# Check if the user account exists
user = User.objects.filter(email=email).first()
# Check if user is a part of workspace
workspace_member = WorkspaceMember.objects.filter(
workspace__slug=slug, member=user
).first()
# Add him to workspace
if workspace_member is None:
_ = WorkspaceMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=15 if project_invite.role >= 15 else project_invite.role,
)
else:
# Else make him active
workspace_member.is_active = True
workspace_member.save()
# Check if the user was already a member of project then activate the user
project_member = ProjectMember.objects.filter(
workspace_id=project_invite.workspace_id, member=user
).first()
if project_member is None:
# Create a Project Member
_ = ProjectMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=project_invite.role,
)
else:
project_member.is_active = True
project_member.role = project_member.role
project_member.save()
return Response(
{"message": "Project Invitation Accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"message": "Project Invitation was not accepted"},
status=status.HTTP_200_OK,
)
return Response(
{"error": "You have already responded to the invitation request"},
status=status.HTTP_400_BAD_REQUEST,
)
def get(self, request, slug, project_id, pk):
project_invitation = ProjectMemberInvite.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
serializer = ProjectMemberInviteSerializer(project_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMemberViewSet(BaseViewSet):
@@ -477,102 +608,13 @@ class ProjectMemberViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(member__is_bot=False)
.filter()
.select_related("project")
.select_related("member")
.select_related("workspace", "workspace__owner")
)
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
if request.user.id == project_member.member_id:
return Response(
{"error": "You cannot update your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check while updating user roles
requested_project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
)
if (
"role" in request.data
and int(request.data.get("role", project_member.role))
> requested_project_member.role
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectMemberSerializer(
project_member, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
# check requesting user role
requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug, member=request.user, project_id=project_id
)
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than yourself"},
status=status.HTTP_400_BAD_REQUEST,
)
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, project_id=project_id, user=project_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug,
project_id=project_id,
assignee=project_member.member,
).delete()
# Remove if module member
ModuleMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=project_member.member,
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug,
project_id=project_id,
owned_by=project_member.member,
).delete()
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class AddMemberToProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
def post(self, request, slug, project_id):
def create(self, request, slug, project_id):
members = request.data.get("members", [])
# get the project
@@ -599,8 +641,7 @@ class AddMemberToProjectEndpoint(BaseAPIView):
sort_order = [
project_member.get("sort_order")
for project_member in project_members
if str(project_member.get("member_id"))
== str(member.get("member_id"))
if str(project_member.get("member_id")) == str(member.get("member_id"))
]
bulk_project_members.append(
ProjectMember(
@@ -633,6 +674,129 @@ class AddMemberToProjectEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
member=request.user,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
).select_related("project", "member", "workspace")
if project_member.role > 10:
serializer = ProjectMemberAdminSerializer(project_members, many=True)
else:
serializer = ProjectMemberSerializer(project_members, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
if request.user.id == project_member.member_id:
return Response(
{"error": "You cannot update your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check while updating user roles
requested_project_member = ProjectMember.objects.get(
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
if (
"role" in request.data
and int(request.data.get("role", project_member.role))
> requested_project_member.role
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ProjectMemberSerializer(
project_member, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_project_member = ProjectMember.objects.get(
workspace__slug=slug,
member=request.user,
project_id=project_id,
is_active=True,
)
# User cannot remove himself
if str(project_member.id) == str(requesting_project_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# User cannot deactivate higher role
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def leave(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
project_id=project_id,
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the project
if (
project_member.role == 20
and not ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the project as your the only admin of the project you will have to either delete the project or create an another admin",
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
project_member.is_active = False
project_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class AddTeamToProjectEndpoint(BaseAPIView):
permission_classes = [
@@ -683,46 +847,6 @@ class AddTeamToProjectEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
class ProjectMemberInvitationsViewset(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectMemberInviteDetailViewSet(BaseViewSet):
serializer_class = ProjectMemberInviteSerializer
model = ProjectMemberInvite
search_fields = []
permission_classes = [
ProjectBasePermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.select_related("project")
.select_related("workspace", "workspace__owner")
)
class ProjectIdentifierEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
@@ -766,59 +890,14 @@ class ProjectIdentifierEndpoint(BaseAPIView):
)
class ProjectJoinEndpoint(BaseAPIView):
def post(self, request, slug):
project_ids = request.data.get("project_ids", [])
# Get the workspace user role
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace
ProjectMember.objects.bulk_create(
[
ProjectMember(
project_id=project_id,
member=request.user,
role=20
if workspace_role >= 15
else (15 if workspace_role == 10 else workspace_role),
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
[
IssueProperty(
project_id=project_id,
user=request.user,
workspace=workspace,
created_by=request.user,
)
for project_id in project_ids
],
ignore_conflicts=True,
)
return Response(
{"message": "Projects joined successfully"},
status=status.HTTP_201_CREATED,
)
class ProjectUserViewsEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
member=request.user, project=project
member=request.user,
project=project,
is_active=True,
).first()
if project_member is None:
@@ -842,7 +921,10 @@ class ProjectUserViewsEndpoint(BaseAPIView):
class ProjectMemberUserEndpoint(BaseAPIView):
def get(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
project_id=project_id, workspace__slug=slug, member=request.user
project_id=project_id,
workspace__slug=slug,
member=request.user,
is_active=True,
)
serializer = ProjectMemberSerializer(project_member)
@@ -933,21 +1015,6 @@ class ProjectDeployBoardViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMemberEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
member__is_bot=False,
).select_related("project", "member", "workspace")
serializer = ProjectMemberSerializer(project_members, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectDeployBoardPublicSettingsEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
@@ -990,39 +1057,6 @@ class WorkspaceProjectDeployBoardEndpoint(BaseAPIView):
return Response(projects, status=status.HTTP_200_OK)
class LeaveProjectEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def delete(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
workspace__slug=slug,
member=request.user,
project_id=project_id,
)
# Only Admin case
if (
project_member.role == 20
and ProjectMember.objects.filter(
workspace__slug=slug,
role=20,
project_id=project_id,
).count()
== 1
):
return Response(
{
"error": "You cannot leave the project since you are the only admin of the project you should delete the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
project_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectPublicCoverImagesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,

View File

@@ -47,36 +47,45 @@ class StateViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def list(self, request, slug, project_id):
state_dict = dict()
states = StateSerializer(self.get_queryset(), many=True).data
grouped = request.GET.get("grouped", False)
if grouped == "true":
state_dict = {}
for key, value in groupby(
sorted(states, key=lambda state: state["group"]),
lambda state: state.get("group"),
):
state_dict[str(key)] = list(value)
return Response(state_dict, status=status.HTTP_200_OK)
return Response(states, status=status.HTTP_200_OK)
def mark_as_default(self, request, slug, project_id, pk):
# Select all the states which are marked as default
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, default=True
).update(default=False)
_ = State.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
).update(default=True)
return Response(status=status.HTTP_204_NO_CONTENT)
def destroy(self, request, slug, project_id, pk):
state = State.objects.get(
~Q(name="Triage"),
pk=pk, project_id=project_id, workspace__slug=slug,
pk=pk,
project_id=project_id,
workspace__slug=slug,
)
if state.default:
return Response(
{"error": "Default state cannot be deleted"}, status=False
)
return Response({"error": "Default state cannot be deleted"}, status=False)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()
if issue_exist:
return Response(
{
"error": "The state is not empty, only empty states can be deleted"
},
{"error": "The state is not empty, only empty states can be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -13,14 +13,7 @@ from plane.api.serializers import (
)
from plane.api.views.base import BaseViewSet, BaseAPIView
from plane.db.models import (
User,
Workspace,
WorkspaceMemberInvite,
Issue,
IssueActivity,
WorkspaceMember,
)
from plane.db.models import User, IssueActivity, WorkspaceMember
from plane.utils.paginator import BasePaginator
@@ -42,10 +35,28 @@ class UserEndpoint(BaseViewSet):
serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK)
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(
member=request.user, is_active=True
).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Deactivate the user
user.is_active = False
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class UpdateUserOnBoardedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_onboarded = request.data.get("is_onboarded", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
@@ -53,7 +64,7 @@ class UpdateUserOnBoardedEndpoint(BaseAPIView):
class UpdateUserTourCompletedEndpoint(BaseAPIView):
def patch(self, request):
user = User.objects.get(pk=request.user.id)
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,130 @@
# Django imports
from django.db import IntegrityError
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import Webhook, WebhookLog, Workspace
from plane.db.models.webhook import generate_token
from .base import BaseAPIView
from plane.api.permissions import WorkspaceOwnerPermission
from plane.api.serializers import WebhookSerializer, WebhookLogSerializer
class WebhookEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
{"error": "URL already exists for the workspace"},
status=status.HTTP_410_GONE,
)
raise IntegrityError
def get(self, request, slug, pk=None):
if pk == None:
webhooks = Webhook.objects.filter(workspace__slug=slug)
serializer = WebhookSerializer(
webhooks,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
serializer = WebhookSerializer(
webhook,
data=request.data,
partial=True,
fields=(
"id",
"url",
"is_active",
"created_at",
"updated_at",
"project",
"issue",
"cycle",
"module",
"issue_comment",
),
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, pk):
webhook = Webhook.objects.get(pk=pk, workspace__slug=slug)
webhook.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class WebhookSecretRegenerateEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def post(self, request, slug, pk):
webhook = Webhook.objects.get(workspace__slug=slug, pk=pk)
webhook.secret_key = generate_token()
webhook.save()
serializer = WebhookSerializer(webhook)
return Response(serializer.data, status=status.HTTP_200_OK)
class WebhookLogsEndpoint(BaseAPIView):
permission_classes = [
WorkspaceOwnerPermission,
]
def get(self, request, slug, webhook_id):
webhook_logs = WebhookLog.objects.filter(
workspace__slug=slug, webhook_id=webhook_id
)
serializer = WebhookLogSerializer(webhook_logs, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -2,16 +2,13 @@
import jwt
from datetime import date, datetime
from dateutil.relativedelta import relativedelta
from uuid import uuid4
# Django imports
from django.db import IntegrityError
from django.db.models import Prefetch
from django.conf import settings
from django.utils import timezone
from django.core.exceptions import ValidationError
from django.core.validators import validate_email
from django.contrib.sites.shortcuts import get_current_site
from django.db.models import (
Prefetch,
OuterRef,
@@ -28,13 +25,11 @@ from django.db.models import (
)
from django.db.models.functions import ExtractWeek, Cast, ExtractDay
from django.db.models.fields import DateField
from django.contrib.auth.hashers import make_password
# Third party modules
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from sentry_sdk import capture_exception
from rest_framework.permissions import AllowAny, IsAuthenticated
# Module imports
from plane.api.serializers import (
@@ -55,21 +50,12 @@ from . import BaseViewSet
from plane.db.models import (
User,
Workspace,
WorkspaceMember,
WorkspaceMemberInvite,
Team,
ProjectMember,
IssueActivity,
Issue,
WorkspaceTheme,
IssueAssignee,
ProjectFavorite,
CycleFavorite,
ModuleMember,
ModuleFavorite,
PageFavorite,
Page,
IssueViewFavorite,
IssueLink,
IssueAttachment,
IssueSubscriber,
@@ -109,7 +95,9 @@ class WorkSpaceViewSet(BaseViewSet):
def get_queryset(self):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -184,7 +172,9 @@ class UserWorkSpacesEndpoint(BaseAPIView):
def get(self, request):
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member__is_bot=False
workspace=OuterRef("id"),
member__is_bot=False,
is_active=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -230,23 +220,40 @@ class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
return Response({"status": not workspace}, status=status.HTTP_200_OK)
class InviteWorkspaceEndpoint(BaseAPIView):
class WorkspaceInvitationsViewset(BaseViewSet):
"""Endpoint for creating, listing and deleting workspaces"""
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def post(self, request, slug):
emails = request.data.get("emails", False)
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def create(self, request, slug):
emails = request.data.get("emails", [])
# Check if email is provided
if not emails or not len(emails):
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
)
# check for role level
# check for role level of the requesting user
requesting_user = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if any invited user has an higher role
if len(
[
email
@@ -259,15 +266,17 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace object
workspace = Workspace.objects.get(slug=slug)
# Check if user is already a member of workspace
workspace_members = WorkspaceMember.objects.filter(
workspace_id=workspace.id,
member__email__in=[email.get("email") for email in emails],
is_active=True,
).select_related("member", "workspace", "workspace__owner")
if len(workspace_members):
if workspace_members:
return Response(
{
"error": "Some users are already member of workspace",
@@ -305,35 +314,20 @@ class InviteWorkspaceEndpoint(BaseAPIView):
},
status=status.HTTP_400_BAD_REQUEST,
)
WorkspaceMemberInvite.objects.bulk_create(
# Create workspace member invite
workspace_invitations = WorkspaceMemberInvite.objects.bulk_create(
workspace_invitations, batch_size=10, ignore_conflicts=True
)
workspace_invitations = WorkspaceMemberInvite.objects.filter(
email__in=[email.get("email") for email in emails]
).select_related("workspace")
# create the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
_ = User.objects.bulk_create(
[
User(
username=str(uuid4().hex),
email=invitation.email,
password=make_password(uuid4().hex),
is_password_autoset=True,
)
for invitation in workspace_invitations
],
batch_size=100,
)
current_site = f"{request.scheme}://{request.get_host()}",
# Send invitations
for invitation in workspace_invitations:
workspace_invitation.delay(
invitation.email,
workspace.id,
invitation.token,
settings.WEB_URL,
current_site,
request.user.email,
)
@@ -344,11 +338,19 @@ class InviteWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class JoinWorkspaceEndpoint(BaseAPIView):
class WorkspaceJoinEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
"""Invitation response endpoint the user can respond to the invitation"""
def post(self, request, slug, pk):
workspace_invite = WorkspaceMemberInvite.objects.get(
@@ -357,12 +359,14 @@ class JoinWorkspaceEndpoint(BaseAPIView):
email = request.data.get("email", "")
# Check the email
if email == "" or workspace_invite.email != email:
return Response(
{"error": "You do not have permission to join the workspace"},
status=status.HTTP_403_FORBIDDEN,
)
# If already responded then return error
if workspace_invite.responded_at is None:
workspace_invite.accepted = request.data.get("accepted", False)
workspace_invite.responded_at = timezone.now()
@@ -374,12 +378,23 @@ class JoinWorkspaceEndpoint(BaseAPIView):
# If the user is present then create the workspace member
if user is not None:
WorkspaceMember.objects.create(
# Check if the user was already a member of workspace then activate the user
workspace_member = WorkspaceMember.objects.filter(
workspace=workspace_invite.workspace, member=user
).first()
if workspace_member is not None:
workspace_member.is_active = True
workspace_member.role = workspace_invite.role
workspace_member.save()
else:
# Create a Workspace
_ = WorkspaceMember.objects.create(
workspace=workspace_invite.workspace,
member=user,
role=workspace_invite.role,
)
# Set the user last_workspace_id to the accepted workspace
user.last_workspace_id = workspace_invite.workspace.id
user.save()
@@ -391,6 +406,7 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
# Workspace invitation rejected
return Response(
{"message": "Workspace Invitation was not accepted"},
status=status.HTTP_200_OK,
@@ -401,37 +417,13 @@ class JoinWorkspaceEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
class WorkspaceInvitationsViewset(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
permission_classes = [
WorkSpaceAdminPermission,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "workspace__owner", "created_by")
)
def destroy(self, request, slug, pk):
workspace_member_invite = WorkspaceMemberInvite.objects.get(
pk=pk, workspace__slug=slug
)
# delete the user if signup is disabled
if settings.DOCKERIZED and not settings.ENABLE_SIGNUP:
user = User.objects.filter(email=workspace_member_invite.email).first()
if user is not None:
user.delete()
workspace_member_invite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, pk):
workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
class UserWorkspaceInvitationsEndpoint(BaseViewSet):
class UserWorkspaceInvitationsViewSet(BaseViewSet):
serializer_class = WorkSpaceMemberInviteSerializer
model = WorkspaceMemberInvite
@@ -445,9 +437,19 @@ class UserWorkspaceInvitationsEndpoint(BaseViewSet):
)
def create(self, request):
invitations = request.data.get("invitations")
workspace_invitations = WorkspaceMemberInvite.objects.filter(pk__in=invitations)
invitations = request.data.get("invitations", [])
workspace_invitations = WorkspaceMemberInvite.objects.filter(
pk__in=invitations, email=request.user.email
).order_by("-created_at")
# If the user is already a member of workspace and was deactivated then activate the user
for invitation in workspace_invitations:
# Update the WorkspaceMember for this specific invitation
WorkspaceMember.objects.filter(
workspace_id=invitation.workspace_id, member=request.user
).update(is_active=True, role=invitation.role)
# Bulk create the user for all the workspaces
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
@@ -472,7 +474,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
model = WorkspaceMember
permission_classes = [
WorkSpaceAdminPermission,
WorkspaceEntityPermission,
]
search_fields = [
@@ -484,13 +486,41 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"), member__is_bot=False)
.filter(
workspace__slug=self.kwargs.get("slug"),
member__is_bot=False,
is_active=True,
)
.select_related("workspace", "workspace__owner")
.select_related("member")
)
def list(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
)
# Get all active workspace members
workspace_members = self.get_queryset()
if workspace_member.role > 10:
serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True)
else:
serializer = WorkSpaceMemberSerializer(
workspace_members,
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, pk):
workspace_member = WorkspaceMember.objects.get(pk=pk, workspace__slug=slug)
workspace_member = WorkspaceMember.objects.get(
pk=pk,
workspace__slug=slug,
member__is_bot=False,
is_active=True,
)
if request.user.id == workspace_member.member_id:
return Response(
{"error": "You cannot update your own role"},
@@ -499,7 +529,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Get the requested user role
requested_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if role is being updated
# One cannot update role higher than his own role
@@ -524,68 +556,121 @@ class WorkSpaceMemberViewSet(BaseViewSet):
def destroy(self, request, slug, pk):
# Check the user role who is deleting the user
workspace_member = WorkspaceMember.objects.get(workspace__slug=slug, pk=pk)
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
pk=pk,
member__is_bot=False,
is_active=True,
)
# check requesting user role
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
if str(workspace_member.id) == str(requesting_workspace_member.id):
return Response(
{
"error": "You cannot remove yourself from the workspace. Please use leave workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
if requesting_workspace_member.role < workspace_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for the only member in the workspace
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
member__is_bot=False,
).count()
== 1
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{"error": "Cannot delete the only Admin for the workspace"},
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the user also from all the projects
ProjectMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Remove all favorites
ProjectFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
CycleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
ModuleFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
PageFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
IssueViewFavorite.objects.filter(
workspace__slug=slug, user=workspace_member.member
).delete()
# Also remove issue from issue assigned
IssueAssignee.objects.filter(
workspace__slug=slug, assignee=workspace_member.member
).delete()
# Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# Remove if module member
ModuleMember.objects.filter(
workspace__slug=slug, member=workspace_member.member
).delete()
# Delete owned Pages
Page.objects.filter(
workspace__slug=slug, owned_by=workspace_member.member
).delete()
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
workspace_member.delete()
def leave(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug,
member=request.user,
is_active=True,
)
# Check if the leaving user is the only admin of the workspace
if (
workspace_member.role == 20
and not WorkspaceMember.objects.filter(
workspace__slug=slug,
role=20,
is_active=True,
).count()
> 1
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
},
status=status.HTTP_400_BAD_REQUEST,
)
if (
Project.objects.annotate(
total_members=Count("project_projectmember"),
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__role=20,
),
),
)
.filter(total_members=1, member_with_role=1, workspace__slug=slug)
.exists()
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
},
status=status.HTTP_400_BAD_REQUEST,
)
# # Deactivate the users from the projects where the user is part of
_ = ProjectMember.objects.filter(
workspace__slug=slug,
member_id=workspace_member.member_id,
is_active=True,
).update(is_active=False)
# # Deactivate the user
workspace_member.is_active = False
workspace_member.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -613,7 +698,9 @@ class TeamMemberViewSet(BaseViewSet):
def create(self, request, slug):
members = list(
WorkspaceMember.objects.filter(
workspace__slug=slug, member__id__in=request.data.get("members", [])
workspace__slug=slug,
member__id__in=request.data.get("members", []),
is_active=True,
)
.annotate(member_str_id=Cast("member", output_field=CharField()))
.distinct()
@@ -642,23 +729,6 @@ class TeamMemberViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class UserWorkspaceInvitationEndpoint(BaseViewSet):
model = WorkspaceMemberInvite
serializer_class = WorkSpaceMemberInviteSerializer
permission_classes = [
AllowAny,
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(pk=self.kwargs.get("pk"))
.select_related("workspace")
)
class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
def get(self, request):
user = User.objects.get(pk=request.user.id)
@@ -695,7 +765,9 @@ class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView):
def get(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user, workspace__slug=slug
member=request.user,
workspace__slug=slug,
is_active=True,
)
serializer = WorkspaceMemberMeSerializer(workspace_member)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -704,7 +776,9 @@ class WorkspaceMemberUserEndpoint(BaseAPIView):
class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
def post(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
workspace_member.view_props = request.data.get("view_props", {})
workspace_member.save()
@@ -1030,7 +1104,9 @@ class WorkspaceUserProfileEndpoint(BaseAPIView):
user_data = User.objects.get(pk=user_id)
requesting_workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
workspace__slug=slug,
member=request.user,
is_active=True,
)
projects = []
if requesting_workspace_member.role >= 10:
@@ -1234,9 +1310,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
return Response(
issues, status=status.HTTP_200_OK
)
return Response(issues, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):
@@ -1250,44 +1324,3 @@ class WorkspaceLabelsEndpoint(BaseAPIView):
project__project_projectmember__member=request.user,
).values("parent", "name", "color", "id", "project_id", "workspace__slug")
return Response(labels, status=status.HTTP_200_OK)
class WorkspaceMembersEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
def get(self, request, slug):
workspace_members = WorkspaceMember.objects.filter(
workspace__slug=slug,
member__is_bot=False,
).select_related("workspace", "member")
serialzier = WorkSpaceMemberSerializer(workspace_members, many=True)
return Response(serialzier.data, status=status.HTTP_200_OK)
class LeaveWorkspaceEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
def delete(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
workspace__slug=slug, member=request.user
)
# Only Admin case
if (
workspace_member.role == 20
and WorkspaceMember.objects.filter(workspace__slug=slug, role=20).count()
== 1
):
return Response(
{
"error": "You cannot leave the workspace since you are the only admin of the workspace you should delete the workspace"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Delete the member from workspace
workspace_member.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -0,0 +1,47 @@
# Django imports
from django.utils import timezone
from django.db.models import Q
# Third party imports
from rest_framework import authentication
from rest_framework.exceptions import AuthenticationFailed
# Module imports
from plane.db.models import APIToken
class APIKeyAuthentication(authentication.BaseAuthentication):
"""
Authentication with an API Key
"""
www_authenticate_realm = "api"
media_type = "application/json"
auth_header_name = "X-Api-Key"
def get_api_token(self, request):
return request.headers.get(self.auth_header_name)
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)
except APIToken.DoesNotExist:
raise AuthenticationFailed("Given API token is not valid")
# save api token last used
api_token.last_used = timezone.now()
api_token.save(update_fields=["last_used"])
return (api_token.user, api_token.token)
def authenticate(self, request):
token = self.get_api_token(request=request)
if not token:
return None
# Validate the API token
user, token = self.validate_api_token(token)
return user, token

View File

@@ -0,0 +1,5 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "plane.authentication"

View File

@@ -408,7 +408,6 @@ def analytic_export_task(email, data, slug):
distribution,
x_axis,
y_axis,
segment,
key,
assignee_details,
label_details,

View File

@@ -23,7 +23,7 @@ def email_verification(first_name, email, token, current_site):
from_email_string = settings.EMAIL_FROM
subject = f"Verify your Email!"
subject = "Verify your Email!"
context = {
"first_name": first_name,

View File

@@ -4,7 +4,6 @@ import io
import json
import boto3
import zipfile
from urllib.parse import urlparse, urlunparse
# Django imports
from django.conf import settings

View File

@@ -8,8 +8,6 @@ from django.conf import settings
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.db.models import User
@shared_task
@@ -21,7 +19,7 @@ def forgot_password(first_name, email, uidb64, token, current_site):
from_email_string = settings.EMAIL_FROM
subject = f"Reset Your Password - Plane"
subject = "Reset Your Password - Plane"
context = {
"first_name": first_name,

View File

@@ -2,8 +2,6 @@
import json
import requests
import uuid
import jwt
from datetime import datetime
# Django imports
from django.conf import settings
@@ -27,7 +25,6 @@ from plane.db.models import (
User,
IssueProperty,
)
from .workspace_invitation_task import workspace_invitation
from plane.bgtasks.user_welcome_task import send_welcome_slack
@@ -58,7 +55,7 @@ def service_importer(service, importer_id):
ignore_conflicts=True,
)
[
_ = [
send_welcome_slack.delay(
str(user.id),
True,
@@ -76,6 +73,12 @@ def service_importer(service, importer_id):
]
)
# Check if any of the users are already member of workspace
_ = WorkspaceMember.objects.filter(
member__in=[user for user in workspace_users],
workspace_id=importer.workspace_id,
).update(is_active=True)
# Add new users to Workspace and project automatically
WorkspaceMember.objects.bulk_create(
[
@@ -157,7 +160,7 @@ def service_importer(service, importer_id):
)
# Create repo sync
repo_sync = GithubRepositorySync.objects.create(
_ = GithubRepositorySync.objects.create(
repository=repo,
workspace_integration=workspace_integration,
actor=workspace_integration.actor,
@@ -179,7 +182,7 @@ def service_importer(service, importer_id):
ImporterSerializer(importer).data,
cls=DjangoJSONEncoder,
)
res = requests.post(
_ = requests.post(
f"{settings.PROXY_BASE_URL}/hooks/workspaces/{str(importer.workspace_id)}/projects/{str(importer.project_id)}/importers/{str(service)}/",
json=import_data_json,
headers=headers,

View File

@@ -82,7 +82,7 @@ def track_description(
if (
last_activity is not None
and last_activity.field == "description"
and actor_id == last_activity.actor_id
and actor_id == str(last_activity.actor_id)
):
last_activity.created_at = timezone.now()
last_activity.save(update_fields=["created_at"])
@@ -131,7 +131,7 @@ def track_parent(
else "",
field="parent",
project_id=project_id,
workspace=workspace_id,
workspace_id=workspace_id,
comment=f"updated the parent issue to",
old_identifier=old_parent.id if old_parent is not None else None,
new_identifier=new_parent.id if new_parent is not None else None,
@@ -276,7 +276,7 @@ def track_labels(
issue_activities,
epoch,
):
requested_labels = set([str(lab) for lab in requested_data.get("labels_list", [])])
requested_labels = set([str(lab) for lab in requested_data.get("labels", [])])
current_labels = set([str(lab) for lab in current_instance.get("labels", [])])
added_labels = requested_labels - current_labels
@@ -334,9 +334,7 @@ def track_assignees(
issue_activities,
epoch,
):
requested_assignees = set(
[str(asg) for asg in requested_data.get("assignees_list", [])]
)
requested_assignees = set([str(asg) for asg in requested_data.get("assignees", [])])
current_assignees = set([str(asg) for asg in current_instance.get("assignees", [])])
added_assignees = requested_assignees - current_assignees
@@ -363,6 +361,7 @@ def track_assignees(
for dropped_assignee in dropped_assginees:
assignee = User.objects.get(pk=dropped_assignee)
issue_activities.append(
IssueActivity(
issue_id=issue_id,
actor_id=actor_id,
verb="updated",
@@ -375,6 +374,7 @@ def track_assignees(
old_identifier=assignee.id,
epoch=epoch,
)
)
def track_estimate_points(
@@ -418,13 +418,14 @@ def track_archive_at(
issue_activities,
epoch,
):
if current_instance.get("archived_at") != requested_data.get("archived_at"):
if requested_data.get("archived_at") is None:
issue_activities.append(
IssueActivity(
issue_id=issue_id,
project_id=project_id,
workspace_id=workspace_id,
comment=f"has restored the issue",
comment="has restored the issue",
verb="updated",
actor_id=actor_id,
field="archived_at",
@@ -439,7 +440,7 @@ def track_archive_at(
issue_id=issue_id,
project_id=project_id,
workspace_id=workspace_id,
comment=f"Plane has archived the issue",
comment="Plane has archived the issue",
verb="updated",
actor_id=actor_id,
field="archived_at",
@@ -523,8 +524,8 @@ def update_issue_activity(
"description_html": track_description,
"target_date": track_target_date,
"start_date": track_start_date,
"labels_list": track_labels,
"assignees_list": track_assignees,
"labels": track_labels,
"assignees": track_assignees,
"estimate_point": track_estimate_points,
"archived_at": track_archive_at,
"closed_to": track_closed_to,
@@ -536,7 +537,7 @@ def update_issue_activity(
)
for key in requested_data:
func = ISSUE_ACTIVITY_MAPPER.get(key, None)
func = ISSUE_ACTIVITY_MAPPER.get(key)
if func is not None:
func(
requested_data=requested_data,
@@ -690,6 +691,10 @@ def create_cycle_issue_activity(
new_cycle = Cycle.objects.filter(
pk=updated_record.get("new_cycle_id", None)
).first()
issue = Issue.objects.filter(pk=updated_record.get("issue_id")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -712,6 +717,10 @@ def create_cycle_issue_activity(
cycle = Cycle.objects.filter(
pk=created_record.get("fields").get("cycle")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -746,22 +755,27 @@ def delete_cycle_issue_activity(
)
cycle_id = requested_data.get("cycle_id", "")
cycle_name = requested_data.get("cycle_name", "")
cycle = Cycle.objects.filter(pk=cycle_id).first()
issues = requested_data.get("issues")
for issue in issues:
current_issue = Issue.objects.filter(pk=issue).first()
if issue:
current_issue.updated_at = timezone.now()
current_issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=issue,
actor_id=actor_id,
verb="deleted",
old_value=cycle.name if cycle is not None else "",
old_value=cycle.name if cycle is not None else cycle_name,
new_value="",
field="cycles",
project_id=project_id,
workspace_id=workspace_id,
comment=f"removed this issue from {cycle.name if cycle is not None else None}",
old_identifier=cycle.id if cycle is not None else None,
comment=f"removed this issue from {cycle.name if cycle is not None else cycle_name}",
old_identifier=cycle_id if cycle_id is not None else None,
epoch=epoch,
)
)
@@ -793,6 +807,10 @@ def create_module_issue_activity(
new_module = Module.objects.filter(
pk=updated_record.get("new_module_id", None)
).first()
issue = Issue.objects.filter(pk=updated_record.get("issue_id")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
@@ -815,6 +833,10 @@ def create_module_issue_activity(
module = Module.objects.filter(
pk=created_record.get("fields").get("module")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=created_record.get("fields").get("issue"),
@@ -848,22 +870,27 @@ def delete_module_issue_activity(
)
module_id = requested_data.get("module_id", "")
module_name = requested_data.get("module_name", "")
module = Module.objects.filter(pk=module_id).first()
issues = requested_data.get("issues")
for issue in issues:
current_issue = Issue.objects.filter(pk=issue).first()
if issue:
current_issue.updated_at = timezone.now()
current_issue.save(update_fields=["updated_at"])
issue_activities.append(
IssueActivity(
issue_id=issue,
actor_id=actor_id,
verb="deleted",
old_value=module.name if module is not None else "",
old_value=module.name if module is not None else module_name,
new_value="",
field="modules",
project_id=project_id,
workspace_id=workspace_id,
comment=f"removed this issue from ",
old_identifier=module.id if module is not None else None,
comment=f"removed this issue from {module.name if module is not None else module_name}",
old_identifier=module_id if module_id is not None else None,
epoch=epoch,
)
)
@@ -1451,10 +1478,11 @@ def issue_activity(
issue_activities = []
project = Project.objects.get(pk=project_id)
issue = Issue.objects.filter(pk=issue_id).first()
workspace_id = project.workspace_id
if issue is not None:
if issue_id is not None:
issue = Issue.objects.filter(pk=issue_id).first()
if issue:
try:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
@@ -1534,6 +1562,8 @@ def issue_activity(
IssueActivitySerializer(issue_activities_created, many=True).data,
cls=DjangoJSONEncoder,
),
requested_data=requested_data,
current_instance=current_instance,
)
return

View File

@@ -12,7 +12,7 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.db.models import Issue, Project, State
from plane.db.models import Issue, Project, State, Page
from plane.bgtasks.issue_activites_task import issue_activity
@@ -20,6 +20,7 @@ from plane.bgtasks.issue_activites_task import issue_activity
def archive_and_close_old_issues():
archive_old_issues()
close_old_issues()
delete_archived_pages()
def archive_old_issues():
@@ -59,7 +60,7 @@ def archive_old_issues():
# Check if Issues
if issues:
# Set the archive time to current time
archive_at = timezone.now()
archive_at = timezone.now().date()
issues_to_update = []
for issue in issues:
@@ -71,16 +72,16 @@ def archive_old_issues():
Issue.objects.bulk_update(
issues_to_update, ["archived_at"], batch_size=100
)
[
_ = [
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"archived_at": str(archive_at)}),
actor_id=str(project.created_by_id),
issue_id=issue.id,
project_id=project_id,
current_instance=None,
current_instance=json.dumps({"archived_at": None}),
subscriber=False,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
for issue in issues_to_update
]
@@ -142,17 +143,21 @@ def close_old_issues():
# Bulk Update the issues and log the activity
if issues_to_update:
Issue.objects.bulk_update(issues_to_update, ["state"], batch_size=100)
Issue.objects.bulk_update(
issues_to_update, ["state"], batch_size=100
)
[
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"closed_to": str(issue.state_id)}),
requested_data=json.dumps(
{"closed_to": str(issue.state_id)}
),
actor_id=str(project.created_by_id),
issue_id=issue.id,
project_id=project_id,
current_instance=None,
subscriber=False,
epoch=int(timezone.now().timestamp())
epoch=int(timezone.now().timestamp()),
)
for issue in issues_to_update
]
@@ -162,3 +167,20 @@ def close_old_issues():
print(e)
capture_exception(e)
return
def delete_archived_pages():
try:
pages_to_delete = Page.objects.filter(
archived_at__isnull=False,
archived_at__lte=(timezone.now() - timedelta(days=30)),
)
pages_to_delete._raw_delete(pages_to_delete.db)
return
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -17,7 +17,7 @@ def magic_link(email, key, token, current_site):
from_email_string = settings.EMAIL_FROM
subject = f"Login for Plane"
subject = "Login for Plane"
context = {"magic_url": abs_url, "code": token}

View File

@@ -1,20 +1,193 @@
# Python imports
import json
# Django imports
from django.utils import timezone
import uuid
# Module imports
from plane.db.models import IssueSubscriber, Project, IssueAssignee, Issue, Notification
from plane.db.models import (
IssueMention,
IssueSubscriber,
Project,
User,
IssueAssignee,
Issue,
Notification,
IssueComment,
IssueActivity
)
# Third Party imports
from celery import shared_task
from bs4 import BeautifulSoup
# =========== Issue Description Html Parsing and Notification Functions ======================
def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
aggregated_issue_mentions = []
for mention_id in new_mentions:
aggregated_issue_mentions.append(
IssueMention(
mention_id=mention_id,
issue=issue,
project=project,
workspace_id=project.workspace_id
)
)
IssueMention.objects.bulk_create(
aggregated_issue_mentions, batch_size=100)
IssueMention.objects.filter(
issue=issue, mention__in=removed_mention).delete()
def get_new_mentions(requested_instance, current_instance):
# requested_data is the newer instance of the current issue
# current_instance is the older instance of the current issue, saved in the database
# extract mentions from both the instance of data
mentions_older = extract_mentions(current_instance)
mentions_newer = extract_mentions(requested_instance)
# Getting Set Difference from mentions_newer
new_mentions = [
mention for mention in mentions_newer if mention not in mentions_older]
return new_mentions
# Get Removed Mention
def get_removed_mentions(requested_instance, current_instance):
# requested_data is the newer instance of the current issue
# current_instance is the older instance of the current issue, saved in the database
# extract mentions from both the instance of data
mentions_older = extract_mentions(current_instance)
mentions_newer = extract_mentions(requested_instance)
# Getting Set Difference from mentions_newer
removed_mentions = [
mention for mention in mentions_older if mention not in mentions_newer]
return removed_mentions
# Adds mentions as subscribers
def extract_mentions_as_subscribers(project_id, issue_id, mentions):
# mentions is an array of User IDs representing the FILTERED set of mentioned users
bulk_mention_subscribers = []
for mention_id in mentions:
# If the particular mention has not already been subscribed to the issue, he must be sent the mentioned notification
if not IssueSubscriber.objects.filter(
issue_id=issue_id,
subscriber_id=mention_id,
project_id=project_id,
).exists() and not IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id,
assignee_id=mention_id
).exists() and not Issue.objects.filter(
project_id=project_id, pk=issue_id, created_by_id=mention_id
).exists():
project = Project.objects.get(pk=project_id)
bulk_mention_subscribers.append(IssueSubscriber(
workspace_id=project.workspace_id,
project_id=project_id,
issue_id=issue_id,
subscriber_id=mention_id,
))
return bulk_mention_subscribers
# Parse Issue Description & extracts mentions
def extract_mentions(issue_instance):
try:
# issue_instance has to be a dictionary passed, containing the description_html and other set of activity data.
mentions = []
# Convert string to dictionary
data = json.loads(issue_instance)
html = data.get("description_html")
soup = BeautifulSoup(html, 'html.parser')
mention_tags = soup.find_all(
'mention-component', attrs={'target': 'users'})
mentions = [mention_tag['id'] for mention_tag in mention_tags]
return list(set(mentions))
except Exception as e:
return []
# =========== Comment Parsing and Notification Functions ======================
def extract_comment_mentions(comment_value):
try:
mentions = []
soup = BeautifulSoup(comment_value, 'html.parser')
mentions_tags = soup.find_all(
'mention-component', attrs={'target': 'users'}
)
for mention_tag in mentions_tags:
mentions.append(mention_tag['id'])
return list(set(mentions))
except Exception as e:
return []
def get_new_comment_mentions(new_value, old_value):
mentions_newer = extract_comment_mentions(new_value)
if old_value is None:
return mentions_newer
mentions_older = extract_comment_mentions(old_value)
# Getting Set Difference from mentions_newer
new_mentions = [
mention for mention in mentions_newer if mention not in mentions_older]
return new_mentions
def createMentionNotification(project, notification_comment, issue, actor_id, mention_id, issue_id, activity):
return Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mentioned",
triggered_by_id=actor_id,
receiver_id=mention_id,
entity_identifier=issue_id,
entity_name="issue",
project=project,
message=notification_comment,
data={
"issue": {
"id": str(issue_id),
"name": str(issue.name),
"identifier": str(issue.project.identifier),
"sequence_id": issue.sequence_id,
"state_name": issue.state.name,
"state_group": issue.state.group,
},
"issue_activity": {
"id": str(activity.get("id")),
"verb": str(activity.get("verb")),
"field": str(activity.get("field")),
"actor": str(activity.get("actor_id")),
"new_value": str(activity.get("new_value")),
"old_value": str(activity.get("old_value")),
}
},
)
@shared_task
def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activities_created):
def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activities_created, requested_data, current_instance):
issue_activities_created = (
json.loads(issue_activities_created) if issue_activities_created is not None else None
json.loads(
issue_activities_created) if issue_activities_created is not None else None
)
if type not in [
"cycle.activity.created",
@@ -33,39 +206,99 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
]:
# Create Notifications
bulk_notifications = []
issue_subscribers = list(
IssueSubscriber.objects.filter(project_id=project_id, issue_id=issue_id)
.exclude(subscriber_id=actor_id)
.values_list("subscriber", flat=True)
)
"""
Mention Tasks
1. Perform Diffing and Extract the mentions, that mention notification needs to be sent
2. From the latest set of mentions, extract the users which are not a subscribers & make them subscribers
"""
# Get new mentions from the newer instance
new_mentions = get_new_mentions(
requested_instance=requested_data, current_instance=current_instance)
removed_mention = get_removed_mentions(
requested_instance=requested_data, current_instance=current_instance)
comment_mentions = []
all_comment_mentions = []
# Get New Subscribers from the mentions of the newer instance
requested_mentions = extract_mentions(
issue_instance=requested_data)
mention_subscribers = extract_mentions_as_subscribers(
project_id=project_id, issue_id=issue_id, mentions=requested_mentions)
for issue_activity in issue_activities_created:
issue_comment = issue_activity.get("issue_comment")
issue_comment_new_value = issue_activity.get("new_value")
issue_comment_old_value = issue_activity.get("old_value")
if issue_comment is not None:
# TODO: Maybe save the comment mentions, so that in future, we can filter out the issues based on comment mentions as well.
all_comment_mentions = all_comment_mentions + extract_comment_mentions(issue_comment_new_value)
new_comment_mentions = get_new_comment_mentions(old_value=issue_comment_old_value, new_value=issue_comment_new_value)
comment_mentions = comment_mentions + new_comment_mentions
comment_mention_subscribers = extract_mentions_as_subscribers( project_id=project_id, issue_id=issue_id, mentions=all_comment_mentions)
"""
We will not send subscription activity notification to the below mentioned user sets
- Those who have been newly mentioned in the issue description, we will send mention notification to them.
- When the activity is a comment_created and there exist a mention in the comment, then we have to send the "mention_in_comment" notification
- When the activity is a comment_updated and there exist a mention change, then also we have to send the "mention_in_comment" notification
"""
issue_assignees = list(
IssueAssignee.objects.filter(project_id=project_id, issue_id=issue_id)
.exclude(assignee_id=actor_id)
IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(assignee_id__in=list(new_mentions + comment_mentions))
.values_list("assignee", flat=True)
)
issue_subscribers = issue_subscribers + issue_assignees
issue_subscribers = list(
IssueSubscriber.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(subscriber_id__in=list(new_mentions + comment_mentions + [actor_id]))
.values_list("subscriber", flat=True)
)
issue = Issue.objects.filter(pk=issue_id).first()
if (issue.created_by_id is not None and str(issue.created_by_id) != str(actor_id)):
issue_subscribers = issue_subscribers + [issue.created_by_id]
if subscriber:
# add the user to issue subscriber
try:
if str(issue.created_by_id) != str(actor_id) and uuid.UUID(actor_id) not in issue_assignees:
_ = IssueSubscriber.objects.get_or_create(
issue_id=issue_id, subscriber_id=actor_id
project_id=project_id, issue_id=issue_id, subscriber_id=actor_id
)
except Exception as e:
pass
project = Project.objects.get(pk=project_id)
for subscriber in list(set(issue_subscribers)):
issue_subscribers = list(set(issue_subscribers + issue_assignees) - {uuid.UUID(actor_id)})
for subscriber in issue_subscribers:
if subscriber in issue_subscribers:
sender = "in_app:issue_activities:subscribed"
if issue.created_by_id is not None and subscriber == issue.created_by_id:
sender = "in_app:issue_activities:created"
if subscriber in issue_assignees:
sender = "in_app:issue_activities:assigned"
for issue_activity in issue_activities_created:
issue_comment = issue_activity.get("issue_comment")
if issue_comment is not None:
issue_comment = IssueComment.objects.get(
id=issue_comment, issue_id=issue_id, project_id=project_id, workspace_id=project.workspace_id)
bulk_notifications.append(
Notification(
workspace=project.workspace,
sender="in_app:issue_activities",
sender=sender,
triggered_by_id=actor_id,
receiver_id=subscriber,
entity_identifier=issue_id,
@@ -89,7 +322,7 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
"new_value": str(issue_activity.get("new_value")),
"old_value": str(issue_activity.get("old_value")),
"issue_comment": str(
issue_activity.get("issue_comment").comment_stripped
issue_comment.comment_stripped
if issue_activity.get("issue_comment") is not None
else ""
),
@@ -98,5 +331,88 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
)
)
# Add Mentioned as Issue Subscribers
IssueSubscriber.objects.bulk_create(
mention_subscribers + comment_mention_subscribers, batch_size=100)
last_activity = (
IssueActivity.objects.filter(issue_id=issue_id)
.order_by("-created_at")
.first()
)
actor = User.objects.get(pk=actor_id)
for mention_id in comment_mentions:
if (mention_id != actor_id):
for issue_activity in issue_activities_created:
notification = createMentionNotification(
project=project,
issue=issue,
notification_comment=f"{actor.display_name} has mentioned you in a comment in issue {issue.name}",
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
)
bulk_notifications.append(notification)
for mention_id in new_mentions:
if (mention_id != actor_id):
if (
last_activity is not None
and last_activity.field == "description"
and actor_id == str(last_activity.actor_id)
):
bulk_notifications.append(
Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mentioned",
triggered_by_id=actor_id,
receiver_id=mention_id,
entity_identifier=issue_id,
entity_name="issue",
project=project,
message=f"You have been mentioned in the issue {issue.name}",
data={
"issue": {
"id": str(issue_id),
"name": str(issue.name),
"identifier": str(issue.project.identifier),
"sequence_id": issue.sequence_id,
"state_name": issue.state.name,
"state_group": issue.state.group,
},
"issue_activity": {
"id": str(last_activity.id),
"verb": str(last_activity.verb),
"field": str(last_activity.field),
"actor": str(last_activity.actor_id),
"new_value": str(last_activity.new_value),
"old_value": str(last_activity.old_value),
},
},
)
)
else:
for issue_activity in issue_activities_created:
notification = createMentionNotification(
project=project,
issue=issue,
notification_comment=f"You have been mentioned in the issue {issue.name}",
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
)
bulk_notifications.append(notification)
# save new mentions for the particular issue and remove the mentions that has been deleted from the description
update_mentions_for_issue(issue=issue, project=project, new_mentions=new_mentions,
removed_mention=removed_mention)
# Bulk create notifications
Notification.objects.bulk_create(bulk_notifications, batch_size=100)

View File

@@ -13,23 +13,24 @@ from plane.db.models import Project, User, ProjectMemberInvite
@shared_task
def project_invitation(email, project_id, token, current_site):
def project_invitation(email, project_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
project = Project.objects.get(pk=project_id)
project_member_invite = ProjectMemberInvite.objects.get(
token=token, email=email
)
relativelink = f"/project-member-invitation/{project_member_invite.id}"
relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM
subject = f"{project.created_by.first_name or project.created_by.email} invited you to join {project.name} on Plane"
subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = {
"email": email,
"first_name": project.created_by.first_name,
"first_name": user.first_name,
"project_name": project.name,
"invitation_url": abs_url,
}

View File

@@ -0,0 +1,139 @@
import requests
import uuid
import hashlib
import json
# Django imports
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
@shared_task(
bind=True,
autoretry_for=(requests.RequestException,),
retry_backoff=600,
max_retries=5,
retry_jitter=True,
)
def webhook_task(self, webhook, slug, event, event_data, action):
try:
webhook = Webhook.objects.get(id=webhook, workspace__slug=slug)
headers = {
"Content-Type": "application/json",
"User-Agent": "Autopilot",
"X-Plane-Delivery": str(uuid.uuid4()),
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
action = {
"POST": "create",
"PATCH": "update",
"PUT": "update",
"DELETE": "delete",
}.get(action, action)
payload = {
"event": event,
"action": action,
"webhook_id": str(webhook.id),
"workspace_id": str(webhook.workspace_id),
"data": event_data,
}
# Send the webhook event
response = requests.post(
webhook.url,
headers=headers,
json=payload,
timeout=30,
)
# Log the webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=str(response.status_code),
response_headers=str(response.headers),
response_body=str(response.text),
retry_count=str(self.request.retries),
)
except requests.RequestException as e:
# Log the failed webhook request
WebhookLog.objects.create(
workspace_id=str(webhook.workspace_id),
webhook_id=str(webhook.id),
event_type=str(event),
request_method=str(action),
request_headers=str(headers),
request_body=str(payload),
response_status=500,
response_headers="",
response_body=str(e),
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return
@shared_task()
def send_webhook(event, event_data, action, slug):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
if event == "project":
webhooks = webhooks.filter(project=True)
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
webhooks = webhooks.filter(module=True)
if event == "cycle":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -11,25 +11,33 @@ from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
# Module imports
from plane.db.models import Workspace, User, WorkspaceMemberInvite
from plane.db.models import User, Workspace, WorkspaceMemberInvite
@shared_task
def workspace_invitation(email, workspace_id, token, current_site, invitor):
try:
user = User.objects.get(email=invitor)
workspace = Workspace.objects.get(pk=workspace_id)
workspace_member_invite = WorkspaceMemberInvite.objects.get(
token=token, email=email
)
realtivelink = (
f"/workspace-member-invitation/?invitation_id={workspace_member_invite.id}&email={email}"
# Relative link
relative_link = (
f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
)
abs_url = current_site + realtivelink
# The complete url including the domain
abs_url = current_site + relative_link
# The email from
from_email_string = settings.EMAIL_FROM
subject = f"{invitor or email} invited you to join {workspace.name} on Plane"
# Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
context = {
"email": email,

View File

@@ -3,7 +3,7 @@
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api_token
import plane.db.models.api
import uuid
@@ -40,8 +40,8 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token', models.CharField(default=plane.db.models.api_token.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api_token.generate_label_token, max_length=255)),
('token', models.CharField(default=plane.db.models.api.generate_token, max_length=255, unique=True)),
('label', models.CharField(default=plane.db.models.api.generate_label_token, max_length=255)),
('user_type', models.PositiveSmallIntegerField(choices=[(0, 'Human'), (1, 'Bot')], default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='apitoken_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),

View File

@@ -1,21 +0,0 @@
# Generated by Django 4.2.5 on 2023-10-18 12:04
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
]

View File

@@ -0,0 +1,984 @@
# Generated by Django 4.2.5 on 2023-11-15 09:47
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.issue
import uuid
import random
def random_sort_ordering(apps, schema_editor):
Label = apps.get_model("db", "Label")
bulk_labels = []
for label in Label.objects.all():
label.sort_order = random.randint(0,65535)
bulk_labels.append(label)
Label.objects.bulk_update(bulk_labels, ["sort_order"], batch_size=1000)
class Migration(migrations.Migration):
dependencies = [
('db', '0045_issueactivity_epoch_workspacemember_issue_props_and_more'),
]
operations = [
migrations.AddField(
model_name='label',
name='sort_order',
field=models.FloatField(default=65535),
),
migrations.AlterField(
model_name='analyticview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='analyticview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='apitoken',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='apitoken',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycle',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycle',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycle',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cyclefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='cycleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='cycleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='cycleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='cycleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimate',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimate',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimate',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimate',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='estimatepoint',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='estimatepoint',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='estimatepoint',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='fileasset',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='fileasset',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubcommentsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubissuesync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubissuesync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubissuesync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepository',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepository',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepository',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepository',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='githubrepositorysync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='importer',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='importer',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='importer',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='importer',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inbox',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inbox',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inbox',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inbox',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='inboxissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='inboxissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='inboxissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='inboxissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='integration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='integration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueactivity',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueactivity',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueactivity',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueactivity',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueassignee',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueassignee',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueassignee',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueassignee',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueattachment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueattachment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueattachment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueattachment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueblocker',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueblocker',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueblocker',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueblocker',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuecomment',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuecomment',
name='issue',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_comments', to='db.issue'),
),
migrations.AlterField(
model_name='issuecomment',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuecomment',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuecomment',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueproperty',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueproperty',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueproperty',
name='properties',
field=models.JSONField(default=plane.db.models.issue.get_default_properties),
),
migrations.AlterField(
model_name='issueproperty',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueproperty',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issuesequence',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issuesequence',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issuesequence',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issuesequence',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueview',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueview',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueview',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueview',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='issueviewfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='label',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='label',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='label',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='label',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='module',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='module',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='module',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='module',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='moduleissue',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='moduleissue',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='moduleissue',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='moduleissue',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulelink',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulelink',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulelink',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulelink',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='modulemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='modulemember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='modulemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='modulemember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='page',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='page',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='page',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='page',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pageblock',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pageblock',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pageblock',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pageblock',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagefavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagefavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagefavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='pagelabel',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='pagelabel',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='pagelabel',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='pagelabel',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='project',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='project',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectfavorite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectfavorite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectidentifier',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectidentifier',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmember',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmember',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='projectmemberinvite',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='slackprojectsync',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='socialloginconnection',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='state',
name='project',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project'),
),
migrations.AlterField(
model_name='state',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='state',
name='workspace',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace'),
),
migrations.AlterField(
model_name='team',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='team',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='teammember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='teammember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspace',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspace',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspaceintegration',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacemember',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacemember',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacememberinvite',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='created_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By'),
),
migrations.AlterField(
model_name='workspacetheme',
name='updated_by',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By'),
),
migrations.CreateModel(
name='IssueMention',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to='db.issue')),
('mention', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_mention', to=settings.AUTH_USER_MODEL)),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Mention',
'verbose_name_plural': 'Issue Mentions',
'db_table': 'issue_mentions',
'ordering': ('-created_at',),
'unique_together': {('issue', 'mention')},
},
),
migrations.RunPython(random_sort_ordering),
]

View File

@@ -0,0 +1,131 @@
# Generated by Django 4.2.5 on 2023-11-15 11:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.api
import plane.db.models.webhook
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0046_label_sort_order_alter_analyticview_created_by_and_more'),
]
operations = [
migrations.CreateModel(
name='Webhook',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('url', models.URLField(validators=[plane.db.models.webhook.validate_schema, plane.db.models.webhook.validate_domain])),
('is_active', models.BooleanField(default=True)),
('secret_key', models.CharField(default=plane.db.models.webhook.generate_token, max_length=255)),
('project', models.BooleanField(default=False)),
('issue', models.BooleanField(default=False)),
('module', models.BooleanField(default=False)),
('cycle', models.BooleanField(default=False)),
('issue_comment', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_webhooks', to='db.workspace')),
],
options={
'verbose_name': 'Webhook',
'verbose_name_plural': 'Webhooks',
'db_table': 'webhooks',
'ordering': ('-created_at',),
'unique_together': {('workspace', 'url')},
},
),
migrations.AddField(
model_name='apitoken',
name='description',
field=models.TextField(blank=True),
),
migrations.AddField(
model_name='apitoken',
name='expired_at',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='apitoken',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='apitoken',
name='last_used',
field=models.DateTimeField(null=True),
),
migrations.AddField(
model_name='projectmember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name='workspacemember',
name='is_active',
field=models.BooleanField(default=True),
),
migrations.AlterField(
model_name='apitoken',
name='token',
field=models.CharField(db_index=True, default=plane.db.models.api.generate_token, max_length=255, unique=True),
),
migrations.CreateModel(
name='WebhookLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('event_type', models.CharField(blank=True, max_length=255, null=True)),
('request_method', models.CharField(blank=True, max_length=10, null=True)),
('request_headers', models.TextField(blank=True, null=True)),
('request_body', models.TextField(blank=True, null=True)),
('response_status', models.TextField(blank=True, null=True)),
('response_headers', models.TextField(blank=True, null=True)),
('response_body', models.TextField(blank=True, null=True)),
('retry_count', models.PositiveSmallIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('webhook', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='logs', to='db.webhook')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='webhook_logs', to='db.workspace')),
],
options={
'verbose_name': 'Webhook Log',
'verbose_name_plural': 'Webhook Logs',
'db_table': 'webhook_logs',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='APIActivityLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('token_identifier', models.CharField(max_length=255)),
('path', models.CharField(max_length=255)),
('method', models.CharField(max_length=10)),
('query_params', models.TextField(blank=True, null=True)),
('headers', models.TextField(blank=True, null=True)),
('body', models.TextField(blank=True, null=True)),
('response_code', models.PositiveIntegerField()),
('response_body', models.TextField(blank=True, null=True)),
('ip_address', models.GenericIPAddressField(blank=True, null=True)),
('user_agent', models.CharField(blank=True, max_length=512, null=True)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
'verbose_name': 'API Activity Log',
'verbose_name_plural': 'API Activity Logs',
'db_table': 'api_activity_logs',
'ordering': ('-created_at',),
},
),
]

View File

@@ -0,0 +1,54 @@
# Generated by Django 4.2.5 on 2023-11-13 12:53
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0047_webhook_apitoken_description_apitoken_expired_at_and_more'),
]
operations = [
migrations.CreateModel(
name='PageLog',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('transaction', models.UUIDField(default=uuid.uuid4)),
('entity_identifier', models.UUIDField(null=True)),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('mention', 'Mention')], max_length=30, verbose_name='Transaction Type')),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_log', to='db.page')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Page Log',
'verbose_name_plural': 'Page Logs',
'db_table': 'page_logs',
'ordering': ('-created_at',),
'unique_together': {('page', 'transaction')}
},
),
migrations.AddField(
model_name='page',
name='archived_at',
field=models.DateField(null=True),
),
migrations.AddField(
model_name='page',
name='is_locked',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='page',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='child_page', to='db.page'),
),
]

View File

@@ -0,0 +1,72 @@
# Generated by Django 4.2.5 on 2023-11-15 09:16
# Python imports
import uuid
from django.db import migrations
def update_pages(apps, schema_editor):
try:
Page = apps.get_model("db", "Page")
PageBlock = apps.get_model("db", "PageBlock")
PageLog = apps.get_model("db", "PageLog")
updated_pages = []
page_logs = []
# looping through all the pages
for page in Page.objects.all():
page_blocks = PageBlock.objects.filter(
page_id=page.id, project_id=page.project_id, workspace_id=page.workspace_id
).order_by("sort_order")
if page_blocks:
# looping through all the page blocks in a page
for page_block in page_blocks:
if page_block.issue is not None:
project_identifier = page.project.identifier
sequence_id = page_block.issue.sequence_id
transaction = uuid.uuid4().hex
embed_component = f'<issue-embed-component id="{transaction}" entity_name="issue" entity_identifier="{page_block.issue_id}" sequence_id="{sequence_id}" project_identifier="{project_identifier}" title="{page_block.name}"></issue-embed-component>'
page.description_html += embed_component
# create the page transaction for the issue
page_logs.append(
PageLog(
page_id=page_block.page_id,
transaction=transaction,
entity_identifier=page_block.issue_id,
entity_name="issue",
project_id=page.project_id,
workspace_id=page.workspace_id,
created_by_id=page_block.created_by_id,
updated_by_id=page_block.updated_by_id,
)
)
else:
# adding the page block name and description to the page description
page.description_html += f"<h2>{page_block.name}</h2>"
page.description_html += page_block.description_html
updated_pages.append(page)
Page.objects.bulk_update(
updated_pages,
["description_html"],
batch_size=100,
)
PageLog.objects.bulk_create(page_logs, batch_size=100)
except Exception as e:
print(e)
class Migration(migrations.Migration):
dependencies = [
("db", "0048_auto_20231116_0713"),
]
operations = [
migrations.RunPython(update_pages),
]

View File

@@ -27,12 +27,12 @@ from .issue import (
IssueActivity,
IssueProperty,
IssueComment,
IssueBlocker,
IssueLabel,
IssueAssignee,
Label,
IssueBlocker,
IssueRelation,
IssueMention,
IssueLink,
IssueSequence,
IssueAttachment,
@@ -54,7 +54,7 @@ from .view import GlobalView, IssueView, IssueViewFavorite
from .module import Module, ModuleMember, ModuleIssue, ModuleLink, ModuleFavorite
from .api_token import APIToken
from .api import APIToken, APIActivityLog
from .integration import (
WorkspaceIntegration,
@@ -68,7 +68,7 @@ from .integration import (
from .importer import Importer
from .page import Page, PageBlock, PageFavorite, PageLabel
from .page import Page, PageLog, PageFavorite, PageLabel
from .estimate import Estimate, EstimatePoint
@@ -79,3 +79,5 @@ from .analytic import AnalyticView
from .notification import Notification
from .exporter import ExporterHistory
from .webhook import Webhook, WebhookLog

View File

@@ -0,0 +1,80 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return "plane_api_" + uuid4().hex
class APIToken(BaseModel):
# Meta information
label = models.CharField(max_length=255, default=generate_label_token)
description = models.TextField(blank=True)
is_active = models.BooleanField(default=True)
last_used = models.DateTimeField(null=True)
# Token
token = models.CharField(
max_length=255, unique=True, default=generate_token, db_index=True
)
# User Information
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
expired_at = models.DateTimeField(blank=True, null=True)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.id)
class APIActivityLog(BaseModel):
token_identifier = models.CharField(max_length=255)
# Request Info
path = models.CharField(max_length=255)
method = models.CharField(max_length=10)
query_params = models.TextField(null=True, blank=True)
headers = models.TextField(null=True, blank=True)
body = models.TextField(null=True, blank=True)
# Response info
response_code = models.PositiveIntegerField()
response_body = models.TextField(null=True, blank=True)
# Meta information
ip_address = models.GenericIPAddressField(null=True, blank=True)
user_agent = models.CharField(max_length=512, null=True, blank=True)
class Meta:
verbose_name = "API Activity Log"
verbose_name_plural = "API Activity Logs"
db_table = "api_activity_logs"
ordering = ("-created_at",)
def __str__(self):
return str(self.token_identifier)

View File

@@ -1,41 +0,0 @@
# Python imports
from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from .base import BaseModel
def generate_label_token():
return uuid4().hex
def generate_token():
return uuid4().hex + uuid4().hex
class APIToken(BaseModel):
token = models.CharField(max_length=255, unique=True, default=generate_token)
label = models.CharField(max_length=255, default=generate_label_token)
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="bot_tokens",
)
user_type = models.PositiveSmallIntegerField(
choices=((0, "Human"), (1, "Bot")), default=0
)
workspace = models.ForeignKey(
"db.Workspace", related_name="api_tokens", on_delete=models.CASCADE, null=True
)
class Meta:
verbose_name = "API Token"
verbose_name_plural = "API Tokems"
db_table = "api_tokens"
ordering = ("-created_at",)
def __str__(self):
return str(self.user.name)

View File

@@ -6,7 +6,6 @@ from django.db import models
# Module imports
from plane.db.models import ProjectBaseModel
from plane.db.mixins import AuditModel
class GithubRepository(ProjectBaseModel):

View File

@@ -132,25 +132,7 @@ class Issue(ProjectBaseModel):
self.state = default_state
except ImportError:
pass
else:
try:
from plane.db.models import State, PageBlock
# Check if the current issue state and completed state id are same
if self.state.group == "completed":
self.completed_at = timezone.now()
# check if there are any page blocks
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=timezone.now()
)
else:
PageBlock.objects.filter(issue_id=self.id).filter().update(
completed_at=None
)
self.completed_at = None
except ImportError:
pass
if self._state.adding:
# Get the maximum display_id value from the database
last_id = IssueSequence.objects.filter(project=self.project).aggregate(
@@ -228,6 +210,25 @@ class IssueRelation(ProjectBaseModel):
def __str__(self):
return f"{self.issue.name} {self.related_issue.name}"
class IssueMention(ProjectBaseModel):
issue = models.ForeignKey(
Issue, on_delete=models.CASCADE, related_name="issue_mention"
)
mention = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="issue_mention",
)
class Meta:
unique_together = ["issue", "mention"]
verbose_name = "Issue Mention"
verbose_name_plural = "Issue Mentions"
db_table = "issue_mentions"
ordering = ("-created_at",)
def __str__(self):
return f"{self.issue.name} {self.mention.email}"
class IssueAssignee(ProjectBaseModel):
issue = models.ForeignKey(
@@ -412,6 +413,7 @@ class Label(ProjectBaseModel):
name = models.CharField(max_length=255)
description = models.TextField(blank=True)
color = models.CharField(max_length=255, blank=True)
sort_order = models.FloatField(default=65535)
class Meta:
unique_together = ["name", "project"]
@@ -420,6 +422,18 @@ class Label(ProjectBaseModel):
db_table = "labels"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
if self._state.adding:
# Get the maximum sequence value from the database
last_id = Label.objects.filter(project=self.project).aggregate(
largest=models.Max("sort_order")
)["largest"]
# if last_id is not None
if last_id is not None:
self.sort_order = last_id + 10000
super(Label, self).save(*args, **kwargs)
def __str__(self):
return str(self.name)

View File

@@ -1,3 +1,5 @@
import uuid
# Django imports
from django.db import models
from django.conf import settings
@@ -22,6 +24,15 @@ class Page(ProjectBaseModel):
labels = models.ManyToManyField(
"db.Label", blank=True, related_name="pages", through="db.PageLabel"
)
parent = models.ForeignKey(
"self",
on_delete=models.CASCADE,
null=True,
blank=True,
related_name="child_page",
)
archived_at = models.DateField(null=True)
is_locked = models.BooleanField(default=False)
class Meta:
verbose_name = "Page"
@@ -34,6 +45,42 @@ class Page(ProjectBaseModel):
return f"{self.owned_by.email} <{self.name}>"
class PageLog(ProjectBaseModel):
TYPE_CHOICES = (
("to_do", "To Do"),
("issue", "issue"),
("image", "Image"),
("video", "Video"),
("file", "File"),
("link", "Link"),
("cycle","Cycle"),
("module", "Module"),
("back_link", "Back Link"),
("forward_link", "Forward Link"),
("mention", "Mention"),
)
transaction = models.UUIDField(default=uuid.uuid4)
page = models.ForeignKey(
Page, related_name="page_log", on_delete=models.CASCADE
)
entity_identifier = models.UUIDField(null=True)
entity_name = models.CharField(
max_length=30,
choices=TYPE_CHOICES,
verbose_name="Transaction Type",
)
class Meta:
unique_together = ["page", "transaction"]
verbose_name = "Page Log"
verbose_name_plural = "Page Logs"
db_table = "page_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.page.name} {self.type}"
class PageBlock(ProjectBaseModel):
page = models.ForeignKey("db.Page", on_delete=models.CASCADE, related_name="blocks")
name = models.CharField(max_length=255)

View File

@@ -4,9 +4,6 @@ from uuid import uuid4
# Django imports
from django.db import models
from django.conf import settings
from django.template.defaultfilters import slugify
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.core.validators import MinValueValidator, MaxValueValidator
# Modeule imports
@@ -169,6 +166,7 @@ class ProjectMember(ProjectBaseModel):
default_props = models.JSONField(default=get_default_props)
preferences = models.JSONField(default=get_default_preferences)
sort_order = models.FloatField(default=65535)
is_active = models.BooleanField(default=True)
def save(self, *args, **kwargs):
if self._state.adding:

View File

@@ -0,0 +1,90 @@
# Python imports
from uuid import uuid4
from urllib.parse import urlparse
# Django imports
from django.db import models
from django.core.exceptions import ValidationError
# Module imports
from plane.db.models import BaseModel
def generate_token():
return "plane_wh_" + uuid4().hex
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")
def validate_domain(value):
parsed_url = urlparse(value)
domain = parsed_url.netloc
if domain in ["localhost", "127.0.0.1"]:
raise ValidationError("Local URLs are not allowed.")
class Webhook(BaseModel):
workspace = models.ForeignKey(
"db.Workspace",
on_delete=models.CASCADE,
related_name="workspace_webhooks",
)
url = models.URLField(
validators=[
validate_schema,
validate_domain,
]
)
is_active = models.BooleanField(default=True)
secret_key = models.CharField(max_length=255, default=generate_token)
project = models.BooleanField(default=False)
issue = models.BooleanField(default=False)
module = models.BooleanField(default=False)
cycle = models.BooleanField(default=False)
issue_comment = models.BooleanField(default=False)
def __str__(self):
return f"{self.workspace.slug} {self.url}"
class Meta:
unique_together = ["workspace", "url"]
verbose_name = "Webhook"
verbose_name_plural = "Webhooks"
db_table = "webhooks"
ordering = ("-created_at",)
class WebhookLog(BaseModel):
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="webhook_logs"
)
# Associated webhook
webhook = models.ForeignKey(Webhook, on_delete=models.CASCADE, related_name="logs")
# Basic request details
event_type = models.CharField(max_length=255, blank=True, null=True)
request_method = models.CharField(max_length=10, blank=True, null=True)
request_headers = models.TextField(blank=True, null=True)
request_body = models.TextField(blank=True, null=True)
# Response details
response_status = models.TextField(blank=True, null=True)
response_headers = models.TextField(blank=True, null=True)
response_body = models.TextField(blank=True, null=True)
# Retry Count
retry_count = models.PositiveSmallIntegerField(default=0)
class Meta:
verbose_name = "Webhook Log"
verbose_name_plural = "Webhook Logs"
db_table = "webhook_logs"
ordering = ("-created_at",)
def __str__(self):
return f"{self.event_type} {str(self.webhook.url)}"

View File

@@ -99,6 +99,7 @@ class WorkspaceMember(BaseModel):
view_props = models.JSONField(default=get_default_props)
default_props = models.JSONField(default=get_default_props)
issue_props = models.JSONField(default=get_issue_props)
is_active = models.BooleanField(default=True)
class Meta:
unique_together = ["workspace", "member"]

View File

@@ -0,0 +1,40 @@
from plane.db.models import APIToken, APIActivityLog
class APITokenLogMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
request_body = request.body
response = self.get_response(request)
self.process_request(request, response, request_body)
return response
def process_request(self, request, response, request_body):
api_key_header = "X-Api-Key"
api_key = request.headers.get(api_key_header)
# If the API key is present, log the request
if api_key:
try:
APIActivityLog.objects.create(
token_identifier=api_key,
path=request.path,
method=request.method,
query_params=request.META.get("QUERY_STRING", ""),
headers=str(request.headers),
body=(request_body.decode('utf-8') if request_body else None),
response_body=(
response.content.decode("utf-8") if response.content else None
),
response_code=response.status_code,
ip_address=request.META.get("REMOTE_ADDR", None),
user_agent=request.META.get("HTTP_USER_AGENT", None),
)
except Exception as e:
print(e)
# If the token does not exist, you can decide whether to log this as an invalid attempt
pass
return None

View File

@@ -1,33 +0,0 @@
import jwt
import pytz
from django.conf import settings
from django.utils import timezone
from plane.db.models import User
class UserMiddleware(object):
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
try:
if request.headers.get("Authorization"):
authorization_header = request.headers.get("Authorization")
access_token = authorization_header.split(" ")[1]
decoded = jwt.decode(
access_token, settings.SECRET_KEY, algorithms=["HS256"]
)
id = decoded['user_id']
user = User.objects.get(id=id)
user.last_active = timezone.now()
user.token_updated_at = None
user.save()
timezone.activate(pytz.timezone(user.user_timezone))
except Exception as e:
print(e)
response = self.get_response(request)
return response

View File

Some files were not shown because too many files have changed in this diff Show More