Compare commits

...

162 Commits

Author SHA1 Message Date
gurusainath
43366dc355 clean-up: remove logs 2023-12-07 14:40:51 +05:30
gurusainath
77eef2c899 fix: update operaton for member invitations in workspace 2023-12-07 14:38:17 +05:30
gurusainath
7e5320b5d1 Merge branch 'develop' of gurusainath:makeplane/plane into dev/license_engine 2023-12-07 14:37:31 +05:30
gurusainath
c7eae99c89 Merge branch 'dev/license_engine' of gurusainath:makeplane/plane into dev/license_engine 2023-12-07 14:37:19 +05:30
NarayanBavisetti
368834c577 Merge branch 'dev/license_engine' of github.com:makeplane/plane into dev/license_engine 2023-12-07 14:29:59 +05:30
NarayanBavisetti
056b12d7d7 chore: changed my sequence 2023-12-07 14:29:43 +05:30
rahulramesha
03c8aeed57 Chore: Minify build for plane packages (#3017)
* minify @plane/ui build

* minify all the packages
2023-12-07 14:26:17 +05:30
pablohashescobar
58ac2955fa Merge branch 'dev/license_engine' of github.com:makeplane/plane into dev/license_engine 2023-12-07 14:21:55 +05:30
pablohashescobar
30e764e917 dev: custom port for takeoff script 2023-12-07 14:21:45 +05:30
pablohashescobar
99aab52c46 dev: patch endpoint for workspace 2023-12-07 14:21:28 +05:30
NarayanBavisetti
bd25013fee chore: changed the EMAIL_FROM 2023-12-07 14:07:52 +05:30
NarayanBavisetti
1fd7372c18 chore: added SKIP_ENV_VAR 2023-12-07 13:58:57 +05:30
NarayanBavisetti
279044e895 chore: cleanup 2023-12-07 12:51:10 +05:30
NarayanBavisetti
5a364855e8 chore: changed config variables 2023-12-07 12:42:02 +05:30
Henit Chobisa
6c8df73ad4 [ FEATURE ] New Issue Widget for displaying issues inside document-editor (#2920)
* feat: added heading 3 in the editor summary markings

* feat: fixed editor and summary bar sizing

* feat: added `issue-embed` extension

* feat: exposed issue embed extension

* feat: added main embed config configuration to document editor body

* feat: added peek overview and issue embed fetch function

* feat: enabled slash commands to take additonal suggestions from editors

* chore: replaced `IssueEmbedWidget` into widget extension

* chore: removed issue embed from previous places

* feat: added issue embed suggestion extension

* feat: added issue embed suggestion renderer

* feat: added issue embed suggestions into extensions module

* feat: added issues in issueEmbedConfiguration in document editor

* chore: package fixes

* chore: removed log statements

* feat: added title updation logic into document editor

* fix: issue suggestion items, not rendering issue widget on enter

* feat: added error card for issue widget

* feat: improved focus logic for issue search and navigate

* feat: appended transactionid for issueWidgetTransaction

* chore: packages update

* feat: disabled editing of title in readonly mode

* feat: added issueEmbedConfig in readonly editor

* fix: issue suggestions not loading after structure changed to object

* feat: added toast messages for success/error messages from doc editor

* fix: issue suggestions sorting issue

* fix: formatting errors resolved

* fix: infinite reloading of the readonly document editor

* fix: css in avatar of issue widget card

* feat: added show alert on pages reload

* feat: added saving state for the pages editor

* fix: issue with heading 3 in side bar view

* style: updated issue suggestions dropdown ui

* fix: Pages intiliazation and mutation with updated MobX store

* fixed image uploads being cancelled on refocus due to swr

* fix: issue with same description rerendering empty content fixed

* fix: scroll in issue suggestion view

* fix: added submission prop

* fix: Updated the comment update to take issue id in inbox issues

* feat:changed date representation in IssueEmbedCard

* fix: page details mutation with optimistic updates using swr

* fix: menu options in read only editor with auth fixed

* fix: add error handling for title and page desc

* fixed yarn.lock

* fix: read-only editor title wrapping

* fix: build error with rich text editor

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: Palanikannan1437 <73993394+Palanikannan1437@users.noreply.github.com>
2023-12-07 12:04:21 +05:30
Lakhan Baheti
074e35525c fix: spreadsheet layout bugs (#3016)
* fix: date picker z visibility

* fix: typo in empty issue screen

* fix: spread sheet column rightmost border
2023-12-07 11:59:46 +05:30
NarayanBavisetti
b912bb21ad chore: removed the print statement 2023-12-07 11:52:48 +05:30
pablohashescobar
7b8a3d2e95 dev: update instance configuration function 2023-12-06 22:34:36 +05:30
Aaryan Khandelwal
8ea42aa0b6 fix: remove all unused variables and added dependecies to useEffect and useCallback (#3013) 2023-12-06 20:31:42 +05:30
sriram veeraghanta
1df067ff61 fix: upgrading types react package (#3014) 2023-12-06 20:07:39 +05:30
guru_sainath
a79dbdadb5 fix: issue layouts bugs and ui fixes (#3012)
* fix: initial issue creation issue in the list layout

* fix kanban drag n drop and updating properties

* reduce z index of spreadsheet bottom row to not overlap with other elements

* fix state update by using state id instead of state detail's id

* fix add default use state for description

* add create issue button for project views to be at par with production

* save draft issues from modal

* chore: added save view button in all layouts applied filters

* use useEffect instead of swr for fetching issue details for peek overview

* fix: resolved kanban dnd

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-12-06 19:58:47 +05:30
Bavisetti Narayan
bffba6b9dc chore: Page auth and other improvements (#3011)
* chore: project query optimised

* chore: page permissions changed
2023-12-06 19:30:40 +05:30
Aaryan Khandelwal
bd4b5040b2 chore: remove unused fields from the god mode (#3007) 2023-12-06 19:29:45 +05:30
pablohashescobar
9adb02036d dev: remove license engine base url 2023-12-06 19:25:40 +05:30
Lakhan Baheti
0d762d74dc fix: kanban board block's menu & drop delete. (#2987)
* fix: kanban board block menu click

* fix: menu active/disable

* fix: drag n drop delete modal

* fix: quick action button in all the layouts

* chore: toast for drag & drop api
2023-12-06 19:21:24 +05:30
pablohashescobar
1984a242d0 dev: remove license engine communication 2023-12-06 19:19:42 +05:30
Henit Chobisa
567c3fadc0 feat: added custom blockquote extension for resolving enter key behaviour (#2997) 2023-12-06 19:17:47 +05:30
Lakhan Baheti
cb0af255f9 fix: custom analytic grouped bar tooltip value as ID (#3003)
* fix: tooltip value is coming as ID

* fix lint named module
2023-12-06 19:15:07 +05:30
Aaryan Khandelwal
d204cc7d6c chore: added authorization to pages (#3006)
* chore: updated pages authorization

* chore: updated pages empty state image
2023-12-06 19:13:42 +05:30
Anmol Singh Bhatia
5317de5919 chore: plane logo without text updated (#3008) 2023-12-06 19:11:34 +05:30
Anmol Singh Bhatia
b499e2c5a5 fix: bug fixes (#3010)
* fix: project view modal auto close bug fix

* fix: issue peek overview label select permission validation added
2023-12-06 19:10:53 +05:30
Nikhil
956d7acfa6 dev: user password reset management command (#3000) 2023-12-06 18:29:53 +05:30
Nikhil
4040441ab1 dev: remove unused packages (#3009)
* dev: remove unused packages

* dev: remove gunicorn config
2023-12-06 18:28:34 +05:30
Bavisetti Narayan
dc5c0fe5bf chore: posthog event for workspace invite (#2989)
* chore: posthog event for workspace invite

* chore: updated event names, added all the existing events to workspace metrics group

* chore: seperated workspace invite

* fix: workspace invite accept event updated

---------

Co-authored-by: Ramesh Kumar Chandra <rameshkumar2299@gmail.com>
2023-12-06 17:15:14 +05:30
Jorge
675e747fed Add CodeQL workflow (#1452) 2023-12-06 17:07:55 +05:30
Manish Gupta
c9517610da modified docker image repo names (#3004) 2023-12-06 16:47:14 +05:30
Anmol Singh Bhatia
0e1ee80512 chore: updated plane deploy sign-in workflows for cloud and self-hosted instances (#2999)
* chore: deploy onboarding workflow

* chore: sign in workflow improvement

* fix: build error
2023-12-06 16:42:57 +05:30
guru_sainath
0fc273aca6 clean-up: removed labels in the filters and handled redirection issue from peek overview and ui changes (#3002) 2023-12-06 16:27:33 +05:30
Lakhan Baheti
d6b23fe380 fix: bugs & improvements (#2998)
* fix: create more toggle in update issue modal

* fix: spreadsheet estimate column hide

* fix: flickering in all the layouts

* fix: logs
2023-12-06 14:29:07 +05:30
Aaryan Khandelwal
11987994a1 chore: updated sign-in workflows for cloud and self-hosted instances (#2994)
* chore: update onboarding workflow

* dev: update user count tasks

* fix: forgot password endpoint

* dev: instance and onboarding updates

* chore: update sign-in workflow for cloud and self-hosted instances (#2993)

* chore: updated auth services

* chore: new signin workflow updated

* chore: updated content

* chore: instance admin setup

* dev: update instance verification task

* dev: run the instance verification task every 4 hours

* dev: update migrations

* chore: update latest features image

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-12-06 14:22:59 +05:30
Anmol Singh Bhatia
b957e4e935 fix: bug fixes and improvement (#2992)
* chore: issue sidebar permission bug fix and not authorized page redirection added

* chore: unauthorized project setting page improvement

* fix: build error fix
2023-12-06 14:22:06 +05:30
M. Palanikannan
e7ac7e1da8 fix: Image Resizing and PR (#2996)
* added image min width and height programatically

* fixed editor initialization for peek view and inbox issues

* fixed ts issues with issue id in inbox
2023-12-06 12:16:34 +05:30
rahulramesha
ac18906604 fix: bugs related to issues (#2995)
* hide properties in list and kanban with 0 or nil values

* module and cycle mutation from peek overlay

* fix peek over view title change while switching

* fix create issue fetching

* fix build errors by mutating the values as well
2023-12-06 01:02:18 +05:30
Anmol Singh Bhatia
aec50e2c48 [FED-1147] chore: module link mobx integration (#2990)
* chore: link type updated

* chore: mobx implementation for module link

* chore: update module mutation logic updated and toast alert added
2023-12-05 19:32:25 +05:30
guru_sainath
8dee7e51ca chore: workspace profile issues, kanabn DND upgrade, implemented filters in plaen deploy (#2991) 2023-12-05 17:26:57 +05:30
Anmol Singh Bhatia
8b2d78ef92 chore: space ui component revamp and bug fixes (#2980)
* chore: replace space ui component with plane ui component

* fix: space project icon and user pic bug

* chore: code refactor

* fix: profile section navbar fix
2023-12-05 16:07:25 +05:30
Lakhan Baheti
9649f42ff3 chore: email invite accept validation (#2965)
* fix: empty state flickering on accepting only invitation

* fix: redirection from workspace-invitaion to onboarding

* chore: onboarding step 1 skip on accepting invite from email

* fix: dashboard redirection path
2023-12-05 16:06:43 +05:30
sabith-tu
db1166411a style: image picker, spreadsheet view title, icons (#2988)
* style: image picker, spreadsheet view title, icons

* fix: build error fix
2023-12-05 16:05:50 +05:30
Lakhan Baheti
effad33f67 chore: issue update status in peekview & detail (#2985) 2023-12-05 14:40:29 +05:30
Aaryan Khandelwal
fd73f18d95 fix: leave project mutation (#2976) 2023-12-05 13:44:01 +05:30
Nikhil
32bfa4d7cf fix: sentry dsn error (#2981) 2023-12-05 13:42:16 +05:30
Anmol Singh Bhatia
7b12d54d83 chore: draft issue layout and permission validation (#2982)
* chore: create draft issue option added in draft issue layout and permission validation added

* chore: create draft issue option added in draft issue list layout and permission validation added
2023-12-05 13:41:31 +05:30
Anmol Singh Bhatia
2cf847e8a7 chore: module and cycle sidebar date mutation fix (#2986) 2023-12-05 13:40:29 +05:30
Bavisetti Narayan
e2524799d2 chore: html validation (#2970)
* chore: changed api serializers

* chore: state status code

* chore: removed sorted keys
2023-12-05 13:39:09 +05:30
rahulramesha
8a8eea38f9 fix: Permission levels for project settings (#2978)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102

* use Enums instead of numbers

* fix Estimates setting permission for admin

* disable access to project settings for viewers and guests

* fix project unautorized flicker

* add observer to estimates

* add permissions to member list
2023-12-04 20:03:23 +05:30
rahulramesha
b527ec582f fix: mutation on transfer issues from cycle (#2979)
* fix cycle issues mutation on transfering issues

* fix transfer issues from cycle
2023-12-04 20:02:14 +05:30
Aaryan Khandelwal
fa990ed444 refactor: custom hook for sign in redirection (#2969) 2023-12-04 15:04:04 +05:30
Nikhil
79fa860b28 chore: instance (#2955) 2023-12-04 14:48:40 +05:30
rahulramesha
59110c7205 fix: sub display filter params for fetching issues (#2972)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102
2023-12-02 18:23:07 +05:30
Aaryan Khandelwal
9756abece4 chore: update instance admin sign in endpoint (#2973) 2023-12-02 18:19:59 +05:30
guru_sainath
f4aa059da6 fix: global issues properties updation issue resolved (#2974) 2023-12-02 18:19:29 +05:30
rahulramesha
d22c258c00 fix: V3 release blocker bugs (#2968)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes
2023-12-01 18:51:52 +05:30
sabith-tu
65253c6383 style: empty state for analytics, views and pages (#2967) 2023-12-01 18:19:17 +05:30
guru_sainath
8c462f96ee fix: corrected rendering of workspace-level labels, members, and states in project view (#2966)
* fix: dynamic issue properties filters in project, workspace and profile level

* clean-up: removed logs from the project store
2023-12-01 17:35:33 +05:30
Aaryan Khandelwal
332e56bb0d style: updated the UI of the instance admin setup and the sign in workflow (#2962)
* style: updated the UI of the signin and instance setups

* fix: form validations and mutations

* fix: updated Link tags in accordance to next v14

* chore: latest features image, reset password redirection
2023-12-01 15:50:01 +05:30
Lakhan Baheti
eca96da837 fix: create workspace form validation (#2958)
* fix: create workspace form validation

* fix: textfield placeholder typo

* fix: name field onchange
2023-12-01 15:18:48 +05:30
guru_sainath
5446447231 chore: workspace global issues (#2964)
* dev: global issues store

* build-error: all issues render

* build-error: build error resolved in global view store
2023-12-01 15:16:36 +05:30
Anmol Singh Bhatia
dec1fb5a63 chore: profile issue display filters content updated (#2963) 2023-12-01 15:14:18 +05:30
Anmol Singh Bhatia
2b63827caa chore: create issue modal improvement (#2960) 2023-12-01 15:12:25 +05:30
Manish Gupta
5993fc38f0 branch build fixes (#2961) 2023-12-01 13:40:16 +05:30
Anmol Singh Bhatia
ab37ce979a fix: view modal overlapping (#2956) 2023-12-01 13:27:16 +05:30
sriram veeraghanta
c6764111a3 fix: upgrading to nextjs 14 (#2959) 2023-12-01 13:25:48 +05:30
Manish Gupta
4a2f8d93b1 fix: branch build 2 (#2957)
* branch build fix

* removed quotes
2023-12-01 13:23:50 +05:30
Manish Gupta
5db9b2b638 branch build fix (#2954) 2023-12-01 12:46:18 +05:30
Bavisetti Narayan
e2b704ea6f chore: removed django settings module (#2953) 2023-12-01 12:21:17 +05:30
Nikhil
452c9f0d5b dev: update email templates (#2948)
* dev: update magic link email

* dev: forgot password mail

* dev: workspace invitation update

* dev: update email templates and task

* dev: remove email verification template

* dev: change all conversation links to issues
2023-11-30 13:30:13 +05:30
rahulramesha
139c0857eb fix: quick add positioning (#2949)
* fix quick add posutioning for kanban and spreadsheet

* fix kanban quick add project identifier
2023-11-30 12:22:43 +05:30
Nikhil
2556d052de chore: status code changed (#2947) 2023-11-29 21:46:54 +05:30
Nikhil
c65915665b dev: transactional emails (#2946) 2023-11-29 21:46:18 +05:30
Nikhil
6ece5a57e2 dev: instance registration (#2912)
* dev: remove auto script for registration

* dev: make all of the instance admins as owners when adding a instance admin

* dev: remove sign out endpoint

* dev: update takeoff script to register the instance

* dev:  reapply instance model

* dev: check none for instance configuration encryptions

* dev: encrypting secrets configuration

* dev: user workflow for registration in instances

* dev: add email automation configuration

* dev: remove unused imports

* dev: reallign migrations

* dev: reconfigure license engine registrations

* dev: move email check to background worker

* dev: add sign up

* chore: signup error message

* dev: updated onboarding workflows and instance setting

* dev: updated template for magic login

* chore: page migration changed

* dev: updated migrations and authentication for license and update template for workspace invite

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:33:37 +05:30
Anmol Singh Bhatia
a779fa1497 dev: instance setup workflow (#2935)
* chore: instance type updated

* chore: instance not ready screen added

* chore: instance layout added

* chore: instance magic sign in endpoint and type added

* chore: instance admin password endpoint added

* chore: instance setup page added

* chore: instance setup form added

* chore: instance layout updated

* fix: instance admin workflow setup

* fix: admin workflow setup

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 20:33:08 +05:30
sriram veeraghanta
5eb7740833 fix: removed unused packages and upgraded to next 14 (#2944)
* fix: upgrading next package and removed unused deps

* chore: unused variable removed

* chore: next image icon fix

* chore: unused component removed

* chore: next image icon fix

* chore: replace use-debounce with lodash debounce

* chore: unused component removed

* resolved: fixed issue with next link component

* fix: updates in next config

* fix: updating types pages

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2023-11-29 20:32:10 +05:30
guru_sainath
86408fcd46 chore: replaced v3 issues endpoints (#2945)
* chore: removed v3 endpoints

* chore: replace v3 issues to normal issues endpoints

* build-error: Bulid error is new issue structure

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:26:43 +05:30
Anmol Singh Bhatia
86de3188b1 chore: cycle and module status indicator improvement (#2942) 2023-11-29 20:02:41 +05:30
Anmol Singh Bhatia
c5996382bc style: empty state improvement (#2943) 2023-11-29 20:02:13 +05:30
rahulramesha
2796754e5f fix: v3 issues for the layouts (#2941)
* fix drag n drop exception error

* fix peek overlay close buttons

* fix project empty state view

* fix cycle and module empty state view

* add ai options to inbox issue creation

* fix inbox filters for viewers

* fix inbox filters for viewers for project

* disable editing permission for members and viewers

* define accurate types for drag and drop
2023-11-29 19:58:27 +05:30
Lakhan Baheti
3488dc3014 style: deactivate acount modal (#2940) 2023-11-29 19:12:12 +05:30
Bavisetti Narayan
c7a3d8f0a0 chore: v3 global issues (#2938) 2023-11-29 19:08:14 +05:30
Aaryan Khandelwal
6d97dd814a chore: updated sign in workflow (#2939)
* chore: new sign in workflow

* chore: request new code button added

* chore: create new password form added

* fix: build errors

* chore: remove unused components

* chore: update submitting state texts

* fix: oauth sign in process
2023-11-29 19:07:33 +05:30
Lakhan Baheti
804c03bd15 chore: redirection to profile after workpace delete/leave (#2937) 2023-11-29 16:52:54 +05:30
Manish Gupta
59981e3e68 fix: Branch Build and Self hosting fixes (#2930)
* Branch build yml modified to create preview and latest tags

* self host install modified to handle public image only

* testing update-docker

* testing

* wip

* rolled back to orignal

* selfhosting readme updated
2023-11-29 14:58:31 +05:30
Aaryan Khandelwal
1be1b9f4a3 chore: issue peek overview (#2918)
* chore: autorun for the issue detail store

* fix: labels mutation

* chore: remove old peek overview code

* chore: move add to cycle and module logic to store

* fix: build errors

* chore: add peekProjectId query param for the peek overview

* chore: update profile layout

* fix: multiple workspaces

* style: Issue activity and link design improvements in Peek overview.
* fix issue with labels not occupying full widht.
* fix links overflow issue.
* add tooltip in links to display entire link.
* add functionality to copy links to clipboard.

* chore: peek overview for all the layouts

* fix: build errors

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 14:25:57 +05:30
Lakhan Baheti
7cd02401c1 fix: spreadsheet layout issue properties (#2936)
* fix: spredsheet layout state column state name & tootltip

* fix: label select dropdown first item auto active state

* fix: priority column padding & tooltip position
2023-11-29 14:20:58 +05:30
Bavisetti Narayan
6d175c0e56 chore: api rate limiting (#2933) 2023-11-29 14:14:26 +05:30
Bavisetti Narayan
6a4f521b47 chore: api webhooks validation (#2928)
* chore: api webhooks update

* chore: webhooks signature validation
2023-11-29 14:13:03 +05:30
rahulramesha
cdc4fd27a5 fix issue sorting and add sorting to other properties (#2931) 2023-11-29 14:02:39 +05:30
Aaryan Khandelwal
f4af3db7b6 chore: update the content of webhooks form (#2932) 2023-11-29 14:02:13 +05:30
Aaryan Khandelwal
4f64f431a7 chore: update profile settings layout (#2925)
* chore: update profile layout

* fix: multiple workspaces

* chore: removed breadcrumbs

* chore: fix sidebar collapsed state
2023-11-29 13:48:07 +05:30
Anmol Singh Bhatia
537901f046 style: workspace sidebar scroll fix and improvement (#2934) 2023-11-29 13:32:35 +05:30
Lakhan Baheti
404a61aee5 fix: google auth button content alignment (#2929) 2023-11-29 13:29:38 +05:30
Lakhan Baheti
1a2b1e648d style: switch or delete account modal (#2926)
* style: switch or delete account modal

* fix: popover text color

* fix: typo
2023-11-29 13:28:24 +05:30
guru_sainath
3400c119bc fix: drag and drop implementation in calendar layout and kanban layout (#2921)
* fix profile issue filters and kanban

* chore: calendar drag and drop

* chore: kanban drag and drop

* dev: remove issue from the kanban layout and resolved build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-28 19:17:38 +05:30
sabith-tu
d5853405ca style: new empty state ui (#2923) 2023-11-28 18:47:52 +05:30
rahulramesha
db510dcfcd fix: all functionalities for profile, archived and draft issues (#2922)
* fix profile issue filters and kanban

* fix profile draft and archived issues
2023-11-28 18:15:46 +05:30
Lakhan Baheti
62c0615012 style: member role visibility (#2919)
* style: member role visibility

* fix: build errors
2023-11-28 17:11:04 +05:30
Bavisetti Narayan
3914a75334 chore: deactivated user workflow change (#2888)
* chore: deactivated user workflow change

* chore: removed archived and draft from v3 by default

* chore: draft and archive update

* chore: bool field

* chore: fall back workspace updated

* chore: workspace member active
2023-11-28 17:08:05 +05:30
Aaryan Khandelwal
0cbb201348 fix: workspace settings pages authorization (#2915)
* fix: workspace settings pages authorization

* chore: user cannot add a member with a higher role than theirs

* chore: update workspace general settings auth
2023-11-28 17:05:42 +05:30
rahulramesha
f7264364bd add functionality for addition of existing issues to modules and cycles (#2913)
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-28 14:50:37 +05:30
Manish Gupta
35f8ffa5ab migration script added (#2914) 2023-11-28 13:29:08 +05:30
M. Palanikannan
6607caade7 fix: Image restoration fixed (marks/unmarks an image to be deleted after a week) (#2859)
* image restoration fixed (marks an image to be deleted after a week)

* removed clgs

* added image constraints

* formatted editor-core package using yarn format

* lite-text-editor nothing to format

* rich-text-editor nothing to format

* formatted document-editor with prettier

* modified file service to follow api change

* fixed more formatting in document editor

* fixed all instances of types with that from the package

* fixed delete to work consistently (minor optimizations turned off)

* stop duplicate images inside editor

* restore image on editor creation

say if user A deletes image number 2, user B was also in the same issue and in their screen the image was there, if user B makes certain changes and that gets saved in backend, according to user B image 2 should exist but since user A deleted it, it'll not get restored and get deleted in 7 days, hence I've added a check such that whenever a issue loads we restore all images by default

* added restore image function with types

* replaced all instances to have restore image logic

* fixed issue detail for peek view

* disabled option to insert table inside a table
2023-11-28 11:34:20 +05:30
Manish Gupta
8ee8270697 removed container names for selfhosting (#2907) 2023-11-28 11:31:04 +05:30
Aaryan Khandelwal
41ab962dd7 chore: update get invitation details endpoint (#2902) 2023-11-28 11:30:03 +05:30
Prateek Shourya
c22c6bb9b2 Fix: bug fixes and UI / UX improvements (#2906)
* Fix: issue with project publish modal data not updating immediately.

* fix: issue with workspace list not scrollable in profile settings.

* fix: update redirect workspace slug logic to redirect to prev workspace instead of `/`.

* style: update API tokens and webhooks empty state designs.
2023-11-28 11:29:01 +05:30
sriram veeraghanta
67de6d0729 fix: adding ai assistance to pages (#2905)
* fix: adding ai modal to pages

* fix: pages overflow

* chore: update pages UI

* fix: updating page description while using ai assistance

* fix: gpt assistant modal height and position

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-27 20:39:18 +05:30
M. Palanikannan
f361cd045e image can't be inserted inside table (#2904)
* image can't be inserted inside table

Now we've diabled image icon from showing up if the cursor is inside a table node or if a table cell is selected

* added drag drop support for document editor

* fixed missing dependencies
2023-11-27 20:37:40 +05:30
Prateek Shourya
06d3cd7e73 refactor: Instance admin setting and UI updates. (#2889)
* refactor: shift instance admin restriction content to seperate component.
fix: instance components export logic.

* style: fix sidebar dropdown `God Mode` icon padding.

* style: update profile settings user dropdown menu width.

* fix: update input type to `password` for Client Secret and API/ Access Key fields.

* style: update loader design for all forms.

* fix: typo

* style: ui updates.

* chore: add show/ hide button for all password fields.
2023-11-27 19:41:47 +05:30
Aaryan Khandelwal
10c52bf89b refactor: keyboard shortcuts modal (#2822)
* refactor: keyboard shortcuts modal

* chore: updated search logic

* refactor: divided the modal component into granular components
2023-11-27 18:41:59 +05:30
Anmol Singh Bhatia
b717518fbe fix: workspace dropdown scroll (#2900) 2023-11-27 18:25:40 +05:30
Aaryan Khandelwal
f48cd6f50c fix: profile layout flicker (#2898)
* fix: user profile layout flicker

* chore: update import statements
2023-11-27 18:25:16 +05:30
Lakhan Baheti
3203ae6549 fix: progress panel default open (#2894) 2023-11-27 17:21:14 +05:30
Lakhan Baheti
b8f603f920 chore: signup removed (#2890) 2023-11-27 17:17:55 +05:30
Anmol Singh Bhatia
96ff76af94 [FED-1018] chore: workspace dropdown improvement (#2891)
* fix: workspace dropdown improvement

* style: sidebar workspace icon alignment
2023-11-27 17:17:09 +05:30
Anmol Singh Bhatia
830675741f [FED-1054] fix: join project mutation (#2892)
* fix: join project mutation

* chore: code refactor
2023-11-27 17:16:46 +05:30
srinivas pendem
e489ad50dc Update README.md (#2893)
./setup.sh removed

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-27 17:16:13 +05:30
Aaryan Khandelwal
3dc18bc8fd refactor: webhooks (#2896)
* refactor: webhooks workflow

* chore: update delete modal content
2023-11-27 17:15:48 +05:30
Anmol Singh Bhatia
2d04917951 chore: instance admins endpoint added and ui/ux improvement (#2895)
* style: sidebar improvement

* style: header height consistency

* chore: layout consistency and general page improvement

* chore: layout, email form and image form improvement

* chore: instance admins endpoint intergrated and code refactor

* chore: code refactor

* chore: google client secret section removed
2023-11-27 17:15:11 +05:30
guru_sainath
2bf7e63625 issues rendering in all issue layouts fir profile and project issues and global issues store implementation (#2886)
* dev: draft and archived issue store

* connect draft and archived issues

* kanban for draft issues

* fix filter store for calendar and kanban

* dev: profile issues store and draft issues filters in header

* disble issue creation for draft issues

* dev: profile issues store filters

* disable kanban properties in draft issues

* dev: profile issues store filters

* dev: seperated adding issues to the cycle and module as seperate methds in cycle and module store

* dev: workspace profile issues store

* dev: sub group issues in the swimlanes

* profile issues and create issue connection

* fix profile issues

* fix spreadsheet issues

* fix dissapearing project from create issue modal

* page level modifications

* fix additional bugs

* dev: issues profile and global iisues and filters update

* fix issue related bugs

* fix project views for list and kanban

* fix build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-27 14:15:33 +05:30
Anmol Singh Bhatia
eb78fd6088 fix: resolve modal overlapping issue (#2885) 2023-11-27 12:16:59 +05:30
Lakhan Baheti
202ecd21df fix: bug fixes & UI improvements (#2884)
* chore: access restriction for api tokens

* fix: on create module total issues undefined

* fix: cycle board card typo

* chore: fetch modules after creation

* fix: peek module on delete

* fix: peek cycle on delete

* fix: cycle detail sidebar copy link toast

* chore: router replace -> push
2023-11-27 12:15:10 +05:30
Aaryan Khandelwal
b2ac7b9ac6 chore: revamp the API tokens workflow (#2880)
* chore: added getLayout method to api tokens pages

* revamp: api tokens workflow

* chore: add title validation and update types

* chore: minor UI updates

* chore: update route
2023-11-27 12:14:06 +05:30
Lakhan Baheti
51dff31926 fix: user state after logout (#2849)
* fix: user state after logout

* chore: user state handle with mobx

* chore: signout update for profile setting

* fix: minor fixes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-25 23:04:56 +05:30
sriram veeraghanta
e89f152779 fix: remove slack notification on build branch workflow (#2881) 2023-11-25 22:43:27 +05:30
Lakhan Baheti
3c9f57f8f4 fix: workspace & user avatar tooltip (#2851)
* fix: workspace & user avatar tooltip

* chore: user name update while typing on top right avatar

* chore: imports placement

* fix: rendering condition

* chore: component re-arrangement

* fix: imports
2023-11-25 21:31:09 +05:30
Bavisetti Narayan
1bc859c68c chore: seperated delete endpoint for file upload (#2870) 2023-11-25 21:28:03 +05:30
Ramesh Kumar Chandra
11d57a5bf0 fix: track events updated, extra parameters added, added events for issues, pages, states, cycles (#2875)
* fix: event tracking method updated to store, chore: updated and added events for workspace, projects and create issue

* fix: posthog auth event tracking

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-25 21:26:26 +05:30
Prateek Shourya
2980c7b00d Feat: God Mode UI Updates and More Config Settings (#2877)
* feat: Images in Plane config screen.
* feat: Enable/ Disable Magic Login config toggle.
* style: UX copy and design updates across all screens.
* style: SSO and OAuth Screen revamp.
* style: Enter God Mode button for Profile Settings sidebar.
* fix: update input type to password for password fields.
2023-11-25 21:23:50 +05:30
Anmol Singh Bhatia
5c6a59ba35 dev: badge component added in planu ui package (#2876) 2023-11-25 21:21:03 +05:30
Anmol Singh Bhatia
a3ea7c8f10 fix: issue peek overview state select dropdown overflow fix (#2873) 2023-11-25 21:18:54 +05:30
Anmol Singh Bhatia
cb922fb113 fix: module sidebar date select fix and code refactor (#2872) 2023-11-25 21:18:16 +05:30
sriram veeraghanta
06564ee856 fix: remove slack notify (#2871)
* fix: remove slack notifications on workflows

* fix: bugfix
2023-11-24 14:31:44 +05:30
Nikhil
c7e6118804 refactor: image upload modals, file size limit added to config (#2868)
* chore: add file size limit as config in the config api

* refactor: image upload modals

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-24 13:23:46 +05:30
Manish Gupta
069b8b3ed9 Updated the slack notification message to PR Title (#2869) 2023-11-24 13:11:21 +05:30
Lakhan Baheti
38a5b7bec0 chore: added error toast for invitation (#2853) 2023-11-24 12:47:02 +05:30
Bavisetti Narayan
236caaafe8 chore: user deactivation and login restriction (#2855)
* chore: user deactivation

* chore: deactivation and login disabled

* chore: added get configuration value

* chore: serializer message change

* chore: instance admin passowrd change

* chore: removed triage

* chore: v3 endpoint for user profile

* chore: added enable signin

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-24 12:22:24 +05:30
Nikhil
a6d5eab634 chore: api and webhook refactor (#2861)
* chore: bug fix

* dev: changes in api endpoints for invitations and inbox

* chore: improvements

* dev: update webhook send

* dev: webhook validation and fix webhook flow for app

* dev: error messages for deactivation

* chore: api fixes

* dev: update webhook and workspace leave

* chore: issue comment

* dev: default values for environment variables

* dev: make the user active if he was already part of project member

* chore: webhook cycle and module event

* dev: disable ssl for emails

* dev: webhooks restructuring

* dev: updated webhook configuration

* dev: webhooks

* dev: state get object

* dev: update workspace slug validation

* dev: remove deactivation flag if max retries exceeded

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-24 12:19:26 +05:30
sriram veeraghanta
8d76c96a6f fix: adding slack notification when build is failed to upload to docker (#2862)
* fix: removing logs

* fix: adding slack notification when build is failed to upload to docker

* minor changes

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
2023-11-24 12:17:31 +05:30
Aaryan Khandelwal
97be4b60ae chore: update profile and God mode routes (#2860)
* chore: update profile and god mode routes

* fix: profile activity loader

* chore: update profile route in the change password page
2023-11-24 12:16:37 +05:30
Bavisetti Narayan
dece103873 chore: user activity in profile page (#2856)
* chore: user activity endpoint change

* chore: added workspace detail in activity serializer
2023-11-23 21:00:49 +05:30
Anmol Singh Bhatia
c6125876be fix: view date filter select fix (#2858) 2023-11-23 20:40:41 +05:30
Anmol Singh Bhatia
1f85bf2302 style: module ui improvement (#2838) 2023-11-23 20:39:58 +05:30
Anmol Singh Bhatia
20baba3bb0 style: issue activity section improvement (#2836) 2023-11-23 20:39:18 +05:30
Ramesh Kumar Chandra
85907b32d1 feat: change password page (#2847) 2023-11-23 20:38:50 +05:30
sabith-tu
ef2bef83dc style: removing extra options heading and drop down icon (#2852) 2023-11-23 20:38:05 +05:30
Aaryan Khandelwal
6e7a96394a fix: page scroll area (#2850) 2023-11-23 18:22:25 +05:30
Aaryan Khandelwal
5726f6955c dev: added tailwind merge helper function (#2844) 2023-11-23 17:21:47 +05:30
Aaryan Khandelwal
82665a35ee fix: archived issues infinite call (#2848) 2023-11-23 16:58:08 +05:30
sriram veeraghanta
4efd225599 fix: updated document editor package in web and space apps (#2846) 2023-11-23 15:27:20 +05:30
sriram veeraghanta
2481706581 chore: optimizations and file name changes (#2845)
* fix: deepsource antipatterns

* fix: deepsource exclude file patterns

* chore: file name changes and removed unwanted variables

* fix: changing version number for editor
2023-11-23 15:09:46 +05:30
guru_sainath
a17b08dd15 chore: implemented new store and issue layouts for issues and updated new data structure for issues (#2843)
* fix: Implemented new workflow in the issue store and updated the quick add workflow in list layout

* fix: initial load and mutaion of issues in list layout

* dev: implemented the new project issues store with grouped, subGrouped and unGrouped issue computed functions

* dev: default display properties data made as a function

* conflict: merge conflict resolved

* dev: implemented quick add logic in kanban

* chore: implemented quick add logic in calendar and spreadsheet layout

* fix: spreadsheet layout quick add fix

* dev: optimised the issues workflow and handled the issues order_by filter

* dev: project issue CRUD operations in new issue store architecture

* dev: issues filtering in calendar layout

* fix: build error

* dev/issue_filters_store

* chore: updated filters computed structure

* conflict: merge conflicts resolved in project issues

* dev: implemented gantt chart for project issues using the new mobx store

* dev: initialized cycle and module issue filters store

* dev: issue store and list layout store updates

* dev: quick add and update, delete issue in the list

* refactor list root changes

* dev: store new structure

* refactor spreadsheet and gnatt project roots

* fix errors for base gantt and spreadsheet roots

* connect Calendar project view

* minor house keeping

* connect Kanban View to th enew store

* generalise base calendar issue actions

* dev: store project issues and issue filters

* dev: store project issues and filters

* dev: updated undefined with displayFilters in project issue store

* Add Quick add to all the layouts

* connect module views to store

* dev: Rendering list issues in project issues

* dev: removed console log

* dev: module filters store

* fix errors and connect modules list and quick add for list

* dev: module issue store

* dev: modle filter store issue fixed and updates cycle issue filters

* minor house keeping changes

* dev: cycle issues and cycle filters

* connecty cycles to teh store

* dev: project view issues and issue filtrs

* connect project views

* dev: updated applied filters in layouts

* dev: replaced project id with view id in project views

* dev: in cycle and module store made cycledId and moduleId as optional

* fix minor issues and build errots

* dev: project draft and archived issues store and filters

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-23 14:47:04 +05:30
Aaryan Khandelwal
a7d6b528bd chore: deactivate user option added (#2841)
* dev: deactivate user option added

* chore: new layout for profile settings

* fix: build errors

* fix: user profile activity
2023-11-23 14:44:06 +05:30
Lakhan Baheti
9ba724b78d fix: onboarding bugs & improvements (#2839)
* fix: terms & condition alignment

* fix: onboarding page scrolling

* fix: create workspace name clear

* fix: setup profile sidebar workspace name

* fix: invite team screen button text

* fix: inner div min height

* fix: allow single invite also in invite member

* fix: UI clipping in invite members

* fix: signin screen scroll

* fix: sidebar notification icon

* fix: sidebar project name & icon

* fix: user detail bottom image alignment

* fix: step indicator in invite member

* fix: try different account modal state

* fix: setup profile remove image

* fix: workspace slug clear

* fix: invite member UI & focus

* fix: step indicator size

* fix: inner div placement

* fix: invite member validation logic

* fix: cuurent user data persistency

* fix: sidebar animation colors

* feat: signup & resend

* fix: sign out theme persist from popover

* fix: imports

* chore: signin responsiveness

* fix: sign-in, sign-up top padding
2023-11-23 13:45:00 +05:30
Bavisetti Narayan
c2da9783a3 chore: change password endpoint (#2842) 2023-11-23 13:44:50 +05:30
936 changed files with 37208 additions and 19537 deletions

View File

@@ -1,5 +1,11 @@
version = 1
exclude_patterns = [
"bin/**",
"**/node_modules/",
"**/*.min.js"
]
[[analyzers]]
name = "shell"

View File

@@ -1,62 +1,30 @@
name: Branch Build
on:
pull_request:
types:
types:
- closed
branches:
branches:
- master
- release
- preview
- qa
- develop
release:
types: [released, prereleased]
env:
TARGET_BRANCH: ${{ github.event.pull_request.base.ref }}
TARGET_BRANCH: ${{ github.event.pull_request.base.ref || github.event.release.target_commitish }}
jobs:
branch_build_and_push:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) }}
branch_build_setup:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) || github.event_name == 'release' }}
name: Build-Push Web/Space/API/Proxy Docker Image
runs-on: ubuntu-20.04
steps:
- name: Check out the repo
uses: actions/checkout@v3.3.0
# - name: Set Target Branch Name on PR close
# if: ${{ github.event_name == 'pull_request' && github.event.action =='closed' }}
# run: echo "TARGET_BRANCH=${{ github.event.pull_request.base.ref }}" >> $GITHUB_ENV
# - name: Set Target Branch Name on other than PR close
# if: ${{ github.event_name == 'push' }}
# run: echo "TARGET_BRANCH=${{ github.ref_name }}" >> $GITHUB_ENV
- uses: ASzc/change-string-case-action@v2
id: gh_branch_upper_lower
with:
string: ${{env.TARGET_BRANCH}}
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_slash
with:
source: ${{ steps.gh_branch_upper_lower.outputs.lowercase }}
find: '/'
replace: '-'
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_dot
with:
source: ${{ steps.gh_branch_replace_slash.outputs.value }}
find: '.'
replace: ''
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_clean
with:
source: ${{ steps.gh_branch_replace_dot.outputs.value }}
find: '_'
replace: ''
- name: Uploading Proxy Source
uses: actions/upload-artifact@v3
with:
@@ -77,7 +45,6 @@ jobs:
!./nginx
!./deploy
!./space
- name: Uploading Space Source
uses: actions/upload-artifact@v3
with:
@@ -89,12 +56,24 @@ jobs:
!./deploy
!./web
outputs:
gh_branch_name: ${{ steps.gh_branch_clean.outputs.value }}
gh_branch_name: ${{ env.TARGET_BRANCH }}
branch_build_push_frontend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
FRONTEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:${{ github.event.release.tag_name }}
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-ce:stable
else
TAG=${{ env.FRONTEND_TAG }}
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -114,7 +93,7 @@ jobs:
context: .
file: ./web/Dockerfile.web
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.FRONTEND_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
@@ -123,8 +102,20 @@ jobs:
branch_build_push_space:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
SPACE_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Space Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:${{ github.event.release.tag_name }}
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-ce:stable
else
TAG=${{ env.SPACE_TAG }}
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -144,7 +135,7 @@ jobs:
context: .
file: ./space/Dockerfile.space
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.SPACE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
@@ -153,8 +144,20 @@ jobs:
branch_build_push_backend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
BACKEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:${{ github.event.release.tag_name }}
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-ce:stable
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -175,7 +178,7 @@ jobs:
file: ./Dockerfile.api
platforms: linux/amd64
push: true
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.BACKEND_TAG }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
@@ -183,8 +186,20 @@ jobs:
branch_build_push_proxy:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
PROXY_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ] && [ "${{ github.event_name }}" == "release" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:latest,${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:${{ github.event.release.tag_name }}
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-ce:stable
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -205,7 +220,7 @@ jobs:
context: .
file: ./Dockerfile
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.PROXY_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1

65
.github/workflows/codeql.yml vendored Normal file
View File

@@ -0,0 +1,65 @@
name: "CodeQL"
on:
push:
branches: [ 'develop', 'hot-fix', 'stage-release' ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ 'develop' ]
schedule:
- cron: '53 19 * * 5'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'python', 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Use only 'java' to analyze code written in Java, Kotlin or both
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,107 +0,0 @@
name: Update Docker Images for Plane on Release
on:
release:
types: [released, prereleased]
jobs:
build_push_backend:
name: Build and Push Api Server Docker Image
runs-on: ubuntu-20.04
steps:
- name: Check out the repo
uses: actions/checkout@v3.3.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
- name: Login to Docker Hub
uses: docker/login-action@v2.1.0
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Extract metadata (tags, labels) for Docker (Docker Hub) from Github Release
id: metaFrontend
uses: docker/metadata-action@v4.3.0
with:
images: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend
tags: |
type=ref,event=tag
- name: Extract metadata (tags, labels) for Docker (Docker Hub) from Github Release
id: metaBackend
uses: docker/metadata-action@v4.3.0
with:
images: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend
tags: |
type=ref,event=tag
- name: Extract metadata (tags, labels) for Docker (Docker Hub) from Github Release
id: metaSpace
uses: docker/metadata-action@v4.3.0
with:
images: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space
tags: |
type=ref,event=tag
- name: Extract metadata (tags, labels) for Docker (Docker Hub) from Github Release
id: metaProxy
uses: docker/metadata-action@v4.3.0
with:
images: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy
tags: |
type=ref,event=tag
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./web/Dockerfile.web
platforms: linux/amd64
tags: ${{ steps.metaFrontend.outputs.tags }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKET_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: ./apiserver
file: ./apiserver/Dockerfile.api
platforms: linux/amd64
push: true
tags: ${{ steps.metaBackend.outputs.tags }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKET_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and Push Plane-Deploy to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: .
file: ./space/Dockerfile.space
platforms: linux/amd64
push: true
tags: ${{ steps.metaSpace.outputs.tags }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKET_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v4.0.0
with:
context: ./nginx
file: ./nginx/Dockerfile
platforms: linux/amd64
push: true
tags: ${{ steps.metaProxy.outputs.tags }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKET_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}

View File

@@ -79,7 +79,6 @@ COPY apiserver/manage.py manage.py
COPY apiserver/plane plane/
COPY apiserver/templates templates/
COPY apiserver/gunicorn.config.py ./
RUN apk --no-cache add "bash~=5.2"
COPY apiserver/bin ./bin/

View File

@@ -76,7 +76,6 @@ NEXT_PUBLIC_ENABLE_OAUTH=0
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.selfhosted" # deprecated
# Error logs
SENTRY_DSN=""

View File

@@ -57,10 +57,6 @@ Setting up local environment is extremely easy and straight forward. Follow the
1. Review the `.env` files available in various folders. Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system
1. Run the docker command to initiate various services `docker compose -f docker-compose-local.yml up -d`
```bash
./setup.sh
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case id does not auto-reload)
Thats it!

View File

@@ -14,6 +14,11 @@ PGHOST="plane-db"
PGDATABASE="plane"
DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
# Oauth variables
GOOGLE_CLIENT_ID=""
GITHUB_CLIENT_ID=""
GITHUB_CLIENT_SECRET=""
# Redis Settings
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
@@ -50,7 +55,6 @@ NGINX_PORT=80
# SignUps
ENABLE_SIGNUP="1"
# Enable Email/Password Signup
ENABLE_EMAIL_PASSWORD="1"

View File

@@ -44,7 +44,6 @@ COPY manage.py manage.py
COPY plane plane/
COPY templates templates/
COPY package.json package.json
COPY gunicorn.config.py ./
USER root
RUN apk --no-cache add "bash~=5.2"
COPY ./bin ./bin/

View File

@@ -3,18 +3,28 @@ set -e
python manage.py wait_for_db
python manage.py migrate
# Set default value for ENABLE_REGISTRATION
ENABLE_REGISTRATION=${ENABLE_REGISTRATION:-1}
# Create the default bucket
#!/bin/bash
# Check if ENABLE_REGISTRATION is not set to '0'
if [ "$ENABLE_REGISTRATION" != "0" ]; then
# Register instance
python manage.py register_instance
# Load the configuration variable
python manage.py configure_instance
fi
# Collect system information
HOSTNAME=$(hostname)
MAC_ADDRESS=$(ip link show | awk '/ether/ {print $2}' | head -n 1)
CPU_INFO=$(cat /proc/cpuinfo)
MEMORY_INFO=$(free -h)
DISK_INFO=$(df -h)
# Concatenate information and compute SHA-256 hash
SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256sum | awk '{print $1}')
# Export the variables
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
python manage.py register_instance $MACHINE_SIGNATURE
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:8000 --max-requests 1200 --max-requests-jitter 1000 --access-logfile -
exec gunicorn -w $GUNICORN_WORKERS -k uvicorn.workers.UvicornWorker plane.asgi:application --bind 0.0.0.0:${PORT:-8000} --max-requests 1200 --max-requests-jitter 1000 --access-logfile -

0
apiserver/file.txt Normal file
View File

View File

@@ -1,6 +0,0 @@
from psycogreen.gevent import patch_psycopg
def post_fork(server, worker):
patch_psycopg()
worker.log.info("Made Psycopg2 Green")

View File

@@ -1,9 +1,8 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
rate = '60/minute'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
@@ -15,13 +14,10 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
now = self.timer()
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])

View File

@@ -9,8 +9,9 @@ from .issue import (
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
IssueExpandSerializer,
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer
from .module import ModuleSerializer, ModuleIssueSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .inbox import InboxIssueSerializer

View File

@@ -30,6 +30,11 @@ class CycleSerializer(BaseSerializer):
model = Cycle
fields = "__all__"
read_only_fields = [
"id",
"created_at",
"updated_at",
"created_by",
"updated_by",
"workspace",
"project",
"owned_by",
@@ -46,4 +51,11 @@ class CycleIssueSerializer(BaseSerializer):
"workspace",
"project",
"cycle",
]
]
class CycleLiteSerializer(BaseSerializer):
class Meta:
model = Cycle
fields = "__all__"

View File

@@ -8,6 +8,12 @@ class InboxIssueSerializer(BaseSerializer):
model = InboxIssue
fields = "__all__"
read_only_fields = [
"project",
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]

View File

@@ -1,3 +1,6 @@
from lxml import html
# Django imports
from django.utils import timezone
@@ -19,7 +22,10 @@ from plane.db.models import (
ProjectMember,
)
from .base import BaseSerializer
from .cycle import CycleSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleLiteSerializer
from .user import UserLiteSerializer
from .state import StateLiteSerializer
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
@@ -40,8 +46,8 @@ class IssueSerializer(BaseSerializer):
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
@@ -49,6 +55,10 @@ class IssueSerializer(BaseSerializer):
"created_at",
"updated_at",
]
exclude = [
"description",
"description_stripped",
]
def validate(self, data):
if (
@@ -57,12 +67,21 @@ class IssueSerializer(BaseSerializer):
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
try:
if(data.get("description_html", None) is not None):
parsed = html.fromstring(data["description_html"])
parsed_str = html.tostring(parsed, encoding='unicode')
data["description_html"] = parsed_str
except Exception as e:
raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
# Validate assignees are from project
if data.get("assignees", []):
print(data.get("assignees"))
data["assignees"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
is_active=True,
member_id__in=data["assignees"],
).values_list("member_id", flat=True)
@@ -88,7 +107,7 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
workspce_id=self.context.get("workspace_id"), pk=data.get("parent")
workspace_id=self.context.get("workspace_id"), pk=data.get("parent")
).exists()
):
raise serializers.ValidationError(
@@ -231,8 +250,13 @@ class LabelSerializer(BaseSerializer):
model = Label
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
@@ -241,13 +265,14 @@ class IssueLinkSerializer(BaseSerializer):
model = IssueLink
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"issue",
]
# Validation if url already exists
@@ -266,13 +291,14 @@ class IssueAttachmentSerializer(BaseSerializer):
model = IssueAttachment
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
]
@@ -281,39 +307,87 @@ class IssueCommentSerializer(BaseSerializer):
class Meta:
model = IssueComment
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
]
exclude = [
"comment_stripped",
"comment_json",
]
def validate(self, data):
try:
if(data.get("comment_html", None) is not None):
parsed = html.fromstring(data["comment_html"])
parsed_str = html.tostring(parsed, encoding='unicode')
data["comment_html"] = parsed_str
except Exception as e:
raise serializers.ValidationError(f"Invalid HTML: {str(e)}")
return data
class IssueActivitySerializer(BaseSerializer):
class Meta:
model = IssueActivity
fields = "__all__"
exclude = [
"created_by",
"udpated_by",
"updated_by",
]
class CycleIssueSerializer(BaseSerializer):
cycle = CycleSerializer(read_only=True)
class Meta:
fields = [
"cycle",
]
class ModuleIssueSerializer(BaseSerializer):
module = ModuleSerializer(read_only=True)
class Meta:
fields = [
"module",
]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id",
"name",
"color",
]
class IssueExpandSerializer(BaseSerializer):
cycle = CycleLiteSerializer(source="issue_cycle.cycle", read_only=True)
module = ModuleLiteSerializer(source="issue_module.module", read_only=True)
labels = LabelLiteSerializer(read_only=True, many=True)
assignees = UserLiteSerializer(read_only=True, many=True)
state = StateLiteSerializer(read_only=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]

View File

@@ -21,7 +21,6 @@ class ModuleSerializer(BaseSerializer):
write_only=True,
required=False,
)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
@@ -33,6 +32,7 @@ class ModuleSerializer(BaseSerializer):
model = Module
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
@@ -55,7 +55,6 @@ class ModuleSerializer(BaseSerializer):
raise serializers.ValidationError("Start date cannot exceed target date")
if data.get("members", []):
print(data.get("members"))
data["members"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
member_id__in=data["members"],
@@ -152,4 +151,11 @@ class ModuleLinkSerializer(BaseSerializer):
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return ModuleLink.objects.create(**validated_data)
return ModuleLink.objects.create(**validated_data)
class ModuleLiteSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"

View File

@@ -20,8 +20,13 @@ class ProjectSerializer(BaseSerializer):
model = Project
fields = "__all__"
read_only_fields = [
"workspace",
"id",
'emoji',
"workspace",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
def validate(self, data):

View File

@@ -16,6 +16,11 @@ class StateSerializer(BaseSerializer):
model = State
fields = "__all__"
read_only_fields = [
"id",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
]

View File

@@ -11,10 +11,6 @@ class UserLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"is_bot",
"display_name",
]
read_only_fields = [
"id",
"is_bot",
]
read_only_fields = fields

View File

@@ -13,7 +13,7 @@ urlpatterns = [
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
@@ -23,7 +23,7 @@ urlpatterns = [
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),

View File

@@ -5,12 +5,12 @@ from plane.api.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),

View File

@@ -15,27 +15,27 @@ urlpatterns = [
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueAPIEndpoint.as_view(),
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/",
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/",
LabelAPIEndpoint.as_view(),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/<uuid:pk>/",
LabelAPIEndpoint.as_view(),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/",
IssueLinkAPIEndpoint.as_view(),
name="link",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/<uuid:pk>/",
IssueLinkAPIEndpoint.as_view(),
name="link",
),
@@ -50,12 +50,12 @@ urlpatterns = [
name="comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activites/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/",
IssueActivityAPIEndpoint.as_view(),
name="activity",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activites/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/<uuid:pk>/",
IssueActivityAPIEndpoint.as_view(),
name="activity",
),

View File

@@ -19,7 +19,7 @@ urlpatterns = [
name="module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:issue_id>/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),

View File

@@ -9,7 +9,7 @@ urlpatterns = [
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAPIEndpoint.as_view(),
name="project",
),

View File

@@ -8,4 +8,9 @@ urlpatterns = [
StateAPIEndpoint.as_view(),
name="states",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:state_id>/",
StateAPIEndpoint.as_view(),
name="states",
),
]

View File

@@ -7,7 +7,6 @@ from django.conf import settings
from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from rest_framework.views import APIView
@@ -36,28 +35,33 @@ class TimezoneMixin:
else:
timezone.deactivate()
class WebhookMixin:
webhook_event = None
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
# Check for the case should webhook be sent
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
# Push the object to delay
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
payload=response.data,
kw=self.kwargs,
action=self.request.method,
slug=self.workspace_slug,
bulk=self.bulk,
)
return response
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
authentication_classes = [
APIKeyAuthentication,
@@ -139,13 +143,13 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get('X-RateLimit-Remaining')
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
if ratelimit_remaining is not None:
response['X-RateLimit-Remaining'] = ratelimit_remaining
response["X-RateLimit-Remaining"] = ratelimit_remaining
ratelimit_reset = request.META.get('X-RateLimit-Reset')
ratelimit_reset = request.META.get("X-RateLimit-Reset")
if ratelimit_reset is not None:
response['X-RateLimit-Reset'] = ratelimit_reset
response["X-RateLimit-Reset"] = ratelimit_reset
return response
@@ -169,4 +173,4 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
expand = [
expand for expand in self.request.GET.get("expand", "").split(",") if expand
]
return expand if expand else None
return expand if expand else None

View File

@@ -17,7 +17,6 @@ from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
@@ -142,7 +141,6 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
)
queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all")
queryset = queryset.order_by("-is_favorite", "-created_at")
# Current Cycle
if cycle_view == "current":
@@ -293,7 +291,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=None,
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -305,14 +303,15 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle issues.
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
"""
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
@@ -457,7 +456,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Capture Issue Activity
issue_activity.delay(
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": issues}),
requested_data=json.dumps({"cycles_list": str(issues)}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
@@ -478,9 +477,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, cycle_id, pk):
def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()
@@ -493,7 +492,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=str(issue_id),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=None,
epoch=int(timezone.now().timestamp()),

View File

@@ -14,7 +14,7 @@ from rest_framework.response import Response
from .base import BaseAPIView
from plane.app.permissions import ProjectLitePermission
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.db.models import InboxIssue, Issue, State, ProjectMember
from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
from plane.bgtasks.issue_activites_task import issue_activity
@@ -37,29 +37,39 @@ class InboxIssueAPIEndpoint(BaseAPIView):
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(
inbox = Inbox.objects.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
)
if inbox is None and not project.inbox_view:
return InboxIssue.objects.none()
return (
InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=self.kwargs.get("inbox_id"),
inbox_id=inbox.id,
)
.select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
)
def get(self, request, slug, project_id, inbox_id, pk=None):
if pk:
issue_queryset = self.get_queryset().get(pk=pk)
issues_data = InboxIssueSerializer(
issue_queryset,
def get(self, request, slug, project_id, issue_id=None):
if issue_id:
inbox_issue_queryset = self.get_queryset().get(issue_id=issue_id)
inbox_issue_data = InboxIssueSerializer(
inbox_issue_queryset,
fields=self.fields,
expand=self.expand,
).data
return Response(
issues_data,
inbox_issue_data,
status=status.HTTP_200_OK,
)
issue_queryset = self.get_queryset()
@@ -74,12 +84,30 @@ class InboxIssueAPIEndpoint(BaseAPIView):
).data,
)
def post(self, request, slug, project_id, inbox_id):
def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for valid priority
if not request.data.get("issue", {}).get("priority", "none") in [
"low",
@@ -123,21 +151,45 @@ class InboxIssueAPIEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
inbox_id=inbox_id,
inbox_issue = InboxIssue.objects.create(
inbox_id=inbox.id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "in-app"),
)
serializer = IssueSerializer(issue)
serializer = InboxIssueSerializer(inbox_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
def patch(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member
project_member = ProjectMember.objects.get(
workspace__slug=slug,
@@ -145,6 +197,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
@@ -159,7 +212,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
if bool(issue_data):
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
pk=issue_id, workspace__slug=slug, project_id=project_id
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
@@ -183,7 +236,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
type="issue.activity.updated",
requested_data=requested_data,
actor_id=str(request.user.id),
issue_id=str(issue.id),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=json.dumps(
IssueSerializer(current_instance).data,
@@ -208,7 +261,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if the issue is rejected or marked as duplicate
if serializer.data["status"] in [-1, 2]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -222,7 +275,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if it is accepted
if serializer.data["status"] in [1]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -244,10 +297,33 @@ class InboxIssueAPIEndpoint(BaseAPIView):
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def delete(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
def delete(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member
project_member = ProjectMember.objects.get(
workspace__slug=slug,
@@ -256,6 +332,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
is_active=True,
)
# Check the inbox issue created
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
@@ -268,8 +345,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
workspace__slug=slug, project_id=project_id, pk=issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -22,7 +22,6 @@ from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.parsers import MultiPartParser, FormParser
# Module imports
from .base import BaseAPIView, WebhookMixin
@@ -41,14 +40,12 @@ from plane.db.models import (
IssueComment,
IssueActivity,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activites_task import issue_activity
from plane.api.serializers import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
)
@@ -103,7 +100,6 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
@@ -112,7 +108,6 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
issue_queryset = (
self.get_queryset()
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
@@ -278,7 +273,7 @@ class LabelAPIEndpoint(BaseAPIView):
def get_queryset(self):
return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
Label.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
@@ -302,29 +297,29 @@ class LabelAPIEndpoint(BaseAPIView):
)
def get(self, request, slug, project_id, pk=None):
if pk:
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(
label,
fields=self.fields,
expand=self.expand,
if pk is None:
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda labels: LabelSerializer(
labels,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda labels: LabelSerializer(
labels,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, data=request.data, partial=True)
return Response(serializer.data, status=status.HTTP_200_OK)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
@@ -356,25 +351,31 @@ class IssueLinkAPIEndpoint(BaseAPIView):
.distinct()
)
def get(self, request, slug, project_id, pk=None):
if pk:
label = self.get_queryset().get(pk=pk)
def get(self, request, slug, project_id, issue_id, pk=None):
if pk is None:
issue_links = self.get_queryset()
serializer = IssueLinkSerializer(
label,
issue_links,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_links: IssueLinkSerializer(
issue_links,
many=True,
fields=self.fields,
expand=self.expand,
).data,
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_links: IssueLinkSerializer(
issue_links,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
issue_link = self.get_queryset().get(pk=pk)
serializer = IssueLinkSerializer(
issue_link,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug, project_id, issue_id):
serializer = IssueLinkSerializer(data=request.data)
@@ -449,7 +450,7 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
@@ -587,7 +588,7 @@ class IssueActivityAPIEndpoint(BaseAPIView):
serializer = IssueActivitySerializer(issue_activities)
return Response(serializer.data, status=status.HTTP_200_OK)
self.paginate(
return self.paginate(
request=request,
queryset=(issue_activities),
on_results=lambda issue_activity: IssueActivitySerializer(

View File

@@ -129,6 +129,14 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk):
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
serializer = ModuleSerializer(module, data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
@@ -168,7 +176,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=None,
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -186,7 +194,8 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module"
webhook_event = "module_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
@@ -323,7 +332,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Capture Issue Activity
issue_activity.delay(
type="module.activity.created",
requested_data=json.dumps({"modules_list": issues}),
requested_data=json.dumps({"modules_list": str(issues)}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
@@ -343,9 +352,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, module_id, pk):
def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk
workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
)
module_issue.delete()
issue_activity.delay(
@@ -357,7 +366,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),

View File

@@ -94,8 +94,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct()
)
def get(self, request, slug, pk=None):
if pk is None:
def get(self, request, slug, project_id=None):
if project_id is None:
sort_order_query = ProjectMember.objects.filter(
member=request.user,
project_id=OuterRef("pk"),
@@ -114,7 +114,7 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
).select_related("member"),
)
)
.order_by("sort_order", "name")
.order_by(request.GET.get("order_by", "sort_order"))
)
return self.paginate(
request=request,
@@ -123,15 +123,13 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
projects, many=True, fields=self.fields, expand=self.expand,
).data,
)
else:
project = self.get_queryset().get(workspace__slug=slug, pk=pk)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug):
try:
workspace = Workspace.objects.get(slug=slug)
serializer = ProjectSerializer(
data={**request.data}, context={"workspace_id": workspace.id}
)
@@ -236,10 +234,10 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_410_GONE,
)
def patch(self, request, slug, pk=None):
def patch(self, request, slug, project_id=None):
try:
workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk)
project = Project.objects.get(pk=project_id)
serializer = ProjectSerializer(
project,
@@ -260,7 +258,7 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
name="Triage",
group="backlog",
description="Default state for managing all Inbox Issues",
project_id=pk,
project_id=project_id,
color="#ff7700",
)
@@ -282,4 +280,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
)
)
def delete(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -23,10 +23,8 @@ class StateAPIEndpoint(BaseAPIView):
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
return (
State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.filter(~Q(name="Triage"))
@@ -42,9 +40,9 @@ class StateAPIEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
serializer = StateSerializer(self.get_queryset().get(pk=pk))
def get(self, request, slug, project_id, state_id=None):
if state_id:
serializer = StateSerializer(self.get_queryset().get(pk=state_id))
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
@@ -57,19 +55,19 @@ class StateAPIEndpoint(BaseAPIView):
).data,
)
def delete(self, request, slug, project_id, pk):
def delete(self, request, slug, project_id, state_id):
state = State.objects.get(
~Q(name="Triage"),
pk=pk,
pk=state_id,
project_id=project_id,
workspace__slug=slug,
)
if state.default:
return Response({"error": "Default state cannot be deleted"}, status=False)
return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()
issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist:
return Response(
@@ -80,8 +78,8 @@ class StateAPIEndpoint(BaseAPIView):
state.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, pk=None):
state = State.objects.filter(workspace__slug=slug, project_id=project_id, pk=pk)
def patch(self, request, slug, project_id, state_id=None):
state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()

View File

@@ -99,7 +99,6 @@ class WorkspaceViewerPermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()

View File

@@ -1,45 +0,0 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
return allowed

View File

@@ -225,6 +225,7 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = IssueActivity

View File

@@ -103,16 +103,19 @@ class ProjectListSerializer(DynamicBaseSerializer):
members = serializers.SerializerMethodField()
def get_members(self, obj):
project_members = ProjectMember.objects.filter(
project_id=obj.id,
is_active=True,
).values(
"id",
"member_id",
"member__display_name",
"member__avatar",
)
return list(project_members)
project_members = getattr(obj, "members_list", None)
if project_members is not None:
# Filter members by the project ID
return [
{
"id": member.id,
"member_id": member.member_id,
"member__display_name": member.member.display_name,
"member__avatar": member.member.avatar,
}
for member in project_members
]
return []
class Meta:
model = Project

View File

@@ -26,6 +26,8 @@ class UserSerializer(BaseSerializer):
"token_updated_at",
"is_onboarded",
"is_bot",
"is_password_autoset",
"is_email_verified",
]
extra_kwargs = {"password": {"write_only": True}}
@@ -60,6 +62,8 @@ class UserMeSerializer(BaseSerializer):
"theme",
"last_workspace_id",
"use_case",
"is_password_autoset",
"is_email_verified",
]
read_only_fields = fields
@@ -80,9 +84,18 @@ class UserMeSettingsSerializer(BaseSerializer):
workspace_invites = WorkspaceMemberInvite.objects.filter(
email=obj.email
).count()
if obj.last_workspace_id is not None:
if (
obj.last_workspace_id is not None
and Workspace.objects.filter(
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).exists()
):
workspace = Workspace.objects.filter(
pk=obj.last_workspace_id, workspace_member__member=obj.id
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).first()
return {
"last_workspace_id": obj.last_workspace_id,
@@ -95,7 +108,9 @@ class UserMeSettingsSerializer(BaseSerializer):
}
else:
fallback_workspace = (
Workspace.objects.filter(workspace_member__member_id=obj.id)
Workspace.objects.filter(
workspace_member__member_id=obj.id, workspace_member__is_active=True
)
.order_by("created_at")
.first()
)
@@ -154,14 +169,25 @@ class ChangePasswordSerializer(serializers.Serializer):
Serializer for password change endpoint.
"""
old_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True, min_length=8)
confirm_password = serializers.CharField(required=True, min_length=8)
def validate(self, data):
if data.get("old_password") == data.get("new_password"):
raise serializers.ValidationError(
{"error": "New password cannot be same as old password."}
)
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
{"error": "Confirm password should be same as the new password."}
)
return data
class ResetPasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
new_password = serializers.CharField(required=True)
confirm_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True, min_length=8)

View File

@@ -1,3 +1,9 @@
# Python imports
import urllib
import socket
import ipaddress
from urllib.parse import urlparse
# Third party imports
from rest_framework import serializers
@@ -8,6 +14,76 @@ from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
def create(self, validated_data):
url = validated_data.get("url", None)
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return Webhook.objects.create(**validated_data)
def update(self, instance, validated_data):
url = validated_data.get("url", None)
if url:
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return super().update(instance, validated_data)
class Meta:
model = Webhook

View File

@@ -21,6 +21,23 @@ class WorkSpaceSerializer(BaseSerializer):
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
def validated(self, data):
if data.get("slug") in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
"password",
]:
raise serializers.ValidationError({"slug": "Slug is not valid"})
class Meta:
model = Workspace
fields = "__all__"
@@ -78,6 +95,16 @@ class WorkSpaceMemberInviteSerializer(BaseSerializer):
class Meta:
model = WorkspaceMemberInvite
fields = "__all__"
read_only_fields = [
"id",
"email",
"token",
"workspace",
"message",
"responded_at",
"created_at",
"updated_at",
]
class TeamSerializer(BaseSerializer):

View File

@@ -4,6 +4,7 @@ from django.urls import path
from plane.app.views import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
)
@@ -28,4 +29,13 @@ urlpatterns = [
UserAssetsEndpoint.as_view(),
name="user-file-assets",
),
path(
"workspaces/file-assets/<uuid:workspace_id>/<str:asset_key>/restore/",
FileAssetViewSet.as_view(
{
"post": "restore",
}
),
name="file-assets-restore",
),
]

View File

@@ -5,18 +5,16 @@ from rest_framework_simplejwt.views import TokenRefreshView
from plane.app.views import (
# Authentication
SignUpEndpoint,
SignInEndpoint,
SignOutEndpoint,
MagicGenerateEndpoint,
MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
OauthEndpoint,
EmailCheckEndpoint,
## End Authentication
# Auth Extended
ForgotPasswordEndpoint,
VerifyEmailEndpoint,
ResetPasswordEndpoint,
RequestEmailVerificationEndpoint,
ChangePasswordEndpoint,
## End Auth Extender
# API Tokens
@@ -27,24 +25,15 @@ from plane.app.views import (
urlpatterns = [
# Social Auth
path("email-check/", EmailCheckEndpoint.as_view(), name="email"),
path("social-auth/", OauthEndpoint.as_view(), name="oauth"),
# Auth
path("sign-up/", SignUpEndpoint.as_view(), name="sign-up"),
path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# Magic Sign In/Up
path(
"magic-generate/", MagicSignInGenerateEndpoint.as_view(), name="magic-generate"
),
# magic sign in
path("magic-generate/", MagicGenerateEndpoint.as_view(), name="magic-generate"),
path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Email verification
path("email-verify/", VerifyEmailEndpoint.as_view(), name="email-verify"),
path(
"request-email-verify/",
RequestEmailVerificationEndpoint.as_view(),
name="request-reset-email",
),
# Password Manipulation
path(
"users/me/change-password/",

View File

@@ -7,7 +7,6 @@ from plane.app.views import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleIssueGroupedEndpoint,
)
@@ -44,11 +43,6 @@ urlpatterns = [
),
name="project-issue-cycle",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueGroupedEndpoint.as_view(),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueViewSet.as_view(

View File

@@ -3,8 +3,6 @@ from django.urls import path
from plane.app.views import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
@@ -37,16 +35,6 @@ urlpatterns = [
),
name="project-issue",
),
path(
"v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListGroupedEndpoint.as_view(),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(

View File

@@ -7,7 +7,6 @@ from plane.app.views import (
ModuleLinkViewSet,
ModuleFavoriteViewSet,
BulkImportModulesEndpoint,
ModuleIssueGroupedEndpoint,
)
@@ -44,11 +43,6 @@ urlpatterns = [
),
name="project-module-issues",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueGroupedEndpoint.as_view(),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueViewSet.as_view(

View File

@@ -65,7 +65,7 @@ urlpatterns = [
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
"users/me/workspaces/<str:slug>/projects/invitations/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
@@ -75,7 +75,7 @@ urlpatterns = [
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
"workspaces/<str:slug>/projects/<uuid:project_id>/join/<uuid:pk>/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),

View File

@@ -7,6 +7,7 @@ from plane.app.views import (
UpdateUserTourCompletedEndpoint,
UserActivityEndpoint,
ChangePasswordEndpoint,
SetUserPasswordEndpoint,
## End User
## Workspaces
UserWorkSpacesEndpoint,
@@ -63,7 +64,7 @@ urlpatterns = [
name="user-tour",
),
path(
"users/workspaces/<str:slug>/activities/",
"users/me/activities/",
UserActivityEndpoint.as_view(),
name="user-activities",
),
@@ -89,5 +90,10 @@ urlpatterns = [
UserWorkspaceDashboardEndpoint.as_view(),
name="user-workspace-dashboard",
),
path(
"users/me/set-password/",
SetUserPasswordEndpoint.as_view(),
name="set-password",
),
## End User Graph
]

View File

@@ -5,7 +5,7 @@ from plane.app.views import (
IssueViewViewSet,
GlobalViewViewSet,
GlobalViewIssuesViewSet,
IssueViewFavoriteViewSet,
IssueViewFavoriteViewSet,
)

View File

@@ -65,6 +65,7 @@ urlpatterns = [
{
"delete": "destroy",
"get": "retrieve",
"patch": "partial_update",
}
),
name="workspace-invitations",

View File

@@ -58,13 +58,10 @@ from .cycle import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleIssueGroupedEndpoint,
)
from .asset import FileAssetEndpoint, UserAssetsEndpoint
from .asset import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .issue import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
WorkSpaceIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
@@ -85,20 +82,19 @@ from .issue import (
)
from .auth_extended import (
VerifyEmailEndpoint,
RequestEmailVerificationEndpoint,
ForgotPasswordEndpoint,
ResetPasswordEndpoint,
ChangePasswordEndpoint,
SetUserPasswordEndpoint,
EmailCheckEndpoint,
MagicGenerateEndpoint,
)
from .authentication import (
SignUpEndpoint,
SignInEndpoint,
SignOutEndpoint,
MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
)
from .module import (
@@ -106,7 +102,6 @@ from .module import (
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
ModuleIssueGroupedEndpoint,
)
from .api import ApiTokenEndpoint
@@ -135,7 +130,6 @@ from .page import (
PageFavoriteViewSet,
PageLogEndpoint,
SubPagesEndpoint,
CreateIssueFromBlockEndpoint,
)
from .search import GlobalSearchEndpoint, IssueSearchEndpoint
@@ -168,4 +162,8 @@ from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint
from .webhook import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)

View File

@@ -4,7 +4,7 @@ from rest_framework.response import Response
from rest_framework.parsers import MultiPartParser, FormParser, JSONParser
# Module imports
from .base import BaseAPIView
from .base import BaseAPIView, BaseViewSet
from plane.db.models import FileAsset, Workspace
from plane.app.serializers import FileAssetSerializer
@@ -34,10 +34,20 @@ class FileAssetEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, workspace_id, asset_key):
def delete(self, request, workspace_id, asset_key):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = request.data.get("is_deleted", file_asset.is_deleted)
file_asset.is_deleted = True
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class FileAssetViewSet(BaseViewSet):
def restore(self, request, workspace_id, asset_key):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = False
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -63,8 +73,6 @@ class UserAssetsEndpoint(BaseAPIView):
def delete(self, request, asset_key):
file_asset = FileAsset.objects.get(asset=asset_key, created_by=request.user)
# Delete the file from storage
file_asset.asset.delete(save=False)
# Delete the file object
file_asset.delete()
file_asset.is_deleted = True
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,5 +1,9 @@
## Python imports
import jwt
import uuid
import os
import json
import random
import string
## Django imports
from django.contrib.auth.tokens import PasswordResetTokenGenerator
@@ -8,80 +12,117 @@ from django.utils.encoding import (
smart_bytes,
DjangoUnicodeDecodeError,
)
from django.contrib.auth.hashers import make_password
from django.utils.http import urlsafe_base64_decode, urlsafe_base64_encode
from django.core.validators import validate_email
from django.core.exceptions import ValidationError
from django.conf import settings
## Third Party Imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework import permissions
from rest_framework.permissions import AllowAny
from rest_framework_simplejwt.tokens import RefreshToken
from sentry_sdk import capture_exception
## Module imports
from . import BaseAPIView
from plane.app.serializers import (
ChangePasswordSerializer,
ResetPasswordSerializer,
UserSerializer,
)
from plane.db.models import User
from plane.bgtasks.email_verification_task import email_verification
from plane.db.models import User, WorkspaceMemberInvite
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.forgot_password_task import forgot_password
from plane.license.models import Instance
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
from plane.bgtasks.event_tracking_task import auth_events
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
return Response(
{"message": "Email sent successfully"}, status=status.HTTP_200_OK
)
def get_tokens_for_user(user):
refresh = RefreshToken.for_user(user)
return (
str(refresh.access_token),
str(refresh),
)
class VerifyEmailEndpoint(BaseAPIView):
def get(self, request):
token = request.GET.get("token")
try:
payload = jwt.decode(token, settings.SECRET_KEY, algorithms="HS256")
user = User.objects.get(id=payload["user_id"])
def generate_magic_token(email):
key = "magic_" + str(email)
if not user.is_email_verified:
user.is_email_verified = True
user.save()
return Response(
{"email": "Successfully activated"}, status=status.HTTP_200_OK
)
except jwt.ExpiredSignatureError as _indentifier:
return Response(
{"email": "Activation expired"}, status=status.HTTP_400_BAD_REQUEST
)
except jwt.exceptions.DecodeError as _indentifier:
return Response(
{"email": "Invalid token"}, status=status.HTTP_400_BAD_REQUEST
)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
# Initialize the redis instance
ri = redis_instance()
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return key, token, False
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
return key, token, True
def generate_password_token(user):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
return uidb64, token
class ForgotPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email")
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = request.META.get('HTTP_ORIGIN')
try:
validate_email(email)
except ValidationError:
return Response(
{"error": "Please enter a valid email"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the user
user = User.objects.filter(email=email).first()
if user:
# Get the reset token for user
uidb64, token = generate_password_token(user=user)
current_site = request.META.get("HTTP_ORIGIN")
# send the forgot password email
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site
)
return Response(
{"message": "Check your email to reset your password"},
status=status.HTTP_200_OK,
@@ -92,30 +133,40 @@ class ForgotPasswordEndpoint(BaseAPIView):
class ResetPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
permission_classes = [
AllowAny,
]
def post(self, request, uidb64, token):
try:
# Decode the id from the uidb64
id = smart_str(urlsafe_base64_decode(uidb64))
user = User.objects.get(id=id)
# check if the token is valid for the user
if not PasswordResetTokenGenerator().check_token(user, token):
return Response(
{"error": "token is not valid, please check the new one"},
{"error": "Token is invalid"},
status=status.HTTP_401_UNAUTHORIZED,
)
serializer = ResetPasswordSerializer(data=request.data)
# Reset the password
serializer = ResetPasswordSerializer(data=request.data)
if serializer.is_valid():
# set_password also hashes the password that the user will get
user.set_password(serializer.data.get("new_password"))
user.is_password_autoset = False
user.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
# Log the user in
# Generate access token for the user
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(response)
return Response(data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except DjangoUnicodeDecodeError as indentifier:
@@ -128,24 +179,289 @@ class ResetPasswordEndpoint(BaseAPIView):
class ChangePasswordEndpoint(BaseAPIView):
def post(self, request):
serializer = ChangePasswordSerializer(data=request.data)
user = User.objects.get(pk=request.user.id)
if serializer.is_valid():
# Check old password
if not user.object.check_password(serializer.data.get("old_password")):
if not user.check_password(serializer.data.get("old_password")):
return Response(
{"old_password": ["Wrong password."]},
{"error": "Old password is not correct"},
status=status.HTTP_400_BAD_REQUEST,
)
# set_password also hashes the password that the user will get
self.object.set_password(serializer.data.get("new_password"))
self.object.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
}
return Response(response)
user.set_password(serializer.data.get("new_password"))
user.is_password_autoset = False
user.save()
return Response(
{"message": "Password updated successfully"}, status=status.HTTP_200_OK
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class SetUserPasswordEndpoint(BaseAPIView):
def post(self, request):
user = User.objects.get(pk=request.user.id)
password = request.data.get("password", False)
# If the user password is not autoset then return error
if not user.is_password_autoset:
return Response(
{
"error": "Your password is already set please change your password from profile"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check password validation
if not password and len(str(password)) < 8:
return Response(
{"error": "Password is not valid"}, status=status.HTTP_400_BAD_REQUEST
)
# Set the user password
user.set_password(password)
user.is_password_autoset = False
user.save()
serializer = UserSerializer(user)
return Response(serializer.data, status=status.HTTP_200_OK)
class MagicGenerateEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email", False)
# Check the instance registration
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
if not email:
return Response(
{"error": "Please provide a valid email address"},
status=status.HTTP_400_BAD_REQUEST,
)
# Clean up the email
email = email.strip().lower()
validate_email(email)
# check if the email exists not
if not User.objects.filter(email=email).exists():
# Create a user
_ = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
ri = redis_instance()
key = "magic_" + str(email)
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
# If the smtp is configured send through here
current_site = request.META.get("HTTP_ORIGIN")
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
class EmailCheckEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
# Check the instance registration
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get configuration values
ENABLE_SIGNUP, ENABLE_MAGIC_LINK_LOGIN = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
"default": os.environ.get("ENABLE_SIGNUP"),
},
{
"key": "ENABLE_MAGIC_LINK_LOGIN",
"default": os.environ.get("ENABLE_MAGIC_LINK_LOGIN"),
},
]
)
email = request.data.get("email", False)
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
)
# validate the email
try:
validate_email(email)
except ValidationError:
return Response(
{"error": "Email is not valid"}, status=status.HTTP_400_BAD_REQUEST
)
# Check if the user exists
user = User.objects.filter(email=email).first()
current_site = request.META.get("HTTP_ORIGIN")
# If new user
if user is None:
# Create the user
if (
ENABLE_SIGNUP == "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Create the user with default values
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
if not bool(
ENABLE_MAGIC_LINK_LOGIN,
):
return Response(
{"error": "Magic link sign in is disabled."},
status=status.HTTP_400_BAD_REQUEST,
)
# Send event
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=True,
)
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
# Trigger the email
magic_link.delay(email, "magic_" + str(email), token, current_site)
return Response(
{"is_password_autoset": user.is_password_autoset, "is_existing": False},
status=status.HTTP_200_OK,
)
# Existing user
else:
if user.is_password_autoset:
## Generate a random token
if not bool(ENABLE_MAGIC_LINK_LOGIN):
return Response(
{"error": "Magic link sign in is disabled."},
status=status.HTTP_400_BAD_REQUEST,
)
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=False,
)
# Generate magic token
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
# Trigger the email
magic_link.delay(email, key, token, current_site)
return Response(
{
"is_password_autoset": user.is_password_autoset,
"is_existing": True,
},
status=status.HTTP_200_OK,
)
else:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="EMAIL",
first_time=False,
)
# User should enter password to login
return Response(
{
"is_password_autoset": user.is_password_autoset,
"is_existing": True,
},
status=status.HTTP_200_OK,
)

View File

@@ -1,10 +1,7 @@
# Python imports
import os
import uuid
import random
import string
import json
import requests
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@@ -18,8 +15,7 @@ from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from rest_framework import status
from rest_framework_simplejwt.tokens import RefreshToken
from sentry_sdk import capture_exception, capture_message
from sentry_sdk import capture_message
# Module imports
from . import BaseAPIView
@@ -31,7 +27,9 @@ from plane.db.models import (
ProjectMember,
)
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
from plane.license.models import Instance
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.event_tracking_task import auth_events
def get_tokens_for_user(user):
@@ -46,17 +44,16 @@ class SignUpEndpoint(BaseAPIView):
permission_classes = (AllowAny,)
def post(self, request):
if not settings.ENABLE_SIGNUP:
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
email = request.data.get("email", False)
password = request.data.get("password", False)
## Raise exception if any of the above are missing
if not email or not password:
return Response(
@@ -64,8 +61,8 @@ class SignUpEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Validate the email
email = email.strip().lower()
try:
validate_email(email)
except ValidationError as e:
@@ -74,6 +71,31 @@ class SignUpEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# get configuration values
# Get configuration values
ENABLE_SIGNUP, = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
"default": os.environ.get("ENABLE_SIGNUP"),
},
]
)
# If the sign up is not enabled and the user does not have invite disallow him from creating the account
if (
ENABLE_SIGNUP == "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the user already exists
if User.objects.filter(email=email).exists():
return Response(
@@ -85,6 +107,7 @@ class SignUpEndpoint(BaseAPIView):
user.set_password(password)
# settings last actives for the user
user.is_password_autoset = False
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -92,93 +115,13 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
@@ -186,6 +129,14 @@ class SignInEndpoint(BaseAPIView):
permission_classes = (AllowAny,)
def post(self, request):
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
email = request.data.get("email", False)
password = request.data.get("password", False)
@@ -196,8 +147,8 @@ class SignInEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Validate email
email = email.strip().lower()
try:
validate_email(email)
except ValidationError as e:
@@ -206,33 +157,53 @@ class SignInEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Get the user
user = User.objects.filter(email=email).first()
if user is None:
return Response(
{
"error": "Sorry, we could not find a user with the provided credentials. Please try again."
},
status=status.HTTP_403_FORBIDDEN,
)
# Existing user
if user:
# Check user password
if not user.check_password(password):
return Response(
{
"error": "Sorry, we could not find a user with the provided credentials. Please try again."
},
status=status.HTTP_403_FORBIDDEN,
)
# Sign up Process
if not user.check_password(password):
return Response(
{
"error": "Sorry, we could not find a user with the provided credentials. Please try again."
},
status=status.HTTP_403_FORBIDDEN,
# Create the user
else:
ENABLE_SIGNUP, = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
"default": os.environ.get("ENABLE_SIGNUP"),
},
]
)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
# Create the user
if (
ENABLE_SIGNUP == "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(password),
is_password_autoset=False,
)
# settings last active for the user
user.is_active = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -288,7 +259,8 @@ class SignInEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -296,30 +268,16 @@ class SignInEndpoint(BaseAPIView):
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
# Send event
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="EMAIL",
first_time=False,
)
access_token, refresh_token = get_tokens_for_user(user)
data = {
@@ -352,77 +310,20 @@ class SignOutEndpoint(BaseAPIView):
return Response({"message": "success"}, status=status.HTTP_200_OK)
class MagicSignInGenerateEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email", False)
if not email:
return Response(
{"error": "Please provide a valid email address"},
status=status.HTTP_400_BAD_REQUEST,
)
# Clean up
email = email.strip().lower()
validate_email(email)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
ri = redis_instance()
key = "magic_" + str(email)
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
class MagicSignInEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
user_token = request.data.get("token", "").strip()
key = request.data.get("key", False).strip().lower()
@@ -441,71 +342,27 @@ class MagicSignInEndpoint(BaseAPIView):
email = data["email"]
if str(token) == str(user_token):
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else:
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
user = User.objects.get(email=email)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
# Send event
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=False,
)
user.is_active = True
user.is_email_verified = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -561,7 +418,8 @@ class MagicSignInEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)

View File

@@ -43,20 +43,25 @@ class TimezoneMixin:
class WebhookMixin:
webhook_event = None
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
# Check for the case should webhook be sent
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
# Push the object to delay
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
payload=response.data,
kw=self.kwargs,
action=self.request.method,
slug=self.workspace_slug,
bulk=self.bulk,
)
return response

View File

@@ -11,7 +11,6 @@ from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.license.models import Instance, InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@@ -21,84 +20,101 @@ class ConfigurationEndpoint(BaseAPIView):
]
def get(self, request):
instance_configuration = InstanceConfiguration.objects.values("key", "value")
# Get all the configuration
(
GOOGLE_CLIENT_ID,
GITHUB_CLIENT_ID,
GITHUB_APP_NAME,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
ENABLE_MAGIC_LINK_LOGIN,
ENABLE_EMAIL_PASSWORD,
SLACK_CLIENT_ID,
POSTHOG_API_KEY,
POSTHOG_HOST,
UNSPLASH_ACCESS_KEY,
OPENAI_API_KEY,
) = get_configuration_value(
[
{
"key": "GOOGLE_CLIENT_ID",
"default": os.environ.get("GOOGLE_CLIENT_ID", None),
},
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID", None),
},
{
"key": "GITHUB_APP_NAME",
"default": os.environ.get("GITHUB_APP_NAME", None),
},
{
"key": "EMAIL_HOST_USER",
"default": os.environ.get("EMAIL_HOST_USER", None),
},
{
"key": "EMAIL_HOST_PASSWORD",
"default": os.environ.get("EMAIL_HOST_PASSWORD", None),
},
{
"key": "ENABLE_MAGIC_LINK_LOGIN",
"default": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "1"),
},
{
"key": "ENABLE_EMAIL_PASSWORD",
"default": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
},
{
"key": "SLACK_CLIENT_ID",
"default": os.environ.get("SLACK_CLIENT_ID", "1"),
},
{
"key": "POSTHOG_API_KEY",
"default": os.environ.get("POSTHOG_API_KEY", "1"),
},
{
"key": "POSTHOG_HOST",
"default": os.environ.get("POSTHOG_HOST", "1"),
},
{
"key": "UNSPLASH_ACCESS_KEY",
"default": os.environ.get("UNSPLASH_ACCESS_KEY", "1"),
},
{
"key": "OPENAI_API_KEY",
"default": os.environ.get("OPENAI_API_KEY", "1"),
},
]
)
data = {}
# Authentication
data["google_client_id"] = get_configuration_value(
instance_configuration,
"GOOGLE_CLIENT_ID",
os.environ.get("GOOGLE_CLIENT_ID", None),
)
data["github_client_id"] = get_configuration_value(
instance_configuration,
"GITHUB_CLIENT_ID",
os.environ.get("GITHUB_CLIENT_ID", None),
)
data["github_app_name"] = get_configuration_value(
instance_configuration,
"GITHUB_APP_NAME",
os.environ.get("GITHUB_APP_NAME", None),
)
data["google_client_id"] = GOOGLE_CLIENT_ID
data["github_client_id"] = GITHUB_CLIENT_ID
data["github_app_name"] = GITHUB_APP_NAME
data["magic_login"] = (
bool(
get_configuration_value(
instance_configuration,
"EMAIL_HOST_USER",
os.environ.get("GITHUB_APP_NAME", None),
),
)
and bool(
get_configuration_value(
instance_configuration,
"EMAIL_HOST_PASSWORD",
os.environ.get("GITHUB_APP_NAME", None),
)
)
) and get_configuration_value(
instance_configuration, "ENABLE_MAGIC_LINK_LOGIN", "0"
) == "1"
data["email_password_login"] = (
get_configuration_value(
instance_configuration, "ENABLE_EMAIL_PASSWORD", "0"
)
== "1"
)
bool(EMAIL_HOST_USER) and bool(EMAIL_HOST_PASSWORD)
) and ENABLE_MAGIC_LINK_LOGIN == "1"
data["email_password_login"] = ENABLE_EMAIL_PASSWORD == "1"
# Slack client
data["slack_client_id"] = get_configuration_value(
instance_configuration,
"SLACK_CLIENT_ID",
os.environ.get("SLACK_CLIENT_ID", None),
)
data["slack_client_id"] = SLACK_CLIENT_ID
# Posthog
data["posthog_api_key"] = get_configuration_value(
instance_configuration,
"POSTHOG_API_KEY",
os.environ.get("POSTHOG_API_KEY", None),
)
data["posthog_host"] = get_configuration_value(
instance_configuration,
"POSTHOG_HOST",
os.environ.get("POSTHOG_HOST", None),
)
data["posthog_api_key"] = POSTHOG_API_KEY
data["posthog_host"] = POSTHOG_HOST
# Unsplash
data["has_unsplash_configured"] = bool(
get_configuration_value(
instance_configuration,
"UNSPLASH_ACCESS_KEY",
os.environ.get("UNSPLASH_ACCESS_KEY", None),
)
)
data["has_unsplash_configured"] = UNSPLASH_ACCESS_KEY
# Open AI settings
data["has_openai_configured"] = bool(
get_configuration_value(
instance_configuration,
"OPENAI_API_KEY",
os.environ.get("OPENAI_API_KEY", None),
)
)
data["has_openai_configured"] = bool(OPENAI_API_KEY)
# File size settings
data["file_size_limit"] = float(os.environ.get("FILE_SIZE_LIMIT", 5242880))
# is self managed
data["is_self_managed"] = bool(int(os.environ.get("IS_SELF_MANAGED", "1")))
return Response(data, status=status.HTTP_200_OK)

View File

@@ -502,7 +502,10 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
@@ -536,9 +539,8 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, cycle_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
@@ -573,24 +575,9 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
)
)
issues_data = IssueStateSerializer(issues, many=True).data
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", [])
@@ -688,7 +675,6 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()
issue_activity.delay(
type="cycle.activity.deleted",
requested_data=json.dumps(
@@ -698,64 +684,15 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
}
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=str(cycle_issue.issue_id),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
cycle_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id, cycle_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_cycle__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.filter(**filters)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class CycleDateCheckEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,

View File

@@ -1,5 +1,6 @@
# Python imports
import requests
import os
# Third party imports
from openai import OpenAI
@@ -15,23 +16,31 @@ from plane.app.permissions import ProjectEntityPermission
from plane.db.models import Workspace, Project
from plane.app.serializers import ProjectLiteSerializer, WorkspaceLiteSerializer
from plane.utils.integrations.github import get_release_notes
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
class GPTIntegrationEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id):
OPENAI_API_KEY, GPT_ENGINE = get_configuration_value(
[
{
"key": "OPENAI_API_KEY",
"default": os.environ.get("OPENAI_API_KEY", None),
},
{
"key": "GPT_ENGINE",
"default": os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"),
},
]
)
# Get the configuration value
instance_configuration = InstanceConfiguration.objects.values("key", "value")
api_key = get_configuration_value(instance_configuration, "OPENAI_API_KEY")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
# Check the keys
if not api_key or not gpt_engine:
if not OPENAI_API_KEY or not GPT_ENGINE:
return Response(
{"error": "OpenAI API key and engine is required"},
status=status.HTTP_400_BAD_REQUEST,
@@ -47,16 +56,12 @@ class GPTIntegrationEndpoint(BaseAPIView):
final_text = task + "\n" + prompt
instance_configuration = InstanceConfiguration.objects.values("key", "value")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
client = OpenAI(
api_key=api_key,
api_key=OPENAI_API_KEY,
)
response = client.chat.completions.create(
model=gpt_engine,
model=GPT_ENGINE,
messages=[{"role": "user", "content": final_text}],
)
@@ -83,16 +88,28 @@ class ReleaseNotesEndpoint(BaseAPIView):
class UnsplashEndpoint(BaseAPIView):
def get(self, request):
UNSPLASH_ACCESS_KEY, = get_configuration_value(
[
{
"key": "UNSPLASH_ACCESS_KEY",
"default": os.environ.get("UNSPLASH_ACCESS_KEY"),
}
]
)
# Check unsplash access key
if not UNSPLASH_ACCESS_KEY:
return Response([], status=status.HTTP_200_OK)
# Query parameters
query = request.GET.get("query", False)
page = request.GET.get("page", 1)
per_page = request.GET.get("per_page", 20)
url = (
f"https://api.unsplash.com/search/photos/?client_id={settings.UNSPLASH_ACCESS_KEY}&query={query}&page=${page}&per_page={per_page}"
f"https://api.unsplash.com/search/photos/?client_id={UNSPLASH_ACCESS_KEY}&query={query}&page=${page}&per_page={per_page}"
if query
else f"https://api.unsplash.com/photos/?client_id={settings.UNSPLASH_ACCESS_KEY}&page={page}&per_page={per_page}"
else f"https://api.unsplash.com/photos/?client_id={UNSPLASH_ACCESS_KEY}&page={page}&per_page={per_page}"
)
headers = {

View File

@@ -4,6 +4,7 @@ import random
from itertools import chain
# Django imports
from django.db import models
from django.utils import timezone
from django.db.models import (
Prefetch,
@@ -132,6 +133,7 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
@@ -215,25 +217,9 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -311,104 +297,6 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class IssueListEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
serializer = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueListGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class UserWorkSpaceIssues(BaseAPIView):
@method_decorator(gzip_page)
def get(self, request, slug):
@@ -596,7 +484,7 @@ class IssueActivityEndpoint(BaseAPIView):
class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
@@ -1083,6 +971,7 @@ class IssueArchiveViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
show_sub_issues = request.GET.get("show_sub_issues", "true")
@@ -1173,14 +1062,9 @@ class IssueArchiveViewSet(BaseViewSet):
else issue_queryset.filter(parent__isnull=True)
)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
return Response(group_results(issues, group_by), status=status.HTTP_200_OK)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def retrieve(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
@@ -1580,6 +1464,7 @@ class IssueDraftViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
@@ -1662,18 +1547,9 @@ class IssueDraftViewSet(BaseViewSet):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
grouped_results = group_results(issues, group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)

View File

@@ -283,9 +283,12 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueViewSet(BaseViewSet):
class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module_issue"
bulk = True
filterset_fields = [
"issue__labels__id",
@@ -321,9 +324,8 @@ class ModuleIssueViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, module_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
@@ -357,24 +359,9 @@ class ModuleIssueViewSet(BaseViewSet):
.values("count")
)
)
issues_data = IssueStateSerializer(issues, many=True).data
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, module_id):
issues = request.data.get("issues", [])
@@ -461,7 +448,6 @@ class ModuleIssueViewSet(BaseViewSet):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk
)
module_issue.delete()
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
@@ -471,63 +457,15 @@ class ModuleIssueViewSet(BaseViewSet):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=str(module_issue.issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
module_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id, module_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_module__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.filter(**filters)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class ModuleLinkViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,

View File

@@ -2,7 +2,6 @@
import uuid
import requests
import os
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@@ -29,7 +28,10 @@ from plane.db.models import (
ProjectMemberInvite,
ProjectMember,
)
from plane.bgtasks.event_tracking_task import auth_events
from .base import BaseAPIView
from plane.license.models import Instance
from plane.license.utils.instance_value import get_configuration_value
def get_tokens_for_user(user):
@@ -133,10 +135,37 @@ class OauthEndpoint(BaseAPIView):
def post(self, request):
try:
# Check if instance is registered or not
instance = Instance.objects.first()
if instance is None and not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
medium = request.data.get("medium", False)
id_token = request.data.get("credential", False)
client_id = request.data.get("clientId", False)
GOOGLE_CLIENT_ID, GITHUB_CLIENT_ID = get_configuration_value(
[
{
"key": "GOOGLE_CLIENT_ID",
"default": os.environ.get("GOOGLE_CLIENT_ID"),
},
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID"),
},
]
)
if not GOOGLE_CLIENT_ID or not GITHUB_CLIENT_ID:
return Response(
{"error": "Github or Google login is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
if not medium or not id_token:
return Response(
{
@@ -174,15 +203,7 @@ class OauthEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
## Login Case
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
user.is_active = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -239,7 +260,8 @@ class OauthEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -256,29 +278,17 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
# Send event
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium=medium.upper(),
first_time=False,
)
access_token, refresh_token = get_tokens_for_user(user)
@@ -289,7 +299,26 @@ class OauthEndpoint(BaseAPIView):
return Response(data, status=status.HTTP_200_OK)
except User.DoesNotExist:
## Signup Case
ENABLE_SIGNUP, = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
"default": os.environ.get("ENABLE_SIGNUP", "0"),
}
]
)
if (
ENABLE_SIGNUP == "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
username = uuid.uuid4().hex
@@ -305,7 +334,7 @@ class OauthEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
user = User(
user = User.objects.create(
username=username,
email=email,
mobile_number=mobile_number,
@@ -316,7 +345,6 @@ class OauthEndpoint(BaseAPIView):
)
user.set_password(uuid.uuid4().hex)
user.is_password_autoset = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -373,7 +401,8 @@ class OauthEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -381,29 +410,16 @@ class OauthEndpoint(BaseAPIView):
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
# Send event
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium=medium.upper(),
first_time=True,
)
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -420,4 +436,5 @@ class OauthEndpoint(BaseAPIView):
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED)
return Response(data, status=status.HTTP_201_CREATED)

View File

@@ -22,6 +22,7 @@ from plane.db.models import (
IssueAssignee,
IssueActivity,
PageLog,
ProjectMember,
)
from plane.app.serializers import (
PageSerializer,
@@ -140,12 +141,6 @@ class PageViewSet(BaseViewSet):
pk=page_id, workspace__slug=slug, project_id=project_id
).first()
# only the owner can lock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can lock the page"},
)
page.is_locked = True
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -155,12 +150,6 @@ class PageViewSet(BaseViewSet):
pk=page_id, workspace__slug=slug, project_id=project_id
).first()
# only the owner can unlock the page
if request.user.id != page.owned_by_id:
return Response(
{"error": "Only the page owner can unlock the page"},
status=status.HTTP_400_BAD_REQUEST,
)
page.is_locked = False
page.save()
@@ -175,10 +164,16 @@ class PageViewSet(BaseViewSet):
def archive(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
if page.owned_by_id != request.user.id:
# only the owner and admin can archive the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
).exists()
or request.user.id != page.owned_by_id
):
return Response(
{"error": "Only the owner of the page can archive a page"},
status=status.HTTP_204_NO_CONTENT,
{"error": "Only the owner and admin can archive the page"},
status=status.HTTP_400_BAD_REQUEST,
)
unarchive_archive_page_and_descendants(page_id, datetime.now())
@@ -188,9 +183,15 @@ class PageViewSet(BaseViewSet):
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
if page.owned_by_id != request.user.id:
# only the owner and admin can un archive the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
).exists()
or request.user.id != page.owned_by_id
):
return Response(
{"error": "Only the owner of the page can unarchive a page"},
{"error": "Only the owner and admin can un archive the page"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -216,6 +217,18 @@ class PageViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
# only the owner and admin can delete the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
).exists()
or request.user.id != page.owned_by_id
):
return Response(
{"error": "Only the owner and admin can delete the page"},
status=status.HTTP_400_BAD_REQUEST,
)
if page.archived_at is None:
return Response(
{"error": "The page should be archived before deleting"},
@@ -227,7 +240,6 @@ class PageViewSet(BaseViewSet):
parent_id=pk, project_id=project_id, workspace__slug=slug
).update(parent=None)
page.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -310,36 +322,6 @@ class PageLogEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
class CreateIssueFromBlockEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id, page_id):
page = Page.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=page_id,
)
issue = Issue.objects.create(
name=request.data.get("name"),
project_id=project_id,
)
_ = IssueAssignee.objects.create(
issue=issue, assignee=request.user, project_id=project_id
)
_ = IssueActivity.objects.create(
issue=issue,
actor=request.user,
project_id=project_id,
comment=f"created the issue from {page.name} block",
verb="created",
)
return Response(IssueLiteSerializer(issue).data, status=status.HTTP_200_OK)
class SubPagesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,

View File

@@ -165,6 +165,7 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
workspace__slug=slug,
is_active=True,
).select_related("member"),
to_attr='members_list'
)
)
.order_by("sort_order", "name")
@@ -388,7 +389,7 @@ class ProjectInvitationsViewset(BaseViewSet):
{"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.get(slug=slug)
project_invitations = []
@@ -424,7 +425,7 @@ class ProjectInvitationsViewset(BaseViewSet):
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
current_site = request.META.get("HTTP_ORIGIN")
# Send invitations
for invitation in project_invitations:
@@ -469,6 +470,13 @@ class UserProjectInvitationsViewset(BaseViewSet):
workspace_role = workspace_member.role
workspace = workspace_member.workspace
# If the user was already part of workspace
_ = ProjectMember.objects.filter(
workspace__slug=slug,
project_id__in=project_ids,
member=request.user,
).update(is_active=True)
ProjectMember.objects.bulk_create(
[
ProjectMember(
@@ -1040,4 +1048,4 @@ class ProjectDeployBoardViewSet(BaseViewSet):
project_deploy_board.save()
serializer = ProjectDeployBoardSerializer(project_deploy_board)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -77,7 +77,7 @@ class StateViewSet(BaseViewSet):
)
if state.default:
return Response({"error": "Default state cannot be deleted"}, status=False)
return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()

View File

@@ -12,11 +12,14 @@ from plane.app.serializers import (
)
from plane.app.views.base import BaseViewSet, BaseAPIView
from plane.db.models import User, IssueActivity, WorkspaceMember
from plane.db.models import User, IssueActivity, WorkspaceMember, ProjectMember
from plane.license.models import Instance, InstanceAdmin
from plane.utils.paginator import BasePaginator
from django.db.models import Q, F, Count, Case, When, IntegerField
class UserEndpoint(BaseViewSet):
serializer_class = UserSerializer
model = User
@@ -45,16 +48,79 @@ class UserEndpoint(BaseViewSet):
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(member=request.user, is_active=True).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
projects_to_deactivate = []
workspaces_to_deactivate = []
projects = ProjectMember.objects.filter(
member=request.user, is_active=True
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
default=0,
output_field=IntegerField(),
)
),
total_members=Count("id"),
)
for project in projects:
if project.other_admin_exists > 0 or (project.total_members == 1):
project.is_active = False
projects_to_deactivate.append(project)
else:
return Response(
{
"error": "You cannot deactivate account as you are the only admin in some projects."
},
status=status.HTTP_400_BAD_REQUEST,
)
workspaces = WorkspaceMember.objects.filter(
member=request.user, is_active=True
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
default=0,
output_field=IntegerField(),
)
),
total_members=Count("id"),
)
for workspace in workspaces:
if workspace.other_admin_exists > 0 or (workspace.total_members == 1):
workspace.is_active = False
workspaces_to_deactivate.append(workspace)
else:
return Response(
{
"error": "You cannot deactivate account as you are the only admin in some workspaces."
},
status=status.HTTP_400_BAD_REQUEST,
)
ProjectMember.objects.bulk_update(
projects_to_deactivate, ["is_active"], batch_size=100
)
WorkspaceMember.objects.bulk_update(
workspaces_to_deactivate, ["is_active"], batch_size=100
)
# Deactivate the user
user.is_active = False
user.last_workspace_id = None
user.is_tour_completed = False
user.is_onboarded = False
user.onboarding_step = {
"workspace_join": False,
"profile_complete": False,
"workspace_create": False,
"workspace_invite": False,
}
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -76,10 +142,10 @@ class UpdateUserTourCompletedEndpoint(BaseAPIView):
class UserActivityEndpoint(BaseAPIView, BasePaginator):
def get(self, request, slug):
queryset = IssueActivity.objects.filter(
actor=request.user, workspace__slug=slug
).select_related("actor", "workspace", "issue", "project")
def get(self, request):
queryset = IssueActivity.objects.filter(actor=request.user).select_related(
"actor", "workspace", "issue", "project"
)
return self.paginate(
request=request,
@@ -88,3 +154,4 @@ class UserActivityEndpoint(BaseAPIView, BasePaginator):
issue_activities, many=True
).data,
)

View File

@@ -20,7 +20,7 @@ from rest_framework.response import Response
from rest_framework import status
# Module imports
from . import BaseViewSet
from . import BaseViewSet, BaseAPIView
from plane.app.serializers import (
GlobalViewSerializer,
IssueViewSerializer,
@@ -95,6 +95,7 @@ class GlobalViewIssuesViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
@@ -177,25 +178,13 @@ class GlobalViewIssuesViewSet(BaseViewSet):
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class IssueViewViewSet(BaseViewSet):

View File

@@ -20,9 +20,10 @@ class WebhookEndpoint(BaseAPIView):
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
serializer = WebhookSerializer(
data=request.data, context={"request": request}
)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
@@ -79,6 +80,7 @@ class WebhookEndpoint(BaseAPIView):
serializer = WebhookSerializer(
webhook,
data=request.data,
context={request: request},
partial=True,
fields=(
"id",

View File

@@ -73,8 +73,7 @@ from plane.app.permissions import (
)
from plane.bgtasks.workspace_invitation_task import workspace_invitation
from plane.utils.issue_filters import issue_filters
from plane.utils.grouper import group_results
from plane.bgtasks.event_tracking_task import workspace_invite_event
class WorkSpaceViewSet(BaseViewSet):
model = Workspace
@@ -113,7 +112,10 @@ class WorkSpaceViewSet(BaseViewSet):
return (
self.filter_queryset(super().get_queryset().select_related("owner"))
.order_by("name")
.filter(workspace_member__member=self.request.user)
.filter(
workspace_member__member=self.request.user,
workspace_member__is_active=True,
)
.annotate(total_members=member_count)
.annotate(total_issues=issue_count)
.select_related("owner")
@@ -189,17 +191,21 @@ class UserWorkSpacesEndpoint(BaseAPIView):
)
workspace = (
(
Workspace.objects.prefetch_related(
Prefetch("workspace_member", queryset=WorkspaceMember.objects.all())
Workspace.objects.prefetch_related(
Prefetch(
"workspace_member",
queryset=WorkspaceMember.objects.filter(
member=request.user, is_active=True
),
)
.filter(
workspace_member__member=request.user,
)
.select_related("owner")
)
.select_related("owner")
.annotate(total_members=member_count)
.annotate(total_issues=issue_count)
.filter(
workspace_member__member=request.user, workspace_member__is_active=True
)
.distinct()
)
serializer = WorkSpaceSerializer(self.filter_queryset(workspace), many=True)
@@ -319,7 +325,7 @@ class WorkspaceInvitationsViewset(BaseViewSet):
workspace_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
current_site = request.META.get("HTTP_ORIGIN")
# Send invitations
for invitation in workspace_invitations:
@@ -400,6 +406,16 @@ class WorkspaceJoinEndpoint(BaseAPIView):
# Delete the invitation
workspace_invite.delete()
# Send event
workspace_invite_event.delay(
user=user.id if user is not None else None,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="MEMBER_ACCEPTED",
accepted_from="EMAIL",
)
return Response(
{"message": "Workspace Invitation Accepted"},
@@ -418,7 +434,9 @@ class WorkspaceJoinEndpoint(BaseAPIView):
)
def get(self, request, slug, pk):
workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
workspace_invitation = WorkspaceMemberInvite.objects.get(
workspace__slug=slug, pk=pk
)
serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -590,7 +608,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__member_id=workspace_member.id,
project_projectmember__role=20,
),
),
@@ -600,7 +618,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
"error": "User is a part of some projects where they are the only admin, they should either leave that project or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -635,7 +653,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
"error": "You cannot leave the workspace as you are the only admin of the workspace you will have to either delete the workspace or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -656,7 +674,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
"error": "You are a part of some projects where you are the only admin, you should either leave the project or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -1198,6 +1216,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
]
def get(self, request, slug, user_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
@@ -1299,18 +1318,11 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
grouped_results = group_results(issues, group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):

View File

@@ -1,6 +1,8 @@
# Python imports
import csv
import io
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
@@ -16,8 +18,7 @@ from sentry_sdk import capture_exception
from plane.db.models import Issue
from plane.utils.analytics_plot import build_graph_plot
from plane.utils.issue_filters import issue_filters
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.utils.instance_value import get_email_configuration
row_mapping = {
"state__name": "State",
@@ -32,7 +33,7 @@ row_mapping = {
"priority": "Priority",
"estimate": "Estimate",
"issue_cycle__cycle_id": "Cycle",
"issue_module__module_id": "Module"
"issue_module__module_id": "Module",
}
ASSIGNEE_ID = "assignees__id"
@@ -42,7 +43,7 @@ CYCLE_ID = "issue_cycle__cycle_id"
MODULE_ID = "issue_module__module_id"
def send_export_email(email, slug, csv_buffer):
def send_export_email(email, slug, csv_buffer, rows):
"""Helper function to send export email."""
subject = "Your Export is ready"
html_content = render_to_string("emails/exports/analytics.html", {})
@@ -50,20 +51,33 @@ def send_export_email(email, slug, csv_buffer):
csv_buffer.seek(0)
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration()
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
use_ssl=bool(get_configuration_value(instance_configuration, "EMAIL_USE_SSL", "0")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
msg.attach(f"{slug}-analytics.csv", csv_buffer.getvalue())
msg.send(fail_silently=False)
return
def get_assignee_details(slug, filters):
@@ -431,8 +445,11 @@ def analytic_export_task(email, data, slug):
)
csv_buffer = generate_csv_from_rows(rows)
send_export_email(email, slug, csv_buffer)
send_export_email(email, slug, csv_buffer, rows)
return
except Exception as e:
print(e)
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -1,57 +0,0 @@
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def email_verification(first_name, email, token, current_site):
try:
realtivelink = "/request-email-verification/" + "?token=" + str(token)
abs_url = current_site + realtivelink
from_email_string = settings.EMAIL_FROM
subject = "Verify your Email!"
context = {
"first_name": first_name,
"verification_url": abs_url,
}
html_content = render_to_string("emails/auth/email_verification.html", context)
text_content = strip_tags(html_content)
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return
except Exception as e:
# Print logs if in DEBUG mode
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -0,0 +1,78 @@
import uuid
import os
# third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from posthog import Posthog
# module imports
from plane.license.utils.instance_value import get_configuration_value
def posthogConfiguration():
POSTHOG_API_KEY, POSTHOG_HOST = get_configuration_value(
[
{
"key": "POSTHOG_API_KEY",
"default": os.environ.get("POSTHOG_API_KEY", None),
},
{
"key": "POSTHOG_HOST",
"default": os.environ.get("POSTHOG_HOST", None),
},
]
)
if POSTHOG_API_KEY and POSTHOG_HOST:
return POSTHOG_API_KEY, POSTHOG_HOST
else:
return None, None
@shared_task
def auth_events(user, email, user_agent, ip, event_name, medium, first_time):
try:
POSTHOG_API_KEY, POSTHOG_HOST = posthogConfiguration()
if POSTHOG_API_KEY and POSTHOG_HOST:
posthog = Posthog(POSTHOG_API_KEY, host=POSTHOG_HOST)
posthog.capture(
email,
event=event_name,
properties={
"event_id": uuid.uuid4().hex,
"user": {"email": email, "id": str(user)},
"device_ctx": {
"ip": ip,
"user_agent": user_agent,
},
"medium": medium,
"first_time": first_time
}
)
except Exception as e:
capture_exception(e)
@shared_task
def workspace_invite_event(user, email, user_agent, ip, event_name, accepted_from):
try:
POSTHOG_API_KEY, POSTHOG_HOST = posthogConfiguration()
if POSTHOG_API_KEY and POSTHOG_HOST:
posthog = Posthog(POSTHOG_API_KEY, host=POSTHOG_HOST)
posthog.capture(
email,
event=event_name,
properties={
"event_id": uuid.uuid4().hex,
"user": {"email": email, "id": str(user)},
"device_ctx": {
"ip": ip,
"user_agent": user_agent,
},
"accepted_from": accepted_from
}
)
except Exception as e:
capture_exception(e)

View File

@@ -1,3 +1,8 @@
# Python import
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -9,39 +14,53 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def forgot_password(first_name, email, uidb64, token, current_site):
try:
realtivelink = f"/accounts/reset-password/?uidb64={uidb64}&token={token}"
abs_url = current_site + realtivelink
relative_link = (
f"/accounts/password/?uidb64={uidb64}&token={token}&email={email}"
)
abs_url = str(current_site) + relative_link
from_email_string = settings.EMAIL_FROM
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration()
subject = "Reset Your Password - Plane"
subject = "A new password to your Plane account has been requested"
context = {
"first_name": first_name,
"forgot_password_url": abs_url,
"email": email,
}
html_content = render_to_string("emails/auth/forgot_password.html", context)
text_content = strip_tags(html_content)
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -1,3 +1,8 @@
# Python imports
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -9,38 +14,48 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def magic_link(email, key, token, current_site):
try:
realtivelink = f"/magic-sign-in/?password={token}&key={key}"
abs_url = current_site + realtivelink
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration()
subject = "Login for Plane"
context = {"magic_url": abs_url, "code": token}
# Send the mail
subject = f"Your unique Plane login code is {token}"
context = {"code": token, "email": email}
html_content = render_to_string("emails/auth/magic_signin.html", context)
text_content = strip_tags(html_content)
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
msg.attach_alternative(html_content, "text/html")
msg.send()
return
except Exception as e:
print(e)
capture_exception(e)
# Print logs if in DEBUG mode
if settings.DEBUG:

View File

@@ -190,6 +190,7 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
issue_activities_created) if issue_activities_created is not None else None
)
if type not in [
"issue.activity.deleted",
"cycle.activity.created",
"cycle.activity.deleted",
"module.activity.created",

View File

@@ -1,3 +1,6 @@
# Python import
import os
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -10,8 +13,7 @@ from sentry_sdk import capture_exception
# Module imports
from plane.db.models import Project, User, ProjectMemberInvite
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def project_invitation(email, project_id, token, current_site, invitor):
@@ -25,8 +27,6 @@ def project_invitation(email, project_id, token, current_site, invitor):
relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM
subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = {
@@ -46,16 +46,31 @@ def project_invitation(email, project_id, token, current_site, invitor):
project_member_invite.save()
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration()
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -2,15 +2,67 @@ import requests
import uuid
import hashlib
import json
import hmac
# Django imports
from django.conf import settings
from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
from plane.db.models import (
Webhook,
WebhookLog,
Project,
Issue,
Cycle,
Module,
ModuleIssue,
CycleIssue,
IssueComment,
)
from plane.api.serializers import (
ProjectSerializer,
IssueSerializer,
CycleSerializer,
ModuleSerializer,
CycleIssueSerializer,
ModuleIssueSerializer,
IssueCommentSerializer,
IssueExpandSerializer,
)
SERIALIZER_MAPPER = {
"project": ProjectSerializer,
"issue": IssueExpandSerializer,
"cycle": CycleSerializer,
"module": ModuleSerializer,
"cycle_issue": CycleIssueSerializer,
"module_issue": ModuleIssueSerializer,
"issue_comment": IssueCommentSerializer,
}
MODEL_MAPPER = {
"project": Project,
"issue": Issue,
"cycle": Cycle,
"module": Module,
"cycle_issue": CycleIssue,
"module_issue": ModuleIssue,
"issue_comment": IssueComment,
}
def get_model_data(event, event_id, many=False):
model = MODEL_MAPPER.get(event)
if many:
queryset = model.objects.filter(pk__in=event_id)
else:
queryset = model.objects.get(pk=event_id)
serializer = SERIALIZER_MAPPER.get(event)
return serializer(queryset, many=many).data
@shared_task(
@@ -31,18 +83,12 @@ def webhook_task(self, webhook, slug, event, event_data, action):
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
# # Your secret key
event_data = (
json.loads(json.dumps(event_data, cls=DjangoJSONEncoder))
if event_data is not None
else None
)
action = {
"POST": "create",
@@ -59,6 +105,16 @@ def webhook_task(self, webhook, slug, event, event_data, action):
"data": event_data,
}
# Use HMAC for generating signature
if webhook.secret_key:
hmac_signature = hmac.new(
webhook.secret_key.encode("utf-8"),
json.dumps(payload).encode("utf-8"),
hashlib.sha256,
)
signature = hmac_signature.hexdigest()
headers["X-Plane-Signature"] = signature
# Send the webhook event
response = requests.post(
webhook.url,
@@ -96,10 +152,6 @@ def webhook_task(self, webhook, slug, event, event_data, action):
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
@@ -110,7 +162,7 @@ def webhook_task(self, webhook, slug, event, event_data, action):
@shared_task()
def send_webhook(event, event_data, action, slug):
def send_webhook(event, payload, kw, action, slug, bulk):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
@@ -120,17 +172,48 @@ def send_webhook(event, event_data, action, slug):
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
if event == "module" or event == "module_issue":
webhooks = webhooks.filter(module=True)
if event == "cycle":
if event == "cycle" or event == "cycle_issue":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
if event == "issue_comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
if webhooks:
if action in ["POST", "PATCH"]:
if bulk and event in ["cycle_issue", "module_issue"]:
event_data = IssueExpandSerializer(
Issue.objects.filter(
pk__in=[
str(event.get("issue")) for event in payload
]
).prefetch_related("issue_cycle", "issue_module"), many=True
).data
event = "issue"
action = "PATCH"
else:
event_data = [
get_model_data(
event=event,
event_id=payload.get("id") if isinstance(payload, dict) else None,
many=False,
)
]
if action == "DELETE":
event_data = [{"id": kw.get("pk")}]
for webhook in webhooks:
for data in event_data:
webhook_task.delay(
webhook=webhook.id,
slug=slug,
event=event,
event_data=data,
action=action,
)
except Exception as e:
if settings.DEBUG:

View File

@@ -1,3 +1,8 @@
# Python imports
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -12,8 +17,7 @@ from slack_sdk.errors import SlackApiError
# Module imports
from plane.db.models import Workspace, WorkspaceMemberInvite, User
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.utils.instance_value import get_email_configuration
@shared_task
@@ -30,19 +34,26 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
relative_link = f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
# The complete url including the domain
abs_url = current_site + relative_link
abs_url = str(current_site) + relative_link
# The email from
from_email_string = settings.EMAIL_FROM
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration()
# Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
subject = f"{user.first_name or user.display_name or user.email} has invited you to join them in {workspace.name} on Plane"
context = {
"email": email,
"first_name": invitor,
"first_name": user.first_name or user.display_name or user.email,
"workspace_name": workspace.name,
"invitation_url": abs_url,
"abs_url": abs_url,
}
html_content = render_to_string(
@@ -54,27 +65,18 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
workspace_member_invite.message = text_content
workspace_member_invite.save()
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(
get_configuration_value(instance_configuration, "EMAIL_PORT", "587")
),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(
instance_configuration, "EMAIL_HOST_PASSWORD"
),
use_tls=bool(
get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")
),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"),
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
@@ -94,6 +96,7 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
return
except (Workspace.DoesNotExist, WorkspaceMemberInvite.DoesNotExist) as e:
print("Workspace or WorkspaceMember Invite Does not exists")
return
except Exception as e:
# Print logs if in DEBUG mode

View File

@@ -33,4 +33,4 @@ app.conf.beat_schedule = {
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
app.conf.beat_scheduler = 'django_celery_beat.schedulers.DatabaseScheduler'
app.conf.beat_scheduler = "django_celery_beat.schedulers.DatabaseScheduler"

View File

@@ -0,0 +1,54 @@
# Python imports
import getpass
# Django imports
from django.core.management import BaseCommand
# Module imports
from plane.db.models import User
class Command(BaseCommand):
help = "Reset password of the user with the given email"
def add_arguments(self, parser):
# Positional argument
parser.add_argument("email", type=str, help="user email")
def handle(self, *args, **options):
# get the user email from console
email = options.get("email", False)
# raise error if email is not present
if not email:
self.stderr.write("Error: Email is required")
return
# filter the user
user = User.objects.filter(email=email).first()
# Raise error if the user is not present
if not user:
self.stderr.write(f"Error: User with {email} does not exists")
return
# get password for the user
password = getpass.getpass("Password: ")
confirm_password = getpass.getpass("Password (again): ")
# If the passwords doesn't match raise error
if password != confirm_password:
self.stderr.write("Error: Your passwords didn't match.")
return
# Blank passwords should not be allowed
if password.strip() == "":
self.stderr.write("Error: Blank passwords aren't allowed.")
return
# Set user password
user.set_password(password)
user.is_password_autoset = False
user.save()
self.stdout.write(self.style.SUCCESS(f"User password updated succesfully"))

View File

@@ -21,7 +21,7 @@ class Migration(migrations.Migration):
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('transaction', models.UUIDField(default=uuid.uuid4)),
('entity_identifier', models.UUIDField(null=True)),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('mention', 'Mention')], max_length=30, verbose_name='Transaction Type')),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('page_mention', 'Page Mention'), ('user_mention', 'User Mention')], max_length=30, verbose_name='Transaction Type')),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_log', to='db.page')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),

View File

@@ -1,6 +1,11 @@
# Generated by Django 4.2.5 on 2023-11-17 08:48
from django.db import migrations, models
import plane.db.models.workspace
def user_password_autoset(apps, schema_editor):
User = apps.get_model("db", "User")
User.objects.update(is_password_autoset=True)
class Migration(migrations.Migration):
@@ -20,4 +25,15 @@ class Migration(migrations.Migration):
name='organization_size',
field=models.CharField(blank=True, max_length=20, null=True),
),
migrations.AddField(
model_name='fileasset',
name='is_deleted',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='workspace',
name='slug',
field=models.SlugField(max_length=48, unique=True, validators=[plane.db.models.workspace.slug_validator]),
),
migrations.RunPython(user_password_autoset),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 4.2.7 on 2023-11-20 08:26
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('db', '0050_user_use_case_alter_workspace_organization_size'),
]
operations = [
migrations.AddField(
model_name='fileasset',
name='is_deleted',
field=models.BooleanField(default=False),
),
]

View File

@@ -7,7 +7,6 @@ from django.db import models
from django.conf import settings
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.utils import timezone
from django.core.validators import MinValueValidator, MaxValueValidator
from django.core.exceptions import ValidationError
@@ -141,8 +140,10 @@ class Issue(ProjectBaseModel):
)["largest"]
# aggregate can return None! Check it first.
# If it isn't none, just use the last ID specified (which should be the greatest) and add one to it
if last_id is not None:
if last_id:
self.sequence_id = last_id + 1
else:
self.sequence_id = 1
largest_sort_order = Issue.objects.filter(
project=self.project, state=self.state

View File

@@ -51,9 +51,9 @@ class Module(ProjectBaseModel):
def save(self, *args, **kwargs):
if self._state.adding:
smallest_sort_order = Module.objects.filter(
project=self.project
).aggregate(smallest=models.Min("sort_order"))["smallest"]
smallest_sort_order = Module.objects.filter(project=self.project).aggregate(
smallest=models.Min("sort_order")
)["smallest"]
if smallest_sort_order is not None:
self.sort_order = smallest_sort_order - 10000

View File

@@ -57,7 +57,8 @@ class PageLog(ProjectBaseModel):
("module", "Module"),
("back_link", "Back Link"),
("forward_link", "Forward Link"),
("mention", "Mention"),
("page_mention", "Page Mention"),
("user_mention", "User Mention"),
)
transaction = models.UUIDField(default=uuid.uuid4)
page = models.ForeignKey(

View File

@@ -16,7 +16,6 @@ def generate_token():
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")

View File

@@ -1,6 +1,7 @@
# Django imports
from django.db import models
from django.conf import settings
from django.core.exceptions import ValidationError
# Module imports
from . import BaseModel
@@ -50,7 +51,7 @@ def get_default_props():
"state": True,
"sub_issue_count": True,
"updated_on": True,
}
},
}
@@ -63,6 +64,23 @@ def get_issue_props():
}
def slug_validator(value):
if value in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
]:
raise ValidationError("Slug is not valid")
class Workspace(BaseModel):
name = models.CharField(max_length=80, verbose_name="Workspace Name")
logo = models.URLField(verbose_name="Logo", blank=True, null=True)
@@ -71,7 +89,7 @@ class Workspace(BaseModel):
on_delete=models.CASCADE,
related_name="owner_workspace",
)
slug = models.SlugField(max_length=48, db_index=True, unique=True)
slug = models.SlugField(max_length=48, db_index=True, unique=True, validators=[slug_validator,])
organization_size = models.CharField(max_length=20, blank=True, null=True)
def __str__(self):

View File

@@ -1 +1 @@
from .instance import InstanceOwnerPermission, InstanceAdminPermission
from .instance import InstanceAdminPermission

View File

@@ -5,20 +5,6 @@ from rest_framework.permissions import BasePermission
from plane.license.models import Instance, InstanceAdmin
class InstanceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
instance = Instance.objects.first()
return InstanceAdmin.objects.filter(
role=20,
instance=instance,
user=request.user,
).exists()
class InstanceAdminPermission(BasePermission):
def has_permission(self, request, view):

View File

@@ -2,7 +2,7 @@
from plane.license.models import Instance, InstanceAdmin, InstanceConfiguration
from plane.app.serializers import BaseSerializer
from plane.app.serializers import UserAdminLiteSerializer
from plane.license.utils.encryption import decrypt_data
class InstanceSerializer(BaseSerializer):
primary_owner_details = UserAdminLiteSerializer(source="primary_owner", read_only=True)
@@ -12,14 +12,13 @@ class InstanceSerializer(BaseSerializer):
fields = "__all__"
read_only_fields = [
"id",
"primary_owner",
"primary_email",
"instance_id",
"license_key",
"api_key",
"version",
"email",
"last_checked_at",
"is_setup_done",
]
@@ -40,3 +39,11 @@ class InstanceConfigurationSerializer(BaseSerializer):
class Meta:
model = InstanceConfiguration
fields = "__all__"
def to_representation(self, instance):
data = super().to_representation(instance)
# Decrypt secrets value
if instance.is_encrypted and instance.value is not None:
data["value"] = decrypt_data(instance.value)
return data

View File

@@ -1,6 +1,7 @@
from .instance import (
InstanceEndpoint,
TransferPrimaryOwnerEndpoint,
InstanceAdminEndpoint,
InstanceConfigurationEndpoint,
InstanceAdminSignInEndpoint,
SignUpScreenVisitedEndpoint,
)

View File

@@ -2,13 +2,22 @@
import json
import os
import requests
import uuid
import random
import string
# Django imports
from django.utils import timezone
from django.contrib.auth.hashers import make_password
from django.core.validators import validate_email
from django.core.exceptions import ValidationError
from django.conf import settings
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from rest_framework_simplejwt.tokens import RefreshToken
# Module imports
from plane.app.views import BaseAPIView
@@ -19,102 +28,35 @@ from plane.license.api.serializers import (
InstanceConfigurationSerializer,
)
from plane.license.api.permissions import (
InstanceOwnerPermission,
InstanceAdminPermission,
)
from plane.db.models import User
from plane.license.utils.encryption import encrypt_data
class InstanceEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "PATCH"]:
self.permission_classes = [
InstanceOwnerPermission,
if self.request.method == "PATCH":
return [
InstanceAdminPermission(),
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceEndpoint, self).get_permissions()
def post(self, request):
# Check if the instance is registered
instance = Instance.objects.first()
# If instance is None then register this instance
if instance is None:
with open("package.json", "r") as file:
# Load JSON content from the file
data = json.load(file)
license_engine_base_url = os.environ.get("LICENSE_ENGINE_BASE_URL")
if not license_engine_base_url:
raise Response(
{"error": "LICENSE_ENGINE_BASE_URL is required"},
status=status.HTTP_400_BAD_REQUEST,
)
headers = {"Content-Type": "application/json"}
payload = {
"email": request.user.email,
"version": data.get("version", 0.1),
}
response = requests.post(
f"{license_engine_base_url}/api/instances",
headers=headers,
data=json.dumps(payload),
)
if response.status_code == 201:
data = response.json()
# Create instance
instance = Instance.objects.create(
instance_name="Plane Free",
instance_id=data.get("id"),
license_key=data.get("license_key"),
api_key=data.get("api_key"),
version=data.get("version"),
primary_email=data.get("email"),
primary_owner=request.user,
last_checked_at=timezone.now(),
)
# Create instance admin
_ = InstanceAdmin.objects.create(
user=request.user,
instance=instance,
role=20,
)
return Response(
{
"message": f"Instance succesfully registered with owner: {instance.primary_owner.email}"
},
status=status.HTTP_201_CREATED,
)
return Response(
{"error": "Instance could not be registered"},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return Response(
{
"message": f"Instance already registered with instance owner: {instance.primary_owner.email}"
},
status=status.HTTP_200_OK,
)
return [
AllowAny(),
]
def get(self, request):
instance = Instance.objects.first()
# get the instance
if instance is None:
return Response({"activated": False}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"is_activated": False, "is_setup_done": False},
status=status.HTTP_200_OK,
)
# Return instance
serializer = InstanceSerializer(instance)
serializer.data["activated"] = True
return Response(serializer.data, status=status.HTTP_200_OK)
data = serializer.data
data["is_activated"] = True
return Response(data, status=status.HTTP_200_OK)
def patch(self, request):
# Get the instance
@@ -126,58 +68,15 @@ class InstanceEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class TransferPrimaryOwnerEndpoint(BaseAPIView):
permission_classes = [
InstanceOwnerPermission,
]
# Transfer the owner of the instance
def post(self, request):
instance = Instance.objects.first()
# Get the email of the new user
email = request.data.get("email", False)
if not email:
return Response(
{"error": "User is required"}, status=status.HTTP_400_BAD_REQUEST
)
# Get users
user = User.objects.get(email=email)
# Save the instance user
instance.primary_owner = user
instance.primary_email = user.email
instance.save(update_fields=["owner", "email"])
# Add the user to admin
_ = InstanceAdmin.objects.get_or_create(
instance=instance,
user=user,
role=20,
)
return Response(
{"message": "Owner successfully updated"}, status=status.HTTP_200_OK
)
class InstanceAdminEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "DELETE"]:
self.permission_classes = [
InstanceOwnerPermission,
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceAdminEndpoint, self).get_permissions()
permission_classes = [
InstanceAdminPermission,
]
# Create an instance admin
def post(self, request):
email = request.data.get("email", False)
role = request.data.get("role", 15)
role = request.data.get("role", 20)
if not email:
return Response(
@@ -230,18 +129,137 @@ class InstanceConfigurationEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request):
configurations = InstanceConfiguration.objects.filter(key__in=request.data.keys())
configurations = InstanceConfiguration.objects.filter(
key__in=request.data.keys()
)
bulk_configurations = []
for configuration in configurations:
configuration.value = request.data.get(configuration.key, configuration.value)
value = request.data.get(configuration.key, configuration.value)
if configuration.is_encrypted:
configuration.value = encrypt_data(value)
else:
configuration.value = value
bulk_configurations.append(configuration)
InstanceConfiguration.objects.bulk_update(
bulk_configurations,
["value"],
batch_size=100
bulk_configurations, ["value"], batch_size=100
)
serializer = InstanceConfigurationSerializer(configurations, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def get_tokens_for_user(user):
refresh = RefreshToken.for_user(user)
return (
str(refresh.access_token),
str(refresh),
)
class InstanceAdminSignInEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
# Check instance first
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
# check if the instance is already activated
if InstanceAdmin.objects.first():
return Response(
{"error": "Admin for this instance is already registered"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the email and password from all the user
email = request.data.get("email", False)
password = request.data.get("password", False)
# return error if the email and password is not present
if not email or not password:
return Response(
{"error": "Email and password are required"},
status=status.HTTP_400_BAD_REQUEST,
)
# Validate the email
email = email.strip().lower()
try:
validate_email(email)
except ValidationError as e:
return Response(
{"error": "Please provide a valid email address."},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if already a user exists or not
user = User.objects.filter(email=email).first()
# Existing user
if user:
# Check user password
if not user.check_password(password):
return Response(
{
"error": "Sorry, we could not find a user with the provided credentials. Please try again."
},
status=status.HTTP_403_FORBIDDEN,
)
else:
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(password),
is_password_autoset=False,
)
# settings last active for the user
user.is_active = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
user.last_login_uagent = request.META.get("HTTP_USER_AGENT")
user.token_updated_at = timezone.now()
user.save()
# Register the user as an instance admin
_ = InstanceAdmin.objects.create(
user=user,
instance=instance,
)
# Make the setup flag True
instance.is_setup_done = True
instance.save()
# get tokens for user
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
class SignUpScreenVisitedEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
instance.is_signup_screen_visited = True
instance.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -2,46 +2,131 @@
import os
# Django imports
from django.core.management.base import BaseCommand, CommandError
from django.utils import timezone
from django.core.management.base import BaseCommand
from django.conf import settings
# Module imports
from plane.license.models import InstanceConfiguration
class Command(BaseCommand):
help = "Configure instance variables"
def handle(self, *args, **options):
config_keys = {
# Authentication Settings
"GOOGLE_CLIENT_ID": os.environ.get("GOOGLE_CLIENT_ID"),
"GOOGLE_CLIENT_SECRET": os.environ.get("GOOGLE_CLIENT_SECRET"),
"GITHUB_CLIENT_ID": os.environ.get("GITHUB_CLIENT_ID"),
"GITHUB_CLIENT_SECRET": os.environ.get("GITHUB_CLIENT_SECRET"),
"ENABLE_SIGNUP": os.environ.get("ENABLE_SIGNUP", "1"),
"ENABLE_EMAIL_PASSWORD": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
"ENABLE_MAGIC_LINK_LOGIN": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0"),
# Email Settings
"EMAIL_HOST": os.environ.get("EMAIL_HOST", ""),
"EMAIL_HOST_USER": os.environ.get("EMAIL_HOST_USER", ""),
"EMAIL_HOST_PASSWORD": os.environ.get("EMAIL_HOST_PASSWORD"),
"EMAIL_PORT": os.environ.get("EMAIL_PORT", "587"),
"EMAIL_FROM": os.environ.get("EMAIL_FROM", ""),
"EMAIL_USE_TLS": os.environ.get("EMAIL_USE_TLS", "1"),
"EMAIL_USE_SSL": os.environ.get("EMAIL_USE_SSL", "0"),
# Open AI Settings
"OPENAI_API_BASE": os.environ.get("", "https://api.openai.com/v1"),
"OPENAI_API_KEY": os.environ.get("OPENAI_API_KEY", "sk-"),
"GPT_ENGINE": os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"),
}
from plane.license.utils.encryption import encrypt_data
for key, value in config_keys.items():
config_keys = [
# Authentication Settings
{
"key": "ENABLE_SIGNUP",
"value": os.environ.get("ENABLE_SIGNUP", "1"),
"category": "AUTHENTICATION",
"is_encrypted": False,
},
{
"key": "ENABLE_EMAIL_PASSWORD",
"value": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
"category": "AUTHENTICATION",
"is_encrypted": False,
},
{
"key": "ENABLE_MAGIC_LINK_LOGIN",
"value": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0"),
"category": "AUTHENTICATION",
"is_encrypted": False,
},
{
"key": "GOOGLE_CLIENT_ID",
"value": os.environ.get("GOOGLE_CLIENT_ID"),
"category": "GOOGLE",
"is_encrypted": False,
},
{
"key": "GITHUB_CLIENT_ID",
"value": os.environ.get("GITHUB_CLIENT_ID"),
"category": "GITHUB",
"is_encrypted": False,
},
{
"key": "GITHUB_CLIENT_SECRET",
"value": os.environ.get("GITHUB_CLIENT_SECRET"),
"category": "GITHUB",
"is_encrypted": True,
},
{
"key": "EMAIL_HOST",
"value": os.environ.get("EMAIL_HOST", ""),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "EMAIL_HOST_USER",
"value": os.environ.get("EMAIL_HOST_USER", ""),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "EMAIL_HOST_PASSWORD",
"value": os.environ.get("EMAIL_HOST_PASSWORD", ""),
"category": "SMTP",
"is_encrypted": True,
},
{
"key": "EMAIL_PORT",
"value": os.environ.get("EMAIL_PORT", "587"),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "EMAIL_FROM",
"value": os.environ.get("EMAIL_FROM", ""),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "EMAIL_USE_TLS",
"value": os.environ.get("EMAIL_USE_TLS", "1"),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "OPENAI_API_KEY",
"value": os.environ.get("OPENAI_API_KEY"),
"category": "OPENAI",
"is_encrypted": True,
},
{
"key": "GPT_ENGINE",
"value": os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"),
"category": "SMTP",
"is_encrypted": False,
},
{
"key": "UNSPLASH_ACCESS_KEY",
"value": os.environ.get("UNSPLASH_ACESS_KEY", ""),
"category": "UNSPLASH",
"is_encrypted": True,
},
]
for item in config_keys:
obj, created = InstanceConfiguration.objects.get_or_create(
key=key
key=item.get("key")
)
if created:
obj.value = value
obj.category = item.get("category")
obj.is_encrypted = item.get("is_encrypted", False)
if item.get("is_encrypted", False):
obj.value = encrypt_data(item.get("value"))
else:
obj.value = item.get("value")
obj.save()
self.stdout.write(self.style.SUCCESS(f"{key} loaded with value from environment variable."))
self.stdout.write(
self.style.SUCCESS(
f"{obj.key} loaded with value from environment variable."
)
)
else:
self.stdout.write(self.style.WARNING(f"{key} configuration already exists"))
self.stdout.write(
self.style.WARNING(f"{obj.key} configuration already exists")
)

View File

@@ -1,23 +1,25 @@
# Python imports
import json
import os
import requests
import uuid
import secrets
# Django imports
from django.core.management.base import BaseCommand, CommandError
from django.utils import timezone
from django.core.exceptions import ValidationError
from django.core.validators import validate_email
from django.conf import settings
# Module imports
from plane.license.models import Instance
from plane.db.models import User
from plane.license.models import Instance, InstanceAdmin
class Command(BaseCommand):
help = "Check if instance in registered else register"
def add_arguments(self, parser):
# Positional argument
parser.add_argument('machine_signature', type=str, help='Machine signature')
def handle(self, *args, **options):
# Check if the instance is registered
instance = Instance.objects.first()
@@ -28,77 +30,37 @@ class Command(BaseCommand):
# Load JSON content from the file
data = json.load(file)
admin_email = os.environ.get("ADMIN_EMAIL")
machine_signature = options.get("machine_signature", "machine-signature")
try:
validate_email(admin_email)
except ValidationError:
CommandError(f"{admin_email} is not a valid ADMIN_EMAIL")
# Raise an exception if the admin email is not provided
if not admin_email:
raise CommandError("ADMIN_EMAIL is required")
# Check if the admin email user exists
user = User.objects.filter(email=admin_email).first()
# If the user does not exist create the user and add him to the database
if user is None:
user = User.objects.create(email=admin_email, username=uuid.uuid4().hex)
user.set_password(uuid.uuid4().hex)
user.save()
license_engine_base_url = os.environ.get("LICENSE_ENGINE_BASE_URL")
if not license_engine_base_url:
raise CommandError("LICENSE_ENGINE_BASE_URL is required")
headers = {"Content-Type": "application/json"}
if not machine_signature:
raise CommandError("Machine signature is required")
payload = {
"email": user.email,
"instance_key": settings.INSTANCE_KEY,
"version": data.get("version", 0.1),
"machine_signature": machine_signature,
"user_count": User.objects.filter(is_bot=False).count(),
}
response = requests.post(
f"{license_engine_base_url}/api/instances/",
headers=headers,
data=json.dumps(payload),
instance = Instance.objects.create(
instance_name="Plane Free",
instance_id=secrets.token_hex(12),
license_key=None,
api_key=secrets.token_hex(8),
version=payload.get("version"),
last_checked_at=timezone.now(),
user_count=payload.get("user_count", 0),
)
if response.status_code == 201:
data = response.json()
# Create instance
instance = Instance.objects.create(
instance_name="Plane Free",
instance_id=data.get("id"),
license_key=data.get("license_key"),
api_key=data.get("api_key"),
version=data.get("version"),
primary_email=data.get("email"),
primary_owner=user,
last_checked_at=timezone.now(),
self.stdout.write(
self.style.SUCCESS(
f"Instance registered"
)
# Create instance admin
_ = InstanceAdmin.objects.create(
user=user,
instance=instance,
role=20,
)
self.stdout.write(
self.style.SUCCESS(
f"Instance succesfully registered with owner: {instance.primary_owner.email}"
)
)
return
self.stdout.write(self.style.WARNING("Instance could not be registered"))
return
)
else:
self.stdout.write(
self.style.SUCCESS(
f"Instance already registered with instance owner: {instance.primary_owner.email}"
f"Instance already registered"
)
)
return

View File

@@ -1,4 +1,4 @@
# Generated by Django 4.2.5 on 2023-11-15 14:22
# Generated by Django 4.2.7 on 2023-12-06 06:49
from django.conf import settings
from django.db import migrations, models
@@ -27,13 +27,15 @@ class Migration(migrations.Migration):
('license_key', models.CharField(blank=True, max_length=256, null=True)),
('api_key', models.CharField(max_length=16)),
('version', models.CharField(max_length=10)),
('primary_email', models.CharField(max_length=256)),
('last_checked_at', models.DateTimeField()),
('namespace', models.CharField(blank=True, max_length=50, null=True)),
('is_telemetry_enabled', models.BooleanField(default=True)),
('is_support_required', models.BooleanField(default=True)),
('is_setup_done', models.BooleanField(default=False)),
('is_signup_screen_visited', models.BooleanField(default=False)),
('user_count', models.PositiveBigIntegerField(default=0)),
('is_verified', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('primary_owner', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='instance_primary_owner', to=settings.AUTH_USER_MODEL)),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
@@ -51,6 +53,8 @@ class Migration(migrations.Migration):
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('key', models.CharField(max_length=100, unique=True)),
('value', models.TextField(blank=True, default=None, null=True)),
('category', models.TextField()),
('is_encrypted', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
@@ -67,7 +71,8 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('role', models.PositiveIntegerField(choices=[(20, 'Owner'), (15, 'Admin')], default=15)),
('role', models.PositiveIntegerField(choices=[(20, 'Admin')], default=20)),
('is_verified', models.BooleanField(default=False)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('instance', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='admins', to='license.instance')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
@@ -78,6 +83,7 @@ class Migration(migrations.Migration):
'verbose_name_plural': 'Instance Admins',
'db_table': 'instance_admins',
'ordering': ('-created_at',),
'unique_together': {('instance', 'user')},
},
),
]

View File

@@ -1,19 +0,0 @@
# Generated by Django 4.2.5 on 2023-11-16 09:45
from django.conf import settings
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('license', '0001_initial'),
]
operations = [
migrations.AlterUniqueTogether(
name='instanceadmin',
unique_together={('instance', 'user')},
),
]

View File

@@ -4,11 +4,9 @@ from django.conf import settings
# Module imports
from plane.db.models import BaseModel
from plane.db.mixins import AuditModel
ROLE_CHOICES = (
(20, "Owner"),
(15, "Admin"),
(20, "Admin"),
)
@@ -20,20 +18,19 @@ class Instance(BaseModel):
license_key = models.CharField(max_length=256, null=True, blank=True)
api_key = models.CharField(max_length=16)
version = models.CharField(max_length=10)
# User information
primary_email = models.CharField(max_length=256)
primary_owner = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.SET_NULL,
null=True,
related_name="instance_primary_owner",
)
# Instnace specifics
last_checked_at = models.DateTimeField()
namespace = models.CharField(max_length=50, blank=True, null=True)
# telemetry and support
is_telemetry_enabled = models.BooleanField(default=True)
is_support_required = models.BooleanField(default=True)
# is setup done
is_setup_done = models.BooleanField(default=False)
# signup screen
is_signup_screen_visited = models.BooleanField(default=False)
# users
user_count = models.PositiveBigIntegerField(default=0)
is_verified = models.BooleanField(default=False)
class Meta:
verbose_name = "Instance"
@@ -50,7 +47,8 @@ class InstanceAdmin(BaseModel):
related_name="instance_owner",
)
instance = models.ForeignKey(Instance, on_delete=models.CASCADE, related_name="admins")
role = models.PositiveIntegerField(choices=ROLE_CHOICES, default=15)
role = models.PositiveIntegerField(choices=ROLE_CHOICES, default=20)
is_verified = models.BooleanField(default=False)
class Meta:
unique_together = ["instance", "user"]
@@ -64,6 +62,8 @@ class InstanceConfiguration(BaseModel):
# The instance configuration variables
key = models.CharField(max_length=100, unique=True)
value = models.TextField(null=True, blank=True, default=None)
category = models.TextField()
is_encrypted = models.BooleanField(default=False)
class Meta:
verbose_name = "Instance Configuration"

View File

@@ -2,9 +2,10 @@ from django.urls import path
from plane.license.api.views import (
InstanceEndpoint,
TransferPrimaryOwnerEndpoint,
InstanceAdminEndpoint,
InstanceConfigurationEndpoint,
InstanceAdminSignInEndpoint,
SignUpScreenVisitedEndpoint,
)
urlpatterns = [
@@ -13,11 +14,6 @@ urlpatterns = [
InstanceEndpoint.as_view(),
name="instance",
),
path(
"instances/transfer-primary-owner/",
TransferPrimaryOwnerEndpoint.as_view(),
name="instance",
),
path(
"instances/admins/",
InstanceAdminEndpoint.as_view(),
@@ -33,4 +29,14 @@ urlpatterns = [
InstanceConfigurationEndpoint.as_view(),
name="instance-configuration",
),
path(
"instances/admins/sign-in/",
InstanceAdminSignInEndpoint.as_view(),
name="instance-admin-sign-in",
),
path(
"instances/admins/sign-up-screen-visited/",
SignUpScreenVisitedEndpoint.as_view(),
name="instance-sign-up",
),
]

View File

@@ -0,0 +1,22 @@
import base64
import hashlib
from django.conf import settings
from cryptography.fernet import Fernet
def derive_key(secret_key):
# Use a key derivation function to get a suitable encryption key
dk = hashlib.pbkdf2_hmac('sha256', secret_key.encode(), b'salt', 100000)
return base64.urlsafe_b64encode(dk)
def encrypt_data(data):
cipher_suite = Fernet(derive_key(settings.SECRET_KEY))
encrypted_data = cipher_suite.encrypt(data.encode())
return encrypted_data.decode() # Convert bytes to string
def decrypt_data(encrypted_data):
cipher_suite = Fernet(derive_key(settings.SECRET_KEY))
decrypted_data = cipher_suite.decrypt(encrypted_data.encode()) # Convert string back to bytes
return decrypted_data.decode()

Some files were not shown because too many files have changed in this diff Show More