Compare commits

...

115 Commits

Author SHA1 Message Date
NarayanBavisetti
bc8d51dd97 chore: space issues v3 2023-12-04 22:23:59 +05:30
NarayanBavisetti
3c80b87365 Merge branch 'develop' of github.com:makeplane/plane into develop 2023-12-04 22:04:06 +05:30
rahulramesha
8a8eea38f9 fix: Permission levels for project settings (#2978)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102

* use Enums instead of numbers

* fix Estimates setting permission for admin

* disable access to project settings for viewers and guests

* fix project unautorized flicker

* add observer to estimates

* add permissions to member list
2023-12-04 20:03:23 +05:30
rahulramesha
b527ec582f fix: mutation on transfer issues from cycle (#2979)
* fix cycle issues mutation on transfering issues

* fix transfer issues from cycle
2023-12-04 20:02:14 +05:30
Aaryan Khandelwal
fa990ed444 refactor: custom hook for sign in redirection (#2969) 2023-12-04 15:04:04 +05:30
Nikhil
79fa860b28 chore: instance (#2955) 2023-12-04 14:48:40 +05:30
NarayanBavisetti
f2b3a645bd Merge branch 'develop' of github.com:makeplane/plane into develop 2023-12-04 12:11:46 +05:30
NarayanBavisetti
4cd70f3b73 Merge branch 'develop' of github.com:makeplane/plane into develop 2023-12-04 12:11:36 +05:30
rahulramesha
59110c7205 fix: sub display filter params for fetching issues (#2972)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes

* fix show sub issue filter FED-1102
2023-12-02 18:23:07 +05:30
Aaryan Khandelwal
9756abece4 chore: update instance admin sign in endpoint (#2973) 2023-12-02 18:19:59 +05:30
guru_sainath
f4aa059da6 fix: global issues properties updation issue resolved (#2974) 2023-12-02 18:19:29 +05:30
rahulramesha
d22c258c00 fix: V3 release blocker bugs (#2968)
* fix add subgroup issue FED-1101

* fix subgroup by None assignee FED-1100

* fix grouping by asignee or labels FED-1096

* fix create view popup FED-1093

* fix subgroup exception in swimlanes
2023-12-01 18:51:52 +05:30
sabith-tu
65253c6383 style: empty state for analytics, views and pages (#2967) 2023-12-01 18:19:17 +05:30
guru_sainath
8c462f96ee fix: corrected rendering of workspace-level labels, members, and states in project view (#2966)
* fix: dynamic issue properties filters in project, workspace and profile level

* clean-up: removed logs from the project store
2023-12-01 17:35:33 +05:30
Aaryan Khandelwal
332e56bb0d style: updated the UI of the instance admin setup and the sign in workflow (#2962)
* style: updated the UI of the signin and instance setups

* fix: form validations and mutations

* fix: updated Link tags in accordance to next v14

* chore: latest features image, reset password redirection
2023-12-01 15:50:01 +05:30
Lakhan Baheti
eca96da837 fix: create workspace form validation (#2958)
* fix: create workspace form validation

* fix: textfield placeholder typo

* fix: name field onchange
2023-12-01 15:18:48 +05:30
guru_sainath
5446447231 chore: workspace global issues (#2964)
* dev: global issues store

* build-error: all issues render

* build-error: build error resolved in global view store
2023-12-01 15:16:36 +05:30
Anmol Singh Bhatia
dec1fb5a63 chore: profile issue display filters content updated (#2963) 2023-12-01 15:14:18 +05:30
Anmol Singh Bhatia
2b63827caa chore: create issue modal improvement (#2960) 2023-12-01 15:12:25 +05:30
Manish Gupta
5993fc38f0 branch build fixes (#2961) 2023-12-01 13:40:16 +05:30
Anmol Singh Bhatia
ab37ce979a fix: view modal overlapping (#2956) 2023-12-01 13:27:16 +05:30
sriram veeraghanta
c6764111a3 fix: upgrading to nextjs 14 (#2959) 2023-12-01 13:25:48 +05:30
Manish Gupta
4a2f8d93b1 fix: branch build 2 (#2957)
* branch build fix

* removed quotes
2023-12-01 13:23:50 +05:30
Manish Gupta
5db9b2b638 branch build fix (#2954) 2023-12-01 12:46:18 +05:30
Bavisetti Narayan
e2b704ea6f chore: removed django settings module (#2953) 2023-12-01 12:21:17 +05:30
Nikhil
452c9f0d5b dev: update email templates (#2948)
* dev: update magic link email

* dev: forgot password mail

* dev: workspace invitation update

* dev: update email templates and task

* dev: remove email verification template

* dev: change all conversation links to issues
2023-11-30 13:30:13 +05:30
rahulramesha
139c0857eb fix: quick add positioning (#2949)
* fix quick add posutioning for kanban and spreadsheet

* fix kanban quick add project identifier
2023-11-30 12:22:43 +05:30
Nikhil
2556d052de chore: status code changed (#2947) 2023-11-29 21:46:54 +05:30
Nikhil
c65915665b dev: transactional emails (#2946) 2023-11-29 21:46:18 +05:30
Nikhil
6ece5a57e2 dev: instance registration (#2912)
* dev: remove auto script for registration

* dev: make all of the instance admins as owners when adding a instance admin

* dev: remove sign out endpoint

* dev: update takeoff script to register the instance

* dev:  reapply instance model

* dev: check none for instance configuration encryptions

* dev: encrypting secrets configuration

* dev: user workflow for registration in instances

* dev: add email automation configuration

* dev: remove unused imports

* dev: reallign migrations

* dev: reconfigure license engine registrations

* dev: move email check to background worker

* dev: add sign up

* chore: signup error message

* dev: updated onboarding workflows and instance setting

* dev: updated template for magic login

* chore: page migration changed

* dev: updated migrations and authentication for license and update template for workspace invite

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:33:37 +05:30
Anmol Singh Bhatia
a779fa1497 dev: instance setup workflow (#2935)
* chore: instance type updated

* chore: instance not ready screen added

* chore: instance layout added

* chore: instance magic sign in endpoint and type added

* chore: instance admin password endpoint added

* chore: instance setup page added

* chore: instance setup form added

* chore: instance layout updated

* fix: instance admin workflow setup

* fix: admin workflow setup

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 20:33:08 +05:30
sriram veeraghanta
5eb7740833 fix: removed unused packages and upgraded to next 14 (#2944)
* fix: upgrading next package and removed unused deps

* chore: unused variable removed

* chore: next image icon fix

* chore: unused component removed

* chore: next image icon fix

* chore: replace use-debounce with lodash debounce

* chore: unused component removed

* resolved: fixed issue with next link component

* fix: updates in next config

* fix: updating types pages

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2023-11-29 20:32:10 +05:30
guru_sainath
86408fcd46 chore: replaced v3 issues endpoints (#2945)
* chore: removed v3 endpoints

* chore: replace v3 issues to normal issues endpoints

* build-error: Bulid error is new issue structure

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-29 20:26:43 +05:30
Anmol Singh Bhatia
86de3188b1 chore: cycle and module status indicator improvement (#2942) 2023-11-29 20:02:41 +05:30
Anmol Singh Bhatia
c5996382bc style: empty state improvement (#2943) 2023-11-29 20:02:13 +05:30
rahulramesha
2796754e5f fix: v3 issues for the layouts (#2941)
* fix drag n drop exception error

* fix peek overlay close buttons

* fix project empty state view

* fix cycle and module empty state view

* add ai options to inbox issue creation

* fix inbox filters for viewers

* fix inbox filters for viewers for project

* disable editing permission for members and viewers

* define accurate types for drag and drop
2023-11-29 19:58:27 +05:30
Lakhan Baheti
3488dc3014 style: deactivate acount modal (#2940) 2023-11-29 19:12:12 +05:30
Bavisetti Narayan
c7a3d8f0a0 chore: v3 global issues (#2938) 2023-11-29 19:08:14 +05:30
Aaryan Khandelwal
6d97dd814a chore: updated sign in workflow (#2939)
* chore: new sign in workflow

* chore: request new code button added

* chore: create new password form added

* fix: build errors

* chore: remove unused components

* chore: update submitting state texts

* fix: oauth sign in process
2023-11-29 19:07:33 +05:30
Lakhan Baheti
804c03bd15 chore: redirection to profile after workpace delete/leave (#2937) 2023-11-29 16:52:54 +05:30
Manish Gupta
59981e3e68 fix: Branch Build and Self hosting fixes (#2930)
* Branch build yml modified to create preview and latest tags

* self host install modified to handle public image only

* testing update-docker

* testing

* wip

* rolled back to orignal

* selfhosting readme updated
2023-11-29 14:58:31 +05:30
Aaryan Khandelwal
1be1b9f4a3 chore: issue peek overview (#2918)
* chore: autorun for the issue detail store

* fix: labels mutation

* chore: remove old peek overview code

* chore: move add to cycle and module logic to store

* fix: build errors

* chore: add peekProjectId query param for the peek overview

* chore: update profile layout

* fix: multiple workspaces

* style: Issue activity and link design improvements in Peek overview.
* fix issue with labels not occupying full widht.
* fix links overflow issue.
* add tooltip in links to display entire link.
* add functionality to copy links to clipboard.

* chore: peek overview for all the layouts

* fix: build errors

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-29 14:25:57 +05:30
Lakhan Baheti
7cd02401c1 fix: spreadsheet layout issue properties (#2936)
* fix: spredsheet layout state column state name & tootltip

* fix: label select dropdown first item auto active state

* fix: priority column padding & tooltip position
2023-11-29 14:20:58 +05:30
Bavisetti Narayan
6d175c0e56 chore: api rate limiting (#2933) 2023-11-29 14:14:26 +05:30
Bavisetti Narayan
6a4f521b47 chore: api webhooks validation (#2928)
* chore: api webhooks update

* chore: webhooks signature validation
2023-11-29 14:13:03 +05:30
rahulramesha
cdc4fd27a5 fix issue sorting and add sorting to other properties (#2931) 2023-11-29 14:02:39 +05:30
Aaryan Khandelwal
f4af3db7b6 chore: update the content of webhooks form (#2932) 2023-11-29 14:02:13 +05:30
Aaryan Khandelwal
4f64f431a7 chore: update profile settings layout (#2925)
* chore: update profile layout

* fix: multiple workspaces

* chore: removed breadcrumbs

* chore: fix sidebar collapsed state
2023-11-29 13:48:07 +05:30
Anmol Singh Bhatia
537901f046 style: workspace sidebar scroll fix and improvement (#2934) 2023-11-29 13:32:35 +05:30
Lakhan Baheti
404a61aee5 fix: google auth button content alignment (#2929) 2023-11-29 13:29:38 +05:30
Lakhan Baheti
1a2b1e648d style: switch or delete account modal (#2926)
* style: switch or delete account modal

* fix: popover text color

* fix: typo
2023-11-29 13:28:24 +05:30
guru_sainath
3400c119bc fix: drag and drop implementation in calendar layout and kanban layout (#2921)
* fix profile issue filters and kanban

* chore: calendar drag and drop

* chore: kanban drag and drop

* dev: remove issue from the kanban layout and resolved build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-28 19:17:38 +05:30
sabith-tu
d5853405ca style: new empty state ui (#2923) 2023-11-28 18:47:52 +05:30
rahulramesha
db510dcfcd fix: all functionalities for profile, archived and draft issues (#2922)
* fix profile issue filters and kanban

* fix profile draft and archived issues
2023-11-28 18:15:46 +05:30
Lakhan Baheti
62c0615012 style: member role visibility (#2919)
* style: member role visibility

* fix: build errors
2023-11-28 17:11:04 +05:30
Bavisetti Narayan
3914a75334 chore: deactivated user workflow change (#2888)
* chore: deactivated user workflow change

* chore: removed archived and draft from v3 by default

* chore: draft and archive update

* chore: bool field

* chore: fall back workspace updated

* chore: workspace member active
2023-11-28 17:08:05 +05:30
Aaryan Khandelwal
0cbb201348 fix: workspace settings pages authorization (#2915)
* fix: workspace settings pages authorization

* chore: user cannot add a member with a higher role than theirs

* chore: update workspace general settings auth
2023-11-28 17:05:42 +05:30
rahulramesha
f7264364bd add functionality for addition of existing issues to modules and cycles (#2913)
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-28 14:50:37 +05:30
Manish Gupta
35f8ffa5ab migration script added (#2914) 2023-11-28 13:29:08 +05:30
M. Palanikannan
6607caade7 fix: Image restoration fixed (marks/unmarks an image to be deleted after a week) (#2859)
* image restoration fixed (marks an image to be deleted after a week)

* removed clgs

* added image constraints

* formatted editor-core package using yarn format

* lite-text-editor nothing to format

* rich-text-editor nothing to format

* formatted document-editor with prettier

* modified file service to follow api change

* fixed more formatting in document editor

* fixed all instances of types with that from the package

* fixed delete to work consistently (minor optimizations turned off)

* stop duplicate images inside editor

* restore image on editor creation

say if user A deletes image number 2, user B was also in the same issue and in their screen the image was there, if user B makes certain changes and that gets saved in backend, according to user B image 2 should exist but since user A deleted it, it'll not get restored and get deleted in 7 days, hence I've added a check such that whenever a issue loads we restore all images by default

* added restore image function with types

* replaced all instances to have restore image logic

* fixed issue detail for peek view

* disabled option to insert table inside a table
2023-11-28 11:34:20 +05:30
Manish Gupta
8ee8270697 removed container names for selfhosting (#2907) 2023-11-28 11:31:04 +05:30
Aaryan Khandelwal
41ab962dd7 chore: update get invitation details endpoint (#2902) 2023-11-28 11:30:03 +05:30
Prateek Shourya
c22c6bb9b2 Fix: bug fixes and UI / UX improvements (#2906)
* Fix: issue with project publish modal data not updating immediately.

* fix: issue with workspace list not scrollable in profile settings.

* fix: update redirect workspace slug logic to redirect to prev workspace instead of `/`.

* style: update API tokens and webhooks empty state designs.
2023-11-28 11:29:01 +05:30
sriram veeraghanta
67de6d0729 fix: adding ai assistance to pages (#2905)
* fix: adding ai modal to pages

* fix: pages overflow

* chore: update pages UI

* fix: updating page description while using ai assistance

* fix: gpt assistant modal height and position

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-27 20:39:18 +05:30
M. Palanikannan
f361cd045e image can't be inserted inside table (#2904)
* image can't be inserted inside table

Now we've diabled image icon from showing up if the cursor is inside a table node or if a table cell is selected

* added drag drop support for document editor

* fixed missing dependencies
2023-11-27 20:37:40 +05:30
Prateek Shourya
06d3cd7e73 refactor: Instance admin setting and UI updates. (#2889)
* refactor: shift instance admin restriction content to seperate component.
fix: instance components export logic.

* style: fix sidebar dropdown `God Mode` icon padding.

* style: update profile settings user dropdown menu width.

* fix: update input type to `password` for Client Secret and API/ Access Key fields.

* style: update loader design for all forms.

* fix: typo

* style: ui updates.

* chore: add show/ hide button for all password fields.
2023-11-27 19:41:47 +05:30
Aaryan Khandelwal
10c52bf89b refactor: keyboard shortcuts modal (#2822)
* refactor: keyboard shortcuts modal

* chore: updated search logic

* refactor: divided the modal component into granular components
2023-11-27 18:41:59 +05:30
Anmol Singh Bhatia
b717518fbe fix: workspace dropdown scroll (#2900) 2023-11-27 18:25:40 +05:30
Aaryan Khandelwal
f48cd6f50c fix: profile layout flicker (#2898)
* fix: user profile layout flicker

* chore: update import statements
2023-11-27 18:25:16 +05:30
Lakhan Baheti
3203ae6549 fix: progress panel default open (#2894) 2023-11-27 17:21:14 +05:30
Lakhan Baheti
b8f603f920 chore: signup removed (#2890) 2023-11-27 17:17:55 +05:30
Anmol Singh Bhatia
96ff76af94 [FED-1018] chore: workspace dropdown improvement (#2891)
* fix: workspace dropdown improvement

* style: sidebar workspace icon alignment
2023-11-27 17:17:09 +05:30
Anmol Singh Bhatia
830675741f [FED-1054] fix: join project mutation (#2892)
* fix: join project mutation

* chore: code refactor
2023-11-27 17:16:46 +05:30
srinivas pendem
e489ad50dc Update README.md (#2893)
./setup.sh removed

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-27 17:16:13 +05:30
Aaryan Khandelwal
3dc18bc8fd refactor: webhooks (#2896)
* refactor: webhooks workflow

* chore: update delete modal content
2023-11-27 17:15:48 +05:30
Anmol Singh Bhatia
2d04917951 chore: instance admins endpoint added and ui/ux improvement (#2895)
* style: sidebar improvement

* style: header height consistency

* chore: layout consistency and general page improvement

* chore: layout, email form and image form improvement

* chore: instance admins endpoint intergrated and code refactor

* chore: code refactor

* chore: google client secret section removed
2023-11-27 17:15:11 +05:30
guru_sainath
2bf7e63625 issues rendering in all issue layouts fir profile and project issues and global issues store implementation (#2886)
* dev: draft and archived issue store

* connect draft and archived issues

* kanban for draft issues

* fix filter store for calendar and kanban

* dev: profile issues store and draft issues filters in header

* disble issue creation for draft issues

* dev: profile issues store filters

* disable kanban properties in draft issues

* dev: profile issues store filters

* dev: seperated adding issues to the cycle and module as seperate methds in cycle and module store

* dev: workspace profile issues store

* dev: sub group issues in the swimlanes

* profile issues and create issue connection

* fix profile issues

* fix spreadsheet issues

* fix dissapearing project from create issue modal

* page level modifications

* fix additional bugs

* dev: issues profile and global iisues and filters update

* fix issue related bugs

* fix project views for list and kanban

* fix build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-27 14:15:33 +05:30
Anmol Singh Bhatia
eb78fd6088 fix: resolve modal overlapping issue (#2885) 2023-11-27 12:16:59 +05:30
Lakhan Baheti
202ecd21df fix: bug fixes & UI improvements (#2884)
* chore: access restriction for api tokens

* fix: on create module total issues undefined

* fix: cycle board card typo

* chore: fetch modules after creation

* fix: peek module on delete

* fix: peek cycle on delete

* fix: cycle detail sidebar copy link toast

* chore: router replace -> push
2023-11-27 12:15:10 +05:30
Aaryan Khandelwal
b2ac7b9ac6 chore: revamp the API tokens workflow (#2880)
* chore: added getLayout method to api tokens pages

* revamp: api tokens workflow

* chore: add title validation and update types

* chore: minor UI updates

* chore: update route
2023-11-27 12:14:06 +05:30
Lakhan Baheti
51dff31926 fix: user state after logout (#2849)
* fix: user state after logout

* chore: user state handle with mobx

* chore: signout update for profile setting

* fix: minor fixes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-25 23:04:56 +05:30
sriram veeraghanta
e89f152779 fix: remove slack notification on build branch workflow (#2881) 2023-11-25 22:43:27 +05:30
Lakhan Baheti
3c9f57f8f4 fix: workspace & user avatar tooltip (#2851)
* fix: workspace & user avatar tooltip

* chore: user name update while typing on top right avatar

* chore: imports placement

* fix: rendering condition

* chore: component re-arrangement

* fix: imports
2023-11-25 21:31:09 +05:30
Bavisetti Narayan
1bc859c68c chore: seperated delete endpoint for file upload (#2870) 2023-11-25 21:28:03 +05:30
Ramesh Kumar Chandra
11d57a5bf0 fix: track events updated, extra parameters added, added events for issues, pages, states, cycles (#2875)
* fix: event tracking method updated to store, chore: updated and added events for workspace, projects and create issue

* fix: posthog auth event tracking

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-25 21:26:26 +05:30
Prateek Shourya
2980c7b00d Feat: God Mode UI Updates and More Config Settings (#2877)
* feat: Images in Plane config screen.
* feat: Enable/ Disable Magic Login config toggle.
* style: UX copy and design updates across all screens.
* style: SSO and OAuth Screen revamp.
* style: Enter God Mode button for Profile Settings sidebar.
* fix: update input type to password for password fields.
2023-11-25 21:23:50 +05:30
Anmol Singh Bhatia
5c6a59ba35 dev: badge component added in planu ui package (#2876) 2023-11-25 21:21:03 +05:30
Anmol Singh Bhatia
a3ea7c8f10 fix: issue peek overview state select dropdown overflow fix (#2873) 2023-11-25 21:18:54 +05:30
Anmol Singh Bhatia
cb922fb113 fix: module sidebar date select fix and code refactor (#2872) 2023-11-25 21:18:16 +05:30
sriram veeraghanta
06564ee856 fix: remove slack notify (#2871)
* fix: remove slack notifications on workflows

* fix: bugfix
2023-11-24 14:31:44 +05:30
Nikhil
c7e6118804 refactor: image upload modals, file size limit added to config (#2868)
* chore: add file size limit as config in the config api

* refactor: image upload modals

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2023-11-24 13:23:46 +05:30
Manish Gupta
069b8b3ed9 Updated the slack notification message to PR Title (#2869) 2023-11-24 13:11:21 +05:30
Lakhan Baheti
38a5b7bec0 chore: added error toast for invitation (#2853) 2023-11-24 12:47:02 +05:30
Bavisetti Narayan
236caaafe8 chore: user deactivation and login restriction (#2855)
* chore: user deactivation

* chore: deactivation and login disabled

* chore: added get configuration value

* chore: serializer message change

* chore: instance admin passowrd change

* chore: removed triage

* chore: v3 endpoint for user profile

* chore: added enable signin

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2023-11-24 12:22:24 +05:30
Nikhil
a6d5eab634 chore: api and webhook refactor (#2861)
* chore: bug fix

* dev: changes in api endpoints for invitations and inbox

* chore: improvements

* dev: update webhook send

* dev: webhook validation and fix webhook flow for app

* dev: error messages for deactivation

* chore: api fixes

* dev: update webhook and workspace leave

* chore: issue comment

* dev: default values for environment variables

* dev: make the user active if he was already part of project member

* chore: webhook cycle and module event

* dev: disable ssl for emails

* dev: webhooks restructuring

* dev: updated webhook configuration

* dev: webhooks

* dev: state get object

* dev: update workspace slug validation

* dev: remove deactivation flag if max retries exceeded

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2023-11-24 12:19:26 +05:30
sriram veeraghanta
8d76c96a6f fix: adding slack notification when build is failed to upload to docker (#2862)
* fix: removing logs

* fix: adding slack notification when build is failed to upload to docker

* minor changes

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
2023-11-24 12:17:31 +05:30
Aaryan Khandelwal
97be4b60ae chore: update profile and God mode routes (#2860)
* chore: update profile and god mode routes

* fix: profile activity loader

* chore: update profile route in the change password page
2023-11-24 12:16:37 +05:30
Bavisetti Narayan
dece103873 chore: user activity in profile page (#2856)
* chore: user activity endpoint change

* chore: added workspace detail in activity serializer
2023-11-23 21:00:49 +05:30
Anmol Singh Bhatia
c6125876be fix: view date filter select fix (#2858) 2023-11-23 20:40:41 +05:30
Anmol Singh Bhatia
1f85bf2302 style: module ui improvement (#2838) 2023-11-23 20:39:58 +05:30
Anmol Singh Bhatia
20baba3bb0 style: issue activity section improvement (#2836) 2023-11-23 20:39:18 +05:30
Ramesh Kumar Chandra
85907b32d1 feat: change password page (#2847) 2023-11-23 20:38:50 +05:30
sabith-tu
ef2bef83dc style: removing extra options heading and drop down icon (#2852) 2023-11-23 20:38:05 +05:30
Aaryan Khandelwal
6e7a96394a fix: page scroll area (#2850) 2023-11-23 18:22:25 +05:30
Aaryan Khandelwal
5726f6955c dev: added tailwind merge helper function (#2844) 2023-11-23 17:21:47 +05:30
Aaryan Khandelwal
82665a35ee fix: archived issues infinite call (#2848) 2023-11-23 16:58:08 +05:30
sriram veeraghanta
4efd225599 fix: updated document editor package in web and space apps (#2846) 2023-11-23 15:27:20 +05:30
sriram veeraghanta
2481706581 chore: optimizations and file name changes (#2845)
* fix: deepsource antipatterns

* fix: deepsource exclude file patterns

* chore: file name changes and removed unwanted variables

* fix: changing version number for editor
2023-11-23 15:09:46 +05:30
guru_sainath
a17b08dd15 chore: implemented new store and issue layouts for issues and updated new data structure for issues (#2843)
* fix: Implemented new workflow in the issue store and updated the quick add workflow in list layout

* fix: initial load and mutaion of issues in list layout

* dev: implemented the new project issues store with grouped, subGrouped and unGrouped issue computed functions

* dev: default display properties data made as a function

* conflict: merge conflict resolved

* dev: implemented quick add logic in kanban

* chore: implemented quick add logic in calendar and spreadsheet layout

* fix: spreadsheet layout quick add fix

* dev: optimised the issues workflow and handled the issues order_by filter

* dev: project issue CRUD operations in new issue store architecture

* dev: issues filtering in calendar layout

* fix: build error

* dev/issue_filters_store

* chore: updated filters computed structure

* conflict: merge conflicts resolved in project issues

* dev: implemented gantt chart for project issues using the new mobx store

* dev: initialized cycle and module issue filters store

* dev: issue store and list layout store updates

* dev: quick add and update, delete issue in the list

* refactor list root changes

* dev: store new structure

* refactor spreadsheet and gnatt project roots

* fix errors for base gantt and spreadsheet roots

* connect Calendar project view

* minor house keeping

* connect Kanban View to th enew store

* generalise base calendar issue actions

* dev: store project issues and issue filters

* dev: store project issues and filters

* dev: updated undefined with displayFilters in project issue store

* Add Quick add to all the layouts

* connect module views to store

* dev: Rendering list issues in project issues

* dev: removed console log

* dev: module filters store

* fix errors and connect modules list and quick add for list

* dev: module issue store

* dev: modle filter store issue fixed and updates cycle issue filters

* minor house keeping changes

* dev: cycle issues and cycle filters

* connecty cycles to teh store

* dev: project view issues and issue filtrs

* connect project views

* dev: updated applied filters in layouts

* dev: replaced project id with view id in project views

* dev: in cycle and module store made cycledId and moduleId as optional

* fix minor issues and build errots

* dev: project draft and archived issues store and filters

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2023-11-23 14:47:04 +05:30
Aaryan Khandelwal
a7d6b528bd chore: deactivate user option added (#2841)
* dev: deactivate user option added

* chore: new layout for profile settings

* fix: build errors

* fix: user profile activity
2023-11-23 14:44:06 +05:30
Lakhan Baheti
9ba724b78d fix: onboarding bugs & improvements (#2839)
* fix: terms & condition alignment

* fix: onboarding page scrolling

* fix: create workspace name clear

* fix: setup profile sidebar workspace name

* fix: invite team screen button text

* fix: inner div min height

* fix: allow single invite also in invite member

* fix: UI clipping in invite members

* fix: signin screen scroll

* fix: sidebar notification icon

* fix: sidebar project name & icon

* fix: user detail bottom image alignment

* fix: step indicator in invite member

* fix: try different account modal state

* fix: setup profile remove image

* fix: workspace slug clear

* fix: invite member UI & focus

* fix: step indicator size

* fix: inner div placement

* fix: invite member validation logic

* fix: cuurent user data persistency

* fix: sidebar animation colors

* feat: signup & resend

* fix: sign out theme persist from popover

* fix: imports

* chore: signin responsiveness

* fix: sign-in, sign-up top padding
2023-11-23 13:45:00 +05:30
Bavisetti Narayan
c2da9783a3 chore: change password endpoint (#2842) 2023-11-23 13:44:50 +05:30
Anmol Singh Bhatia
784be47e91 [FED-888] fix: parent issue select modal improvement (#2837)
This PR include improvement for parent issue select modal.
2023-11-22 16:16:52 +05:30
sriram veeraghanta
0ceb9974f6 dev: hub compose file update (#2376) (#2444) (#2445)
* docker-compose-hub modified for envs

* bug:fix recent page hiding last item on scroll #1468 (#2411)

* wip

* fixed the AMD build on ARM

---------

Co-authored-by: Manish Gupta <59428681+manishg3@users.noreply.github.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2023-10-16 13:00:58 +05:30
Prashant Indurkar
e0fcc0c876 bug:fix recent page hiding last item on scroll #1468 (#2411) 2023-10-12 16:35:00 +05:30
812 changed files with 30449 additions and 16023 deletions

View File

@@ -1,5 +1,11 @@
version = 1
exclude_patterns = [
"bin/**",
"**/node_modules/",
"**/*.min.js"
]
[[analyzers]]
name = "shell"

View File

@@ -1,13 +1,13 @@
name: Branch Build
on:
pull_request:
types:
types:
- closed
branches:
branches:
- master
- release
- preview
- qa
- develop
@@ -15,7 +15,7 @@ env:
TARGET_BRANCH: ${{ github.event.pull_request.base.ref }}
jobs:
branch_build_and_push:
branch_build_setup:
if: ${{ (github.event_name == 'pull_request' && github.event.action =='closed' && github.event.pull_request.merged == true) }}
name: Build-Push Web/Space/API/Proxy Docker Image
runs-on: ubuntu-20.04
@@ -23,40 +23,7 @@ jobs:
steps:
- name: Check out the repo
uses: actions/checkout@v3.3.0
# - name: Set Target Branch Name on PR close
# if: ${{ github.event_name == 'pull_request' && github.event.action =='closed' }}
# run: echo "TARGET_BRANCH=${{ github.event.pull_request.base.ref }}" >> $GITHUB_ENV
# - name: Set Target Branch Name on other than PR close
# if: ${{ github.event_name == 'push' }}
# run: echo "TARGET_BRANCH=${{ github.ref_name }}" >> $GITHUB_ENV
- uses: ASzc/change-string-case-action@v2
id: gh_branch_upper_lower
with:
string: ${{env.TARGET_BRANCH}}
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_slash
with:
source: ${{ steps.gh_branch_upper_lower.outputs.lowercase }}
find: '/'
replace: '-'
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_replace_dot
with:
source: ${{ steps.gh_branch_replace_slash.outputs.value }}
find: '.'
replace: ''
- uses: mad9000/actions-find-and-replace-string@2
id: gh_branch_clean
with:
source: ${{ steps.gh_branch_replace_dot.outputs.value }}
find: '_'
replace: ''
- name: Uploading Proxy Source
uses: actions/upload-artifact@v3
with:
@@ -77,7 +44,6 @@ jobs:
!./nginx
!./deploy
!./space
- name: Uploading Space Source
uses: actions/upload-artifact@v3
with:
@@ -89,12 +55,24 @@ jobs:
!./deploy
!./web
outputs:
gh_branch_name: ${{ steps.gh_branch_clean.outputs.value }}
gh_branch_name: ${{ env.TARGET_BRANCH }}
branch_build_push_frontend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
FRONTEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:latest
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "release" ] || [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend:preview
else
TAG=${{ env.FRONTEND_TAG }}
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -114,7 +92,7 @@ jobs:
context: .
file: ./web/Dockerfile.web
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-frontend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.FRONTEND_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
@@ -123,8 +101,20 @@ jobs:
branch_build_push_space:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
SPACE_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Space Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space:latest
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "release" ] || [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-space:preview
else
TAG=${{ env.SPACE_TAG }}
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -144,7 +134,7 @@ jobs:
context: .
file: ./space/Dockerfile.space
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-space-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.SPACE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
@@ -153,8 +143,20 @@ jobs:
branch_build_push_backend:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
BACKEND_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:latest
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "release" ] || [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-backend:preview
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -175,7 +177,7 @@ jobs:
file: ./Dockerfile.api
platforms: linux/amd64
push: true
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-backend-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.BACKEND_TAG }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
@@ -183,8 +185,20 @@ jobs:
branch_build_push_proxy:
runs-on: ubuntu-20.04
needs: [ branch_build_and_push ]
needs: [branch_build_setup]
env:
PROXY_TAG: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_setup.outputs.gh_branch_name }}
steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "master" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:latest
elif [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "release" ] || [ "${{ needs.branch_build_setup.outputs.gh_branch_name }}" == "preview" ]; then
TAG=${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:preview,${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy:preview
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2.5.0
@@ -205,7 +219,7 @@ jobs:
context: .
file: ./Dockerfile
platforms: linux/amd64
tags: ${{ secrets.DOCKERHUB_USERNAME }}/plane-proxy-private:${{ needs.branch_build_and_push.outputs.gh_branch_name }}
tags: ${{ env.PROXY_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1

View File

@@ -76,7 +76,6 @@ NEXT_PUBLIC_ENABLE_OAUTH=0
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
DJANGO_SETTINGS_MODULE="plane.settings.selfhosted" # deprecated
# Error logs
SENTRY_DSN=""

View File

@@ -57,10 +57,6 @@ Setting up local environment is extremely easy and straight forward. Follow the
1. Review the `.env` files available in various folders. Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system
1. Run the docker command to initiate various services `docker compose -f docker-compose-local.yml up -d`
```bash
./setup.sh
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case id does not auto-reload)
Thats it!

View File

@@ -14,6 +14,11 @@ PGHOST="plane-db"
PGDATABASE="plane"
DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
# Oauth variables
GOOGLE_CLIENT_ID=""
GITHUB_CLIENT_ID=""
GITHUB_CLIENT_SECRET=""
# Redis Settings
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
@@ -50,7 +55,6 @@ NGINX_PORT=80
# SignUps
ENABLE_SIGNUP="1"
# Enable Email/Password Signup
ENABLE_EMAIL_PASSWORD="1"

View File

@@ -3,16 +3,26 @@ set -e
python manage.py wait_for_db
python manage.py migrate
# Set default value for ENABLE_REGISTRATION
ENABLE_REGISTRATION=${ENABLE_REGISTRATION:-1}
# Create the default bucket
#!/bin/bash
# Check if ENABLE_REGISTRATION is not set to '0'
if [ "$ENABLE_REGISTRATION" != "0" ]; then
# Register instance
python manage.py register_instance
# Load the configuration variable
python manage.py configure_instance
fi
# Collect system information
HOSTNAME=$(hostname)
MAC_ADDRESS=$(ip link show | awk '/ether/ {print $2}' | head -n 1)
CPU_INFO=$(cat /proc/cpuinfo)
MEMORY_INFO=$(free -h)
DISK_INFO=$(df -h)
# Concatenate information and compute SHA-256 hash
SIGNATURE=$(echo "$HOSTNAME$MAC_ADDRESS$CPU_INFO$MEMORY_INFO$DISK_INFO" | sha256sum | awk '{print $1}')
# Export the variables
export MACHINE_SIGNATURE=$SIGNATURE
# Register instance
python manage.py register_instance $MACHINE_SIGNATURE
# Load the configuration variable
python manage.py configure_instance
# Create the default bucket
python manage.py create_bucket

View File

@@ -1,9 +1,8 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
rate = '60/minute'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
@@ -15,13 +14,10 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
now = self.timer()
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])

View File

@@ -9,8 +9,9 @@ from .issue import (
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
IssueExpandSerializer,
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer
from .module import ModuleSerializer, ModuleIssueSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .inbox import InboxIssueSerializer

View File

@@ -46,4 +46,11 @@ class CycleIssueSerializer(BaseSerializer):
"workspace",
"project",
"cycle",
]
]
class CycleLiteSerializer(BaseSerializer):
class Meta:
model = Cycle
fields = "__all__"

View File

@@ -8,6 +8,12 @@ class InboxIssueSerializer(BaseSerializer):
model = InboxIssue
fields = "__all__"
read_only_fields = [
"project",
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]

View File

@@ -19,7 +19,10 @@ from plane.db.models import (
ProjectMember,
)
from .base import BaseSerializer
from .cycle import CycleSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleLiteSerializer
from .user import UserLiteSerializer
from .state import StateLiteSerializer
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
@@ -42,6 +45,7 @@ class IssueSerializer(BaseSerializer):
model = Issue
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
@@ -60,9 +64,9 @@ class IssueSerializer(BaseSerializer):
# Validate assignees are from project
if data.get("assignees", []):
print(data.get("assignees"))
data["assignees"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
is_active=True,
member_id__in=data["assignees"],
).values_list("member_id", flat=True)
@@ -88,7 +92,7 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
workspce_id=self.context.get("workspace_id"), pk=data.get("parent")
workspace_id=self.context.get("workspace_id"), pk=data.get("parent")
).exists()
):
raise serializers.ValidationError(
@@ -231,8 +235,13 @@ class LabelSerializer(BaseSerializer):
model = Label
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
@@ -241,13 +250,14 @@ class IssueLinkSerializer(BaseSerializer):
model = IssueLink
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"issue",
]
# Validation if url already exists
@@ -266,13 +276,14 @@ class IssueAttachmentSerializer(BaseSerializer):
model = IssueAttachment
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
]
@@ -282,38 +293,72 @@ class IssueCommentSerializer(BaseSerializer):
class Meta:
model = IssueComment
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"issue",
"created_by",
"updated_by",
"created_at",
"updated_at",
"workspace",
"project",
"issue",
]
class IssueActivitySerializer(BaseSerializer):
class Meta:
model = IssueActivity
fields = "__all__"
exclude = [
"created_by",
"udpated_by",
"updated_by",
]
class CycleIssueSerializer(BaseSerializer):
cycle = CycleSerializer(read_only=True)
class Meta:
fields = [
"cycle",
]
class ModuleIssueSerializer(BaseSerializer):
module = ModuleSerializer(read_only=True)
class Meta:
fields = [
"module",
]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id",
"name",
"color",
]
class IssueExpandSerializer(BaseSerializer):
cycle = CycleLiteSerializer(source="issue_cycle.cycle", read_only=True)
module = ModuleLiteSerializer(source="issue_module.module", read_only=True)
labels = LabelLiteSerializer(read_only=True, many=True)
assignees = UserLiteSerializer(read_only=True, many=True)
state = StateLiteSerializer(read_only=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
"updated_by",
"created_at",
"updated_at",
]

View File

@@ -21,7 +21,6 @@ class ModuleSerializer(BaseSerializer):
write_only=True,
required=False,
)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
cancelled_issues = serializers.IntegerField(read_only=True)
completed_issues = serializers.IntegerField(read_only=True)
@@ -33,6 +32,7 @@ class ModuleSerializer(BaseSerializer):
model = Module
fields = "__all__"
read_only_fields = [
"id",
"workspace",
"project",
"created_by",
@@ -55,7 +55,6 @@ class ModuleSerializer(BaseSerializer):
raise serializers.ValidationError("Start date cannot exceed target date")
if data.get("members", []):
print(data.get("members"))
data["members"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
member_id__in=data["members"],
@@ -152,4 +151,11 @@ class ModuleLinkSerializer(BaseSerializer):
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return ModuleLink.objects.create(**validated_data)
return ModuleLink.objects.create(**validated_data)
class ModuleLiteSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"

View File

@@ -20,8 +20,12 @@ class ProjectSerializer(BaseSerializer):
model = Project
fields = "__all__"
read_only_fields = [
"workspace",
"id",
"workspace",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
def validate(self, data):

View File

@@ -11,10 +11,6 @@ class UserLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"is_bot",
"display_name",
]
read_only_fields = [
"id",
"is_bot",
]
read_only_fields = fields

View File

@@ -13,7 +13,7 @@ urlpatterns = [
name="cycles",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:pk>/",
CycleAPIEndpoint.as_view(),
name="cycles",
),
@@ -23,7 +23,7 @@ urlpatterns = [
name="cycle-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/",
CycleIssueAPIEndpoint.as_view(),
name="cycle-issues",
),

View File

@@ -5,12 +5,12 @@ from plane.api.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),

View File

@@ -15,27 +15,27 @@ urlpatterns = [
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueAPIEndpoint.as_view(),
name="issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/",
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/",
LabelAPIEndpoint.as_view(),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-labels/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/labels/<uuid:pk>/",
LabelAPIEndpoint.as_view(),
name="label",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/",
IssueLinkAPIEndpoint.as_view(),
name="link",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-links/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/links/<uuid:pk>/",
IssueLinkAPIEndpoint.as_view(),
name="link",
),
@@ -50,12 +50,12 @@ urlpatterns = [
name="comment",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activites/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/",
IssueActivityAPIEndpoint.as_view(),
name="activity",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activites/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/activities/<uuid:pk>/",
IssueActivityAPIEndpoint.as_view(),
name="activity",
),

View File

@@ -19,7 +19,7 @@ urlpatterns = [
name="module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:issue_id>/",
ModuleIssueAPIEndpoint.as_view(),
name="module-issues",
),

View File

@@ -9,7 +9,7 @@ urlpatterns = [
name="project",
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAPIEndpoint.as_view(),
name="project",
),

View File

@@ -8,4 +8,9 @@ urlpatterns = [
StateAPIEndpoint.as_view(),
name="states",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/states/<uuid:state_id>/",
StateAPIEndpoint.as_view(),
name="states",
),
]

View File

@@ -7,7 +7,6 @@ from django.conf import settings
from django.db import IntegrityError
from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from rest_framework.views import APIView
@@ -36,28 +35,33 @@ class TimezoneMixin:
else:
timezone.deactivate()
class WebhookMixin:
webhook_event = None
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
# Check for the case should webhook be sent
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
# Push the object to delay
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
payload=response.data,
kw=self.kwargs,
action=self.request.method,
slug=self.workspace_slug,
bulk=self.bulk,
)
return response
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
authentication_classes = [
APIKeyAuthentication,
@@ -139,13 +143,13 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get('X-RateLimit-Remaining')
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
if ratelimit_remaining is not None:
response['X-RateLimit-Remaining'] = ratelimit_remaining
response["X-RateLimit-Remaining"] = ratelimit_remaining
ratelimit_reset = request.META.get('X-RateLimit-Reset')
ratelimit_reset = request.META.get("X-RateLimit-Reset")
if ratelimit_reset is not None:
response['X-RateLimit-Reset'] = ratelimit_reset
response["X-RateLimit-Reset"] = ratelimit_reset
return response
@@ -169,4 +173,4 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
expand = [
expand for expand in self.request.GET.get("expand", "").split(",") if expand
]
return expand if expand else None
return expand if expand else None

View File

@@ -17,7 +17,6 @@ from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import (
CycleSerializer,
CycleIssueSerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
@@ -142,7 +141,6 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
)
queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all")
queryset = queryset.order_by("-is_favorite", "-created_at")
# Current Cycle
if cycle_view == "current":
@@ -293,7 +291,7 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=None,
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -305,14 +303,15 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to cycle issues.
This viewset automatically provides `list`, `create`,
and `destroy` actions related to cycle issues.
"""
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
@@ -457,7 +456,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Capture Issue Activity
issue_activity.delay(
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": issues}),
requested_data=json.dumps({"cycles_list": str(issues)}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
@@ -478,9 +477,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, cycle_id, pk):
def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()
@@ -493,7 +492,7 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=str(issue_id),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=None,
epoch=int(timezone.now().timestamp()),

View File

@@ -14,7 +14,7 @@ from rest_framework.response import Response
from .base import BaseAPIView
from plane.app.permissions import ProjectLitePermission
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.db.models import InboxIssue, Issue, State, ProjectMember
from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
from plane.bgtasks.issue_activites_task import issue_activity
@@ -37,29 +37,39 @@ class InboxIssueAPIEndpoint(BaseAPIView):
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(
inbox = Inbox.objects.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
)
if inbox is None and not project.inbox_view:
return InboxIssue.objects.none()
return (
InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=self.kwargs.get("inbox_id"),
inbox_id=inbox.id,
)
.select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
)
def get(self, request, slug, project_id, inbox_id, pk=None):
if pk:
issue_queryset = self.get_queryset().get(pk=pk)
issues_data = InboxIssueSerializer(
issue_queryset,
def get(self, request, slug, project_id, issue_id=None):
if issue_id:
inbox_issue_queryset = self.get_queryset().get(issue_id=issue_id)
inbox_issue_data = InboxIssueSerializer(
inbox_issue_queryset,
fields=self.fields,
expand=self.expand,
).data
return Response(
issues_data,
inbox_issue_data,
status=status.HTTP_200_OK,
)
issue_queryset = self.get_queryset()
@@ -74,12 +84,30 @@ class InboxIssueAPIEndpoint(BaseAPIView):
).data,
)
def post(self, request, slug, project_id, inbox_id):
def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for valid priority
if not request.data.get("issue", {}).get("priority", "none") in [
"low",
@@ -123,21 +151,45 @@ class InboxIssueAPIEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
# create an inbox issue
InboxIssue.objects.create(
inbox_id=inbox_id,
inbox_issue = InboxIssue.objects.create(
inbox_id=inbox.id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "in-app"),
)
serializer = IssueSerializer(issue)
serializer = InboxIssueSerializer(inbox_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
def patch(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member
project_member = ProjectMember.objects.get(
workspace__slug=slug,
@@ -145,6 +197,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
member=request.user,
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
@@ -159,7 +212,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
if bool(issue_data):
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
pk=issue_id, workspace__slug=slug, project_id=project_id
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
@@ -183,7 +236,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
type="issue.activity.updated",
requested_data=requested_data,
actor_id=str(request.user.id),
issue_id=str(issue.id),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=json.dumps(
IssueSerializer(current_instance).data,
@@ -208,7 +261,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if the issue is rejected or marked as duplicate
if serializer.data["status"] in [-1, 2]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -222,7 +275,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if it is accepted
if serializer.data["status"] in [1]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -244,10 +297,33 @@ class InboxIssueAPIEndpoint(BaseAPIView):
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
)
def delete(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
def delete(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project settings"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
)
# Get the project member
project_member = ProjectMember.objects.get(
workspace__slug=slug,
@@ -256,6 +332,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
is_active=True,
)
# Check the inbox issue created
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
request.user.id
):
@@ -268,8 +345,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
workspace__slug=slug, project_id=project_id, pk=issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -22,7 +22,6 @@ from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.parsers import MultiPartParser, FormParser
# Module imports
from .base import BaseAPIView, WebhookMixin
@@ -41,14 +40,12 @@ from plane.db.models import (
IssueComment,
IssueActivity,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activites_task import issue_activity
from plane.api.serializers import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
)
@@ -103,7 +100,6 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
@@ -112,7 +108,6 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
issue_queryset = (
self.get_queryset()
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
@@ -278,7 +273,7 @@ class LabelAPIEndpoint(BaseAPIView):
def get_queryset(self):
return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
Label.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.select_related("project")
@@ -302,29 +297,29 @@ class LabelAPIEndpoint(BaseAPIView):
)
def get(self, request, slug, project_id, pk=None):
if pk:
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(
label,
fields=self.fields,
expand=self.expand,
if pk is None:
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda labels: LabelSerializer(
labels,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda labels: LabelSerializer(
labels,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, data=request.data, partial=True)
return Response(serializer.data, status=status.HTTP_200_OK)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
@@ -356,25 +351,31 @@ class IssueLinkAPIEndpoint(BaseAPIView):
.distinct()
)
def get(self, request, slug, project_id, pk=None):
if pk:
label = self.get_queryset().get(pk=pk)
def get(self, request, slug, project_id, issue_id, pk=None):
if pk is None:
issue_links = self.get_queryset()
serializer = IssueLinkSerializer(
label,
issue_links,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_links: IssueLinkSerializer(
issue_links,
many=True,
fields=self.fields,
expand=self.expand,
).data,
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_links: IssueLinkSerializer(
issue_links,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
issue_link = self.get_queryset().get(pk=pk)
serializer = IssueLinkSerializer(
issue_link,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug, project_id, issue_id):
serializer = IssueLinkSerializer(data=request.data)
@@ -449,7 +450,7 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
@@ -587,7 +588,7 @@ class IssueActivityAPIEndpoint(BaseAPIView):
serializer = IssueActivitySerializer(issue_activities)
return Response(serializer.data, status=status.HTTP_200_OK)
self.paginate(
return self.paginate(
request=request,
queryset=(issue_activities),
on_results=lambda issue_activity: IssueActivitySerializer(

View File

@@ -129,6 +129,14 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
serializer = ModuleSerializer(module)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk):
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
serializer = ModuleSerializer(module, data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
@@ -168,7 +176,7 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=None,
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -186,7 +194,8 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module"
webhook_event = "module_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
@@ -323,7 +332,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Capture Issue Activity
issue_activity.delay(
type="module.activity.created",
requested_data=json.dumps({"modules_list": issues}),
requested_data=json.dumps({"modules_list": str(issues)}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
@@ -343,9 +352,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, module_id, pk):
def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk
workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
)
module_issue.delete()
issue_activity.delay(
@@ -357,7 +366,7 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),

View File

@@ -94,8 +94,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
.distinct()
)
def get(self, request, slug, pk=None):
if pk is None:
def get(self, request, slug, project_id=None):
if project_id is None:
sort_order_query = ProjectMember.objects.filter(
member=request.user,
project_id=OuterRef("pk"),
@@ -114,7 +114,7 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
).select_related("member"),
)
)
.order_by("sort_order", "name")
.order_by(request.GET.get("order_by", "sort_order"))
)
return self.paginate(
request=request,
@@ -123,15 +123,13 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
projects, many=True, fields=self.fields, expand=self.expand,
).data,
)
else:
project = self.get_queryset().get(workspace__slug=slug, pk=pk)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug):
try:
workspace = Workspace.objects.get(slug=slug)
serializer = ProjectSerializer(
data={**request.data}, context={"workspace_id": workspace.id}
)
@@ -236,10 +234,10 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
status=status.HTTP_410_GONE,
)
def patch(self, request, slug, pk=None):
def patch(self, request, slug, project_id=None):
try:
workspace = Workspace.objects.get(slug=slug)
project = Project.objects.get(pk=pk)
project = Project.objects.get(pk=project_id)
serializer = ProjectSerializer(
project,
@@ -260,7 +258,7 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
name="Triage",
group="backlog",
description="Default state for managing all Inbox Issues",
project_id=pk,
project_id=project_id,
color="#ff7700",
)
@@ -282,4 +280,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
)
)
def delete(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -23,10 +23,8 @@ class StateAPIEndpoint(BaseAPIView):
]
def get_queryset(self):
return self.filter_queryset(
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
return (
State.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(project__project_projectmember__member=self.request.user)
.filter(~Q(name="Triage"))
@@ -42,9 +40,9 @@ class StateAPIEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, project_id, pk=None):
if pk:
serializer = StateSerializer(self.get_queryset().get(pk=pk))
def get(self, request, slug, project_id, state_id=None):
if state_id:
serializer = StateSerializer(self.get_queryset().get(pk=state_id))
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
@@ -57,10 +55,10 @@ class StateAPIEndpoint(BaseAPIView):
).data,
)
def delete(self, request, slug, project_id, pk):
def delete(self, request, slug, project_id, state_id):
state = State.objects.get(
~Q(name="Triage"),
pk=pk,
pk=state_id,
project_id=project_id,
workspace__slug=slug,
)
@@ -69,7 +67,7 @@ class StateAPIEndpoint(BaseAPIView):
return Response({"error": "Default state cannot be deleted"}, status=False)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()
issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist:
return Response(
@@ -80,8 +78,8 @@ class StateAPIEndpoint(BaseAPIView):
state.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, pk=None):
state = State.objects.filter(workspace__slug=slug, project_id=project_id, pk=pk)
def patch(self, request, slug, project_id, state_id=None):
state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()

View File

@@ -99,7 +99,6 @@ class WorkspaceViewerPermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__gte=10,
is_active=True,
).exists()

View File

@@ -1,45 +0,0 @@
from django.utils import timezone
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
def allow_request(self, request, view):
# Calculate the current time as a Unix timestamp
now = timezone.now().timestamp()
# Use the parent class's method to check if the request is allowed
allowed = super().allow_request(request, view)
if allowed:
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
return allowed

View File

@@ -225,6 +225,7 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = IssueActivity

View File

@@ -26,6 +26,8 @@ class UserSerializer(BaseSerializer):
"token_updated_at",
"is_onboarded",
"is_bot",
"is_password_autoset",
"is_email_verified",
]
extra_kwargs = {"password": {"write_only": True}}
@@ -60,6 +62,8 @@ class UserMeSerializer(BaseSerializer):
"theme",
"last_workspace_id",
"use_case",
"is_password_autoset",
"is_email_verified",
]
read_only_fields = fields
@@ -80,9 +84,18 @@ class UserMeSettingsSerializer(BaseSerializer):
workspace_invites = WorkspaceMemberInvite.objects.filter(
email=obj.email
).count()
if obj.last_workspace_id is not None:
if (
obj.last_workspace_id is not None
and Workspace.objects.filter(
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).exists()
):
workspace = Workspace.objects.filter(
pk=obj.last_workspace_id, workspace_member__member=obj.id
pk=obj.last_workspace_id,
workspace_member__member=obj.id,
workspace_member__is_active=True,
).first()
return {
"last_workspace_id": obj.last_workspace_id,
@@ -95,7 +108,9 @@ class UserMeSettingsSerializer(BaseSerializer):
}
else:
fallback_workspace = (
Workspace.objects.filter(workspace_member__member_id=obj.id)
Workspace.objects.filter(
workspace_member__member_id=obj.id, workspace_member__is_active=True
)
.order_by("created_at")
.first()
)
@@ -154,14 +169,25 @@ class ChangePasswordSerializer(serializers.Serializer):
Serializer for password change endpoint.
"""
old_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True, min_length=8)
confirm_password = serializers.CharField(required=True, min_length=8)
def validate(self, data):
if data.get("old_password") == data.get("new_password"):
raise serializers.ValidationError(
{"error": "New password cannot be same as old password."}
)
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
{"error": "Confirm password should be same as the new password."}
)
return data
class ResetPasswordSerializer(serializers.Serializer):
model = User
"""
Serializer for password change endpoint.
"""
new_password = serializers.CharField(required=True)
confirm_password = serializers.CharField(required=True)
new_password = serializers.CharField(required=True, min_length=8)

View File

@@ -1,3 +1,9 @@
# Python imports
import urllib
import socket
import ipaddress
from urllib.parse import urlparse
# Third party imports
from rest_framework import serializers
@@ -8,6 +14,76 @@ from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
def create(self, validated_data):
url = validated_data.get("url", None)
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return Webhook.objects.create(**validated_data)
def update(self, instance, validated_data):
url = validated_data.get("url", None)
if url:
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
return super().update(instance, validated_data)
class Meta:
model = Webhook

View File

@@ -21,6 +21,23 @@ class WorkSpaceSerializer(BaseSerializer):
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
def validated(self, data):
if data.get("slug") in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
"password",
]:
raise serializers.ValidationError({"slug": "Slug is not valid"})
class Meta:
model = Workspace
fields = "__all__"

View File

@@ -4,6 +4,7 @@ from django.urls import path
from plane.app.views import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
)
@@ -28,4 +29,13 @@ urlpatterns = [
UserAssetsEndpoint.as_view(),
name="user-file-assets",
),
path(
"workspaces/file-assets/<uuid:workspace_id>/<str:asset_key>/restore/",
FileAssetViewSet.as_view(
{
"post": "restore",
}
),
name="file-assets-restore",
),
]

View File

@@ -5,18 +5,15 @@ from rest_framework_simplejwt.views import TokenRefreshView
from plane.app.views import (
# Authentication
SignUpEndpoint,
SignInEndpoint,
SignOutEndpoint,
MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
OauthEndpoint,
EmailCheckEndpoint,
## End Authentication
# Auth Extended
ForgotPasswordEndpoint,
VerifyEmailEndpoint,
ResetPasswordEndpoint,
RequestEmailVerificationEndpoint,
ChangePasswordEndpoint,
## End Auth Extender
# API Tokens
@@ -27,24 +24,14 @@ from plane.app.views import (
urlpatterns = [
# Social Auth
path("email-check/", EmailCheckEndpoint.as_view(), name="email"),
path("social-auth/", OauthEndpoint.as_view(), name="oauth"),
# Auth
path("sign-up/", SignUpEndpoint.as_view(), name="sign-up"),
path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# Magic Sign In/Up
path(
"magic-generate/", MagicSignInGenerateEndpoint.as_view(), name="magic-generate"
),
# magic sign in
path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Email verification
path("email-verify/", VerifyEmailEndpoint.as_view(), name="email-verify"),
path(
"request-email-verify/",
RequestEmailVerificationEndpoint.as_view(),
name="request-reset-email",
),
# Password Manipulation
path(
"users/me/change-password/",

View File

@@ -7,7 +7,6 @@ from plane.app.views import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleIssueGroupedEndpoint,
)
@@ -44,11 +43,6 @@ urlpatterns = [
),
name="project-issue-cycle",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueGroupedEndpoint.as_view(),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
CycleIssueViewSet.as_view(

View File

@@ -3,8 +3,6 @@ from django.urls import path
from plane.app.views import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
BulkDeleteIssuesEndpoint,
@@ -37,16 +35,6 @@ urlpatterns = [
),
name="project-issue",
),
path(
"v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListEndpoint.as_view(),
name="project-issue",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/issues/",
IssueListGroupedEndpoint.as_view(),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(

View File

@@ -7,7 +7,6 @@ from plane.app.views import (
ModuleLinkViewSet,
ModuleFavoriteViewSet,
BulkImportModulesEndpoint,
ModuleIssueGroupedEndpoint,
)
@@ -44,11 +43,6 @@ urlpatterns = [
),
name="project-module-issues",
),
path(
"v3/workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/",
ModuleIssueGroupedEndpoint.as_view(),
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
ModuleIssueViewSet.as_view(

View File

@@ -65,7 +65,7 @@ urlpatterns = [
name="project-member-invite",
),
path(
"users/me/invitations/projects/",
"users/me/workspaces/<str:slug>/projects/invitations/",
UserProjectInvitationsViewset.as_view(
{
"get": "list",
@@ -75,7 +75,7 @@ urlpatterns = [
name="user-project-invitations",
),
path(
"workspaces/<str:slug>/projects/join/",
"workspaces/<str:slug>/projects/<uuid:project_id>/join/<uuid:pk>/",
ProjectJoinEndpoint.as_view(),
name="project-join",
),

View File

@@ -7,6 +7,7 @@ from plane.app.views import (
UpdateUserTourCompletedEndpoint,
UserActivityEndpoint,
ChangePasswordEndpoint,
SetUserPasswordEndpoint,
## End User
## Workspaces
UserWorkSpacesEndpoint,
@@ -63,7 +64,7 @@ urlpatterns = [
name="user-tour",
),
path(
"users/workspaces/<str:slug>/activities/",
"users/me/activities/",
UserActivityEndpoint.as_view(),
name="user-activities",
),
@@ -89,5 +90,10 @@ urlpatterns = [
UserWorkspaceDashboardEndpoint.as_view(),
name="user-workspace-dashboard",
),
path(
"users/me/set-password/",
SetUserPasswordEndpoint.as_view(),
name="set-password",
),
## End User Graph
]

View File

@@ -5,7 +5,7 @@ from plane.app.views import (
IssueViewViewSet,
GlobalViewViewSet,
GlobalViewIssuesViewSet,
IssueViewFavoriteViewSet,
IssueViewFavoriteViewSet,
)

View File

@@ -58,13 +58,10 @@ from .cycle import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleIssueGroupedEndpoint,
)
from .asset import FileAssetEndpoint, UserAssetsEndpoint
from .asset import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .issue import (
IssueViewSet,
IssueListEndpoint,
IssueListGroupedEndpoint,
WorkSpaceIssuesEndpoint,
IssueActivityEndpoint,
IssueCommentViewSet,
@@ -85,20 +82,18 @@ from .issue import (
)
from .auth_extended import (
VerifyEmailEndpoint,
RequestEmailVerificationEndpoint,
ForgotPasswordEndpoint,
ResetPasswordEndpoint,
ChangePasswordEndpoint,
SetUserPasswordEndpoint,
EmailCheckEndpoint,
)
from .authentication import (
SignUpEndpoint,
SignInEndpoint,
SignOutEndpoint,
MagicSignInEndpoint,
MagicSignInGenerateEndpoint,
)
from .module import (
@@ -106,7 +101,6 @@ from .module import (
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
ModuleIssueGroupedEndpoint,
)
from .api import ApiTokenEndpoint
@@ -168,4 +162,8 @@ from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .webhook import WebhookEndpoint, WebhookLogsEndpoint, WebhookSecretRegenerateEndpoint
from .webhook import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)

View File

@@ -4,7 +4,7 @@ from rest_framework.response import Response
from rest_framework.parsers import MultiPartParser, FormParser, JSONParser
# Module imports
from .base import BaseAPIView
from .base import BaseAPIView, BaseViewSet
from plane.db.models import FileAsset, Workspace
from plane.app.serializers import FileAssetSerializer
@@ -34,10 +34,20 @@ class FileAssetEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, workspace_id, asset_key):
def delete(self, request, workspace_id, asset_key):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = request.data.get("is_deleted", file_asset.is_deleted)
file_asset.is_deleted = True
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class FileAssetViewSet(BaseViewSet):
def restore(self, request, workspace_id, asset_key):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = False
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -63,8 +73,6 @@ class UserAssetsEndpoint(BaseAPIView):
def delete(self, request, asset_key):
file_asset = FileAsset.objects.get(asset=asset_key, created_by=request.user)
# Delete the file from storage
file_asset.asset.delete(save=False)
# Delete the file object
file_asset.delete()
file_asset.is_deleted = True
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,5 +1,9 @@
## Python imports
import jwt
import uuid
import os
import json
import random
import string
## Django imports
from django.contrib.auth.tokens import PasswordResetTokenGenerator
@@ -8,80 +12,114 @@ from django.utils.encoding import (
smart_bytes,
DjangoUnicodeDecodeError,
)
from django.contrib.auth.hashers import make_password
from django.utils.http import urlsafe_base64_decode, urlsafe_base64_encode
from django.core.validators import validate_email
from django.core.exceptions import ValidationError
from django.conf import settings
## Third Party Imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework import permissions
from rest_framework.permissions import AllowAny
from rest_framework_simplejwt.tokens import RefreshToken
from sentry_sdk import capture_exception
## Module imports
from . import BaseAPIView
from plane.app.serializers import (
ChangePasswordSerializer,
ResetPasswordSerializer,
UserSerializer,
)
from plane.db.models import User
from plane.bgtasks.email_verification_task import email_verification
from plane.db.models import User, WorkspaceMemberInvite
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.forgot_password_task import forgot_password
from plane.license.models import Instance, InstanceConfiguration
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
from plane.bgtasks.user_count_task import update_user_instance_user_count
from plane.bgtasks.event_tracking_task import auth_events
def get_tokens_for_user(user):
refresh = RefreshToken.for_user(user)
return (
str(refresh.access_token),
str(refresh),
)
class RequestEmailVerificationEndpoint(BaseAPIView):
def get(self, request):
token = RefreshToken.for_user(request.user).access_token
current_site = request.META.get('HTTP_ORIGIN')
email_verification.delay(
request.user.first_name, request.user.email, token, current_site
)
return Response(
{"message": "Email sent successfully"}, status=status.HTTP_200_OK
)
def generate_magic_token(email):
key = "magic_" + str(email)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
# Initialize the redis instance
ri = redis_instance()
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return key, token, False
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
return key, token, True
class VerifyEmailEndpoint(BaseAPIView):
def get(self, request):
token = request.GET.get("token")
try:
payload = jwt.decode(token, settings.SECRET_KEY, algorithms="HS256")
user = User.objects.get(id=payload["user_id"])
def generate_password_token(user):
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
if not user.is_email_verified:
user.is_email_verified = True
user.save()
return Response(
{"email": "Successfully activated"}, status=status.HTTP_200_OK
)
except jwt.ExpiredSignatureError as _indentifier:
return Response(
{"email": "Activation expired"}, status=status.HTTP_400_BAD_REQUEST
)
except jwt.exceptions.DecodeError as _indentifier:
return Response(
{"email": "Invalid token"}, status=status.HTTP_400_BAD_REQUEST
)
return uidb64, token
class ForgotPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email")
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
uidb64 = urlsafe_base64_encode(smart_bytes(user.id))
token = PasswordResetTokenGenerator().make_token(user)
current_site = request.META.get('HTTP_ORIGIN')
try:
validate_email(email)
except ValidationError:
return Response({"error": "Please enter a valid email"}, status=status.HTTP_400_BAD_REQUEST)
# Get the user
user = User.objects.filter(email=email).first()
if user:
# Get the reset token for user
uidb64, token = get_tokens_for_user(user=user)
current_site = request.META.get("HTTP_ORIGIN")
# send the forgot password email
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site
)
return Response(
{"message": "Check your email to reset your password"},
status=status.HTTP_200_OK,
@@ -92,30 +130,38 @@ class ForgotPasswordEndpoint(BaseAPIView):
class ResetPasswordEndpoint(BaseAPIView):
permission_classes = [permissions.AllowAny]
permission_classes = [AllowAny,]
def post(self, request, uidb64, token):
try:
# Decode the id from the uidb64
id = smart_str(urlsafe_base64_decode(uidb64))
user = User.objects.get(id=id)
# check if the token is valid for the user
if not PasswordResetTokenGenerator().check_token(user, token):
return Response(
{"error": "token is not valid, please check the new one"},
{"error": "Token is invalid"},
status=status.HTTP_401_UNAUTHORIZED,
)
serializer = ResetPasswordSerializer(data=request.data)
# Reset the password
serializer = ResetPasswordSerializer(data=request.data)
if serializer.is_valid():
# set_password also hashes the password that the user will get
user.set_password(serializer.data.get("new_password"))
user.is_password_autoset = False
user.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
# Log the user in
# Generate access token for the user
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(response)
return Response(data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except DjangoUnicodeDecodeError as indentifier:
@@ -128,24 +174,216 @@ class ResetPasswordEndpoint(BaseAPIView):
class ChangePasswordEndpoint(BaseAPIView):
def post(self, request):
serializer = ChangePasswordSerializer(data=request.data)
user = User.objects.get(pk=request.user.id)
if serializer.is_valid():
# Check old password
if not user.object.check_password(serializer.data.get("old_password")):
if not user.check_password(serializer.data.get("old_password")):
return Response(
{"old_password": ["Wrong password."]},
{"error": "Old password is not correct"},
status=status.HTTP_400_BAD_REQUEST,
)
# set_password also hashes the password that the user will get
self.object.set_password(serializer.data.get("new_password"))
self.object.save()
response = {
"status": "success",
"code": status.HTTP_200_OK,
"message": "Password updated successfully",
}
return Response(response)
user.set_password(serializer.data.get("new_password"))
user.is_password_autoset = False
user.save()
return Response(
{"message": "Password updated successfully"}, status=status.HTTP_200_OK
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class SetUserPasswordEndpoint(BaseAPIView):
def post(self, request):
user = User.objects.get(pk=request.user.id)
password = request.data.get("password", False)
# If the user password is not autoset then return error
if not user.is_password_autoset:
return Response(
{
"error": "Your password is already set please change your password from profile"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check password validation
if not password and len(str(password)) < 8:
return Response(
{"error": "Password is not valid"}, status=status.HTTP_400_BAD_REQUEST
)
# Set the user password
user.set_password(password)
user.is_password_autoset = False
user.save()
serializer = UserSerializer(user)
return Response(serializer.data, status=status.HTTP_200_OK)
class EmailCheckEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
# Check the instance registration
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the configurations
instance_configuration = InstanceConfiguration.objects.values("key", "value")
email = request.data.get("email", False)
type = request.data.get("type", "magic_code")
if not email:
return Response({"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST)
# validate the email
try:
validate_email(email)
except ValidationError:
return Response({"error": "Email is not valid"}, status=status.HTTP_400_BAD_REQUEST)
# Check if the user exists
user = User.objects.filter(email=email).first()
current_site = request.META.get("HTTP_ORIGIN")
# If new user
if user is None:
# Create the user
if (
get_configuration_value(
instance_configuration,
"ENABLE_SIGNUP",
os.environ.get("ENABLE_SIGNUP", "0"),
)
== "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Create the user with default values
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
# Update instance user count
update_user_instance_user_count.delay()
# Case when the user selects magic code
if type == "magic_code":
if not bool(get_configuration_value(
instance_configuration,
"ENABLE_MAGIC_LINK_LOGIN",
os.environ.get("ENABLE_MAGIC_LINK_LOGIN")),
):
return Response(
{"error": "Magic link sign in is disabled."},
status=status.HTTP_400_BAD_REQUEST,
)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=True,
)
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response({"error": "Max attempts exhausted. Please try again later."}, status=status.HTTP_400_BAD_REQUEST)
# Trigger the email
magic_link.delay(email, "magic_" + str(email), token, current_site)
return Response({"is_password_autoset": user.is_password_autoset}, status=status.HTTP_200_OK)
else:
# Get the uidb64 and token for the user
uidb64, token = generate_password_token(user=user)
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site
)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="EMAIL",
first_time=True,
)
# Automatically send the email
return Response({"is_password_autoset": user.is_password_autoset}, status=status.HTTP_200_OK)
# Existing user
else:
if type == "magic_code":
## Generate a random token
if not bool(get_configuration_value(
instance_configuration,
"ENABLE_MAGIC_LINK_LOGIN",
os.environ.get("ENABLE_MAGIC_LINK_LOGIN")),
):
return Response(
{"error": "Magic link sign in is disabled."},
status=status.HTTP_400_BAD_REQUEST,
)
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=False,
)
# Generate magic token
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response({"error": "Max attempts exhausted. Please try again later."}, status=status.HTTP_400_BAD_REQUEST)
# Trigger the email
magic_link.delay(email, key, token, current_site)
return Response({"is_password_autoset": user.is_password_autoset}, status=status.HTTP_200_OK)
else:
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="EMAIL",
first_time=False,
)
if user.is_password_autoset:
# send email
uidb64, token = generate_password_token(user=user)
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site
)
return Response({"is_password_autoset": user.is_password_autoset}, status=status.HTTP_200_OK)
else:
# User should enter password to login
return Response({"is_password_autoset": user.is_password_autoset}, status=status.HTTP_200_OK)

View File

@@ -1,25 +1,22 @@
# Python imports
import os
import uuid
import random
import string
import json
import requests
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
from django.core.exceptions import ValidationError
from django.core.validators import validate_email
from django.conf import settings
from django.contrib.auth.hashers import make_password
# Third party imports
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from rest_framework import status
from rest_framework_simplejwt.tokens import RefreshToken
from sentry_sdk import capture_exception, capture_message
from sentry_sdk import capture_message
# Module imports
from . import BaseAPIView
@@ -31,7 +28,10 @@ from plane.db.models import (
ProjectMember,
)
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.event_tracking_task import auth_events
from plane.bgtasks.user_count_task import update_user_instance_user_count
def get_tokens_for_user(user):
@@ -46,14 +46,16 @@ class SignUpEndpoint(BaseAPIView):
permission_classes = (AllowAny,)
def post(self, request):
if not settings.ENABLE_SIGNUP:
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
instance_configuration = InstanceConfiguration.objects.values("key", "value")
email = request.data.get("email", False)
password = request.data.get("password", False)
@@ -74,6 +76,25 @@ class SignUpEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# If the sign up is not enabled and the user does not have invite disallow him from creating the account
if (
get_configuration_value(
instance_configuration,
"ENABLE_SIGNUP",
os.environ.get("ENABLE_SIGNUP", "0"),
)
== "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the user already exists
if User.objects.filter(email=email).exists():
return Response(
@@ -92,93 +113,16 @@ class SignUpEndpoint(BaseAPIView):
user.token_updated_at = timezone.now()
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=workspace_member_invite.workspace_id,
member=user,
role=workspace_member_invite.role,
)
for workspace_member_invite in workspace_member_invites
],
ignore_conflicts=True,
)
# Check if user has any project invites
project_member_invites = ProjectMemberInvite.objects.filter(
email=user.email, accepted=True
)
# Add user to workspace
WorkspaceMember.objects.bulk_create(
[
WorkspaceMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Now add the users to project
ProjectMember.objects.bulk_create(
[
ProjectMember(
workspace_id=project_member_invite.workspace_id,
role=project_member_invite.role
if project_member_invite.role in [5, 10, 15]
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
# Update instance user count
update_user_instance_user_count.delay()
return Response(data, status=status.HTTP_200_OK)
@@ -186,6 +130,14 @@ class SignInEndpoint(BaseAPIView):
permission_classes = (AllowAny,)
def post(self, request):
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
email = request.data.get("email", False)
password = request.data.get("password", False)
@@ -206,8 +158,10 @@ class SignInEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# Get the user
user = User.objects.filter(email=email).first()
# User is not present in db
if user is None:
return Response(
{
@@ -216,7 +170,7 @@ class SignInEndpoint(BaseAPIView):
status=status.HTTP_403_FORBIDDEN,
)
# Sign up Process
# Check user password
if not user.check_password(password):
return Response(
{
@@ -224,15 +178,9 @@ class SignInEndpoint(BaseAPIView):
},
status=status.HTTP_403_FORBIDDEN,
)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
# settings last active for the user
user.is_active = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -288,7 +236,8 @@ class SignInEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -296,30 +245,17 @@ class SignInEndpoint(BaseAPIView):
# Delete all the invites
workspace_member_invites.delete()
project_member_invites.delete()
try:
# Send Analytics
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "email",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="EMAIL",
first_time=False,
)
access_token, refresh_token = get_tokens_for_user(user)
data = {
@@ -352,77 +288,20 @@ class SignOutEndpoint(BaseAPIView):
return Response({"message": "success"}, status=status.HTTP_200_OK)
class MagicSignInGenerateEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email", False)
if not email:
return Response(
{"error": "Please provide a valid email address"},
status=status.HTTP_400_BAD_REQUEST,
)
# Clean up
email = email.strip().lower()
validate_email(email)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
ri = redis_instance()
key = "magic_" + str(email)
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
current_site = request.META.get('HTTP_ORIGIN')
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
class MagicSignInEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
# Check if the instance configuration is done
instance = Instance.objects.first()
if instance is None or not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
user_token = request.data.get("token", "").strip()
key = request.data.get("key", False).strip().lower()
@@ -441,71 +320,28 @@ class MagicSignInEndpoint(BaseAPIView):
email = data["email"]
if str(token) == str(user_token):
if User.objects.filter(email=email).exists():
user = User.objects.get(email=email)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
else:
user = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
user = User.objects.get(email=email)
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium="MAGIC_LINK",
first_time=False,
)
try:
# Send event to Jitsu for tracking
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": "code",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
user.is_active = True
user.is_email_verified = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -561,7 +397,8 @@ class MagicSignInEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)

View File

@@ -43,20 +43,25 @@ class TimezoneMixin:
class WebhookMixin:
webhook_event = None
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
# Check for the case should webhook be sent
if (
self.webhook_event
and self.request.method in ["POST", "PATCH", "DELETE"]
and response.status_code in [200, 201, 204]
):
# Push the object to delay
send_webhook.delay(
event=self.webhook_event,
event_data=json.dumps(response.data, cls=DjangoJSONEncoder),
payload=response.data,
kw=self.kwargs,
action=self.request.method,
slug=self.workspace_slug,
bulk=self.bulk,
)
return response

View File

@@ -45,22 +45,23 @@ class ConfigurationEndpoint(BaseAPIView):
get_configuration_value(
instance_configuration,
"EMAIL_HOST_USER",
os.environ.get("GITHUB_APP_NAME", None),
os.environ.get("EMAIL_HOST_USER", None),
),
)
and bool(
get_configuration_value(
instance_configuration,
"EMAIL_HOST_PASSWORD",
os.environ.get("GITHUB_APP_NAME", None),
os.environ.get("EMAIL_HOST_PASSWORD", None),
)
)
) and get_configuration_value(
instance_configuration, "ENABLE_MAGIC_LINK_LOGIN", "0"
instance_configuration, "ENABLE_MAGIC_LINK_LOGIN", "1"
) == "1"
data["email_password_login"] = (
get_configuration_value(
instance_configuration, "ENABLE_EMAIL_PASSWORD", "0"
instance_configuration, "ENABLE_EMAIL_PASSWORD", "1"
)
== "1"
)
@@ -101,4 +102,6 @@ class ConfigurationEndpoint(BaseAPIView):
)
)
data["file_size_limit"] = float(os.environ.get("FILE_SIZE_LIMIT", 5242880))
return Response(data, status=status.HTTP_200_OK)

View File

@@ -502,7 +502,10 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
class CycleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
webhook_event = "cycle"
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
@@ -536,9 +539,8 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, cycle_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
@@ -573,24 +575,9 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
)
)
issues_data = IssueStateSerializer(issues, many=True).data
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", [])
@@ -688,7 +675,6 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()
issue_activity.delay(
type="cycle.activity.deleted",
requested_data=json.dumps(
@@ -698,64 +684,15 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
}
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
issue_id=str(cycle_issue.issue_id),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
cycle_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleIssueGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id, cycle_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_cycle__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.filter(**filters)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class CycleDateCheckEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,

View File

@@ -1,6 +1,6 @@
# Python imports
import requests
import os
# Third party imports
from openai import OpenAI
from rest_framework.response import Response
@@ -27,8 +27,8 @@ class GPTIntegrationEndpoint(BaseAPIView):
# Get the configuration value
instance_configuration = InstanceConfiguration.objects.values("key", "value")
api_key = get_configuration_value(instance_configuration, "OPENAI_API_KEY")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
api_key = get_configuration_value(instance_configuration, "OPENAI_API_KEY", os.environ.get("OPENAI_API_KEY"))
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE", os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"))
# Check the keys
if not api_key or not gpt_engine:
@@ -47,10 +47,6 @@ class GPTIntegrationEndpoint(BaseAPIView):
final_text = task + "\n" + prompt
instance_configuration = InstanceConfiguration.objects.values("key", "value")
gpt_engine = get_configuration_value(instance_configuration, "GPT_ENGINE")
client = OpenAI(
api_key=api_key,
)
@@ -85,14 +81,22 @@ class ReleaseNotesEndpoint(BaseAPIView):
class UnsplashEndpoint(BaseAPIView):
def get(self, request):
instance_configuration = InstanceConfiguration.objects.values("key", "value")
unsplash_access_key = get_configuration_value(instance_configuration, "UNSPLASH_ACCESS_KEY", os.environ.get("UNSPLASH_ACCESS_KEY"))
# Check unsplash access key
if not unsplash_access_key:
return Response([], status=status.HTTP_200_OK)
# Query parameters
query = request.GET.get("query", False)
page = request.GET.get("page", 1)
per_page = request.GET.get("per_page", 20)
url = (
f"https://api.unsplash.com/search/photos/?client_id={settings.UNSPLASH_ACCESS_KEY}&query={query}&page=${page}&per_page={per_page}"
f"https://api.unsplash.com/search/photos/?client_id={unsplash_access_key}&query={query}&page=${page}&per_page={per_page}"
if query
else f"https://api.unsplash.com/photos/?client_id={settings.UNSPLASH_ACCESS_KEY}&page={page}&per_page={per_page}"
else f"https://api.unsplash.com/photos/?client_id={unsplash_access_key}&page={page}&per_page={per_page}"
)
headers = {

View File

@@ -4,6 +4,7 @@ import random
from itertools import chain
# Django imports
from django.db import models
from django.utils import timezone
from django.db.models import (
Prefetch,
@@ -132,6 +133,7 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
@@ -215,25 +217,9 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -311,104 +297,6 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class IssueListEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
serializer = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueListGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issue_queryset = (
Issue.objects.filter(workspace__slug=slug, project_id=project_id)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.distinct()
)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class UserWorkSpaceIssues(BaseAPIView):
@method_decorator(gzip_page)
def get(self, request, slug):
@@ -596,7 +484,7 @@ class IssueActivityEndpoint(BaseAPIView):
class IssueCommentViewSet(WebhookMixin, BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue-comment"
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
@@ -1083,6 +971,7 @@ class IssueArchiveViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
show_sub_issues = request.GET.get("show_sub_issues", "true")
@@ -1173,14 +1062,9 @@ class IssueArchiveViewSet(BaseViewSet):
else issue_queryset.filter(parent__isnull=True)
)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
return Response(group_results(issues, group_by), status=status.HTTP_200_OK)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def retrieve(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
@@ -1580,6 +1464,7 @@ class IssueDraftViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
@@ -1662,18 +1547,9 @@ class IssueDraftViewSet(BaseViewSet):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
grouped_results = group_results(issues, group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)

View File

@@ -283,9 +283,12 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueViewSet(BaseViewSet):
class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
serializer_class = ModuleIssueSerializer
model = ModuleIssue
webhook_event = "module_issue"
bulk = True
filterset_fields = [
"issue__labels__id",
@@ -321,9 +324,8 @@ class ModuleIssueViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, module_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
order_by = request.GET.get("order_by", "created_at")
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
@@ -357,24 +359,9 @@ class ModuleIssueViewSet(BaseViewSet):
.values("count")
)
)
issues_data = IssueStateSerializer(issues, many=True).data
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues_data, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(
issues_data, status=status.HTTP_200_OK
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, module_id):
issues = request.data.get("issues", [])
@@ -461,7 +448,6 @@ class ModuleIssueViewSet(BaseViewSet):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk
)
module_issue.delete()
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
@@ -471,63 +457,15 @@ class ModuleIssueViewSet(BaseViewSet):
}
),
actor_id=str(request.user.id),
issue_id=str(pk),
issue_id=str(module_issue.issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
module_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleIssueGroupedEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id, module_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_module__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
.select_related("workspace")
.select_related("state")
.select_related("parent")
.prefetch_related("assignees")
.prefetch_related("labels")
.filter(**filters)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class ModuleLinkViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,

View File

@@ -2,7 +2,6 @@
import uuid
import requests
import os
from requests.exceptions import RequestException
# Django imports
from django.utils import timezone
@@ -29,7 +28,11 @@ from plane.db.models import (
ProjectMemberInvite,
ProjectMember,
)
from plane.bgtasks.event_tracking_task import auth_events
from .base import BaseAPIView
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.user_count_task import update_user_instance_user_count
def get_tokens_for_user(user):
@@ -133,10 +136,35 @@ class OauthEndpoint(BaseAPIView):
def post(self, request):
try:
# Check if instance is registered or not
instance = Instance.objects.first()
if instance is None and not instance.is_setup_done:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
medium = request.data.get("medium", False)
id_token = request.data.get("credential", False)
client_id = request.data.get("clientId", False)
instance_configuration = InstanceConfiguration.objects.values(
"key", "value"
)
if not get_configuration_value(
instance_configuration,
"GOOGLE_CLIENT_ID",
os.environ.get("GOOGLE_CLIENT_ID"),
) or not get_configuration_value(
instance_configuration,
"GITHUB_CLIENT_ID",
os.environ.get("GITHUB_CLIENT_ID"),
):
return Response(
{"error": "Github or Google login is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
if not medium or not id_token:
return Response(
{
@@ -174,15 +202,7 @@ class OauthEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
## Login Case
if not user.is_active:
return Response(
{
"error": "Your account has been deactivated. Please contact your site administrator."
},
status=status.HTTP_403_FORBIDDEN,
)
user.is_active = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -239,7 +259,8 @@ class OauthEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -256,29 +277,18 @@ class OauthEndpoint(BaseAPIView):
"last_login_at": timezone.now(),
},
)
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_IN",
},
)
except RequestException as e:
capture_exception(e)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium=medium.upper(),
first_time=False,
)
access_token, refresh_token = get_tokens_for_user(user)
@@ -290,6 +300,26 @@ class OauthEndpoint(BaseAPIView):
except User.DoesNotExist:
## Signup Case
instance_configuration = InstanceConfiguration.objects.values(
"key", "value"
)
if (
get_configuration_value(
instance_configuration,
"ENABLE_SIGNUP",
os.environ.get("ENABLE_SIGNUP", "0"),
)
== "0"
and not WorkspaceMemberInvite.objects.filter(
email=email,
).exists()
):
return Response(
{
"error": "New account creation is disabled. Please contact your site administrator"
},
status=status.HTTP_400_BAD_REQUEST,
)
username = uuid.uuid4().hex
@@ -305,7 +335,7 @@ class OauthEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
user = User(
user = User.objects.create(
username=username,
email=email,
mobile_number=mobile_number,
@@ -316,7 +346,6 @@ class OauthEndpoint(BaseAPIView):
)
user.set_password(uuid.uuid4().hex)
user.is_password_autoset = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
@@ -373,7 +402,8 @@ class OauthEndpoint(BaseAPIView):
else 15,
member=user,
created_by_id=project_member_invite.created_by_id,
) for project_member_invite in project_member_invites
)
for project_member_invite in project_member_invites
],
ignore_conflicts=True,
)
@@ -381,29 +411,17 @@ class OauthEndpoint(BaseAPIView):
workspace_member_invites.delete()
project_member_invites.delete()
try:
if settings.ANALYTICS_BASE_API:
_ = requests.post(
settings.ANALYTICS_BASE_API,
headers={
"Content-Type": "application/json",
"X-Auth-Token": settings.ANALYTICS_SECRET_KEY,
},
json={
"event_id": uuid.uuid4().hex,
"event_data": {
"medium": f"oauth-{medium}",
},
"user": {"email": email, "id": str(user.id)},
"device_ctx": {
"ip": request.META.get("REMOTE_ADDR"),
"user_agent": request.META.get("HTTP_USER_AGENT"),
},
"event_type": "SIGN_UP",
},
)
except RequestException as e:
capture_exception(e)
# Send event
if settings.POSTHOG_API_KEY and settings.POSTHOG_HOST:
auth_events.delay(
user=user.id,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
event_name="SIGN_IN",
medium=medium.upper(),
first_time=True,
)
SocialLoginConnection.objects.update_or_create(
medium=medium,
@@ -420,4 +438,7 @@ class OauthEndpoint(BaseAPIView):
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_201_CREATED)
# Update the user count
update_user_instance_user_count.delay()
return Response(data, status=status.HTTP_201_CREATED)

View File

@@ -388,7 +388,7 @@ class ProjectInvitationsViewset(BaseViewSet):
{"error": "You cannot invite a user with higher role"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.get(slug=slug)
project_invitations = []
@@ -424,7 +424,7 @@ class ProjectInvitationsViewset(BaseViewSet):
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
current_site = request.META.get("HTTP_ORIGIN")
# Send invitations
for invitation in project_invitations:
@@ -469,6 +469,13 @@ class UserProjectInvitationsViewset(BaseViewSet):
workspace_role = workspace_member.role
workspace = workspace_member.workspace
# If the user was already part of workspace
_ = ProjectMember.objects.filter(
workspace__slug=slug,
project_id__in=project_ids,
member=request.user,
).update(is_active=True)
ProjectMember.objects.bulk_create(
[
ProjectMember(
@@ -1040,4 +1047,4 @@ class ProjectDeployBoardViewSet(BaseViewSet):
project_deploy_board.save()
serializer = ProjectDeployBoardSerializer(project_deploy_board)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -12,11 +12,14 @@ from plane.app.serializers import (
)
from plane.app.views.base import BaseViewSet, BaseAPIView
from plane.db.models import User, IssueActivity, WorkspaceMember
from plane.db.models import User, IssueActivity, WorkspaceMember, ProjectMember
from plane.license.models import Instance, InstanceAdmin
from plane.utils.paginator import BasePaginator
from django.db.models import Q, F, Count, Case, When, IntegerField
class UserEndpoint(BaseViewSet):
serializer_class = UserSerializer
model = User
@@ -45,16 +48,79 @@ class UserEndpoint(BaseViewSet):
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
if WorkspaceMember.objects.filter(member=request.user, is_active=True).exists():
return Response(
{
"error": "User cannot deactivate account as user is active in some workspaces"
},
status=status.HTTP_400_BAD_REQUEST,
)
projects_to_deactivate = []
workspaces_to_deactivate = []
projects = ProjectMember.objects.filter(
member=request.user, is_active=True
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
default=0,
output_field=IntegerField(),
)
),
total_members=Count("id"),
)
for project in projects:
if project.other_admin_exists > 0 or (project.total_members == 1):
project.is_active = False
projects_to_deactivate.append(project)
else:
return Response(
{
"error": "You cannot deactivate account as you are the only admin in some projects."
},
status=status.HTTP_400_BAD_REQUEST,
)
workspaces = WorkspaceMember.objects.filter(
member=request.user, is_active=True
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
default=0,
output_field=IntegerField(),
)
),
total_members=Count("id"),
)
for workspace in workspaces:
if workspace.other_admin_exists > 0 or (workspace.total_members == 1):
workspace.is_active = False
workspaces_to_deactivate.append(workspace)
else:
return Response(
{
"error": "You cannot deactivate account as you are the only admin in some workspaces."
},
status=status.HTTP_400_BAD_REQUEST,
)
ProjectMember.objects.bulk_update(
projects_to_deactivate, ["is_active"], batch_size=100
)
WorkspaceMember.objects.bulk_update(
workspaces_to_deactivate, ["is_active"], batch_size=100
)
# Deactivate the user
user.is_active = False
user.last_workspace_id = None
user.is_tour_completed = False
user.is_onboarded = False
user.onboarding_step = {
"workspace_join": False,
"profile_complete": False,
"workspace_create": False,
"workspace_invite": False,
}
user.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -76,10 +142,10 @@ class UpdateUserTourCompletedEndpoint(BaseAPIView):
class UserActivityEndpoint(BaseAPIView, BasePaginator):
def get(self, request, slug):
queryset = IssueActivity.objects.filter(
actor=request.user, workspace__slug=slug
).select_related("actor", "workspace", "issue", "project")
def get(self, request):
queryset = IssueActivity.objects.filter(actor=request.user).select_related(
"actor", "workspace", "issue", "project"
)
return self.paginate(
request=request,
@@ -88,3 +154,4 @@ class UserActivityEndpoint(BaseAPIView, BasePaginator):
issue_activities, many=True
).data,
)

View File

@@ -20,7 +20,7 @@ from rest_framework.response import Response
from rest_framework import status
# Module imports
from . import BaseViewSet
from . import BaseViewSet, BaseAPIView
from plane.app.serializers import (
GlobalViewSerializer,
IssueViewSerializer,
@@ -95,6 +95,7 @@ class GlobalViewIssuesViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
@@ -177,25 +178,13 @@ class GlobalViewIssuesViewSet(BaseViewSet):
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
if sub_group_by and sub_group_by == group_by:
return Response(
{"error": "Group by and sub group by cannot be same"},
status=status.HTTP_400_BAD_REQUEST,
)
if group_by:
grouped_results = group_results(issues, group_by, sub_group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
)
class IssueViewViewSet(BaseViewSet):

View File

@@ -20,9 +20,10 @@ class WebhookEndpoint(BaseAPIView):
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
try:
serializer = WebhookSerializer(data=request.data)
serializer = WebhookSerializer(
data=request.data, context={"request": request}
)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
@@ -79,6 +80,7 @@ class WebhookEndpoint(BaseAPIView):
serializer = WebhookSerializer(
webhook,
data=request.data,
context={request: request},
partial=True,
fields=(
"id",

View File

@@ -113,7 +113,10 @@ class WorkSpaceViewSet(BaseViewSet):
return (
self.filter_queryset(super().get_queryset().select_related("owner"))
.order_by("name")
.filter(workspace_member__member=self.request.user)
.filter(
workspace_member__member=self.request.user,
workspace_member__is_active=True,
)
.annotate(total_members=member_count)
.annotate(total_issues=issue_count)
.select_related("owner")
@@ -189,17 +192,21 @@ class UserWorkSpacesEndpoint(BaseAPIView):
)
workspace = (
(
Workspace.objects.prefetch_related(
Prefetch("workspace_member", queryset=WorkspaceMember.objects.all())
Workspace.objects.prefetch_related(
Prefetch(
"workspace_member",
queryset=WorkspaceMember.objects.filter(
member=request.user, is_active=True
),
)
.filter(
workspace_member__member=request.user,
)
.select_related("owner")
)
.select_related("owner")
.annotate(total_members=member_count)
.annotate(total_issues=issue_count)
.filter(
workspace_member__member=request.user, workspace_member__is_active=True
)
.distinct()
)
serializer = WorkSpaceSerializer(self.filter_queryset(workspace), many=True)
@@ -319,7 +326,7 @@ class WorkspaceInvitationsViewset(BaseViewSet):
workspace_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get('HTTP_ORIGIN')
current_site = request.META.get("HTTP_ORIGIN")
# Send invitations
for invitation in workspace_invitations:
@@ -418,7 +425,9 @@ class WorkspaceJoinEndpoint(BaseAPIView):
)
def get(self, request, slug, pk):
workspace_invitation = WorkspaceMemberInvite.objects.get(workspace__slug=slug, pk=pk)
workspace_invitation = WorkspaceMemberInvite.objects.get(
workspace__slug=slug, pk=pk
)
serializer = WorkSpaceMemberInviteSerializer(workspace_invitation)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -590,7 +599,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
member_with_role=Count(
"project_projectmember",
filter=Q(
project_projectmember__member_id=request.user.id,
project_projectmember__member_id=workspace_member.id,
project_projectmember__role=20,
),
),
@@ -600,7 +609,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
"error": "User is a part of some projects where they are the only admin, they should either leave that project or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -635,7 +644,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "You cannot leave the workspace as your the only admin of the workspace you will have to either delete the workspace or create an another admin"
"error": "You cannot leave the workspace as you are the only admin of the workspace you will have to either delete the workspace or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -656,7 +665,7 @@ class WorkSpaceMemberViewSet(BaseViewSet):
):
return Response(
{
"error": "User is part of some projects where they are the only admin you should leave that project first"
"error": "You are a part of some projects where you are the only admin, you should either leave the project or promote another user to admin."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -1198,6 +1207,7 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
]
def get(self, request, slug, user_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
@@ -1299,18 +1309,11 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True).data
## Grouping the results
group_by = request.GET.get("group_by", False)
if group_by:
grouped_results = group_results(issues, group_by)
return Response(
grouped_results,
status=status.HTTP_200_OK,
)
return Response(issues, status=status.HTTP_200_OK)
issues = IssueLiteSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):

View File

@@ -1,6 +1,8 @@
# Python imports
import csv
import io
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
@@ -16,8 +18,8 @@ from sentry_sdk import capture_exception
from plane.db.models import Issue
from plane.utils.analytics_plot import build_graph_plot
from plane.utils.issue_filters import issue_filters
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_email_configuration
row_mapping = {
"state__name": "State",
@@ -32,7 +34,7 @@ row_mapping = {
"priority": "Priority",
"estimate": "Estimate",
"issue_cycle__cycle_id": "Cycle",
"issue_module__module_id": "Module"
"issue_module__module_id": "Module",
}
ASSIGNEE_ID = "assignees__id"
@@ -42,7 +44,7 @@ CYCLE_ID = "issue_cycle__cycle_id"
MODULE_ID = "issue_module__module_id"
def send_export_email(email, slug, csv_buffer):
def send_export_email(email, slug, csv_buffer, rows):
"""Helper function to send export email."""
subject = "Your Export is ready"
html_content = render_to_string("emails/exports/analytics.html", {})
@@ -51,19 +53,61 @@ def send_export_email(email, slug, csv_buffer):
csv_buffer.seek(0)
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration(instance_configuration=instance_configuration)
# Send the email if the users don't have smtp configured
if EMAIL_HOST and EMAIL_HOST_USER and EMAIL_HOST_PASSWORD:
# Check the instance registration
instance = Instance.objects.first()
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
payload = {
"email": email,
"slug": slug,
"rows": rows,
}
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/users/analytics/",
headers=headers,
json=payload,
)
return
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
use_ssl=bool(get_configuration_value(instance_configuration, "EMAIL_USE_SSL", "0")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
msg.attach(f"{slug}-analytics.csv", csv_buffer.getvalue())
msg.send(fail_silently=False)
return
def get_assignee_details(slug, filters):
@@ -431,8 +475,11 @@ def analytic_export_task(email, data, slug):
)
csv_buffer = generate_csv_from_rows(rows)
send_export_email(email, slug, csv_buffer)
send_export_email(email, slug, csv_buffer, rows)
return
except Exception as e:
print(e)
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -1,57 +0,0 @@
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
@shared_task
def email_verification(first_name, email, token, current_site):
try:
realtivelink = "/request-email-verification/" + "?token=" + str(token)
abs_url = current_site + realtivelink
from_email_string = settings.EMAIL_FROM
subject = "Verify your Email!"
context = {
"first_name": first_name,
"verification_url": abs_url,
}
html_content = render_to_string("emails/auth/email_verification.html", context)
text_content = strip_tags(html_content)
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return
except Exception as e:
# Print logs if in DEBUG mode
if settings.DEBUG:
print(e)
capture_exception(e)
return

View File

@@ -0,0 +1,30 @@
import uuid
from posthog import Posthog
from django.conf import settings
#third party imports
from celery import shared_task
from sentry_sdk import capture_exception
@shared_task
def auth_events(user, email, user_agent, ip, event_name, medium, first_time):
try:
posthog = Posthog(settings.POSTHOG_API_KEY, host=settings.POSTHOG_HOST)
posthog.capture(
email,
event=event_name,
properties={
"event_id": uuid.uuid4().hex,
"user": {"email": email, "id": str(user)},
"device_ctx": {
"ip": ip,
"user_agent": user_agent,
},
"medium": medium,
"first_time": first_time
}
)
except Exception as e:
capture_exception(e)

View File

@@ -1,3 +1,8 @@
# Python import
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -9,39 +14,87 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def forgot_password(first_name, email, uidb64, token, current_site):
try:
realtivelink = f"/accounts/reset-password/?uidb64={uidb64}&token={token}"
abs_url = current_site + realtivelink
relative_link = (
f"/accounts/password/?uidb64={uidb64}&token={token}&email={email}"
)
abs_url = str(current_site) + relative_link
from_email_string = settings.EMAIL_FROM
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
subject = "Reset Your Password - Plane"
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration(instance_configuration=instance_configuration)
# Send the email if the users don't have smtp configured
if not (EMAIL_HOST and EMAIL_HOST_USER and EMAIL_HOST_PASSWORD):
# Check the instance registration
instance = Instance.objects.first()
# headers
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
payload = {
"abs_url": abs_url,
"first_name": first_name,
"email": email,
}
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/users/forgot-password/",
headers=headers,
data=json.dumps(payload),
)
return
subject = "A new password to your Plane account has been requested"
context = {
"first_name": first_name,
"forgot_password_url": abs_url,
"email": email,
}
html_content = render_to_string("emails/auth/forgot_password.html", context)
text_content = strip_tags(html_content)
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -26,6 +26,7 @@ from plane.db.models import (
IssueProperty,
)
from plane.bgtasks.user_welcome_task import send_welcome_slack
from plane.bgtasks.user_count_task import update_user_instance_user_count
@shared_task
@@ -120,6 +121,9 @@ def service_importer(service, importer_id):
batch_size=100,
ignore_conflicts=True,
)
# Update instance user count
update_user_instance_user_count.delay()
# Check if sync config is on for github importers
if service == "github" and importer.config.get("sync", False):

View File

@@ -1,3 +1,8 @@
# Python imports
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -9,38 +14,77 @@ from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def magic_link(email, key, token, current_site):
try:
realtivelink = f"/magic-sign-in/?password={token}&key={key}"
abs_url = current_site + realtivelink
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
subject = "Login for Plane"
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration(instance_configuration=instance_configuration)
context = {"magic_url": abs_url, "code": token}
# Send the email if the users don't have smtp configured
if not (EMAIL_HOST and EMAIL_HOST_USER and EMAIL_HOST_PASSWORD):
# Check the instance registration
instance = Instance.objects.first()
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
payload = {
"token": token,
"email": email,
}
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/users/magic-code/",
headers=headers,
data=json.dumps(payload),
)
return
# Send the mail
subject = f"Your unique Plane login code is {token}"
context = {"code": token, "email": email}
html_content = render_to_string("emails/auth/magic_signin.html", context)
text_content = strip_tags(html_content)
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
msg.attach_alternative(html_content, "text/html")
msg.send()
return
except Exception as e:
print(e)
capture_exception(e)
# Print logs if in DEBUG mode
if settings.DEBUG:

View File

@@ -190,6 +190,7 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
issue_activities_created) if issue_activities_created is not None else None
)
if type not in [
"issue.activity.deleted",
"cycle.activity.created",
"cycle.activity.deleted",
"module.activity.created",

View File

@@ -1,3 +1,6 @@
# Python import
import os
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -25,8 +28,6 @@ def project_invitation(email, project_id, token, current_site, invitor):
relativelink = f"/project-invitations/?invitation_id={project_member_invite.id}&email={email}&slug={project.workspace.slug}&project_id={str(project_id)}"
abs_url = current_site + relativelink
from_email_string = settings.EMAIL_FROM
subject = f"{user.first_name or user.display_name or user.email} invited you to join {project.name} on Plane"
context = {
@@ -48,14 +49,45 @@ def project_invitation(email, project_id, token, current_site, invitor):
# Configure email connection from the database
instance_configuration = InstanceConfiguration.objects.filter(key__startswith='EMAIL_').values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(get_configuration_value(instance_configuration, "EMAIL_PORT", "587")),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(instance_configuration, "EMAIL_HOST_PASSWORD"),
use_tls=bool(get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")),
host=get_configuration_value(
instance_configuration, "EMAIL_HOST", os.environ.get("EMAIL_HOST")
),
port=int(
get_configuration_value(
instance_configuration, "EMAIL_PORT", os.environ.get("EMAIL_PORT")
)
),
username=get_configuration_value(
instance_configuration,
"EMAIL_HOST_USER",
os.environ.get("EMAIL_HOST_USER"),
),
password=get_configuration_value(
instance_configuration,
"EMAIL_HOST_PASSWORD",
os.environ.get("EMAIL_HOST_PASSWORD"),
),
use_tls=bool(
get_configuration_value(
instance_configuration,
"EMAIL_USE_TLS",
os.environ.get("EMAIL_USE_TLS", "1"),
)
),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(subject=subject, body=text_content, from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"), to=[email], connection=connection)
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=get_configuration_value(
instance_configuration,
"EMAIL_FROM",
os.environ.get("EMAIL_FROM", "Team Plane <team@mailer.plane.so>"),
),
to=[email],
connection=connection,
)
msg.attach_alternative(html_content, "text/html")
msg.send()
return

View File

@@ -0,0 +1,45 @@
# Python imports
import json
import requests
import os
# django imports
from django.conf import settings
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
# Module imports
from plane.db.models import User
from plane.license.models import Instance
@shared_task
def update_user_instance_user_count():
try:
instance_users = User.objects.filter(is_bot=False).count()
instance = Instance.objects.update(user_count=instance_users)
# Update the count in the license engine
payload = {
"user_count": User.objects.count(),
}
# Save the user in control center
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
# Update the license engine
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps(payload),
)
except Exception as e:
if settings.DEBUG:
print(e)
capture_exception(e)

View File

@@ -2,15 +2,67 @@ import requests
import uuid
import hashlib
import json
import hmac
# Django imports
from django.conf import settings
from django.core.serializers.json import DjangoJSONEncoder
# Third party imports
from celery import shared_task
from sentry_sdk import capture_exception
from plane.db.models import Webhook, WebhookLog
from plane.db.models import (
Webhook,
WebhookLog,
Project,
Issue,
Cycle,
Module,
ModuleIssue,
CycleIssue,
IssueComment,
)
from plane.api.serializers import (
ProjectSerializer,
IssueSerializer,
CycleSerializer,
ModuleSerializer,
CycleIssueSerializer,
ModuleIssueSerializer,
IssueCommentSerializer,
IssueExpandSerializer,
)
SERIALIZER_MAPPER = {
"project": ProjectSerializer,
"issue": IssueExpandSerializer,
"cycle": CycleSerializer,
"module": ModuleSerializer,
"cycle_issue": CycleIssueSerializer,
"module_issue": ModuleIssueSerializer,
"issue_comment": IssueCommentSerializer,
}
MODEL_MAPPER = {
"project": Project,
"issue": Issue,
"cycle": Cycle,
"module": Module,
"cycle_issue": CycleIssue,
"module_issue": ModuleIssue,
"issue_comment": IssueComment,
}
def get_model_data(event, event_id, many=False):
model = MODEL_MAPPER.get(event)
if many:
queryset = model.objects.filter(pk__in=event_id)
else:
queryset = model.objects.get(pk=event_id)
serializer = SERIALIZER_MAPPER.get(event)
return serializer(queryset, many=many).data
@shared_task(
@@ -31,18 +83,12 @@ def webhook_task(self, webhook, slug, event, event_data, action):
"X-Plane-Event": event,
}
# Your secret key
if webhook.secret_key:
# Concatenate the data and the secret key
message = event_data + webhook.secret_key
# Create a SHA-256 hash of the message
sha256 = hashlib.sha256()
sha256.update(message.encode("utf-8"))
signature = sha256.hexdigest()
headers["X-Plane-Signature"] = signature
event_data = json.loads(event_data) if event_data is not None else None
# # Your secret key
event_data = (
json.loads(json.dumps(event_data, cls=DjangoJSONEncoder))
if event_data is not None
else None
)
action = {
"POST": "create",
@@ -59,6 +105,16 @@ def webhook_task(self, webhook, slug, event, event_data, action):
"data": event_data,
}
# Use HMAC for generating signature
if webhook.secret_key:
hmac_signature = hmac.new(
webhook.secret_key.encode("utf-8"),
json.dumps(payload, sort_keys=True).encode("utf-8"),
hashlib.sha256,
)
signature = hmac_signature.hexdigest()
headers["X-Plane-Signature"] = signature
# Send the webhook event
response = requests.post(
webhook.url,
@@ -96,10 +152,6 @@ def webhook_task(self, webhook, slug, event, event_data, action):
retry_count=str(self.request.retries),
)
# Retry logic
if self.request.retries >= self.max_retries:
Webhook.objects.filter(pk=webhook.id).update(is_active=False)
return
raise requests.RequestException()
except Exception as e:
@@ -110,7 +162,7 @@ def webhook_task(self, webhook, slug, event, event_data, action):
@shared_task()
def send_webhook(event, event_data, action, slug):
def send_webhook(event, payload, kw, action, slug, bulk):
try:
webhooks = Webhook.objects.filter(workspace__slug=slug, is_active=True)
@@ -120,17 +172,48 @@ def send_webhook(event, event_data, action, slug):
if event == "issue":
webhooks = webhooks.filter(issue=True)
if event == "module":
if event == "module" or event == "module_issue":
webhooks = webhooks.filter(module=True)
if event == "cycle":
if event == "cycle" or event == "cycle_issue":
webhooks = webhooks.filter(cycle=True)
if event == "issue-comment":
if event == "issue_comment":
webhooks = webhooks.filter(issue_comment=True)
for webhook in webhooks:
webhook_task.delay(webhook.id, slug, event, event_data, action)
if webhooks:
if action in ["POST", "PATCH"]:
if bulk and event in ["cycle_issue", "module_issue"]:
event_data = IssueExpandSerializer(
Issue.objects.filter(
pk__in=[
str(event.get("issue")) for event in payload
]
).prefetch_related("issue_cycle", "issue_module"), many=True
).data
event = "issue"
action = "PATCH"
else:
event_data = [
get_model_data(
event=event,
event_id=payload.get("id") if isinstance(payload, dict) else None,
many=False,
)
]
if action == "DELETE":
event_data = [{"id": kw.get("pk")}]
for webhook in webhooks:
for data in event_data:
webhook_task.delay(
webhook=webhook.id,
slug=slug,
event=event,
event_data=data,
action=action,
)
except Exception as e:
if settings.DEBUG:

View File

@@ -1,3 +1,8 @@
# Python imports
import os
import requests
import json
# Django imports
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
@@ -12,8 +17,8 @@ from slack_sdk.errors import SlackApiError
# Module imports
from plane.db.models import Workspace, WorkspaceMemberInvite, User
from plane.license.models import InstanceConfiguration
from plane.license.utils.instance_value import get_configuration_value
from plane.license.models import InstanceConfiguration, Instance
from plane.license.utils.instance_value import get_email_configuration
@shared_task
@@ -30,19 +35,54 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
relative_link = f"/workspace-invitations/?invitation_id={workspace_member_invite.id}&email={email}&slug={workspace.slug}"
# The complete url including the domain
abs_url = current_site + relative_link
abs_url = str(current_site) + relative_link
# The email from
from_email_string = settings.EMAIL_FROM
instance_configuration = InstanceConfiguration.objects.filter(
key__startswith="EMAIL_"
).values("key", "value")
(
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
) = get_email_configuration(instance_configuration=instance_configuration)
# Send the email if the users don't have smtp configured
if not (EMAIL_HOST and EMAIL_HOST_USER and EMAIL_HOST_PASSWORD):
# Check the instance registration
instance = Instance.objects.first()
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
payload = {
"user": user.first_name or user.display_name or user.email,
"workspace_name": workspace.name,
"invitation_url": abs_url,
"email": email,
}
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/users/workspace-invitation/",
headers=headers,
data=json.dumps(payload),
)
return
# Subject of the email
subject = f"{user.first_name or user.display_name or user.email} invited you to join {workspace.name} on Plane"
subject = f"{user.first_name or user.display_name or user.email} has invited you to join them in {workspace.name} on Plane"
context = {
"email": email,
"first_name": invitor,
"first_name": user.first_name or user.display_name or user.email,
"workspace_name": workspace.name,
"invitation_url": abs_url,
"abs_url": abs_url,
}
html_content = render_to_string(
@@ -58,23 +98,17 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
key__startswith="EMAIL_"
).values("key", "value")
connection = get_connection(
host=get_configuration_value(instance_configuration, "EMAIL_HOST"),
port=int(
get_configuration_value(instance_configuration, "EMAIL_PORT", "587")
),
username=get_configuration_value(instance_configuration, "EMAIL_HOST_USER"),
password=get_configuration_value(
instance_configuration, "EMAIL_HOST_PASSWORD"
),
use_tls=bool(
get_configuration_value(instance_configuration, "EMAIL_USE_TLS", "1")
),
host=EMAIL_HOST,
port=int(EMAIL_PORT),
username=EMAIL_HOST_USER,
password=EMAIL_HOST_PASSWORD,
use_tls=bool(EMAIL_USE_TLS),
)
# Initiate email alternatives
msg = EmailMultiAlternatives(
subject=subject,
body=text_content,
from_email=get_configuration_value(instance_configuration, "EMAIL_FROM"),
from_email=EMAIL_FROM,
to=[email],
connection=connection,
)
@@ -94,6 +128,7 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
return
except (Workspace.DoesNotExist, WorkspaceMemberInvite.DoesNotExist) as e:
print("Workspace or WorkspaceMember Invite Does not exists")
return
except Exception as e:
# Print logs if in DEBUG mode

View File

@@ -21,7 +21,7 @@ class Migration(migrations.Migration):
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('transaction', models.UUIDField(default=uuid.uuid4)),
('entity_identifier', models.UUIDField(null=True)),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('mention', 'Mention')], max_length=30, verbose_name='Transaction Type')),
('entity_name', models.CharField(choices=[('to_do', 'To Do'), ('issue', 'issue'), ('image', 'Image'), ('video', 'Video'), ('file', 'File'), ('link', 'Link'), ('cycle', 'Cycle'), ('module', 'Module'), ('back_link', 'Back Link'), ('forward_link', 'Forward Link'), ('page_mention', 'Page Mention'), ('user_mention', 'User Mention')], max_length=30, verbose_name='Transaction Type')),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_log', to='db.page')),
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),

View File

@@ -1,6 +1,11 @@
# Generated by Django 4.2.5 on 2023-11-17 08:48
from django.db import migrations, models
import plane.db.models.workspace
def user_password_autoset(apps, schema_editor):
User = apps.get_model("db", "User")
User.objects.update(is_password_autoset=True)
class Migration(migrations.Migration):
@@ -20,4 +25,15 @@ class Migration(migrations.Migration):
name='organization_size',
field=models.CharField(blank=True, max_length=20, null=True),
),
migrations.AddField(
model_name='fileasset',
name='is_deleted',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='workspace',
name='slug',
field=models.SlugField(max_length=48, unique=True, validators=[plane.db.models.workspace.slug_validator]),
),
migrations.RunPython(user_password_autoset),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 4.2.7 on 2023-11-20 08:26
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('db', '0050_user_use_case_alter_workspace_organization_size'),
]
operations = [
migrations.AddField(
model_name='fileasset',
name='is_deleted',
field=models.BooleanField(default=False),
),
]

View File

@@ -7,7 +7,6 @@ from django.db import models
from django.conf import settings
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.utils import timezone
from django.core.validators import MinValueValidator, MaxValueValidator
from django.core.exceptions import ValidationError
@@ -141,8 +140,10 @@ class Issue(ProjectBaseModel):
)["largest"]
# aggregate can return None! Check it first.
# If it isn't none, just use the last ID specified (which should be the greatest) and add one to it
if last_id is not None:
if last_id:
self.sequence_id = last_id + 1
else:
self.sequence_id = 1
largest_sort_order = Issue.objects.filter(
project=self.project, state=self.state

View File

@@ -51,9 +51,9 @@ class Module(ProjectBaseModel):
def save(self, *args, **kwargs):
if self._state.adding:
smallest_sort_order = Module.objects.filter(
project=self.project
).aggregate(smallest=models.Min("sort_order"))["smallest"]
smallest_sort_order = Module.objects.filter(project=self.project).aggregate(
smallest=models.Min("sort_order")
)["smallest"]
if smallest_sort_order is not None:
self.sort_order = smallest_sort_order - 10000

View File

@@ -57,7 +57,8 @@ class PageLog(ProjectBaseModel):
("module", "Module"),
("back_link", "Back Link"),
("forward_link", "Forward Link"),
("mention", "Mention"),
("page_mention", "Page Mention"),
("user_mention", "User Mention"),
)
transaction = models.UUIDField(default=uuid.uuid4)
page = models.ForeignKey(

View File

@@ -16,7 +16,6 @@ def generate_token():
def validate_schema(value):
parsed_url = urlparse(value)
print(parsed_url)
if parsed_url.scheme not in ["http", "https"]:
raise ValidationError("Invalid schema. Only HTTP and HTTPS are allowed.")

View File

@@ -1,6 +1,7 @@
# Django imports
from django.db import models
from django.conf import settings
from django.core.exceptions import ValidationError
# Module imports
from . import BaseModel
@@ -50,7 +51,7 @@ def get_default_props():
"state": True,
"sub_issue_count": True,
"updated_on": True,
}
},
}
@@ -63,6 +64,23 @@ def get_issue_props():
}
def slug_validator(value):
if value in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
]:
raise ValidationError("Slug is not valid")
class Workspace(BaseModel):
name = models.CharField(max_length=80, verbose_name="Workspace Name")
logo = models.URLField(verbose_name="Logo", blank=True, null=True)
@@ -71,7 +89,7 @@ class Workspace(BaseModel):
on_delete=models.CASCADE,
related_name="owner_workspace",
)
slug = models.SlugField(max_length=48, db_index=True, unique=True)
slug = models.SlugField(max_length=48, db_index=True, unique=True, validators=[slug_validator,])
organization_size = models.CharField(max_length=20, blank=True, null=True)
def __str__(self):

View File

@@ -1 +1 @@
from .instance import InstanceOwnerPermission, InstanceAdminPermission
from .instance import InstanceAdminPermission

View File

@@ -5,20 +5,6 @@ from rest_framework.permissions import BasePermission
from plane.license.models import Instance, InstanceAdmin
class InstanceOwnerPermission(BasePermission):
def has_permission(self, request, view):
if request.user.is_anonymous:
return False
instance = Instance.objects.first()
return InstanceAdmin.objects.filter(
role=20,
instance=instance,
user=request.user,
).exists()
class InstanceAdminPermission(BasePermission):
def has_permission(self, request, view):

View File

@@ -2,7 +2,7 @@
from plane.license.models import Instance, InstanceAdmin, InstanceConfiguration
from plane.app.serializers import BaseSerializer
from plane.app.serializers import UserAdminLiteSerializer
from plane.license.utils.encryption import decrypt_data
class InstanceSerializer(BaseSerializer):
primary_owner_details = UserAdminLiteSerializer(source="primary_owner", read_only=True)
@@ -12,14 +12,13 @@ class InstanceSerializer(BaseSerializer):
fields = "__all__"
read_only_fields = [
"id",
"primary_owner",
"primary_email",
"instance_id",
"license_key",
"api_key",
"version",
"email",
"last_checked_at",
"is_setup_done",
]
@@ -40,3 +39,11 @@ class InstanceConfigurationSerializer(BaseSerializer):
class Meta:
model = InstanceConfiguration
fields = "__all__"
def to_representation(self, instance):
data = super().to_representation(instance)
# Decrypt secrets value
if instance.key in ["OPENAI_API_KEY", "GITHUB_CLIENT_SECRET", "EMAIL_HOST_PASSWORD", "UNSPLASH_ACESS_KEY"] and instance.value is not None:
data["value"] = decrypt_data(instance.value)
return data

View File

@@ -1,6 +1,9 @@
from .instance import (
InstanceEndpoint,
TransferPrimaryOwnerEndpoint,
InstanceAdminEndpoint,
InstanceConfigurationEndpoint,
AdminSetupMagicSignInEndpoint,
SignUpScreenVisitedEndpoint,
AdminMagicSignInGenerateEndpoint,
AdminSetUserPasswordEndpoint,
)

View File

@@ -2,13 +2,22 @@
import json
import os
import requests
import uuid
import random
import string
# Django imports
from django.utils import timezone
from django.contrib.auth.hashers import make_password
from django.core.validators import validate_email
from django.core.exceptions import ValidationError
from django.conf import settings
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
from rest_framework_simplejwt.tokens import RefreshToken
# Module imports
from plane.app.views import BaseAPIView
@@ -18,24 +27,25 @@ from plane.license.api.serializers import (
InstanceAdminSerializer,
InstanceConfigurationSerializer,
)
from plane.app.serializers import UserSerializer
from plane.license.api.permissions import (
InstanceOwnerPermission,
InstanceAdminPermission,
)
from plane.db.models import User
from plane.license.utils.encryption import encrypt_data
from plane.settings.redis import redis_instance
from plane.bgtasks.magic_link_code_task import magic_link
class InstanceEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "PATCH"]:
self.permission_classes = [
InstanceOwnerPermission,
if self.request.method == "PATCH":
return [
InstanceAdminPermission(),
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceEndpoint, self).get_permissions()
return [
AllowAny(),
]
def post(self, request):
# Check if the instance is registered
@@ -47,23 +57,17 @@ class InstanceEndpoint(BaseAPIView):
# Load JSON content from the file
data = json.load(file)
license_engine_base_url = os.environ.get("LICENSE_ENGINE_BASE_URL")
if not license_engine_base_url:
raise Response(
{"error": "LICENSE_ENGINE_BASE_URL is required"},
status=status.HTTP_400_BAD_REQUEST,
)
headers = {"Content-Type": "application/json"}
payload = {
"email": request.user.email,
"instance_key":settings.INSTANCE_KEY,
"version": data.get("version", 0.1),
"machine_signature": os.environ.get("MACHINE_SIGNATURE"),
"user_count": User.objects.filter(is_bot=False).count(),
}
response = requests.post(
f"{license_engine_base_url}/api/instances",
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps(payload),
)
@@ -77,21 +81,15 @@ class InstanceEndpoint(BaseAPIView):
license_key=data.get("license_key"),
api_key=data.get("api_key"),
version=data.get("version"),
primary_email=data.get("email"),
primary_owner=request.user,
last_checked_at=timezone.now(),
)
# Create instance admin
_ = InstanceAdmin.objects.create(
user=request.user,
instance=instance,
role=20,
user_count=data.get("user_count", 0),
)
serializer = InstanceSerializer(instance)
data = serializer.data
data["is_activated"] = True
return Response(
{
"message": f"Instance succesfully registered with owner: {instance.primary_owner.email}"
},
data,
status=status.HTTP_201_CREATED,
)
return Response(
@@ -100,9 +98,7 @@ class InstanceEndpoint(BaseAPIView):
)
else:
return Response(
{
"message": f"Instance already registered with instance owner: {instance.primary_owner.email}"
},
{"message": "Instance already registered"},
status=status.HTTP_200_OK,
)
@@ -110,11 +106,15 @@ class InstanceEndpoint(BaseAPIView):
instance = Instance.objects.first()
# get the instance
if instance is None:
return Response({"activated": False}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"is_activated": False, "is_setup_done": False},
status=status.HTTP_200_OK,
)
# Return instance
serializer = InstanceSerializer(instance)
serializer.data["activated"] = True
return Response(serializer.data, status=status.HTTP_200_OK)
data = serializer.data
data["is_activated"] = True
return Response(data, status=status.HTTP_200_OK)
def patch(self, request):
# Get the instance
@@ -122,62 +122,37 @@ class InstanceEndpoint(BaseAPIView):
serializer = InstanceSerializer(instance, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
# Save the user in control center
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
# Update instance settings in the license engine
_ = requests.patch(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps(
{
"is_support_required": serializer.data["is_support_required"],
"is_telemetry_enabled": serializer.data["is_telemetry_enabled"],
"version": serializer.data["version"],
}
),
)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
class TransferPrimaryOwnerEndpoint(BaseAPIView):
permission_classes = [
InstanceOwnerPermission,
]
# Transfer the owner of the instance
def post(self, request):
instance = Instance.objects.first()
# Get the email of the new user
email = request.data.get("email", False)
if not email:
return Response(
{"error": "User is required"}, status=status.HTTP_400_BAD_REQUEST
)
# Get users
user = User.objects.get(email=email)
# Save the instance user
instance.primary_owner = user
instance.primary_email = user.email
instance.save(update_fields=["owner", "email"])
# Add the user to admin
_ = InstanceAdmin.objects.get_or_create(
instance=instance,
user=user,
role=20,
)
return Response(
{"message": "Owner successfully updated"}, status=status.HTTP_200_OK
)
class InstanceAdminEndpoint(BaseAPIView):
def get_permissions(self):
if self.request.method in ["POST", "DELETE"]:
self.permission_classes = [
InstanceOwnerPermission,
]
else:
self.permission_classes = [
InstanceAdminPermission,
]
return super(InstanceAdminEndpoint, self).get_permissions()
permission_classes = [
InstanceAdminPermission,
]
# Create an instance admin
def post(self, request):
email = request.data.get("email", False)
role = request.data.get("role", 15)
role = request.data.get("role", 20)
if not email:
return Response(
@@ -230,18 +205,289 @@ class InstanceConfigurationEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request):
configurations = InstanceConfiguration.objects.filter(key__in=request.data.keys())
configurations = InstanceConfiguration.objects.filter(
key__in=request.data.keys()
)
bulk_configurations = []
for configuration in configurations:
configuration.value = request.data.get(configuration.key, configuration.value)
value = request.data.get(configuration.key, configuration.value)
if value is not None and configuration.key in [
"OPENAI_API_KEY",
"GITHUB_CLIENT_SECRET",
"EMAIL_HOST_PASSWORD",
"UNSPLASH_ACESS_KEY",
]:
configuration.value = encrypt_data(value)
else:
configuration.value = value
bulk_configurations.append(configuration)
InstanceConfiguration.objects.bulk_update(
bulk_configurations,
["value"],
batch_size=100
bulk_configurations, ["value"], batch_size=100
)
serializer = InstanceConfigurationSerializer(configurations, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
def get_tokens_for_user(user):
refresh = RefreshToken.for_user(user)
return (
str(refresh.access_token),
str(refresh),
)
class AdminMagicSignInGenerateEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
email = request.data.get("email", False)
# Check the instance registration
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
if InstanceAdmin.objects.first():
return Response(
{"error": "Admin for this instance is already registered"},
status=status.HTTP_400_BAD_REQUEST,
)
if not email:
return Response(
{"error": "Please provide a valid email address"},
status=status.HTTP_400_BAD_REQUEST,
)
# Clean up
email = email.strip().lower()
validate_email(email)
# check if the email exists
if not User.objects.filter(email=email).exists():
# Create a user
_ = User.objects.create(
email=email,
username=uuid.uuid4().hex,
password=make_password(uuid.uuid4().hex),
is_password_autoset=True,
)
## Generate a random token
token = (
"".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
+ "-"
+ "".join(random.choices(string.ascii_lowercase, k=4))
)
ri = redis_instance()
key = "magic_" + str(email)
# Check if the key already exists in python
if ri.exists(key):
data = json.loads(ri.get(key))
current_attempt = data["current_attempt"] + 1
if data["current_attempt"] > 2:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
status=status.HTTP_400_BAD_REQUEST,
)
value = {
"current_attempt": current_attempt,
"email": email,
"token": token,
}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
else:
value = {"current_attempt": 0, "email": email, "token": token}
expiry = 600
ri.set(key, json.dumps(value), ex=expiry)
# If the smtp is configured send through here
current_site = request.META.get("HTTP_ORIGIN")
magic_link.delay(email, key, token, current_site)
return Response({"key": key}, status=status.HTTP_200_OK)
class AdminSetupMagicSignInEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
user_token = request.data.get("token", "").strip()
key = request.data.get("key", "").strip().lower()
if not key or user_token == "":
return Response(
{"error": "User token and key are required"},
status=status.HTTP_400_BAD_REQUEST,
)
if InstanceAdmin.objects.first():
return Response(
{"error": "Admin for this instance is already registered"},
status=status.HTTP_400_BAD_REQUEST,
)
ri = redis_instance()
if ri.exists(key):
data = json.loads(ri.get(key))
token = data["token"]
email = data["email"]
if str(token) == str(user_token):
# get the user
user = User.objects.get(email=email)
# get the email
user.is_active = True
user.is_email_verified = True
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = request.META.get("REMOTE_ADDR")
user.last_login_uagent = request.META.get("HTTP_USER_AGENT")
user.token_updated_at = timezone.now()
user.save()
access_token, refresh_token = get_tokens_for_user(user)
data = {
"access_token": access_token,
"refresh_token": refresh_token,
}
return Response(data, status=status.HTTP_200_OK)
else:
return Response(
{"error": "Your login code was incorrect. Please try again."},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return Response(
{"error": "The magic code/link has expired please try again"},
status=status.HTTP_400_BAD_REQUEST,
)
class AdminSetUserPasswordEndpoint(BaseAPIView):
def post(self, request):
user = User.objects.get(pk=request.user.id)
password = request.data.get("password", False)
# If the user password is not autoset then return error
if not user.is_password_autoset:
return Response(
{
"error": "Your password is already set please change your password from profile"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check password validation
if not password and len(str(password)) < 8:
return Response(
{"error": "Password is not valid"}, status=status.HTTP_400_BAD_REQUEST
)
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
# Save the user in control center
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
_ = requests.patch(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps({"is_setup_done": True}),
)
# Also register the user as admin
_ = requests.post(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/users/register/",
headers=headers,
data=json.dumps(
{
"email": str(user.email),
"signup_mode": "MAGIC_CODE",
"is_admin": True,
}
),
)
# Register the user as an instance admin
_ = InstanceAdmin.objects.create(
user=user,
instance=instance,
)
# Make the setup flag True
instance.is_setup_done = True
instance.save()
# Set the user password
user.set_password(password)
user.is_password_autoset = False
user.save()
serializer = UserSerializer(user)
return Response(serializer.data, status=status.HTTP_200_OK)
class SignUpScreenVisitedEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def post(self, request):
instance = Instance.objects.first()
if instance is None:
return Response(
{"error": "Instance is not configured"},
status=status.HTTP_400_BAD_REQUEST,
)
if not instance.is_signup_screen_visited:
instance.is_signup_screen_visited = True
instance.save()
# set the headers
headers = {
"Content-Type": "application/json",
"x-instance-id": instance.instance_id,
"x-api-key": instance.api_key,
}
# create the payload
payload = {"is_signup_screen_visited": True}
_ = requests.patch(
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps(payload),
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -2,46 +2,120 @@
import os
# Django imports
from django.core.management.base import BaseCommand, CommandError
from django.utils import timezone
from django.core.management.base import BaseCommand
from django.conf import settings
# Module imports
from plane.license.models import InstanceConfiguration
class Command(BaseCommand):
help = "Configure instance variables"
def handle(self, *args, **options):
config_keys = {
# Authentication Settings
"GOOGLE_CLIENT_ID": os.environ.get("GOOGLE_CLIENT_ID"),
"GOOGLE_CLIENT_SECRET": os.environ.get("GOOGLE_CLIENT_SECRET"),
"GITHUB_CLIENT_ID": os.environ.get("GITHUB_CLIENT_ID"),
"GITHUB_CLIENT_SECRET": os.environ.get("GITHUB_CLIENT_SECRET"),
"ENABLE_SIGNUP": os.environ.get("ENABLE_SIGNUP", "1"),
"ENABLE_EMAIL_PASSWORD": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
"ENABLE_MAGIC_LINK_LOGIN": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0"),
# Email Settings
"EMAIL_HOST": os.environ.get("EMAIL_HOST", ""),
"EMAIL_HOST_USER": os.environ.get("EMAIL_HOST_USER", ""),
"EMAIL_HOST_PASSWORD": os.environ.get("EMAIL_HOST_PASSWORD"),
"EMAIL_PORT": os.environ.get("EMAIL_PORT", "587"),
"EMAIL_FROM": os.environ.get("EMAIL_FROM", ""),
"EMAIL_USE_TLS": os.environ.get("EMAIL_USE_TLS", "1"),
"EMAIL_USE_SSL": os.environ.get("EMAIL_USE_SSL", "0"),
# Open AI Settings
"OPENAI_API_BASE": os.environ.get("", "https://api.openai.com/v1"),
"OPENAI_API_KEY": os.environ.get("OPENAI_API_KEY", "sk-"),
"GPT_ENGINE": os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"),
}
from plane.license.utils.encryption import encrypt_data
for key, value in config_keys.items():
config_keys = [
# Authentication Settings
{
"key": "ENABLE_SIGNUP",
"value": os.environ.get("ENABLE_SIGNUP", "1"),
"category": "AUTHENTICATION",
},
{
"key": "ENABLE_EMAIL_PASSWORD",
"value": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
"category": "AUTHENTICATION",
},
{
"key": "ENABLE_MAGIC_LINK_LOGIN",
"value": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "0"),
"category": "AUTHENTICATION",
},
{
"key": "GOOGLE_CLIENT_ID",
"value": os.environ.get("GOOGLE_CLIENT_ID"),
"category": "GOOGLE",
},
{
"key": "GITHUB_CLIENT_ID",
"value": os.environ.get("GITHUB_CLIENT_ID"),
"category": "GITHUB",
},
{
"key": "GITHUB_CLIENT_SECRET",
"value": encrypt_data(os.environ.get("GITHUB_CLIENT_SECRET"))
if os.environ.get("GITHUB_CLIENT_SECRET")
else None,
"category": "GITHUB",
},
{
"key": "EMAIL_HOST",
"value": os.environ.get("EMAIL_HOST", ""),
"category": "SMTP",
},
{
"key": "EMAIL_HOST_USER",
"value": os.environ.get("EMAIL_HOST_USER", ""),
"category": "SMTP",
},
{
"key": "EMAIL_HOST_PASSWORD",
"value": encrypt_data(os.environ.get("EMAIL_HOST_PASSWORD"))
if os.environ.get("EMAIL_HOST_PASSWORD")
else None,
"category": "SMTP",
},
{
"key": "EMAIL_PORT",
"value": os.environ.get("EMAIL_PORT", "587"),
"category": "SMTP",
},
{
"key": "EMAIL_FROM",
"value": os.environ.get("EMAIL_FROM", ""),
"category": "SMTP",
},
{
"key": "EMAIL_USE_TLS",
"value": os.environ.get("EMAIL_USE_TLS", "1"),
"category": "SMTP",
},
{
"key": "OPENAI_API_KEY",
"value": encrypt_data(os.environ.get("OPENAI_API_KEY"))
if os.environ.get("OPENAI_API_KEY")
else None,
"category": "OPENAI",
},
{
"key": "GPT_ENGINE",
"value": os.environ.get("GPT_ENGINE", "gpt-3.5-turbo"),
"category": "SMTP",
},
{
"key": "UNSPLASH_ACCESS_KEY",
"value": encrypt_data(os.environ.get("UNSPLASH_ACESS_KEY", ""))
if os.environ.get("UNSPLASH_ACESS_KEY")
else None,
"category": "UNSPLASH",
},
]
for item in config_keys:
obj, created = InstanceConfiguration.objects.get_or_create(
key=key
key=item.get("key")
)
if created:
obj.value = value
obj.value = item.get("value")
obj.category = item.get("category")
obj.save()
self.stdout.write(self.style.SUCCESS(f"{key} loaded with value from environment variable."))
self.stdout.write(
self.style.SUCCESS(
f"{obj.key} loaded with value from environment variable."
)
)
else:
self.stdout.write(self.style.WARNING(f"{key} configuration already exists"))
self.stdout.write(
self.style.WARNING(f"{obj.key} configuration already exists")
)

View File

@@ -2,22 +2,24 @@
import json
import os
import requests
import uuid
# Django imports
from django.core.management.base import BaseCommand, CommandError
from django.utils import timezone
from django.core.exceptions import ValidationError
from django.core.validators import validate_email
from django.conf import settings
# Module imports
from plane.license.models import Instance
from plane.db.models import User
from plane.license.models import Instance, InstanceAdmin
class Command(BaseCommand):
help = "Check if instance in registered else register"
def add_arguments(self, parser):
# Positional argument
parser.add_argument('machine_signature', type=str, help='Machine signature')
def handle(self, *args, **options):
# Check if the instance is registered
instance = Instance.objects.first()
@@ -28,40 +30,23 @@ class Command(BaseCommand):
# Load JSON content from the file
data = json.load(file)
admin_email = os.environ.get("ADMIN_EMAIL")
machine_signature = options.get("machine_signature", False)
try:
validate_email(admin_email)
except ValidationError:
CommandError(f"{admin_email} is not a valid ADMIN_EMAIL")
# Raise an exception if the admin email is not provided
if not admin_email:
raise CommandError("ADMIN_EMAIL is required")
# Check if the admin email user exists
user = User.objects.filter(email=admin_email).first()
# If the user does not exist create the user and add him to the database
if user is None:
user = User.objects.create(email=admin_email, username=uuid.uuid4().hex)
user.set_password(uuid.uuid4().hex)
user.save()
license_engine_base_url = os.environ.get("LICENSE_ENGINE_BASE_URL")
if not license_engine_base_url:
raise CommandError("LICENSE_ENGINE_BASE_URL is required")
if not machine_signature:
raise CommandError("Machine signature is required")
headers = {"Content-Type": "application/json"}
payload = {
"email": user.email,
"instance_key": settings.INSTANCE_KEY,
"version": data.get("version", 0.1),
"machine_signature": machine_signature,
"user_count": User.objects.filter(is_bot=False).count(),
}
response = requests.post(
f"{license_engine_base_url}/api/instances/",
f"{settings.LICENSE_ENGINE_BASE_URL}/api/instances/",
headers=headers,
data=json.dumps(payload),
)
@@ -75,30 +60,21 @@ class Command(BaseCommand):
license_key=data.get("license_key"),
api_key=data.get("api_key"),
version=data.get("version"),
primary_email=data.get("email"),
primary_owner=user,
last_checked_at=timezone.now(),
)
# Create instance admin
_ = InstanceAdmin.objects.create(
user=user,
instance=instance,
role=20,
user_count=data.get("user_count", 0),
)
self.stdout.write(
self.style.SUCCESS(
f"Instance succesfully registered with owner: {instance.primary_owner.email}"
f"Instance successfully registered"
)
)
return
self.stdout.write(self.style.WARNING("Instance could not be registered"))
return
raise CommandError("Instance could not be registered")
else:
self.stdout.write(
self.style.SUCCESS(
f"Instance already registered with instance owner: {instance.primary_owner.email}"
f"Instance already registered"
)
)
return

View File

@@ -1,4 +1,4 @@
# Generated by Django 4.2.5 on 2023-11-15 14:22
# Generated by Django 4.2.7 on 2023-11-29 14:39
from django.conf import settings
from django.db import migrations, models
@@ -27,13 +27,14 @@ class Migration(migrations.Migration):
('license_key', models.CharField(blank=True, max_length=256, null=True)),
('api_key', models.CharField(max_length=16)),
('version', models.CharField(max_length=10)),
('primary_email', models.CharField(max_length=256)),
('last_checked_at', models.DateTimeField()),
('namespace', models.CharField(blank=True, max_length=50, null=True)),
('is_telemetry_enabled', models.BooleanField(default=True)),
('is_support_required', models.BooleanField(default=True)),
('is_setup_done', models.BooleanField(default=False)),
('is_signup_screen_visited', models.BooleanField(default=False)),
('user_count', models.PositiveBigIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('primary_owner', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='instance_primary_owner', to=settings.AUTH_USER_MODEL)),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
options={
@@ -51,6 +52,7 @@ class Migration(migrations.Migration):
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('key', models.CharField(max_length=100, unique=True)),
('value', models.TextField(blank=True, default=None, null=True)),
('category', models.TextField()),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
],
@@ -67,7 +69,7 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('role', models.PositiveIntegerField(choices=[(20, 'Owner'), (15, 'Admin')], default=15)),
('role', models.PositiveIntegerField(choices=[(20, 'Admin')], default=20)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('instance', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='admins', to='license.instance')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
@@ -78,6 +80,7 @@ class Migration(migrations.Migration):
'verbose_name_plural': 'Instance Admins',
'db_table': 'instance_admins',
'ordering': ('-created_at',),
'unique_together': {('instance', 'user')},
},
),
]

View File

@@ -1,19 +0,0 @@
# Generated by Django 4.2.5 on 2023-11-16 09:45
from django.conf import settings
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('license', '0001_initial'),
]
operations = [
migrations.AlterUniqueTogether(
name='instanceadmin',
unique_together={('instance', 'user')},
),
]

View File

@@ -4,11 +4,9 @@ from django.conf import settings
# Module imports
from plane.db.models import BaseModel
from plane.db.mixins import AuditModel
ROLE_CHOICES = (
(20, "Owner"),
(15, "Admin"),
(20, "Admin"),
)
@@ -20,20 +18,18 @@ class Instance(BaseModel):
license_key = models.CharField(max_length=256, null=True, blank=True)
api_key = models.CharField(max_length=16)
version = models.CharField(max_length=10)
# User information
primary_email = models.CharField(max_length=256)
primary_owner = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.SET_NULL,
null=True,
related_name="instance_primary_owner",
)
# Instnace specifics
last_checked_at = models.DateTimeField()
namespace = models.CharField(max_length=50, blank=True, null=True)
# telemetry and support
is_telemetry_enabled = models.BooleanField(default=True)
is_support_required = models.BooleanField(default=True)
# is setup done
is_setup_done = models.BooleanField(default=False)
# signup screen
is_signup_screen_visited = models.BooleanField(default=False)
# users
user_count = models.PositiveBigIntegerField(default=0)
class Meta:
verbose_name = "Instance"
@@ -50,7 +46,7 @@ class InstanceAdmin(BaseModel):
related_name="instance_owner",
)
instance = models.ForeignKey(Instance, on_delete=models.CASCADE, related_name="admins")
role = models.PositiveIntegerField(choices=ROLE_CHOICES, default=15)
role = models.PositiveIntegerField(choices=ROLE_CHOICES, default=20)
class Meta:
unique_together = ["instance", "user"]
@@ -64,6 +60,7 @@ class InstanceConfiguration(BaseModel):
# The instance configuration variables
key = models.CharField(max_length=100, unique=True)
value = models.TextField(null=True, blank=True, default=None)
category = models.TextField()
class Meta:
verbose_name = "Instance Configuration"

View File

@@ -2,9 +2,12 @@ from django.urls import path
from plane.license.api.views import (
InstanceEndpoint,
TransferPrimaryOwnerEndpoint,
InstanceAdminEndpoint,
InstanceConfigurationEndpoint,
AdminMagicSignInGenerateEndpoint,
AdminSetupMagicSignInEndpoint,
AdminSetUserPasswordEndpoint,
SignUpScreenVisitedEndpoint,
)
urlpatterns = [
@@ -13,11 +16,6 @@ urlpatterns = [
InstanceEndpoint.as_view(),
name="instance",
),
path(
"instances/transfer-primary-owner/",
TransferPrimaryOwnerEndpoint.as_view(),
name="instance",
),
path(
"instances/admins/",
InstanceAdminEndpoint.as_view(),
@@ -33,4 +31,24 @@ urlpatterns = [
InstanceConfigurationEndpoint.as_view(),
name="instance-configuration",
),
path(
"instances/admins/magic-generate/",
AdminMagicSignInGenerateEndpoint.as_view(),
name="instance-admins",
),
path(
"instances/admins/magic-sign-in/",
AdminSetupMagicSignInEndpoint.as_view(),
name="instance-admins",
),
path(
"instances/admins/set-password/",
AdminSetUserPasswordEndpoint.as_view(),
name="instance-admins",
),
path(
"instances/admins/sign-up-screen-visited/",
SignUpScreenVisitedEndpoint.as_view(),
name="instance-sign-up",
),
]

View File

@@ -0,0 +1,22 @@
import base64
import hashlib
from django.conf import settings
from cryptography.fernet import Fernet
def derive_key(secret_key):
# Use a key derivation function to get a suitable encryption key
dk = hashlib.pbkdf2_hmac('sha256', secret_key.encode(), b'salt', 100000)
return base64.urlsafe_b64encode(dk)
def encrypt_data(data):
cipher_suite = Fernet(derive_key(settings.SECRET_KEY))
encrypted_data = cipher_suite.encrypt(data.encode())
return encrypted_data.decode() # Convert bytes to string
def decrypt_data(encrypted_data):
cipher_suite = Fernet(derive_key(settings.SECRET_KEY))
decrypted_data = cipher_suite.decrypt(encrypted_data.encode()) # Convert string back to bytes
return decrypted_data.decode()

View File

@@ -1,6 +1,63 @@
import os
# Helper function to return value from the passed key
def get_configuration_value(query, key, default=None):
for item in query:
if item['key'] == key:
if item["key"] == key:
return item.get("value", default)
return default
def get_email_configuration(instance_configuration):
# Get the configuration variables
EMAIL_HOST_USER = get_configuration_value(
instance_configuration,
"EMAIL_HOST_USER",
os.environ.get("EMAIL_HOST_USER", None),
)
EMAIL_HOST_PASSWORD = get_configuration_value(
instance_configuration,
"EMAIL_HOST_PASSWORD",
os.environ.get("EMAIL_HOST_PASSWORD", None),
)
EMAIL_HOST = get_configuration_value(
instance_configuration,
"EMAIL_HOST",
os.environ.get("EMAIL_HOST", None),
)
EMAIL_FROM = get_configuration_value(
instance_configuration,
"EMAIL_FROM",
os.environ.get("EMAIL_FROM", None),
)
EMAIL_USE_TLS = get_configuration_value(
instance_configuration,
"EMAIL_USE_TLS",
os.environ.get("EMAIL_USE_TLS", "1"),
)
EMAIL_PORT = get_configuration_value(
instance_configuration,
"EMAIL_PORT",
587,
)
EMAIL_FROM = get_configuration_value(
instance_configuration,
"EMAIL_FROM",
os.environ.get("EMAIL_FROM", "Team Plane <team@mailer.plane.so>"),
)
return (
EMAIL_HOST,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
EMAIL_PORT,
EMAIL_USE_TLS,
EMAIL_FROM,
)

View File

@@ -75,10 +75,6 @@ REST_FRAMEWORK = {
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
"DEFAULT_FILTER_BACKENDS": ("django_filters.rest_framework.DjangoFilterBackend",),
"DEFAULT_THROTTLE_CLASSES": ("plane.api.rate_limit.ApiKeyRateThrottle",),
"DEFAULT_THROTTLE_RATES": {
"api_key": "60/minute",
},
}
# Django Auth Backend
@@ -189,6 +185,9 @@ AUTH_PASSWORD_VALIDATORS = [
},
]
# Password reset time the number of seconds the uniquely generated uid will be valid
PASSWORD_RESET_TIMEOUT = 3600
# Static files (CSS, JavaScript, Images)
STATIC_URL = "/static/"
STATIC_ROOT = os.path.join(BASE_DIR, "static-assets", "collected-static")
@@ -240,7 +239,7 @@ if AWS_S3_ENDPOINT_URL:
# JWT Auth Configuration
SIMPLE_JWT = {
"ACCESS_TOKEN_LIFETIME": timedelta(minutes=10080),
"ACCESS_TOKEN_LIFETIME": timedelta(minutes=43200),
"REFRESH_TOKEN_LIFETIME": timedelta(days=43200),
"ROTATE_REFRESH_TOKENS": False,
"BLACKLIST_AFTER_ROTATION": False,
@@ -310,7 +309,6 @@ if bool(os.environ.get("SENTRY_DSN", False)):
PROXY_BASE_URL = os.environ.get("PROXY_BASE_URL", False) # For External
SLACK_BOT_TOKEN = os.environ.get("SLACK_BOT_TOKEN", False)
FILE_SIZE_LIMIT = int(os.environ.get("FILE_SIZE_LIMIT", 5242880))
ENABLE_SIGNUP = os.environ.get("ENABLE_SIGNUP", "1") == "1"
# Unsplash Access key
UNSPLASH_ACCESS_KEY = os.environ.get("UNSPLASH_ACCESS_KEY")
@@ -322,4 +320,14 @@ ANALYTICS_SECRET_KEY = os.environ.get("ANALYTICS_SECRET_KEY", False)
ANALYTICS_BASE_API = os.environ.get("ANALYTICS_BASE_API", False)
# Use Minio settings
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
USE_MINIO = int(os.environ.get("USE_MINIO", 0)) == 1
# Posthog settings
POSTHOG_API_KEY = os.environ.get("POSTHOG_API_KEY", False)
POSTHOG_HOST = os.environ.get("POSTHOG_HOST", False)
# License engine base url
LICENSE_ENGINE_BASE_URL = os.environ.get("LICENSE_ENGINE_BASE_URL", "https://control-center.plane.so")
# instance key
INSTANCE_KEY = os.environ.get("INSTANCE_KEY", "ae6517d563dfc13d8270bd45cf17b08f70b37d989128a9dab46ff687603333c3")

View File

@@ -4,6 +4,7 @@ from django.urls import path
from plane.space.views import (
ProjectDeployBoardPublicSettingsEndpoint,
ProjectIssuesPublicEndpoint,
ProjectIssuesPublicGroupedEndpoint,
)
urlpatterns = [
@@ -17,4 +18,9 @@ urlpatterns = [
ProjectIssuesPublicEndpoint.as_view(),
name="project-deploy-board",
),
path(
"v3/workspaces/<str:slug>/project-boards/<uuid:project_id>/issues/",
ProjectIssuesPublicGroupedEndpoint.as_view(),
name="project-deploy-board",
),
]

View File

@@ -10,6 +10,7 @@ from .issue import (
IssueVotePublicViewSet,
IssueRetrievePublicEndpoint,
ProjectIssuesPublicEndpoint,
ProjectIssuesPublicGroupedEndpoint,
)
from .inbox import InboxIssuePublicViewSet

View File

@@ -653,4 +653,165 @@ class ProjectIssuesPublicEndpoint(BaseAPIView):
"labels": labels,
},
status=status.HTTP_200_OK,
)
class ProjectIssuesPublicGroupedEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, slug, project_id):
project_deploy_board = ProjectDeployBoard.objects.get(
workspace__slug=slug, project_id=project_id
)
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project", "workspace", "state", "parent")
.prefetch_related("assignees", "labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.prefetch_related(
Prefetch(
"votes",
queryset=IssueVote.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
state_group_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
states = (
State.objects.filter(
~Q(name="Triage"),
workspace__slug=slug,
project_id=project_id,
)
.annotate(
custom_order=Case(
*[
When(group=value, then=Value(index))
for index, value in enumerate(state_group_order)
],
default=Value(len(state_group_order)),
output_field=IntegerField(),
),
)
.values("name", "group", "color", "id")
.order_by("custom_order", "sequence")
)
labels = Label.objects.filter(
workspace__slug=slug, project_id=project_id
).values("id", "name", "color", "parent")
issues = IssuePublicSerializer(issue_queryset, many=True).data
issue_dict = {str(issue["id"]): issue for issue in issues}
state_dict = {str(state["id"]): state for state in states}
label_dict = {str(label["id"]): label for label in labels}
return Response(
{
"issues": issue_dict,
"states": state_dict,
"labels": label_dict,
},
status=status.HTTP_200_OK,
)

View File

@@ -36,3 +36,5 @@ scout-apm==2.26.1
openpyxl==3.1.2
beautifulsoup4==4.12.2
dj-database-url==2.1.0
posthog==3.0.2
cryptography==41.0.5

View File

@@ -1,11 +0,0 @@
<!DOCTYPE html>
<html>
<p>
Dear {{first_name}},<br /><br />
Welcome! Your account has been created.
Verify your email by clicking on the link below <br />
{{verification_url}}
successfully.<br /><br />
</p>
</html>

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +0,0 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html>
Dear {{username}},<br/>
Your requested Issue's data has been successfully exported from Plane. The export includes all relevant information about issues you requested from your selected projects.</br>
Please find the attachment and download the CSV file. If you have any questions or need further assistance, please don't hesitate to contact our support team at <a href = "mailto: engineering@plane.com">engineering@plane.so</a>. We're here to help!</br>
Thank you for using Plane. We hope this export will aid you in effectively managing your projects.</br>
Regards,
Team Plane
</html>

8
deploy/coolify/README.md Normal file
View File

@@ -0,0 +1,8 @@
## Coolify Setup
Access the `coolify-docker-compose` file [here](https://raw.githubusercontent.com/makeplane/plane/master/deploy/coolify/coolify-docker-compose.yml) or download using using below command
```
curl -fsSL https://raw.githubusercontent.com/makeplane/plane/master/deploy/coolify/coolify-docker-compose.yml
```

Some files were not shown because too many files have changed in this diff Show More