Compare commits

...

255 Commits

Author SHA1 Message Date
JayashTripathy
87de831559 feat: add Link icon to issue link item and update favicon handling 2025-05-28 19:37:03 +05:30
JayashTripathy
8b7c061d15 Merge remote-tracking branch 'origin/fix-project-joining-date' into chore-link-metadata 2025-05-28 19:21:00 +05:30
sangeethailango
9c83fbfc58 fix: set created_at as read_only_fields 2025-05-28 16:42:24 +05:30
gakshita
0f82be1bdd fix: added project's joining date 2025-05-27 19:27:36 +05:30
sangeethailango
64165695bb fix: return project joining date 2025-05-27 18:59:36 +05:30
JayashTripathy
83128c24a9 chore: added favicon and title of links 2025-05-26 20:31:35 +05:30
dependabot[bot]
04c7c53e09 chore(deps): bump requests (#7120)
Bumps the pip group with 1 update in the /apiserver/requirements directory: [requests](https://github.com/psf/requests).


Updates `requests` from 2.31.0 to 2.32.2
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.31.0...v2.32.2)

---
updated-dependencies:
- dependency-name: requests
  dependency-version: 2.32.2
  dependency-type: direct:production
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-05-26 19:45:15 +05:30
sangeethailango
fa63779fb9 fix: remove print statementsg 2025-05-26 17:04:04 +05:30
sangeethailango
aea4c320f2 fix: Handle None 2025-05-26 16:58:58 +05:30
sangeethailango
e7bbedf5d3 chore: type hints 2025-05-26 16:56:55 +05:30
sangeethailango
4ebbe00001 refactor: call find_favicon_url inside fetch_and_encode_favicon function 2025-05-26 16:37:41 +05:30
sangeethailango
cadaf86542 fix: handle exception by returning None 2025-05-26 16:15:49 +05:30
sangeethailango
c75e92c79c fix: remove json.dumps 2025-05-26 16:09:08 +05:30
sangeethailango
4e6958f186 fix: add validation for accessing IP ranges 2025-05-26 16:01:09 +05:30
sangeethailango
9d097d77f7 fix: return meta_data in the response 2025-05-26 15:44:44 +05:30
Dheeraj Kumar Ketireddy
78cc32765b [WEB-3707] pytest based test suite for apiserver (#7010)
* pytest bases tests for apiserver

* Trimmed spaces

* Updated .gitignore for pytest local files
2025-05-26 15:26:26 +05:30
JayashTripathy
4e485d6402 [WEB-4160] fix: close the context menu after select #7113 2025-05-26 15:24:13 +05:30
JayashTripathy
5a208cb1b9 [WEB-2403] fix: alignment of project states in collapsed view #7114 2025-05-26 15:23:39 +05:30
JayashTripathy
0eafbb698a [WEB-3494] fix: size of created at value #7112 2025-05-26 15:22:16 +05:30
sriram veeraghanta
193ae9bfc8 fix: yarn lock file 2025-05-26 14:58:26 +05:30
Vamsi Krishna
7cb5a9120a [WEB-4173]fix: fixed layout overflow issue #7119 2025-05-26 14:28:56 +05:30
sriram veeraghanta
efbccead12 feat: added a python bg task to crawl work item links for title and description 2025-05-25 20:57:48 +05:30
Vamsi Krishna
84fc81dd98 [WEB-4118]fix: adjusted sub work item properties for a better visibility (#7079)
* fix: adjusted sub work item properties for a better visibility

* fix: removed projects from sub work item filters
2025-05-23 16:14:35 +05:30
JayashTripathy
2d0c0c7f8a [WEB-4115] fix: update issue count status query to handle null values #7080 2025-05-23 16:13:48 +05:30
JayashTripathy
5c9bdb1cea [WEB-4133] fix: analytics release bugs (#7086)
* fix: header text of insight table search

* fix: made the active project list scrollable

* chore: added xAxis label to table header

* chore: removed the intake issues

* fix: made the headerText necessary

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2025-05-23 16:13:09 +05:30
Aaron Heckmann
f8ca1e46b1 [WEB-4098] feat: noindex/nofollow (#7088)
* feat: noindex/nofollow

- On login: nofollow
- On app pages: noindex, nofollow

https://app.plane.so/plane/browse/WEB-4098/

- https://nextjs.org/docs/app/api-reference/file-conventions/layout
- https://nextjs.org/docs/app/building-your-application/routing/route-groups#creating-multiple-root-layouts
- https://nextjs.org/docs/app/api-reference/functions/generate-metadata#link-relpreload

* chore: address PR feedback
2025-05-23 16:12:04 +05:30
Vamsi Krishna
a3b9152a9b [WEB-4123]feat: language support for sub-work item empty states #7092 2025-05-23 15:36:47 +05:30
Aaryan Khandelwal
5223bd01e8 [WEB-4153] chore: extend custom font family in tailwind config (#7093)
* chore: remove unwanted font family

* chore: add font family to extend object
2025-05-23 15:35:47 +05:30
Aaryan Khandelwal
6eb0b5ddb0 [WEB-4137] chore: restrict SVG file selection (#7095)
* chore: update accepted file mime types

* chore: update accepted file mime types
2025-05-23 15:33:56 +05:30
Anmol Singh Bhatia
cd200169b6 [WEB-4107] chore: redirect user to the newly created project view after creation #7098 2025-05-23 15:32:41 +05:30
Nikhil
037bb88b53 [WEB-4144] fix: api logger to handle content decode errors #7099 2025-05-23 15:31:40 +05:30
Bavisetti Narayan
643390e723 [WEB-4145] chore: added validation for project deletion #7101 2025-05-23 15:30:42 +05:30
Aaryan Khandelwal
731c4e8fcd [WEB-4161] fix: eslint config for library config file #7103 2025-05-23 15:29:37 +05:30
Prateek Shourya
6216ad77f4 [WEB-4146] fix: AI environment variables configuration in GodMode (#7104)
* [WEB-4146] fix: artificial intelligence environment variables configuration

* chore: update llm configuration keys
2025-05-23 15:06:58 +05:30
Bavisetti Narayan
9812129ad3 [WEB-4133] chore: optimised the analytics endpoints (#7105)
* chore: optimised the analytics endpoints

* chore: segregated peek view endpoints

* chore: added analytics values validation

* chore: added project validation

* chore: reverted the changes

---------

Co-authored-by: JayashTripathy <jayashtripathy371@gmail.com>
2025-05-23 15:05:57 +05:30
JayashTripathy
5226b17f90 [WEB-4159] feat: add 'restricted_entity' translation key across multiple languages #7106 2025-05-23 15:05:37 +05:30
Vamsi Krishna
b376e5300a [WEB-3155]fix: email notification comments overflow #7110 2025-05-23 15:04:50 +05:30
Prateek Shourya
4460529b37 [WEB-4154] fix: dropdown container classname (#7085)
* fix: dropdown container classname

* improvement: update string utils for joinWithConjunction

* improvement: add more string utils
2025-05-23 13:53:16 +05:30
Nikhil
0a8cc24da5 chore: add validation fields in users (#7102)
* chore: add validation fields in users

* chore: make is email valid default value False
2025-05-21 20:34:52 +05:30
Sangeetha
2f4aa843fc [WEB-4122] fix: estimate in project export #7091 2025-05-20 12:56:30 +05:30
sriram veeraghanta
cfac8ce350 fix: ruff file formatting based on config file pyproject (#7082) 2025-05-19 17:34:46 +05:30
sriram veeraghanta
75a11ba31a fix: polynomial regular expression used on uncontrolled data (#7083)
* fix: polynomial regular expression used on uncontrolled data

* fix: optimize the function to handle both operations
2025-05-19 17:14:26 +05:30
sriram veeraghanta
1fc3709731 chore: Strict Null Check in Admin app (#7081)
* chore: upgrade to latest version of turbo repo

* fix: tsconfig changes

* chore: adding format script to package json

* fix: formatting of files
2025-05-19 16:25:46 +05:30
Akshita Goyal
7e21618762 [WEB-3461] fix: profile activity rendering issue (#7059)
* fix: profile activity

* fix: icon

* fix: handled conversion case

* fix: handled conversion case
2025-05-19 15:20:57 +05:30
Aaryan Khandelwal
2d475491e9 [WEB-4117] refactor: work item widgets code split (#7078)
* refactor: work item widget code split

* fix: types
2025-05-19 15:20:40 +05:30
Aaryan Khandelwal
2a2feaf88e [WIKI-181] chore: editor extension storage utility code split (#7071)
* chore: storage extension code split

* chore: use storage extension utility
2025-05-19 13:12:52 +05:30
Anmol Singh Bhatia
e48b2da623 [WEB-4056] fix: archived work item validation #7060 2025-05-18 15:28:47 +05:30
Anmol Singh Bhatia
9c9952a823 [WEB-3866] fix: work item attachment activity #7062 2025-05-18 15:28:00 +05:30
Akshita Goyal
906ce8b500 [WEB-4104] fix: project loading state #7065 2025-05-18 15:19:05 +05:30
Anmol Singh Bhatia
6c483fad2f [WEB-4041] chore: modal outside click behaviour #7072 2025-05-18 15:18:09 +05:30
Bavisetti Narayan
5b776392bd chore: revamped the analytics for cycle and module in peek view. (#7075)
* chore: added cycles and modules in analytics peek view

* chore: added cycles and modules analytics

* chore: added project filter for work items

* chore: added a peekview flag and based on that table columns

* chore: added peek view

* chore: added check for display name

* chore: cleaned up some code

* chore: fixed export csv data

* chore: added distinct work items

* chore: assignee in peek view

* updated csv fields

* chore: updated workitems peek with assignee

* fix: removed type assersions for workspaceslug

* chore: added day wise filter in cycles and modules

* chore: added extra validations

---------

Co-authored-by: JayashTripathy <jayashtripathy371@gmail.com>
2025-05-17 17:11:26 +05:30
Aaryan Khandelwal
ba158d5d6e [WEB-4109] chore: remove analytics duration filter (#7073)
* chore: remove analytics duration filter

* removed subtitle from title and date_filter from service call

* chore: removed the date filter

* bottom text of insight trend card

* chore: changed issue manager

* fix: limited items in table

* fix: removed unnecessary props from data-table

---------

Co-authored-by: JayashTripathy <jayashtripathy371@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2025-05-16 19:16:30 +05:30
JayashTripathy
084cc75726 [WEB-4092] fix:broken detailed empty state layout #7056 2025-05-14 18:01:36 +05:30
Nikhil
534f5c7dd0 [WEB-4088] fix: issue exports when cycles are not present (#7057)
* fix: issue exports when cycles are not present

* fix: type check
2025-05-14 18:00:49 +05:30
Manish Gupta
080cf70e3f refactor: Enhance backup and restore scripts for container data (#7055)
* refactor: enhance backup and restore scripts for container data management

* fix: ensure proper quoting in backup script to handle paths with spaces

* fix: ensure backup directory is only removed if tar command succeeds

* CodeRabbit fixes
2025-05-14 12:33:53 +05:30
Manish Gupta
4c3f7f27a5 fix: update API service startup check to use HTTP request instead of logs (#7054) 2025-05-14 10:02:21 +05:30
sriram veeraghanta
803f6cc62a chore: yarn lock file updates 2025-05-13 16:20:08 +05:30
Vamsi Krishna
3a6d0c11fb fix: set accordion to expand by default (#7053) 2025-05-13 16:18:13 +05:30
JayashTripathy
75d81f9e95 [WEB-3781] Analytics page enhancements (#7005)
* chore: analytics endpoint

* added anlytics v2

* updated status icons

* added area chart in workitems and en translations

* active projects

* chore: created analytics chart

* chore: validation errors

* improved radar-chart , added empty states , added projects summary

* chore: added a new graph in advance analytics

* integrated priority chart

* chore: added csv exporter

* added priority dropdown

* integrated created vs resolved chart

* custom x and y axis label in bar and area chart

* added wrapper styles to legends

* added filter components

* fixed temp data imports

* integrated filters in priority charts

* added label to priority chart and updated duration filter

* refactor

* reverted to void onchange

* fixed some contant exports

* fixed type issues

* fixed some type and build issues

* chore: updated the filtering logic for analytics

* updated default value to last_30_days

* percentage value whole number and added some rules for axis options

* fixed some translations

* added - custom tick for radar, calc of insight cards, filter labels

* chore: opitmised the analytics endpoint

* replace old analytics path with new , updated labels of insight card, done some store fixes

* chore: updated the export request

* Enhanced ProjectSelect to support multi-select, improved state management, and optimized data fetching and component structure.

* fix: round completion percentage calculation in ActiveProjectItem

* added empty states in project insights

* Added loader and empty state in created/resolved chart

* added loaders

* added icons in filters

* added custom colors in customised charts

* cleaned up some code

* added some responsiveness

* updated translations

* updated serrchbar for the table

* added work item modal in project analytics

* fixed some of the layput issues in the peek view

* chore: updated the base function for viewsets

* synced tab to url

* code cleanup

* chore: updated the export logic

* fixed project_ids filter

* added icon in projectdropdown

* updated export button position

* export csv and emptystates icons

* refactor

* code refactor

* updated loaders, moved color pallete to contants, added nullish collasece operator in neccessary places

* removed uneccessary cn

* fixed formatting issues

* fixed empty project_ids in payload

* improved null checks

* optimized charts

* modified relevant variables to observable.ref

* fixed the duration type

* optimized some code

* updated query key in project-insight

* updated query key in project-insight

* updated formatting

* chore: replaced analytics route with new one and done some optimizations

* removed the old analytics

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2025-05-12 20:50:33 +05:30
Aaryan Khandelwal
0d5c7c6653 [WEB-4051] regression: update font size of comment editor #7048 2025-05-12 19:47:44 +05:30
Anmol Singh Bhatia
079c3a3a99 [WEB-3978] chore: cmd k search result redirection improvements (#7012)
* fix: work item tab highlight

* chore: projectListOpen state and toggle method added to command palette store

* chore: openProjectAndScrollToSidebar helper function and highlight keyframes added

* chore: SidebarProjectsListItem updated

* chore: openProjectAndScrollToSidebar implementation

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor
2025-05-12 19:15:39 +05:30
Sangeetha
5f8d5ea388 [WEB-4054] chore: search-issues endpoint code refactoring (#7029)
* chore: moved some code to seperate function

* fix: function name typo
2025-05-12 19:14:10 +05:30
Anmol Singh Bhatia
8613a80b16 [WEB-3523] feat: start of week preference (#7033)
* chore: startOfWeek constant and types updated

* chore: startOfWeek updated in profile store

* chore: StartOfWeekPreference added to profile appearance settings

* chore: calendar layout startOfWeek implementation

* chore: date picker startOfWeek implementation

* chore: gantt layout startOfWeek implementation

* chore: code refactor

* chore: code refactor

* chore: code refactor
2025-05-12 19:13:39 +05:30
Aaryan Khandelwal
dc16f2862e [WIKI-181] refactor: make file handling generic in editor (#7046)
* refactor: make file handling generic

* fix: useeffect dependency array

* chore: remove mime type to extension conversion
2025-05-12 18:37:36 +05:30
Vamsi Krishna
e68d344410 [WEB-4074]fix: removed sub-work item filters at nested levels #7047 2025-05-12 18:21:05 +05:30
Aaron Heckmann
26c8cba322 [WEB-4008] fix: handle when settings are None #7016
https://app.plane.so/plane/browse/WEB-4008/
2025-05-12 13:16:30 +05:30
Bavisetti Narayan
b435ceedfc [WEB-3782] chore: analytics endpoints (#6973)
* chore: analytics endpoint

* chore: created analytics chart

* chore: validation errors

* chore: added a new graph in advance analytics

* chore: added csv exporter

* chore: updated the filtering logic for analytics

* chore: opitmised the analytics endpoint

* chore: updated the base function for viewsets

* chore: updated the export logic

* chore: added type hints

* chore: added type hints
2025-05-12 13:15:17 +05:30
Sangeetha
13c46e0fdf [WEB-3987] chore: project export funtionality enhancement (#7002)
* chore: comment details of work item

* chore: attachment count and attachment name

* chore: issue link and subscriber count

* chore: list of assignees

* chore: asset_url as attachment_links

* chore: code refactor

* fix: cannot export Excel

* chore: remove print statements

* fix: filtering in list

* chore: optimize attachment_count and attachment_link query

* chore: optimize fetching issue details for multiple select

* chore: use Prefetch to avoid duplicates
2025-05-09 21:09:13 +05:30
sriram veeraghanta
02bccb44d6 chore: adding robots txt file for not indexing the server 2025-05-09 21:07:24 +05:30
Surya Prashanth
b5634f5fa1 chore: add disable_auto_set_user flag on base model save method (#7041)
- when disable_auto_set_user flag is set, user fields like created_by
are derived from payload instead of crum
2025-05-09 21:05:05 +05:30
Aaryan Khandelwal
64aae0a2ac [WEB-4051] fix: comment editor list items font size #7034 2025-05-09 18:49:43 +05:30
Henit Chobisa
a263bfc01f chore: added external id and source to page model (#7040)
* chore: added external id and source to page model

* chore: added migration

* fix: added blank field
2025-05-09 17:23:49 +05:30
Anmol Singh Bhatia
50082f0843 [WEB-4002] fix: sidebar tab highlight (#7011)
* fix: work item tab highlight

* chore: code refactor

* chore: code refactor

* chore: code refactor
2025-05-09 16:53:51 +05:30
Prateek Shourya
30db59534d [WEB-3985] feat: common postcss config and local fonts across all plane applications (#6998)
* [WEB-3985] feat: common postcss config and local fonts across all plane applications

* improvement: split fonts into a separate exports
2025-05-09 14:26:29 +05:30
Vamsi Krishna
e401c9d6e4 [WEB-4028] feat: sub work item filters and grouping (#6997)
* feat: added filters for sub issues

* feat: added list groups for sub issues

* chore: updated order for sub work item properties

* feat: filters for sub work items

* feat: added filtering and ordering at frontend

* chore: reverted backend filters

* feat: added empty states

* chore: code improvemnt

---------

Co-authored-by: sangeethailango <sangeethailango21@gmail.com>
2025-05-09 14:24:06 +05:30
Bavisetti Narayan
39b5736c83 [WEB-4057] chore: updated the logger for bgtasks #7025 2025-05-09 14:23:23 +05:30
Vamsi Krishna
2785419d12 [WEB-4052]fix: sub work item copy link (#7036)
* fix: sub work item copy link

* fix: copy url to clipboard
2025-05-09 14:22:34 +05:30
sriram veeraghanta
ac5b974d67 chore: Upgrade Django version to 4.2.21 2025-05-08 21:29:26 +05:30
Anmol Singh Bhatia
14ebaf0799 [WEB-3942] chore: intake url pattern (#7006)
* chore: intake url pattern updated

* chore: code refactor

* chore: removed unused components

---------

Co-authored-by: vamsikrishnamathala <matalav55@gmail.com>
2025-05-07 21:19:24 +05:30
Sangeetha
7cdb622663 [WEB-3930] chore: change source in-app to IN_APP #7008 2025-05-07 18:46:10 +05:30
JayashTripathy
855e4a3218 [WEB-4016] updated project and workitem form (#7019)
* updated project and workitem form

* added translation for other languages also

* Update packages/i18n/src/locales/zh-CN/translations.json

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2025-05-07 18:44:04 +05:30
Anmol Singh Bhatia
d456767492 [WEB-3955] chore: work item parent select modal params #7021 2025-05-07 18:41:28 +05:30
Bavisetti Narayan
6faff1d556 [WEB-3877] fix: changed logic to calculate cycle duration (#7024)
* chore: cycle running days

* chore: removed the module filter
2025-05-07 18:40:37 +05:30
Aaryan Khandelwal
bc2936dcd3 [WEB-3906] fix: page table of content overlap with the page content #7018 2025-05-07 00:51:51 +05:30
Aaryan Khandelwal
d366ac1581 [WEB-2508] fix: page favorite item title mutation (#7020)
* fix: remove page favorite item title fallback value

* refactor: use nullish coalescing operator
2025-05-07 00:28:43 +05:30
Nikhil
0a01e0eb41 [WEB-4013] chore: correct live url #7014 2025-05-06 01:21:53 +05:30
Nikhil
b4cc2d83fe [WEB-4014] fix: check access when duplicating pages #7015 2025-05-06 01:20:33 +05:30
Nikhil
42e2b787f0 [WEB-4013]chore: publish login and standardize urls in common settings (#7013)
* chore: handling base path and urls

* chore: uniformize urls in common settings

* correct live url

* chore: use url join to correctly join urls

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2025-05-05 18:58:24 +05:30
Anmol Singh Bhatia
fbca9d9a7a [WEB-3996] fix: attachment icon rendering and added support for rar and zip icons (#7007)
* chore: zip and rar file icon

* chore: zip and rar file icon

* fix: attachment icon

* chore: application/x-rar type added

* fix: compressed file extensions

* chore: updated file upload extensions

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2025-05-02 16:53:06 +05:30
Sangeetha
dbc00e4add [WEB-3992] chore: support for x-zip-compressed type #7001 2025-05-01 19:22:00 +05:30
Aaron Heckmann
28f9733d1b [WEB-3991] chore: local dev improvements (#6991)
* chore: local dev improvements

* chore: pr feedback

* chore: fix setup

* fix: env variables updated in .env.example files

* fix(local): sign in to admin and web

* chore: update minio deployment to create an bucket automatically on startup.

* chore: resolve merge conflict

* chore: updated api env with live base path

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2025-04-30 21:46:59 +05:30
Sangeetha
1e46290727 [WEB-3958] chore: allow members and admins to create api tokens (#6979)
* chore: allow members and admins to create api tokens

* chore: change permission for service api token
2025-04-30 19:51:04 +05:30
Anmol Singh Bhatia
5a1df8b496 [WEB-3560] chore: work item modal code refactor #6996 2025-04-30 14:56:38 +05:30
Anmol Singh Bhatia
f23a2f0780 [WEB-3973] chore: space app state icon size #6995 2025-04-29 20:13:55 +05:30
sriram veeraghanta
d10bb0b638 chore: yarn lock updates 2025-04-29 15:49:14 +05:30
sriram veeraghanta
c4ddff5419 chore: nextjs dependencies upgrade 2025-04-29 15:48:52 +05:30
sriram veeraghanta
10f5b4e9b8 fix: turbo repo upgrade 2025-04-29 15:34:12 +05:30
sriram veeraghanta
cdca5a4126 chore: build fixes 2025-04-29 15:33:03 +05:30
Vamsi Krishna
14dc6a56bc [WEB-3838]feat:sub work items sorting (#6967)
* refactor: sub-work items components, hooks and types

* feat: added orderby and display properties toggle for sub work items

* fix: build errors

* chore: removed issue type from filters

* chore: added null check

* fix: added null check

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2025-04-29 15:23:10 +05:30
Sangeetha
55340f9f48 [WEB-3957] chore: IntakeIssues with iexact 'in-app' changed to 'IN_APP' (#6977)
* migration: data with iexact 'in-app' changed to 'IN_APP'

* chore: add start_of_week field in profile

* chore: define variables for choices

* chore: merge migration files
2025-04-29 15:22:42 +05:30
Prateek Shourya
efa64fc4b8 [WEB-3968] improvement: added few missing translation keys #6993 2025-04-29 15:14:31 +05:30
Anmol Singh Bhatia
f5449c8f93 [WEB-3751] chore: work item state icon improvement (#6960)
* chore: return order based on group

* chore: order for workspace stats endpoint

* chore: state response updated

* chore: state icon types updated

* chore: state icon updated

* chore: state settings new icon implementation

* chore: icon implementation

* chore: code refactor

* chore: code refactor

* chore: code refactor

* fix: order field type

---------

Co-authored-by: sangeethailango <sangeethailango21@gmail.com>
2025-04-29 14:33:53 +05:30
Bavisetti Narayan
baabb82669 [WEB-3926] chore: removed the duplicated webhook task and updated the webhook task to handle exceptions correctly (#6951)
* chore: removed the duplicated webhook function

* chore: update webhook send task to handle errors

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2025-04-29 14:04:00 +05:30
Nikhil
298e3dc9ca [WEB-3945] chore: update workspace onboarding to add default project (#6964)
* chore: add json files and initial job to push data to workspace

* chore: update seed data location

* chore: update seed data to use assets from static urls

* chore: update seed data to use updated labels

* chore: add logging and update label name

* chore: add created_by for project member

* chore: add created_by_id for issue user property

* chore: add workspace seed task logs

* chore: update log message to return task name

* chore: add warning log for workspace seed task

* chore: add validation for issue seed data
2025-04-29 14:01:22 +05:30
Bavisetti Narayan
190300bc6c [WEB-3877] chore: changed the logic to end cycle (#6971)
* chore: changed the logic to end cycle

* chore: added issue deleted filter

* chore: added check for progress snapshot
2025-04-29 14:00:54 +05:30
Dheeraj Kumar Ketireddy
550fe547e2 [WEB-3967] feat: Optimized module patch endpoint to reduce duplicate db calls (#6983) 2025-04-29 13:51:46 +05:30
Akshat Jain
f278a284c4 chore: comment out APP_RELEASE variable update in build-branch workflow (#6989) 2025-04-28 17:45:44 +05:30
sriram veeraghanta
2bcf6c76cd chore: remove dockerhub user varible from compose 2025-04-28 16:28:48 +05:30
Akshat Jain
fb3e022042 [INFRA-134] fix: Setup and Swarm scripts for DOCKERHUB_USERNAME #6988 2025-04-28 14:37:51 +05:30
Akshat Jain
e3fbb7b073 [INFRA-134]: Update Docker images to use new artifact repository path #6978 2025-04-25 18:09:43 +05:30
Anmol Singh Bhatia
cce6dd581c [WEB-3948] chore: recent work item improvement (#6976)
* chore: issue entity data type updated

* chore: HomePeekOverviewsRoot component added

* chore: recent work item improvement and code refactor
2025-04-25 15:08:10 +05:30
Akshita Goyal
d86ac368a4 [WEB-3863] fix: handled error handling for link editing #6968 2025-04-25 14:31:35 +05:30
Akshita Goyal
101994840a [WEB-3944] fix: Error Toast message content update while uploading images (#6969)
* fix: handled svg uploads

* chore: proper error message with all allowed types

---------

Co-authored-by: sangeethailango <sangeethailango21@gmail.com>
2025-04-25 14:30:12 +05:30
Anmol Singh Bhatia
f60f57ef11 [WEB-3494] chore: platform ux copy changes (#6970)
* chore: project quick action option ux copy updated

* chore: project tour copy updated
2025-04-25 14:29:09 +05:30
Prateek Shourya
546217f09b [WEB-3953] fix: issue description assets upload when project id is switched (#6975) 2025-04-25 14:27:40 +05:30
sriram veeraghanta
6df8323665 fix: add gzip upload support 2025-04-24 17:50:37 +05:30
Sangeetha
77d022df71 [WEB-3919] chore: support .sql file attachment #6966 2025-04-24 17:39:16 +05:30
M. Palanikannan
797f150ec4 [WIKI-331] fix: editor ref issues while locking/unlocking page #6965 2025-04-24 17:38:41 +05:30
sriram veeraghanta
b54f54999e chore: bump up the package version 2025-04-24 17:37:50 +05:30
Sangeetha
dff176be8f [WEB-3930] chore: set IN_APP as default source value for intake issue (#6963)
* chore: chore: only allow intake issues with source IN_APP to be created

* chore: set IN_APP as default intake issue
2025-04-24 16:25:15 +05:30
Sangeetha
2bbaaed3ea [WEB-3918] fix: api tokens is_active (#6941)
* fix: is_active always returning true
chore: formate expired_at to iso date

* Display exact expiration timestamp for API tokens

* chore: remove conversion to iso

* chore: remove unwanted imports

* fix: added timestamp for api token expiry

* fix: handle none value in expired_at

* fix: fix: handle none value in expired_at

* chore: add type hints

* fix: refactor

---------

Co-authored-by: Alaaeddine bousselmi <alaaeddine.bousselmi@medtech.tn>
Co-authored-by: gakshita <akshitagoyal1516@gmail.com>
Co-authored-by: Akshita Goyal <36129505+gakshita@users.noreply.github.com>
2025-04-24 01:28:29 +05:30
Prateek Shourya
b5ceb94fb2 [WEB-3930] fix: application crash on accessing intake work items (#6958) 2025-04-23 15:12:54 +05:30
alaabousselmi
feb6243065 docs: document minimum RAM requirement and issue naming conventions (#6954) 2025-04-22 18:00:19 +05:30
Anmol Singh Bhatia
5dacba74c9 [WEB-3923] fix: applied filters list #6957 2025-04-22 17:58:16 +05:30
bIaO
0efb0c239c feat: improve setup.sh script with better error handling and user feedback (#6758) 2025-04-22 17:56:34 +05:30
Vamsi Krishna
c8be836d6c [WEB-3920]fix: estimate activity #6950 2025-04-22 17:45:15 +05:30
Nikhil
833b82e247 [WEB-3927] chore: add logging to support json logging (#6955)
* chore: update logging to json based logging

* chore: add logging to file
2025-04-22 17:41:58 +05:30
Akshita Goyal
280aa7f671 [WEB-3399] fix: progress data for cycle list item #6956 2025-04-22 17:41:06 +05:30
Aaryan Khandelwal
eac1115566 [WIKI-320] refactor: page header actions (#6946)
* refactor: page header actions

* chore: update toolbar component

* chore: update archived and lock badge colors

* chore: added observer to favorite control
2025-04-17 20:52:33 +05:30
sriram veeraghanta
8166a757a7 fix: removed @plane alias from ui package 2025-04-17 20:51:52 +05:30
Anmol Singh Bhatia
be5d77d978 [WEB-3892] chore: link item improvements (#6944)
* chore: code refactor

* chore: global link block component added

* chore: link item improvement and code refactor
2025-04-17 20:08:53 +05:30
Anmol Singh Bhatia
18fb3b8450 [WEB-3904] fix: sub work item fetching #6945 2025-04-17 20:07:13 +05:30
sriram veeraghanta
ef5616905e chore: upgrade turbo repo version 2025-04-17 17:51:59 +05:30
Sangeetha
aeb41e603c [WEB-3826] feat: estimate activitites #6937 2025-04-17 17:16:57 +05:30
Vamsi Krishna
55eea1a8b7 [WEB-3872]chore: header switcher enhancements (#6935)
* * chore: alignment and size for header
* fix: switcher close on click

* chore: moved acces icon component to components
2025-04-17 17:15:53 +05:30
Aaryan Khandelwal
fa87ff14b7 [WIKI-319] chore: remove bottom border when toolbar is hidden (#6943)
* chore: remove border when toolbar is hidden

* chore: add stricter conditions
2025-04-17 17:13:21 +05:30
khalilzitouni2058
7d91b5f8df [WEB-3892] feat: add icon to Quicklinks (#6927)
* [feature]: add icon to Quicklinks

* fix: moving  getIconForLink to utils packages
2025-04-17 17:11:57 +05:30
Anmol Singh Bhatia
3ce40dfa2f [WIKI-316] fix: list item overflow #6942 2025-04-17 17:08:13 +05:30
Anmol Singh Bhatia
f65253c994 [WEB-2561] chore: favicon icon updated #6938 (#6940)
* chore: favicon icon updated

* chore: code refactor
2025-04-17 15:38:42 +05:30
Anmol Singh Bhatia
97fcfaa653 [WEB-2561] chore: favicon icon updated #6938 2025-04-16 20:34:12 +05:30
Anmol Singh Bhatia
0e1ebff978 [WEB-3871] fix: sidebar label property #6934 2025-04-15 19:42:02 +05:30
Anmol Singh Bhatia
642dabfe35 [WEB-3870] fix: sidebar comment scroll #6932 2025-04-15 17:47:22 +05:30
Aaryan Khandelwal
48557cb670 [WEB-3868] fix: issue detail widget modals #6933 2025-04-15 17:46:45 +05:30
Bavisetti Narayan
608da1465c [WEB-3860] chore: added deleted filter in the grouper (#6931)
* chore: added deleted filter in the grouper

* chore: added type hints for the function
2025-04-15 17:42:45 +05:30
Anmol Singh Bhatia
dbcc7bedb4 [WEB-3855] feat: Turkish language support (#6922)
* add Turkish language support (#6874)

* add turkish language support

* fix indentation

* chore: extended core translation added

* chore: code refactor

---------

Co-authored-by: Farahat Abdrabouh <88924701+fasdjkherig@users.noreply.github.com>
2025-04-15 16:36:02 +05:30
Vamsi Krishna
c401b26dd4 [WEB-3856]chore: refactor work item activity (#6923)
* chore: refactor work item activity

* chore: added estimate render for notifications
2025-04-15 16:35:28 +05:30
Aaryan Khandelwal
a4bca0c39c [WEB-3859] fix: work item links #6930 2025-04-15 13:46:29 +05:30
Saurabh Kumar
24899887b2 chore: Add workspace slug to should render setting link method (#6886)
* add workspace slug to setting link function

* add params in the function
2025-04-14 17:41:47 +05:30
sriram veeraghanta
c6953ff878 fix: db modeling changes in pages 2025-04-12 16:22:13 +05:30
Prateek Shourya
06be9ab81b [WEB-3854] feat: billing and plans new design (#6920)
* [WEB-3854] feat: billing and plans new design

* chore: add missing styles
2025-04-11 20:37:25 +05:30
Akshita Goyal
ed8d00acb1 [WEB-3849] chore: added intake source in the list (#6919)
* chore: added intake source in the list

* fix: refactor
2025-04-11 19:49:35 +05:30
Aaryan Khandelwal
915e374485 [WIKI-307]chore: update page icon placement #6916 2025-04-11 18:07:03 +05:30
Vamsi Krishna
1d5b93cebd [WEB-3853] fix: untitled page name issue #6918 2025-04-11 18:06:26 +05:30
sriram veeraghanta
df65b8c34a fix: adding request logger middleware 2025-04-11 17:59:19 +05:30
Akshita Goyal
4c688b1d25 [WEB-3529] fix: fixed the comment create box position in common comments component (#6915) 2025-04-11 14:00:54 +05:30
Nikhil
bfc6ed839f fix: uuid validation, status and webhook errors (#6896)
* fix: uuid validation and function parameter handling for external apis

* chore: update status 410 Gone to 409 conflicts

* chore: add webhook trigger for issue created through apis

* chore: remove pks from post

* chore: remove issue id from module post
2025-04-11 01:47:00 +05:30
Surya Prashanth
b68396a4b2 [WEB-3831] chore: add validation for project_id in cycle serializer #6908 2025-04-11 01:42:53 +05:30
Vamsi Krishna
b4fc715aba [WEB-3826] fix: estimate dropdown formatting (#6906)
* * fix: time conversion for estimate dropdown in browse
* chore: updated puncutations for estimates.

* chore: estimate activiy formatting

* chore: estimate activity refactor
2025-04-11 01:41:43 +05:30
Anmol Singh Bhatia
33a1b916cb [WEB-3837] fix: mutation of child work item added via Cmd+K with parent context #6910 2025-04-11 01:40:29 +05:30
Akshita Goyal
2818310619 [WEB-3529] fix: comment reset + edit comment font size + comment box position (#6909)
* fix: comment reset + edit comment font size

* fix: dynamically setting the position of the comment box

* fix: refactor

* fix: nomenclature
2025-04-11 01:40:05 +05:30
Anmol Singh Bhatia
882520b3c7 [WEB-3841] fix: create issue modal now correctly uses current project context #6911 2025-04-11 01:35:27 +05:30
Aaryan Khandelwal
20132e7544 [WEB-3839] fix: peek overview description version history (#6912)
* fix: handle undefined created_at

* chore: add created_by, updated_by updated_at and created_at fields in
relation apis

* chore: handle undefined date

* fix: project typo

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2025-04-10 16:22:26 +05:30
Aaryan Khandelwal
0ae57b49d2 [WEB-3829]fix: update workspace store action (#6905) 2025-04-09 20:31:52 +05:30
M. Palanikannan
d347269afb [WEB-3819] fix: images now restore in read only mode as well (#6904) 2025-04-09 20:06:15 +05:30
Aaryan Khandelwal
a3fd616ec4 [WEB-3827] refactor: work item widget components (#6902) 2025-04-09 19:58:16 +05:30
Akshita Goyal
9eeff158d5 [WEB-3811] fix: cycle charts issues (#6901) 2025-04-09 19:57:47 +05:30
Aaryan Khandelwal
ef20b5814e [WEB-3792, 3823] fix: intake form version history (#6898)
* chore: intake form version history

* fix: remove autofocus from the copy markdown button

* chore: add logic to display deactivated user
2025-04-09 19:56:59 +05:30
Vamsi Krishna
14914e8716 [WEB-3759]chore: updated module and pages detail header (#6903)
* chore: added panel collapse and quick action menu for module detail header

* fix: updated pages header swithcer
2025-04-09 19:36:15 +05:30
Vamsi Krishna
b738e39a4a [WEB-3798]chore: updated language support to estimates (#6900) 2025-04-09 19:34:01 +05:30
Vamsi Krishna
993c7899b6 [WEB-3759] chore: header revamp for cycles, modules, pages and views (#6875)
* chore: header revamp for cycles, modules, pages and views

* chore: moved list fetch to layout level
2025-04-09 14:56:57 +05:30
Vipin Chaudhary
2b411de1e3 [WIKI-306] fix: handle editor click behavior on the last node #6879 2025-04-09 14:51:58 +05:30
Prateek Shourya
1f9222065e [WEB-3788] improvement: enhance project properties related components modularity (#6882)
* improvement: work item modal data preload and parent work item details

* improvement: collapsible button title

* improvement: project creation form and modal

* improvement: emoji helper

* improvement: enhance labels component modularity

* improvement: enable state group and state list components modularity

* improvement: project settings feature list

* improvement: common utils
2025-04-09 14:50:43 +05:30
Akshita Goyal
670134562f [WEB-3808] fix: replaced the profile charts with propel components #6892 2025-04-09 14:50:23 +05:30
Akshita Goyal
144c793e9e [WEB-3803] fix: duplicate comments issue (#6893)
* fix: duplicate comments issue

* fix: refactor
2025-04-09 14:49:54 +05:30
Anmol Singh Bhatia
0a924e4824 [WEB-3693] chore: cmd-k work item actions improvements (#6891) 2025-04-09 09:25:57 +05:30
Aaryan Khandelwal
08702a5381 [WEB-3766] fix: user avatar in description version history dropdown item (#6888)
* fix: avatar url

* chore: update version modal width
2025-04-08 18:05:14 +05:30
sriram veeraghanta
270f282c3c fix: copy url util build error 2025-04-08 15:44:07 +05:30
Aaryan Khandelwal
37699362ad [WEB-3797] fix: remove leading slash from URL to copy (#6890)
* fix: remove prefix slash if present

* chore: make use of URL class to generate a valid URL
2025-04-08 15:22:23 +05:30
Vamsi Krishna
27cec64c56 [WEB-3794]chore: set project states to expand by default #6885 2025-04-08 14:38:08 +05:30
Akshita Goyal
782b09eeaf [WEB-3711] fix: relations delete issue (#6887)
* fix: relations delete issue

* fix: removed unnecessary type casting
2025-04-08 14:37:00 +05:30
Akshita Goyal
5ac5892fe5 [WEB-3586] fix: recents dropdown in home #6889 2025-04-08 14:32:08 +05:30
Bavisetti Narayan
96c403ff0b chore: changed inbox to intake (#6884) 2025-04-08 12:46:20 +05:30
Nikhil
543552f492 [WEB-3786] fix: issue date update when converting when dates are passed as string for comparison #6880
for comparison
2025-04-07 19:08:19 +05:30
Akshita Goyal
c3cfcc1b92 [WEB-3753] fix: font size for comment box changed #6881 2025-04-07 19:06:04 +05:30
Anmol Singh Bhatia
ac84d6ecf0 [WEB-3540] chore: icon color picker enhancements #6878 2025-04-07 15:53:02 +05:30
Vamsi Krishna
475b7a8396 [WEB-3737]chore: updated translations for estimates #6871 2025-04-07 15:50:15 +05:30
Nikhil
00f78bd6a1 [WEB-3728] fix: duplicate sequence ids being generated due to race condition (#6877)
* fix: race condition which is creating duplicate sequence ids

* chore: add management command to fix duplicate sequences

* chore: update command to take a lock and optimize the script to use dict
instead of loops

* chore: update the script to use transaction
2025-04-07 15:48:43 +05:30
Aaryan Khandelwal
34337f90c1 [WEB-3748, 3749] feat: work item description version history (#6863)
* chore: work item description versions

* chore: intake issue description

* chore: intake work item description versions

* chore: add missing translations

* chore: endpoint for intake description version

* chore: renamed key to work item

* chore: changed the paginator class

* chore: authorization added

* chore: added the enum validation

* chore: removed extra validations

* chore: added extra validations

* chore: modal position

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Bavisetti Narayan <72156168+NarayanBavisetti@users.noreply.github.com>
2025-04-04 20:09:02 +05:30
Prateek Shourya
4f68aaafa6 fix: web build (#6870) 2025-04-04 20:07:12 +05:30
Vamsi Krishna
9c10235fca [WEB-3737]chore: estimates code refactor and translations (#6857)
* * chore: refactored estimates components.
* chore: added translations for estimates components.

* fix: translation key update
2025-04-04 16:59:12 +05:30
Lorenzo Palaia
9c1b158291 feat: hide create account button on ENABLE_SIGNUP=0 (#6841) 2025-04-04 16:52:59 +05:30
Prateek Shourya
2d0a15efd6 [WEB-3762] improvement: redirect logged in user to the workspace after accepting the invitation (#6869) 2025-04-04 16:52:09 +05:30
dependabot[bot]
d62ac6269b chore(deps): bump next in the npm_and_yarn group across 1 directory (#6865)
Bumps the npm_and_yarn group with 1 update in the / directory: [next](https://github.com/vercel/next.js).


Updates `next` from 14.2.25 to 14.2.26
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/compare/v14.2.25...v14.2.26)

---
updated-dependencies:
- dependency-name: next
  dependency-version: 14.2.26
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-03 20:23:05 +05:30
Nikhil
d9e3405f5a [WEB-3700] chore: improve authentication redirections (#6836)
* chore: update redirections to be from allowed hosts

* chore: update redirection logic

* chore: add web url in settings

* chore: add next path validation

* chore: update typings

* chore: update typings

* chore: update types

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2025-04-02 23:09:27 +05:30
Vamsi Krishna
adee686ea3 [WEB-3699]fix: create link modal text flicker (#6860) 2025-04-02 23:08:18 +05:30
Dheeraj Kumar Ketireddy
81fae36c23 [WEB-3744] Append the deleted_at timestamp to workspcace slug when it's soft deleted (#6862) 2025-04-02 23:07:26 +05:30
Akshita Goyal
3f652ba44e [WEB-3746] fix: intake form css (#6864) 2025-04-02 23:06:32 +05:30
Vamsi Krishna
16aa1d7034 [WEB-3273]fix: editor bubble menu z-index #6858 2025-04-02 17:35:30 +05:30
Anmol Singh Bhatia
0db581509c [WEB-3745] fix: color picker event propagation (#6859) 2025-04-02 17:35:04 +05:30
M. Palanikannan
523ab3f4a1 [WEB-3747] regression: readonly mode with fragments (#6861) 2025-04-02 17:34:28 +05:30
M. Palanikannan
a57c37c26c [PE-304] feat: make floating link generic and use it for all editors (#6552)
* fix: make floating link generic and use it for all editors

* fix: link component behaviour with selected text fixed and storage is now typed

* chore: link view seperated

* fix: editor link edit view across multiple links resets now

* fix: link view container

* fix: cleaning up

* fix: url validation
2025-04-02 13:42:34 +05:30
Sangeetha
65a0530cfe [WEB-2804] fix: subscribed issue count (#6845) 2025-04-01 20:48:25 +05:30
Prateek Shourya
7bb291408d [WEB-3712] improvement: create draft work item logic (#6847) 2025-04-01 20:47:44 +05:30
Anmol Singh Bhatia
4be94adaca [WEB-2597] fix: handle favorite entity data causing application error (#6853) 2025-04-01 20:47:01 +05:30
sriram veeraghanta
2d1b3fb39e [WEB-3732 | WEB-3731] feat: Vietnamese and Portuguese language support #6854 2025-04-01 16:43:16 +05:30
Anmol Singh Bhatia
585432824f chore: portuguese translation updated 2025-04-01 15:33:49 +05:30
Anmol Singh Bhatia
fe9640533c chore: vietnamese translation updated 2025-04-01 15:30:21 +05:30
Trần Huy Duẫn
5ec817ba37 feat: add Vietnamese language support and translations (#6842)
- Added Vietnamese (Tiếng việt) to the list of supported languages.
- Created a new translations file for Vietnamese with comprehensive translations for various UI elements.
- Updated the TranslationStore to include the new Vietnamese language option.
2025-04-01 15:17:21 +05:30
Henrique
9279b5f1fb feat(i18n): add Brazilian Portuguese (pt-BR) translations (#6840)
Updated TranslationStore to include support for Brazilian Portuguese by importing the corresponding translations file.
Extended TLanguage type to include "pt-BR" as a valid language option.
2025-04-01 15:16:58 +05:30
Anmol Singh Bhatia
921dfe3222 [WEB-3704] chore: work item store optimization and code refactor (#6846)
* chore: work item store optimization and code refactor

* chore: code refactor
2025-03-28 18:38:44 +05:30
Anmol Singh Bhatia
8216785b27 [WEB-3704] fix: sub work item #6844 2025-03-28 17:02:17 +05:30
sriram veeraghanta
2bfe4d6a6e fix: tsup version upgrade 2025-03-28 15:52:56 +05:30
Prateek Shourya
691cbef1f2 [WEB-3701] fix: use getCycleById to ensure null handling for cycle access (#6838)
* [WEB-3701] fix: use `getCycleById` to ensure null handling for cycle access

* fix: cycle sidebar storage values
2025-03-28 15:12:40 +05:30
Prateek Shourya
fed0ef6185 [WEB-3705] improvement: clear local db on version change (#6843)
* [WEB-3705] improvement: clear local db on version change

* chore: remove console.log
2025-03-28 15:12:03 +05:30
Akshita Goyal
e8779511ad [WEB-3673] fix: password change form (#6839)
* fix: change password

* fix: added store action for change password

* fix: type

* fix: store refactor
2025-03-28 13:35:42 +05:30
Anmol Singh Bhatia
99dba80d19 [WEB-3540] dev: color picker component (#6823)
* dev: color picker component added

* chore: helper function added

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor
2025-03-27 17:48:39 +05:30
Anmol Singh Bhatia
471fefce8b [WEB-3697] chore: chart components (#6835) 2025-03-27 17:46:43 +05:30
Akshita Goyal
869c755065 [WEB-3698] fix: comments refactor (#6759)
* fix: comments refactor

* fix: add edited at

* chore: add edited_at validation at issue comment update

* fix: comment mentions

* fix: edited at

* fix: css

* fix: added bulk asset upload api

* fix: projectId prop fixed

* fix: css

* fix: refactor

* fix: translation

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2025-03-27 17:28:52 +05:30
Vamsi Krishna
a5ffbffed9 [WEB-3694] feat: added dates display in user timezone in analytics sidebar (#6834) 2025-03-27 17:18:09 +05:30
Dheeraj Kumar Ketireddy
784d651c5b fix: Removed hardcoded timezone offsets and reduced the cache to 2 hours (#6837) 2025-03-27 16:53:20 +05:30
Dancia
b19bca3b50 docs: updated links in README.md file (#6833) 2025-03-27 15:11:43 +05:30
Vamsi Krishna
1121c58ada fix: label update for date dropdown (#6832) 2025-03-27 13:52:30 +05:30
sriram veeraghanta
fb2987e9ef chore: updated gitignore 2025-03-27 12:53:29 +05:30
Aaryan Khandelwal
a25cd426a9 style: page editor width and layout updates (#6826) 2025-03-26 21:10:44 +05:30
M. Palanikannan
993713925a feat: express decorators for rest apis and websocket (#6818)
* feat: express decorators for rest apis and websocket

* fix: added package dependency

* fix: refactor decorators
2025-03-26 20:24:05 +05:30
Vamsi Krishna
ae6e5a48fa [WEB-3681]feat: added user timezone dates for cycle (#6820)
* feat: added user timezone dates for cycle

* *chore: added translations
*chore: refactored user timezone functions
2025-03-26 20:23:19 +05:30
Anmol Singh Bhatia
c125bc54ba [WEB-3686] feat: romanian and indonesian language support (#6825)
* Add ro Romanian Language locale (#6809)

* feat: add Indonesian language support (#6794)

Co-authored-by: Anmol Singh Bhatia <121005188+anmolsinghbhatia@users.noreply.github.com>

* chore: core translation added and code refactor

---------

Co-authored-by: mnbro <107358316+mnbro@users.noreply.github.com>
Co-authored-by: Rasyid Ridho <rasyid@sekeco.id>
2025-03-26 20:10:20 +05:30
Akshita Goyal
41447e566a [WEB-3600] fix: private project join issue (#6799)
* fix: private project join issue

* chore: return network value

* fix: refactor

* fix: refactor

* fix: type

* chore: added restricition for private projects

* chore: removed extra validations

* chore: added value to access enum

---------

Co-authored-by: sangeethailango <sangeethailango21@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2025-03-25 20:17:16 +05:30
Prateek Shourya
cebd0b3599 [RANTS-68] fix: z-index for image picker popover button (change project cover) (#6812) 2025-03-25 20:16:06 +05:30
Prateek Shourya
caa522118d [RANTS-29] fix: enter key does not work in the workspace member invite modal (#6816) 2025-03-25 20:15:35 +05:30
Prateek Shourya
aea5f39059 [RANTS-31] improvement: optimistic update for home widget reordering (#6817) 2025-03-25 20:15:00 +05:30
sriram veeraghanta
f29867968a chore: removed sentry instrumentation dependencies 2025-03-25 15:54:44 +05:30
sriram veeraghanta
c91972cc0a chore: removing sentry instrumentation 2025-03-25 15:45:31 +05:30
Anmol Singh Bhatia
84c7375d2a [WEB-3601] chore: content updated (#6811) 2025-03-24 19:57:13 +05:30
Anmol Singh Bhatia
5cb37a0b9c [WEB-3560] fix: table layout issue block and code refactor (#6805) 2025-03-24 19:06:36 +05:30
Anmol Singh Bhatia
c347dd7dcd [WEB-3614] chore: list layout display filters (#6801) 2025-03-24 19:05:53 +05:30
sriram veeraghanta
0ec206b75d fix: transpile packages update on space and admin apps 2025-03-24 18:55:59 +05:30
Vamsi Krishna
e8718a84fe chore: issue detail refactor (#6803) 2025-03-24 18:33:22 +05:30
Anmol Singh Bhatia
983e0fa081 [WEB-3438] fix: transfer completed cycle issue modal (#6802) 2025-03-24 18:30:31 +05:30
Aaryan Khandelwal
ef108839c4 [RANTS-57] chore: replace target date with due date in work item filters dropdown (#6806) 2025-03-24 18:24:10 +05:30
Bavisetti Narayan
fe04e5a292 [WEB-3658] fix: remove cycles and modules when issues are bulk deleted (#6807) 2025-03-24 18:23:09 +05:30
Aaryan Khandelwal
50e0cb7ffd [RANTS-75] chore: update profile sidebar icons and copy for consistency (#6808) 2025-03-24 18:21:12 +05:30
Akshita Goyal
d37d210921 [WEB-3677] fix: settings dynamic pages permissions (#6804)
* fix: settings dynamic pages permissions

* fix: refactor
2025-03-24 18:15:43 +05:30
Anmol Singh Bhatia
ab3eadf767 [WEB-3614] fix: cmd-k item focus state (#6800) 2025-03-24 18:13:49 +05:30
sriram veeraghanta
dbdf2f001a fix: transpile packages for web application 2025-03-24 13:47:00 +05:30
Prateek Shourya
0d069bf46e [RANTS-65] fix: undefined work item sequence in bulk delete work item modal (#6797) 2025-03-24 13:41:02 +05:30
Prateek Shourya
962923ff4f fix: admin build (#6798) 2025-03-24 13:40:07 +05:30
Samuel Torres
f720a9afb2 feat: validate github organization during OAuth login (#6700)
* feat: add GITHUB_ORGANIZATION_ID support for GitHub OAuth integration

* fix: remove debug print statements from InstanceConfigurationEndpoint
2025-03-24 12:55:20 +05:30
Akshita Goyal
4032aa62c5 [WEB-3551] fix: role improvements (#6763)
* Return Cycle start and end dates in project's timezone

* fix: role improvements

* chore: role updates

* chore: update role endpoint to update workspace admin permissions

* fix: conditions

* chore: update member role for workspace members

* chore: update workspace permission role

* fix: currentAdmin permissions

---------

Co-authored-by: Dheeraj Kumar Ketireddy <dheeru0198@gmail.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2025-03-24 12:52:57 +05:30
dependabot[bot]
cbe248591e chore(deps): bump next in the npm_and_yarn group across 1 directory (#6796)
Bumps the npm_and_yarn group with 1 update in the / directory: [next](https://github.com/vercel/next.js).


Updates `next` from 14.2.24 to 14.2.25
- [Release notes](https://github.com/vercel/next.js/releases)
- [Changelog](https://github.com/vercel/next.js/blob/canary/release.js)
- [Commits](https://github.com/vercel/next.js/compare/v14.2.24...v14.2.25)

---
updated-dependencies:
- dependency-name: next
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-24 12:52:16 +05:30
Dheeraj Kumar Ketireddy
75a9b71edb [WEB-3513] fix: return cycle start and end dates in project's timezone 2025-03-24 12:51:44 +05:30
dependabot[bot]
ef42ce04a4 chore(deps): bump gunicorn (#6793)
Bumps the pip group with 1 update in the /apiserver/requirements directory: [gunicorn](https://github.com/benoitc/gunicorn).


Updates `gunicorn` from 22.0.0 to 23.0.0
- [Release notes](https://github.com/benoitc/gunicorn/releases)
- [Commits](https://github.com/benoitc/gunicorn/compare/22.0.0...23.0.0)

---
updated-dependencies:
- dependency-name: gunicorn
  dependency-type: direct:production
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-24 12:48:05 +05:30
Vipin Chaudhary
6bafdb6dd8 [PE-298] Fix: Copy markdown to clipboard (#6675)
* fix: markdown for mentions fixed

* fix: copying text in mentions

* fix: refactored the component to use the same function

* chore: renamed funcion name

* add the new copy extension

* init working fix

* remove useless code

* improve readibility

* update node import

* better smaller logic

* remove log

* add open close end handler

* update readabliity

* handle tables

* handle triple click in cell

* triple tap select current line

* handle block and list

* lists fixed

* handle all possible cases of copy in table

* update the min elements

* handle multi types in table

* handle table seletion cases

* handle whole table handler

* feat: all case converd

* update markdown handling code

* update return statement

* handle using group block

* handle param

* handle multple cell in table

* handle using recursion

* add types

* fix code rabbit  suggestions

* fix root node bug

* update recursion with loop

* update transform copied to false

* refactor clipboard extension: remove options and integrate MarkdownClipboard into core extensions

* fix: header and code handler

* fix: store hooks fixed

* fix: mention id

---------

Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2025-03-24 12:32:11 +05:30
sriram veeraghanta
72307ec100 chore: update package version 2025-03-21 17:00:14 +05:30
sriram veeraghanta
2eb1d03c20 fix: transpile and optimize package imports 2025-03-21 01:51:50 +05:30
1021 changed files with 38303 additions and 9764 deletions

View File

@@ -273,7 +273,7 @@ jobs:
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
sed -i 's/${APP_RELEASE:-stable}/${APP_RELEASE:-'${REL_VERSION}'}/g' deploy/selfhost/docker-compose.yml
sed -i 's/APP_RELEASE=stable/APP_RELEASE='${REL_VERSION}'/g' deploy/selfhost/variables.env
# sed -i 's/APP_RELEASE=stable/APP_RELEASE='${REL_VERSION}'/g' deploy/selfhost/variables.env
- name: Create Release
id: create_release

10
.gitignore vendored
View File

@@ -1,5 +1,6 @@
node_modules
.next
.yarn
### NextJS ###
# Dependencies
@@ -52,6 +53,8 @@ mediafiles
.env
.DS_Store
logs/
htmlcov/
.coverage
node_modules/
assets/dist/
@@ -78,10 +81,17 @@ pnpm-workspace.yaml
.npmrc
.secrets
tmp/
## packages
dist
.temp/
deploy/selfhost/plane-app/
## Storybook
*storybook.log
output.css
dev-editor
# Redis
*.rdb
*.rdb.gz

1
.yarnrc.yml Normal file
View File

@@ -0,0 +1 @@
nodeLinker: node-modules

View File

@@ -15,14 +15,33 @@ Without said minimal reproduction, we won't be able to investigate all [issues](
You can open a new issue with this [issue form](https://github.com/makeplane/plane/issues/new).
### Naming conventions for issues
When opening a new issue, please use a clear and concise title that follows this format:
- For bugs: `🐛 Bug: [short description]`
- For features: `🚀 Feature: [short description]`
- For improvements: `🛠️ Improvement: [short description]`
- For documentation: `📘 Docs: [short description]`
**Examples:**
- `🐛 Bug: API token expiry time not saving correctly`
- `📘 Docs: Clarify RAM requirement for local setup`
- `🚀 Feature: Allow custom time selection for token expiration`
This helps us triage and manage issues more efficiently.
## Projects setup and Architecture
### Requirements
- Node.js version v16.18.0
- Docker Engine installed and running
- Node.js version 20+ [LTS version](https://nodejs.org/en/about/previous-releases)
- Python version 3.8+
- Postgres version v14
- Redis version v6.2.7
- **Memory**: Minimum **12 GB RAM** recommended
> ⚠️ Running the project on a system with only 8 GB RAM may lead to setup failures or memory crashes (especially during Docker container build/start or dependency install). Use cloud environments like GitHub Codespaces or upgrade local RAM if possible.
### Setup the project
@@ -50,6 +69,17 @@ chmod +x setup.sh
docker compose -f docker-compose-local.yml up
```
5. Start web apps:
```bash
yarn dev
```
6. Open your browser to http://localhost:3001/god-mode/ and register yourself as instance admin
7. Open up your browser to http://localhost:3000 then log in using the same credentials from the previous step
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
## Missing a Feature?
If a feature is missing, you can directly _request_ a new one [here](https://github.com/makeplane/plane/issues/new?assignees=&labels=feature&template=feature_request.yml&title=%F0%9F%9A%80+Feature%3A+). You also can do the same by choosing "🚀 Feature" when raising a [New Issue](https://github.com/makeplane/plane/issues/new/choose) on our GitHub Repository.
@@ -75,7 +105,7 @@ To ensure consistency throughout the source code, please keep these rules in min
- **Improve documentation** - fix incomplete or missing [docs](https://docs.plane.so/), bad wording, examples or explanations.
## Contributing to language support
This guide is designed to help contributors understand how to add or update translations in the application.
This guide is designed to help contributors understand how to add or update translations in the application.
### Understanding translation structure
@@ -90,7 +120,7 @@ packages/i18n/src/locales/
├── fr/
│ └── translations.json
└── [language]/
└── translations.json
└── translations.json
```
#### Nested structure
To keep translations organized, we use a nested structure for keys. This makes it easier to manage and locate specific translations. For example:
@@ -110,14 +140,14 @@ To keep translations organized, we use a nested structure for keys. This makes i
We use [IntlMessageFormat](https://formatjs.github.io/docs/intl-messageformat/) to handle dynamic content, such as variables and pluralization. Here's how to format your translations:
#### Examples
- **Simple variables**
- **Simple variables**
```json
{
"greeting": "Hello, {name}!"
}
```
- **Pluralization**
- **Pluralization**
```json
{
"items": "{count, plural, one {Work item} other {Work items}}"
@@ -142,15 +172,15 @@ We use [IntlMessageFormat](https://formatjs.github.io/docs/intl-messageformat/)
### Adding new languages
Adding a new language involves several steps to ensure it integrates seamlessly with the project. Follow these instructions carefully:
1. **Update type definitions**
1. **Update type definitions**
Add the new language to the TLanguage type in the language definitions file:
```typescript
// types/language.ts
export type TLanguage = "en" | "fr" | "your-lang";
```
```
2. **Add language configuration**
2. **Add language configuration**
Include the new language in the list of supported languages:
```typescript
@@ -161,14 +191,14 @@ Include the new language in the list of supported languages:
];
```
3. **Create translation files**
3. **Create translation files**
1. Create a new folder for your language under locales (e.g., `locales/your-lang/`).
2. Add a `translations.json` file inside the folder.
3. Copy the structure from an existing translation file and translate all keys.
4. **Update import logic**
4. **Update import logic**
Modify the language import logic to include your new language:
```typescript

View File

@@ -43,9 +43,6 @@ NGINX_PORT=80
# Debug value for api server use it as 0 for production use
DEBUG=0
CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs
SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
# Database Settings
POSTGRES_USER="plane"
POSTGRES_PASSWORD="plane"

View File

@@ -16,10 +16,10 @@
</p>
<p align="center">
<a href="https://dub.sh/plane-website-readme"><b>Website</b></a>
<a href="https://git.new/releases"><b>Releases</b></a>
<a href="https://dub.sh/planepowershq"><b>Twitter</b></a>
<a href="https://dub.sh/planedocs"><b>Documentation</b></a>
<a href="https://plane.so/"><b>Website</b></a>
<a href="https://github.com/makeplane/plane/releases"><b>Releases</b></a>
<a href="https://twitter.com/planepowers"><b>Twitter</b></a>
<a href="https://docs.plane.so/"><b>Documentation</b></a>
</p>
<p>
@@ -39,7 +39,7 @@
</a>
</p>
Meet [Plane](https://dub.sh/plane-website-readme), an open-source project management tool to track issues, run ~sprints~ cycles, and manage product roadmaps without the chaos of managing the tool itself. 🧘‍♀️
Meet [Plane](https://plane.so/), an open-source project management tool to track issues, run ~sprints~ cycles, and manage product roadmaps without the chaos of managing the tool itself. 🧘‍♀️
> Plane is evolving every day. Your suggestions, ideas, and reported bugs help us immensely. Do not hesitate to join in the conversation on [Discord](https://discord.com/invite/A92xrEGCge) or raise a GitHub issue. We read everything and respond to most.
@@ -47,10 +47,10 @@ Meet [Plane](https://dub.sh/plane-website-readme), an open-source project manage
Getting started with Plane is simple. Choose the setup that works best for you:
- **Plane Cloud**
- **Plane Cloud**
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
- **Self-host Plane**
- **Self-host Plane**
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
| Installation methods | Docs link |
@@ -62,22 +62,22 @@ Prefer full control over your data and infrastructure? Install and run Plane on
## 🌟 Features
- **Issues**
- **Issues**
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
- **Cycles**
- **Cycles**
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Views**
- **Views**
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
- **Pages**
- **Pages**
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
- **Analytics**
- **Analytics**
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
@@ -85,38 +85,7 @@ Access real-time insights across all your Plane data. Visualize trends, remove b
## 🛠️ Local development
### Pre-requisites
- Ensure Docker Engine is installed and running.
### Development setup
Setting up your local environment is simple and straightforward. Follow these steps to get started:
1. Clone the repository:
```
git clone https://github.com/makeplane/plane.git
```
2. Navigate to the project folder:
```
cd plane
```
3. Create a new branch for your feature or fix:
```
git checkout -b <feature-branch-name>
```
4. Run the setup script in the terminal:
```
./setup.sh
```
5. Open the project in an IDE such as VS Code.
6. Review the `.env` files in the relevant folders. Refer to [Environment Setup](./ENV_SETUP.md) for details on the environment variables used.
7. Start the services using Docker:
```
docker compose -f docker-compose-local.yml up -d
```
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
See [CONTRIBUTING](./CONTRIBUTING.md)
## ⚙️ Built with
[![Next JS](https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white)](https://nextjs.org/)
@@ -194,7 +163,7 @@ Feel free to ask questions, report bugs, participate in discussions, share ideas
If you discover a security vulnerability in Plane, please report it responsibly instead of opening a public issue. We take all legitimate reports seriously and will investigate them promptly. See [Security policy](https://github.com/makeplane/plane/blob/master/SECURITY.md) for more info.
To disclose any security issues, please email us at security@plane.so.
To disclose any security issues, please email us at security@plane.so.
## 🤝 Contributing
@@ -219,4 +188,4 @@ Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CON
## License
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).

View File

@@ -1,3 +1,12 @@
NEXT_PUBLIC_API_BASE_URL=""
NEXT_PUBLIC_API_BASE_URL="http://localhost:8000"
NEXT_PUBLIC_WEB_BASE_URL="http://localhost:3000"
NEXT_PUBLIC_ADMIN_BASE_URL="http://localhost:3001"
NEXT_PUBLIC_ADMIN_BASE_PATH="/god-mode"
NEXT_PUBLIC_WEB_BASE_URL=""
NEXT_PUBLIC_SPACE_BASE_URL="http://localhost:3002"
NEXT_PUBLIC_SPACE_BASE_PATH="/spaces"
NEXT_PUBLIC_LIVE_BASE_URL="http://localhost:3100"
NEXT_PUBLIC_LIVE_BASE_PATH="/live"

View File

@@ -26,16 +26,16 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
formState: { errors, isSubmitting },
} = useForm<AIFormValues>({
defaultValues: {
OPENAI_API_KEY: config["OPENAI_API_KEY"],
GPT_ENGINE: config["GPT_ENGINE"],
LLM_API_KEY: config["LLM_API_KEY"],
LLM_MODEL: config["LLM_MODEL"],
},
});
const aiFormFields: TControllerInputFormField[] = [
{
key: "GPT_ENGINE",
key: "LLM_MODEL",
type: "text",
label: "GPT_ENGINE",
label: "LLM Model",
description: (
<>
Choose an OpenAI engine.{" "}
@@ -49,12 +49,12 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
</a>
</>
),
placeholder: "gpt-3.5-turbo",
error: Boolean(errors.GPT_ENGINE),
placeholder: "gpt-4o-mini",
error: Boolean(errors.LLM_MODEL),
required: false,
},
{
key: "OPENAI_API_KEY",
key: "LLM_API_KEY",
type: "password",
label: "API key",
description: (
@@ -71,7 +71,7 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
</>
),
placeholder: "sk-asddassdfasdefqsdfasd23das3dasdcasd",
error: Boolean(errors.OPENAI_API_KEY),
error: Boolean(errors.LLM_API_KEY),
required: false,
},
];

View File

@@ -43,6 +43,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
defaultValues: {
GITHUB_CLIENT_ID: config["GITHUB_CLIENT_ID"],
GITHUB_CLIENT_SECRET: config["GITHUB_CLIENT_SECRET"],
GITHUB_ORGANIZATION_ID: config["GITHUB_ORGANIZATION_ID"],
},
});
@@ -93,6 +94,15 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
error: Boolean(errors.GITHUB_CLIENT_SECRET),
required: true,
},
{
key: "GITHUB_ORGANIZATION_ID",
type: "text",
label: "Organization ID",
description: <>The organization github ID.</>,
placeholder: "123456789",
error: Boolean(errors.GITHUB_ORGANIZATION_ID),
required: false,
},
];
const GITHUB_SERVICE_FIELD: TCopyField[] = [
@@ -150,6 +160,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
reset({
GITHUB_CLIENT_ID: response.find((item) => item.key === "GITHUB_CLIENT_ID")?.value,
GITHUB_CLIENT_SECRET: response.find((item) => item.key === "GITHUB_CLIENT_SECRET")?.value,
GITHUB_ORGANIZATION_ID: response.find((item) => item.key === "GITHUB_ORGANIZATION_ID")?.value,
});
})
.catch((err) => console.error(err));

View File

@@ -3,18 +3,16 @@
import { ReactNode } from "react";
import { ThemeProvider, useTheme } from "next-themes";
import { SWRConfig } from "swr";
// ui
// plane imports
import { ADMIN_BASE_PATH, DEFAULT_SWR_CONFIG } from "@plane/constants";
import { Toast } from "@plane/ui";
import { resolveGeneralTheme } from "@plane/utils";
// constants
// helpers
// lib
import { InstanceProvider } from "@/lib/instance-provider";
import { StoreProvider } from "@/lib/store-provider";
import { UserProvider } from "@/lib/user-provider";
// styles
import "./globals.css";
import "@/styles/globals.css";
const ToastWithTheme = () => {
const { resolvedTheme } = useTheme();

View File

@@ -7,7 +7,7 @@ import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane internal packages
import { API_BASE_URL } from "@plane/constants";
import {AuthService } from "@plane/services";
import { AuthService } from "@plane/services";
import { Avatar } from "@plane/ui";
import { getFileURL, cn } from "@plane/utils";
// hooks

View File

@@ -2,7 +2,7 @@ import set from "lodash/set";
import { observable, action, computed, makeObservable, runInAction } from "mobx";
// plane internal packages
import { EInstanceStatus, TInstanceStatus } from "@plane/constants";
import {InstanceService} from "@plane/services";
import { InstanceService } from "@plane/services";
import {
IInstance,
IInstanceAdmin,

View File

@@ -1 +1 @@
export * from "ce/components/authentication/authentication-modes";
export * from "ce/components/authentication/authentication-modes";

View File

@@ -9,6 +9,19 @@ const nextConfig = {
unoptimized: true,
},
basePath: process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "",
transpilePackages: [
"@plane/constants",
"@plane/editor",
"@plane/hooks",
"@plane/i18n",
"@plane/logger",
"@plane/propel",
"@plane/services",
"@plane/shared-state",
"@plane/types",
"@plane/ui",
"@plane/utils",
],
};
module.exports = nextConfig;

View File

@@ -1,7 +1,7 @@
{
"name": "admin",
"description": "Admin UI for Plane",
"version": "0.25.2",
"version": "0.26.0",
"license": "AGPL-3.0",
"private": true,
"scripts": {
@@ -10,6 +10,7 @@
"build": "next build",
"preview": "next build && next start",
"start": "next start",
"format": "prettier --write .",
"lint": "eslint . --ext .ts,.tsx",
"lint:errors": "eslint . --ext .ts,.tsx --quiet"
},
@@ -17,11 +18,11 @@
"@headlessui/react": "^1.7.19",
"@plane/constants": "*",
"@plane/hooks": "*",
"@plane/propel": "*",
"@plane/services": "*",
"@plane/types": "*",
"@plane/ui": "*",
"@plane/utils": "*",
"@plane/services": "*",
"@sentry/nextjs": "^8.54.0",
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
@@ -30,7 +31,7 @@
"lucide-react": "^0.469.0",
"mobx": "^6.12.0",
"mobx-react": "^9.1.1",
"next": "^14.2.20",
"next": "^14.2.28",
"next-themes": "^0.2.1",
"postcss": "^8.4.38",
"react": "^18.3.1",

View File

@@ -1,8 +1,2 @@
module.exports = {
plugins: {
"postcss-import": {},
"tailwindcss/nesting": {},
tailwindcss: {},
autoprefixer: {},
},
};
// eslint-disable-next-line @typescript-eslint/no-require-imports
module.exports = require("@plane/tailwind-config/postcss.config.js");

View File

@@ -1,5 +1,4 @@
@import url("https://fonts.googleapis.com/css2?family=Inter:wght@200;300;400;500;600;700;800&display=swap");
@import url("https://fonts.googleapis.com/css2?family=Material+Symbols+Rounded:opsz,wght,FILL,GRAD@48,400,0,0&display=swap");
@import "@plane/propel/styles/fonts";
@tailwind base;
@tailwind components;
@@ -60,23 +59,31 @@
--color-border-300: 212, 212, 212; /* strong border- 1 */
--color-border-400: 185, 185, 185; /* strong border- 2 */
--color-shadow-2xs: 0px 0px 1px 0px rgba(23, 23, 23, 0.06), 0px 1px 2px 0px rgba(23, 23, 23, 0.06),
--color-shadow-2xs:
0px 0px 1px 0px rgba(23, 23, 23, 0.06), 0px 1px 2px 0px rgba(23, 23, 23, 0.06),
0px 1px 2px 0px rgba(23, 23, 23, 0.14);
--color-shadow-xs: 0px 1px 2px 0px rgba(0, 0, 0, 0.16), 0px 2px 4px 0px rgba(16, 24, 40, 0.12),
--color-shadow-xs:
0px 1px 2px 0px rgba(0, 0, 0, 0.16), 0px 2px 4px 0px rgba(16, 24, 40, 0.12),
0px 1px 8px -1px rgba(16, 24, 40, 0.1);
--color-shadow-sm: 0px 1px 4px 0px rgba(0, 0, 0, 0.01), 0px 4px 8px 0px rgba(0, 0, 0, 0.02),
0px 1px 12px 0px rgba(0, 0, 0, 0.12);
--color-shadow-rg: 0px 3px 6px 0px rgba(0, 0, 0, 0.1), 0px 4px 4px 0px rgba(16, 24, 40, 0.08),
--color-shadow-sm:
0px 1px 4px 0px rgba(0, 0, 0, 0.01), 0px 4px 8px 0px rgba(0, 0, 0, 0.02), 0px 1px 12px 0px rgba(0, 0, 0, 0.12);
--color-shadow-rg:
0px 3px 6px 0px rgba(0, 0, 0, 0.1), 0px 4px 4px 0px rgba(16, 24, 40, 0.08),
0px 1px 12px 0px rgba(16, 24, 40, 0.04);
--color-shadow-md: 0px 4px 8px 0px rgba(0, 0, 0, 0.12), 0px 6px 12px 0px rgba(16, 24, 40, 0.12),
--color-shadow-md:
0px 4px 8px 0px rgba(0, 0, 0, 0.12), 0px 6px 12px 0px rgba(16, 24, 40, 0.12),
0px 1px 16px 0px rgba(16, 24, 40, 0.12);
--color-shadow-lg: 0px 6px 12px 0px rgba(0, 0, 0, 0.12), 0px 8px 16px 0px rgba(0, 0, 0, 0.12),
--color-shadow-lg:
0px 6px 12px 0px rgba(0, 0, 0, 0.12), 0px 8px 16px 0px rgba(0, 0, 0, 0.12),
0px 1px 24px 0px rgba(16, 24, 40, 0.12);
--color-shadow-xl: 0px 0px 18px 0px rgba(0, 0, 0, 0.16), 0px 0px 24px 0px rgba(16, 24, 40, 0.16),
--color-shadow-xl:
0px 0px 18px 0px rgba(0, 0, 0, 0.16), 0px 0px 24px 0px rgba(16, 24, 40, 0.16),
0px 0px 52px 0px rgba(16, 24, 40, 0.16);
--color-shadow-2xl: 0px 8px 16px 0px rgba(0, 0, 0, 0.12), 0px 12px 24px 0px rgba(16, 24, 40, 0.12),
--color-shadow-2xl:
0px 8px 16px 0px rgba(0, 0, 0, 0.12), 0px 12px 24px 0px rgba(16, 24, 40, 0.12),
0px 1px 32px 0px rgba(16, 24, 40, 0.12);
--color-shadow-3xl: 0px 12px 24px 0px rgba(0, 0, 0, 0.12), 0px 16px 32px 0px rgba(0, 0, 0, 0.12),
--color-shadow-3xl:
0px 12px 24px 0px rgba(0, 0, 0, 0.12), 0px 16px 32px 0px rgba(0, 0, 0, 0.12),
0px 1px 48px 0px rgba(16, 24, 40, 0.12);
--color-shadow-4xl: 0px 8px 40px 0px rgba(0, 0, 61, 0.05), 0px 12px 32px -16px rgba(0, 0, 0, 0.05);

View File

@@ -1,13 +1,19 @@
{
"extends": "@plane/typescript-config/nextjs.json",
"compilerOptions": {
"plugins": [{ "name": "next" }],
"plugins": [
{
"name": "next"
}
],
"baseUrl": ".",
"paths": {
"@/*": ["core/*"],
"@/public/*": ["public/*"],
"@/plane-admin/*": ["ce/*"]
}
"@/plane-admin/*": ["ce/*"],
"@/styles/*": ["styles/*"]
},
"strictNullChecks": true
},
"include": ["next-env.d.ts", "next.config.js", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules"]

View File

@@ -145,11 +145,8 @@ RUN chmod +x /app/pg-setup.sh
# APPLICATION ENVIRONMENT SETTINGS
# *****************************************************************************
ENV APP_DOMAIN=localhost
ENV WEB_URL=http://${APP_DOMAIN}
ENV DEBUG=0
ENV SENTRY_DSN=
ENV SENTRY_ENVIRONMENT=production
ENV CORS_ALLOWED_ORIGINS=http://${APP_DOMAIN},https://${APP_DOMAIN}
# Secret Key
ENV SECRET_KEY=60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5

25
apiserver/.coveragerc Normal file
View File

@@ -0,0 +1,25 @@
[run]
source = plane
omit =
*/tests/*
*/migrations/*
*/settings/*
*/wsgi.py
*/asgi.py
*/urls.py
manage.py
*/admin.py
*/apps.py
[report]
exclude_lines =
pragma: no cover
def __repr__
if self.debug:
raise NotImplementedError
if __name__ == .__main__.
pass
raise ImportError
[html]
directory = htmlcov

View File

@@ -1,11 +1,7 @@
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs
SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
CORS_ALLOWED_ORIGINS="http://localhost:3000,http://localhost:3001,http://localhost:3002,http://localhost:3100"
# Database Settings
POSTGRES_USER="plane"
@@ -31,7 +27,7 @@ RABBITMQ_VHOST="plane"
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
AWS_SECRET_ACCESS_KEY="secret-key"
AWS_S3_ENDPOINT_URL="http://plane-minio:9000"
AWS_S3_ENDPOINT_URL="http://localhost:9000"
# Changing this requires change in the nginx.conf for uploads if using minio setup
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
@@ -41,22 +37,31 @@ FILE_SIZE_LIMIT=5242880
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
USE_MINIO=0
# Nginx Configuration
NGINX_PORT=80
# Email redirections and minio domain settings
WEB_URL="http://localhost"
WEB_URL="http://localhost:8000"
# Gunicorn Workers
GUNICORN_WORKERS=2
# Base URLs
ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
ADMIN_BASE_URL="http://localhost:3001"
ADMIN_BASE_PATH="/god-mode"
SPACE_BASE_URL="http://localhost:3002"
SPACE_BASE_PATH="/spaces"
APP_BASE_URL="http://localhost:3000"
APP_BASE_PATH=""
LIVE_BASE_URL="http://localhost:3100"
LIVE_BASE_PATH="/live"
LIVE_SERVER_SECRET_KEY="secret-key"
# Hard delete files after days
HARD_DELETE_AFTER_DAYS=60

View File

@@ -1,6 +1,6 @@
{
"name": "plane-api",
"version": "0.25.2",
"version": "0.26.0",
"license": "AGPL-3.0",
"private": true,
"description": "API server powering Plane's backend"

View File

@@ -15,4 +15,4 @@ from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .intake import IntakeIssueSerializer
from .estimate import EstimatePointSerializer
from .estimate import EstimatePointSerializer

View File

@@ -1,4 +1,5 @@
# Third party imports
import pytz
from rest_framework import serializers
# Module imports
@@ -18,6 +19,14 @@ class CycleSerializer(BaseSerializer):
completed_estimates = serializers.FloatField(read_only=True)
started_estimates = serializers.FloatField(read_only=True)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
project = self.context.get("project")
if project and project.timezone:
project_timezone = pytz.timezone(project.timezone)
self.fields["start_date"].timezone = project_timezone
self.fields["end_date"].timezone = project_timezone
def validate(self, data):
if (
data.get("start_date", None) is not None
@@ -30,12 +39,15 @@ class CycleSerializer(BaseSerializer):
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
):
project_id = self.initial_data.get("project_id") or self.instance.project_id
is_start_date_end_date_equal = (
True
if str(data.get("start_date")) == str(data.get("end_date"))
else False
project_id = self.initial_data.get("project_id") or (
self.instance.project_id
if self.instance and hasattr(self.instance, "project_id")
else None
)
if not project_id:
raise serializers.ValidationError("Project ID is required")
data["start_date"] = convert_to_utc(
date=str(data.get("start_date").date()),
project_id=project_id,
@@ -44,7 +56,6 @@ class CycleSerializer(BaseSerializer):
data["end_date"] = convert_to_utc(
date=str(data.get("end_date", None).date()),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
return data

View File

@@ -160,12 +160,15 @@ class IssueSerializer(BaseSerializer):
else:
try:
# Then assign it to default assignee, if it is a valid assignee
if default_assignee_id is not None and ProjectMember.objects.filter(
member_id=default_assignee_id,
project_id=project_id,
role__gte=15,
is_active=True
).exists():
if (
default_assignee_id is not None
and ProjectMember.objects.filter(
member_id=default_assignee_id,
project_id=project_id,
role__gte=15,
is_active=True,
).exists()
):
IssueAssignee.objects.create(
assignee_id=default_assignee_id,
issue=issue,

View File

@@ -16,7 +16,6 @@ class ProjectSerializer(BaseSerializer):
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
cover_image_url = serializers.CharField(read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
class Meta:
model = Project

View File

@@ -4,16 +4,6 @@ from plane.api.views import IntakeIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/",
IntakeIssueAPIEndpoint.as_view(),

View File

@@ -39,7 +39,7 @@ from plane.db.models import (
UserFavorite,
)
from plane.utils.analytics_plot import burndown_plot
from plane.utils.host import base_host
from .base import BaseAPIView
from plane.bgtasks.webhook_task import model_activity
@@ -137,10 +137,14 @@ class CycleAPIEndpoint(BaseAPIView):
)
def get(self, request, slug, project_id, pk=None):
project = Project.objects.get(workspace__slug=slug, pk=project_id)
if pk:
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = CycleSerializer(
queryset, fields=self.fields, expand=self.expand
queryset,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data
return Response(data, status=status.HTTP_200_OK)
queryset = self.get_queryset().filter(archived_at__isnull=True)
@@ -152,7 +156,11 @@ class CycleAPIEndpoint(BaseAPIView):
start_date__lte=timezone.now(), end_date__gte=timezone.now()
)
data = CycleSerializer(
queryset, many=True, fields=self.fields, expand=self.expand
queryset,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data
return Response(data, status=status.HTTP_200_OK)
@@ -163,7 +171,11 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles, many=True, fields=self.fields, expand=self.expand
cycles,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data,
)
@@ -174,7 +186,11 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles, many=True, fields=self.fields, expand=self.expand
cycles,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data,
)
@@ -185,7 +201,11 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles, many=True, fields=self.fields, expand=self.expand
cycles,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data,
)
@@ -198,14 +218,22 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles, many=True, fields=self.fields, expand=self.expand
cycles,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data,
)
return self.paginate(
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles, many=True, fields=self.fields, expand=self.expand
cycles,
many=True,
fields=self.fields,
expand=self.expand,
context={"project": project},
).data,
)
@@ -251,7 +279,7 @@ class CycleAPIEndpoint(BaseAPIView):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -323,7 +351,7 @@ class CycleAPIEndpoint(BaseAPIView):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -694,7 +722,7 @@ class CycleIssueAPIEndpoint(BaseAPIView):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# Return all Cycle Issues
return Response(
@@ -760,6 +788,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -771,6 +800,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -819,6 +849,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
)
)
old_cycle = old_cycle.first()
estimate_type = Project.objects.filter(
workspace__slug=slug,
@@ -938,7 +969,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
estimate_completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="points",
@@ -1086,7 +1117,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
# Pass the new_cycle queryset to burndown_plot
completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="issues",
@@ -1098,12 +1129,12 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
).first()
current_cycle.progress_snapshot = {
"total_issues": old_cycle.first().total_issues,
"completed_issues": old_cycle.first().completed_issues,
"cancelled_issues": old_cycle.first().cancelled_issues,
"started_issues": old_cycle.first().started_issues,
"unstarted_issues": old_cycle.first().unstarted_issues,
"backlog_issues": old_cycle.first().backlog_issues,
"total_issues": old_cycle.total_issues,
"completed_issues": old_cycle.completed_issues,
"cancelled_issues": old_cycle.cancelled_issues,
"started_issues": old_cycle.started_issues,
"unstarted_issues": old_cycle.unstarted_issues,
"backlog_issues": old_cycle.backlog_issues,
"distribution": {
"labels": label_distribution_data,
"assignees": assignee_distribution_data,
@@ -1168,7 +1199,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response({"message": "Success"}, status=status.HTTP_200_OK)

View File

@@ -18,8 +18,9 @@ from plane.api.serializers import IntakeIssueSerializer, IssueSerializer
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import Intake, IntakeIssue, Issue, Project, ProjectMember, State
from plane.utils.host import base_host
from .base import BaseAPIView
from plane.db.models.intake import SourceType
class IntakeIssueAPIEndpoint(BaseAPIView):
@@ -125,7 +126,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
intake_id=intake.id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "IN-APP"),
source=SourceType.IN_APP,
)
# Create an Issue Activity
issue_activity.delay(
@@ -297,7 +298,7 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=False,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
intake=str(intake_issue.id),
)

View File

@@ -56,6 +56,8 @@ from plane.db.models import (
from plane.settings.storage import S3Storage
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
from .base import BaseAPIView
from plane.utils.host import base_host
from plane.bgtasks.webhook_task import model_activity
class WorkspaceIssueAPIEndpoint(BaseAPIView):
@@ -321,6 +323,17 @@ class IssueAPIEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
# Send the model activity
model_activity.delay(
model_name="issue",
model_id=str(serializer.data["id"]),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -1048,7 +1061,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# Get the storage metadata
@@ -1108,7 +1121,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
current_instance=json.dumps(serializer.data, cls=DjangoJSONEncoder),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# Update the attachment

View File

@@ -33,6 +33,7 @@ from plane.db.models import (
from .base import BaseAPIView
from plane.bgtasks.webhook_task import model_activity
from plane.utils.host import base_host
class ModuleAPIEndpoint(BaseAPIView):
@@ -174,7 +175,7 @@ class ModuleAPIEndpoint(BaseAPIView):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
module = Module.objects.get(pk=serializer.data["id"])
serializer = ModuleSerializer(module)
@@ -226,7 +227,7 @@ class ModuleAPIEndpoint(BaseAPIView):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -280,6 +281,7 @@ class ModuleAPIEndpoint(BaseAPIView):
project_id=str(project_id),
current_instance=json.dumps({"module_name": str(module.name)}),
epoch=int(timezone.now().timestamp()),
origin=base_host(request=request, is_app=True),
)
module.delete()
# Delete the module issues
@@ -449,6 +451,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
}
),
epoch=int(timezone.now().timestamp()),
origin=base_host(request=request, is_app=True),
)
return Response(

View File

@@ -30,6 +30,7 @@ from plane.db.models import (
)
from plane.bgtasks.webhook_task import model_activity, webhook_activity
from .base import BaseAPIView
from plane.utils.host import base_host
class ProjectAPIEndpoint(BaseAPIView):
@@ -171,14 +172,14 @@ class ProjectAPIEndpoint(BaseAPIView):
states = [
{
"name": "Backlog",
"color": "#A3A3A3",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
{
"name": "Todo",
"color": "#3A3A3A",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
@@ -190,13 +191,13 @@ class ProjectAPIEndpoint(BaseAPIView):
},
{
"name": "Done",
"color": "#16A34A",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#EF4444",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},
@@ -228,7 +229,7 @@ class ProjectAPIEndpoint(BaseAPIView):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
serializer = ProjectSerializer(project)
@@ -238,7 +239,7 @@ class ProjectAPIEndpoint(BaseAPIView):
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
except Workspace.DoesNotExist:
return Response(
@@ -247,7 +248,7 @@ class ProjectAPIEndpoint(BaseAPIView):
except ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
def patch(self, request, slug, pk):
@@ -258,9 +259,7 @@ class ProjectAPIEndpoint(BaseAPIView):
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
intake_view = request.data.get(
"inbox_view", request.data.get("intake_view", project.intake_view)
)
intake_view = request.data.get("intake_view", project.intake_view)
if project.archived_at:
return Response(
@@ -297,7 +296,7 @@ class ProjectAPIEndpoint(BaseAPIView):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
serializer = ProjectSerializer(project)
@@ -307,7 +306,7 @@ class ProjectAPIEndpoint(BaseAPIView):
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
@@ -316,7 +315,7 @@ class ProjectAPIEndpoint(BaseAPIView):
except ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
def delete(self, request, slug, pk):
@@ -334,7 +333,7 @@ class ProjectAPIEndpoint(BaseAPIView):
new_value=None,
actor_id=request.user.id,
slug=slug,
current_site=request.META.get("HTTP_ORIGIN"),
current_site=base_host(request=request, is_app=True),
event_id=project.id,
old_identifier=None,
new_identifier=None,

View File

@@ -1,5 +1,7 @@
from .base import BaseSerializer
from plane.db.models import APIToken, APIActivityLog
from rest_framework import serializers
from django.utils import timezone
class APITokenSerializer(BaseSerializer):
@@ -17,10 +19,17 @@ class APITokenSerializer(BaseSerializer):
class APITokenReadSerializer(BaseSerializer):
is_active = serializers.SerializerMethodField()
class Meta:
model = APIToken
exclude = ("token",)
def get_is_active(self, obj: APIToken) -> bool:
if obj.expired_at is None:
return True
return timezone.now() < obj.expired_at
class APIActivityLogSerializer(BaseSerializer):
class Meta:

View File

@@ -25,11 +25,6 @@ class CycleWriteSerializer(BaseSerializer):
or (self.instance and self.instance.project_id)
or self.context.get("project_id", None)
)
is_start_date_end_date_equal = (
True
if str(data.get("start_date")) == str(data.get("end_date"))
else False
)
data["start_date"] = convert_to_utc(
date=str(data.get("start_date").date()),
project_id=project_id,
@@ -38,7 +33,6 @@ class CycleWriteSerializer(BaseSerializer):
data["end_date"] = convert_to_utc(
date=str(data.get("end_date", None).date()),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
return data

View File

@@ -53,6 +53,7 @@ def get_entity_model_and_serializer(entity_type):
}
return entity_map.get(entity_type, (None, None))
class UserFavoriteSerializer(serializers.ModelSerializer):
entity_data = serializers.SerializerMethodField()

View File

@@ -352,8 +352,19 @@ class IssueRelationSerializer(BaseSerializer):
"state_id",
"priority",
"assignee_ids",
"created_by",
"created_at",
"updated_at",
"updated_by",
]
read_only_fields = [
"workspace",
"project",
"created_by",
"created_at",
"updated_by",
"updated_at",
]
read_only_fields = ["workspace", "project"]
class RelatedIssueSerializer(BaseSerializer):
@@ -383,8 +394,19 @@ class RelatedIssueSerializer(BaseSerializer):
"state_id",
"priority",
"assignee_ids",
"created_by",
"created_at",
"updated_by",
"updated_at",
]
read_only_fields = [
"workspace",
"project",
"created_by",
"created_at",
"updated_by",
"updated_at",
]
read_only_fields = ["workspace", "project"]
class IssueAssigneeSerializer(BaseSerializer):

View File

@@ -151,7 +151,8 @@ class ProjectMemberAdminSerializer(BaseSerializer):
class ProjectMemberRoleSerializer(DynamicBaseSerializer):
class Meta:
model = ProjectMember
fields = ("id", "role", "member", "project")
fields = ("id", "role", "member", "project", "created_at")
read_only_fields = ["created_at"]
class ProjectMemberInviteSerializer(BaseSerializer):

View File

@@ -1,11 +1,13 @@
# Module imports
from .base import BaseSerializer
from rest_framework import serializers
from plane.db.models import State
class StateSerializer(BaseSerializer):
order = serializers.FloatField(required=False)
class Meta:
model = State
fields = [
@@ -18,6 +20,7 @@ class StateSerializer(BaseSerializer):
"default",
"description",
"sequence",
"order",
]
read_only_fields = ["workspace", "project"]

View File

@@ -148,7 +148,6 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
return value
def create(self, validated_data):
# Filtering the WorkspaceUserLink with the given url to check if the link already exists.
@@ -157,7 +156,7 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
workspace_user_link = WorkspaceUserLink.objects.filter(
url=url,
workspace_id=validated_data.get("workspace_id"),
owner_id=validated_data.get("owner_id")
owner_id=validated_data.get("owner_id"),
)
if workspace_user_link.exists():
@@ -173,10 +172,8 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
url = validated_data.get("url")
workspace_user_link = WorkspaceUserLink.objects.filter(
url=url,
workspace_id=instance.workspace_id,
owner=instance.owner
)
url=url, workspace_id=instance.workspace_id, owner=instance.owner
)
if workspace_user_link.exclude(pk=instance.id).exists():
raise serializers.ValidationError(
@@ -185,6 +182,7 @@ class WorkspaceUserLinkSerializer(BaseSerializer):
return super().update(instance, validated_data)
class IssueRecentVisitSerializer(serializers.ModelSerializer):
project_identifier = serializers.SerializerMethodField()

View File

@@ -6,8 +6,14 @@ from plane.app.views import (
AnalyticViewViewset,
SavedAnalyticEndpoint,
ExportAnalyticsEndpoint,
AdvanceAnalyticsEndpoint,
AdvanceAnalyticsStatsEndpoint,
AdvanceAnalyticsChartEndpoint,
DefaultAnalyticsEndpoint,
ProjectStatsEndpoint,
ProjectAdvanceAnalyticsEndpoint,
ProjectAdvanceAnalyticsStatsEndpoint,
ProjectAdvanceAnalyticsChartEndpoint,
)
@@ -49,4 +55,34 @@ urlpatterns = [
ProjectStatsEndpoint.as_view(),
name="project-analytics",
),
path(
"workspaces/<str:slug>/advance-analytics/",
AdvanceAnalyticsEndpoint.as_view(),
name="advance-analytics",
),
path(
"workspaces/<str:slug>/advance-analytics-stats/",
AdvanceAnalyticsStatsEndpoint.as_view(),
name="advance-analytics-stats",
),
path(
"workspaces/<str:slug>/advance-analytics-charts/",
AdvanceAnalyticsChartEndpoint.as_view(),
name="advance-analytics-chart",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics/",
ProjectAdvanceAnalyticsEndpoint.as_view(),
name="project-advance-analytics",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics-stats/",
ProjectAdvanceAnalyticsStatsEndpoint.as_view(),
name="project-advance-analytics-stats",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/advance-analytics-charts/",
ProjectAdvanceAnalyticsChartEndpoint.as_view(),
name="project-advance-analytics-chart",
),
]

View File

@@ -1,7 +1,11 @@
from django.urls import path
from plane.app.views import IntakeViewSet, IntakeIssueViewSet
from plane.app.views import (
IntakeViewSet,
IntakeIssueViewSet,
IntakeWorkItemDescriptionVersionEndpoint,
)
urlpatterns = [
@@ -53,4 +57,14 @@ urlpatterns = [
),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-work-items/<uuid:work_item_id>/description-versions/",
IntakeWorkItemDescriptionVersionEndpoint.as_view(),
name="intake-work-item-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-work-items/<uuid:work_item_id>/description-versions/<uuid:pk>/",
IntakeWorkItemDescriptionVersionEndpoint.as_view(),
name="intake-work-item-versions",
),
]

View File

@@ -25,7 +25,7 @@ from plane.app.views import (
IssueAttachmentV2Endpoint,
IssueBulkUpdateDateEndpoint,
IssueVersionEndpoint,
IssueDescriptionVersionEndpoint,
WorkItemDescriptionVersionEndpoint,
IssueMetaEndpoint,
IssueDetailIdentifierEndpoint,
)
@@ -263,22 +263,22 @@ urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/versions/",
IssueVersionEndpoint.as_view(),
name="page-versions",
name="issue-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/versions/<uuid:pk>/",
IssueVersionEndpoint.as_view(),
name="page-versions",
name="issue-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/description-versions/",
IssueDescriptionVersionEndpoint.as_view(),
name="page-versions",
"workspaces/<str:slug>/projects/<uuid:project_id>/work-items/<uuid:work_item_id>/description-versions/",
WorkItemDescriptionVersionEndpoint.as_view(),
name="work-item-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/description-versions/<uuid:pk>/",
IssueDescriptionVersionEndpoint.as_view(),
name="page-versions",
"workspaces/<str:slug>/projects/<uuid:project_id>/work-items/<uuid:work_item_id>/description-versions/<uuid:pk>/",
WorkItemDescriptionVersionEndpoint.as_view(),
name="work-item-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/meta/",

View File

@@ -144,7 +144,7 @@ from .issue.sub_issue import SubIssuesEndpoint
from .issue.subscriber import IssueSubscriberViewSet
from .issue.version import IssueVersionEndpoint, IssueDescriptionVersionEndpoint
from .issue.version import IssueVersionEndpoint, WorkItemDescriptionVersionEndpoint
from .module.base import (
ModuleViewSet,
@@ -184,7 +184,11 @@ from .estimate.base import (
EstimatePointEndpoint,
)
from .intake.base import IntakeViewSet, IntakeIssueViewSet
from .intake.base import (
IntakeViewSet,
IntakeIssueViewSet,
IntakeWorkItemDescriptionVersionEndpoint,
)
from .analytic.base import (
AnalyticsEndpoint,
@@ -195,6 +199,18 @@ from .analytic.base import (
ProjectStatsEndpoint,
)
from .analytic.advance import (
AdvanceAnalyticsEndpoint,
AdvanceAnalyticsStatsEndpoint,
AdvanceAnalyticsChartEndpoint,
)
from .analytic.project_analytics import (
ProjectAdvanceAnalyticsEndpoint,
ProjectAdvanceAnalyticsStatsEndpoint,
ProjectAdvanceAnalyticsChartEndpoint,
)
from .notification.base import (
NotificationViewSet,
UnreadNotificationEndpoint,

View File

@@ -0,0 +1,366 @@
from rest_framework.response import Response
from rest_framework import status
from typing import Dict, List, Any
from django.db.models import QuerySet, Q, Count
from django.http import HttpRequest
from django.db.models.functions import TruncMonth
from django.utils import timezone
from plane.app.views.base import BaseAPIView
from plane.app.permissions import ROLE, allow_permission
from plane.db.models import (
WorkspaceMember,
Project,
Issue,
Cycle,
Module,
IssueView,
ProjectPage,
Workspace,
CycleIssue,
ModuleIssue,
ProjectMember,
)
from plane.utils.build_chart import build_analytics_chart
from plane.utils.date_utils import (
get_analytics_filters,
)
class AdvanceAnalyticsBaseView(BaseAPIView):
def initialize_workspace(self, slug: str, type: str) -> None:
self._workspace_slug = slug
self.filters = get_analytics_filters(
slug=slug,
type=type,
user=self.request.user,
date_filter=self.request.GET.get("date_filter", None),
project_ids=self.request.GET.get("project_ids", None),
)
class AdvanceAnalyticsEndpoint(AdvanceAnalyticsBaseView):
def get_filtered_counts(self, queryset: QuerySet) -> Dict[str, int]:
def get_filtered_count() -> int:
if self.filters["analytics_date_range"]:
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["current"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["current"][
"lte"
],
).count()
return queryset.count()
def get_previous_count() -> int:
if self.filters["analytics_date_range"] and self.filters[
"analytics_date_range"
].get("previous"):
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["previous"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["previous"][
"lte"
],
).count()
return 0
return {
"count": get_filtered_count(),
# "filter_count": get_previous_count(),
}
def get_overview_data(self) -> Dict[str, Dict[str, int]]:
members_query = WorkspaceMember.objects.filter(
workspace__slug=self._workspace_slug, is_active=True
)
if self.request.GET.get("project_ids", None):
project_ids = self.request.GET.get("project_ids", None)
project_ids = [str(project_id) for project_id in project_ids.split(",")]
members_query = ProjectMember.objects.filter(
project_id__in=project_ids, is_active=True
)
return {
"total_users": self.get_filtered_counts(members_query),
"total_admins": self.get_filtered_counts(
members_query.filter(role=ROLE.ADMIN.value)
),
"total_members": self.get_filtered_counts(
members_query.filter(role=ROLE.MEMBER.value)
),
"total_guests": self.get_filtered_counts(
members_query.filter(role=ROLE.GUEST.value)
),
"total_projects": self.get_filtered_counts(
Project.objects.filter(**self.filters["project_filters"])
),
"total_work_items": self.get_filtered_counts(
Issue.issue_objects.filter(**self.filters["base_filters"])
),
"total_cycles": self.get_filtered_counts(
Cycle.objects.filter(**self.filters["base_filters"])
),
"total_intake": self.get_filtered_counts(
Issue.objects.filter(**self.filters["base_filters"]).filter(
issue_intake__status__in=["-2", "0"]
)
),
}
def get_work_items_stats(self) -> Dict[str, Dict[str, int]]:
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
return {
"total_work_items": self.get_filtered_counts(base_queryset),
"started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="started")
),
"backlog_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="backlog")
),
"un_started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="unstarted")
),
"completed_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="completed")
),
}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="analytics")
tab = request.GET.get("tab", "overview")
if tab == "overview":
return Response(
self.get_overview_data(),
status=status.HTTP_200_OK,
)
elif tab == "work-items":
return Response(
self.get_work_items_stats(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid tab"}, status=status.HTTP_400_BAD_REQUEST)
class AdvanceAnalyticsStatsEndpoint(AdvanceAnalyticsBaseView):
def get_project_issues_stats(self) -> QuerySet:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
base_queryset = base_queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return (
base_queryset.values("project_id", "project__name").annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
def get_work_items_stats(self) -> Dict[str, Dict[str, int]]:
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
return (
base_queryset
.values("project_id", "project__name")
.annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "work-items")
if type == "work-items":
return Response(
self.get_work_items_stats(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)
class AdvanceAnalyticsChartEndpoint(AdvanceAnalyticsBaseView):
def project_chart(self) -> List[Dict[str, Any]]:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
date_filter = {}
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
date_filter = {
"created_at__date__gte": start_date,
"created_at__date__lte": end_date,
}
total_work_items = base_queryset.filter(**date_filter).count()
total_cycles = Cycle.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_modules = Module.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_intake = Issue.objects.filter(
issue_intake__isnull=False, **self.filters["base_filters"], **date_filter
).count()
total_members = WorkspaceMember.objects.filter(
workspace__slug=self._workspace_slug, is_active=True, **date_filter
).count()
total_pages = ProjectPage.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
total_views = IssueView.objects.filter(
**self.filters["base_filters"], **date_filter
).count()
data = {
"work_items": total_work_items,
"cycles": total_cycles,
"modules": total_modules,
"intake": total_intake,
"members": total_members,
"pages": total_pages,
"views": total_views,
}
return [
{
"key": key,
"name": key.replace("_", " ").title(),
"count": value or 0,
}
for key, value in data.items()
]
def work_item_completion_chart(self) -> Dict[str, Any]:
# Get the base queryset
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
workspace = Workspace.objects.get(slug=self._workspace_slug)
start_date = workspace.created_at.date().replace(day=1)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
# Annotate by month and count
monthly_stats = (
queryset.annotate(month=TruncMonth("created_at"))
.values("month")
.annotate(
created_count=Count("id"),
completed_count=Count("id", filter=Q(state__group="completed")),
)
.order_by("month")
)
# Create dictionary of month -> counts
stats_dict = {
stat["month"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in monthly_stats
}
# Generate monthly data (ensure months with 0 count are included)
data = []
# include the current date at the end
end_date = timezone.now().date()
last_month = end_date.replace(day=1)
current_month = start_date
while current_month <= last_month:
date_str = current_month.strftime("%Y-%m-%d")
stats = stats_dict.get(date_str, {"created_count": 0, "completed_count": 0})
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
# Move to next month
if current_month.month == 12:
current_month = current_month.replace(
year=current_month.year + 1, month=1
)
else:
current_month = current_month.replace(month=current_month.month + 1)
schema = {
"completed_issues": "completed_issues",
"created_issues": "created_issues",
}
return {"data": data, "schema": schema}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def get(self, request: HttpRequest, slug: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "projects")
group_by = request.GET.get("group_by", None)
x_axis = request.GET.get("x_axis", "PRIORITY")
if type == "projects":
return Response(self.project_chart(), status=status.HTTP_200_OK)
elif type == "custom-work-items":
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return Response(
build_analytics_chart(queryset, x_axis, group_by),
status=status.HTTP_200_OK,
)
elif type == "work-items":
return Response(
self.work_item_completion_chart(),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -0,0 +1,421 @@
from rest_framework.response import Response
from rest_framework import status
from typing import Dict, Any
from django.db.models import QuerySet, Q, Count
from django.http import HttpRequest
from django.db.models.functions import TruncMonth
from django.utils import timezone
from datetime import timedelta
from plane.app.views.base import BaseAPIView
from plane.app.permissions import ROLE, allow_permission
from plane.db.models import (
Project,
Issue,
Cycle,
Module,
CycleIssue,
ModuleIssue,
)
from django.db import models
from django.db.models import F, Case, When, Value
from django.db.models.functions import Concat
from plane.utils.build_chart import build_analytics_chart
from plane.utils.date_utils import (
get_analytics_filters,
)
class ProjectAdvanceAnalyticsBaseView(BaseAPIView):
def initialize_workspace(self, slug: str, type: str) -> None:
self._workspace_slug = slug
self.filters = get_analytics_filters(
slug=slug,
type=type,
user=self.request.user,
date_filter=self.request.GET.get("date_filter", None),
project_ids=self.request.GET.get("project_ids", None),
)
class ProjectAdvanceAnalyticsEndpoint(ProjectAdvanceAnalyticsBaseView):
def get_filtered_counts(self, queryset: QuerySet) -> Dict[str, int]:
def get_filtered_count() -> int:
if self.filters["analytics_date_range"]:
return queryset.filter(
created_at__gte=self.filters["analytics_date_range"]["current"][
"gte"
],
created_at__lte=self.filters["analytics_date_range"]["current"][
"lte"
],
).count()
return queryset.count()
return {
"count": get_filtered_count(),
}
def get_work_items_stats(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Dict[str, int]]:
"""
Returns work item stats for the workspace, or filtered by cycle_id or module_id if provided.
"""
base_queryset = None
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=module_issues)
else:
base_queryset = Issue.issue_objects.filter(
**self.filters["base_filters"], project_id=project_id
)
return {
"total_work_items": self.get_filtered_counts(base_queryset),
"started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="started")
),
"backlog_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="backlog")
),
"un_started_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="unstarted")
),
"completed_work_items": self.get_filtered_counts(
base_queryset.filter(state__group="completed")
),
}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="analytics")
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.get_work_items_stats(
cycle_id=cycle_id, module_id=module_id, project_id=project_id
),
status=status.HTTP_200_OK,
)
class ProjectAdvanceAnalyticsStatsEndpoint(ProjectAdvanceAnalyticsBaseView):
def get_project_issues_stats(self) -> QuerySet:
# Get the base queryset with workspace and project filters
base_queryset = Issue.issue_objects.filter(**self.filters["base_filters"])
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
base_queryset = base_queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return (
base_queryset.values("project_id", "project__name")
.annotate(
cancelled_work_items=Count("id", filter=Q(state__group="cancelled")),
completed_work_items=Count("id", filter=Q(state__group="completed")),
backlog_work_items=Count("id", filter=Q(state__group="backlog")),
un_started_work_items=Count("id", filter=Q(state__group="unstarted")),
started_work_items=Count("id", filter=Q(state__group="started")),
)
.order_by("project_id")
)
def get_work_items_stats(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Dict[str, int]]:
base_queryset = None
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
base_queryset = Issue.issue_objects.filter(id__in=module_issues)
else:
base_queryset = Issue.issue_objects.filter(
**self.filters["base_filters"], project_id=project_id
)
return (
base_queryset.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True, then="assignees__avatar"
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
cancelled_work_items=Count(
"id", filter=Q(state__group="cancelled"), distinct=True
),
completed_work_items=Count(
"id", filter=Q(state__group="completed"), distinct=True
),
backlog_work_items=Count(
"id", filter=Q(state__group="backlog"), distinct=True
),
un_started_work_items=Count(
"id", filter=Q(state__group="unstarted"), distinct=True
),
started_work_items=Count(
"id", filter=Q(state__group="started"), distinct=True
),
)
.order_by("display_name")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "work-items")
if type == "work-items":
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.get_work_items_stats(
project_id=project_id, cycle_id=cycle_id, module_id=module_id
),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)
class ProjectAdvanceAnalyticsChartEndpoint(ProjectAdvanceAnalyticsBaseView):
def work_item_completion_chart(
self, project_id, cycle_id=None, module_id=None
) -> Dict[str, Any]:
# Get the base queryset
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.filter(project_id=project_id)
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
cycle = Cycle.objects.filter(id=cycle_id).first()
if cycle and cycle.start_date:
start_date = cycle.start_date.date()
end_date = cycle.end_date.date()
else:
return {"data": [], "schema": {}}
queryset = cycle_issues
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
module = Module.objects.filter(id=module_id).first()
if module and module.start_date:
start_date = module.start_date
end_date = module.target_date
else:
return {"data": [], "schema": {}}
queryset = module_issues
else:
project = Project.objects.filter(id=project_id).first()
if project.created_at:
start_date = project.created_at.date().replace(day=1)
else:
return {"data": [], "schema": {}}
if cycle_id or module_id:
# Get daily stats with optimized query
daily_stats = (
queryset.values("created_at__date")
.annotate(
created_count=Count("id"),
completed_count=Count(
"id", filter=Q(issue__state__group="completed")
),
)
.order_by("created_at__date")
)
# Create a dictionary of existing stats with summed counts
stats_dict = {
stat["created_at__date"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in daily_stats
}
# Generate data for all days in the range
data = []
current_date = start_date
while current_date <= end_date:
date_str = current_date.strftime("%Y-%m-%d")
stats = stats_dict.get(
date_str, {"created_count": 0, "completed_count": 0}
)
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"] + stats["completed_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
current_date += timedelta(days=1)
else:
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
# Annotate by month and count
monthly_stats = (
queryset.annotate(month=TruncMonth("created_at"))
.values("month")
.annotate(
created_count=Count("id"),
completed_count=Count("id", filter=Q(state__group="completed")),
)
.order_by("month")
)
# Create dictionary of month -> counts
stats_dict = {
stat["month"].strftime("%Y-%m-%d"): {
"created_count": stat["created_count"],
"completed_count": stat["completed_count"],
}
for stat in monthly_stats
}
# Generate monthly data (ensure months with 0 count are included)
data = []
# include the current date at the end
end_date = timezone.now().date()
last_month = end_date.replace(day=1)
current_month = start_date
while current_month <= last_month:
date_str = current_month.strftime("%Y-%m-%d")
stats = stats_dict.get(
date_str, {"created_count": 0, "completed_count": 0}
)
data.append(
{
"key": date_str,
"name": date_str,
"count": stats["created_count"],
"completed_issues": stats["completed_count"],
"created_issues": stats["created_count"],
}
)
# Move to next month
if current_month.month == 12:
current_month = current_month.replace(
year=current_month.year + 1, month=1
)
else:
current_month = current_month.replace(month=current_month.month + 1)
schema = {
"completed_issues": "completed_issues",
"created_issues": "created_issues",
}
return {"data": data, "schema": schema}
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request: HttpRequest, slug: str, project_id: str) -> Response:
self.initialize_workspace(slug, type="chart")
type = request.GET.get("type", "projects")
group_by = request.GET.get("group_by", None)
x_axis = request.GET.get("x_axis", "PRIORITY")
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
if type == "custom-work-items":
queryset = (
Issue.issue_objects.filter(**self.filters["base_filters"])
.filter(project_id=project_id)
.select_related("workspace", "state", "parent")
.prefetch_related(
"assignees", "labels", "issue_module__module", "issue_cycle__cycle"
)
)
# Apply cycle/module filters if present
if cycle_id is not None:
cycle_issues = CycleIssue.objects.filter(
**self.filters["base_filters"], cycle_id=cycle_id
).values_list("issue_id", flat=True)
queryset = queryset.filter(id__in=cycle_issues)
elif module_id is not None:
module_issues = ModuleIssue.objects.filter(
**self.filters["base_filters"], module_id=module_id
).values_list("issue_id", flat=True)
queryset = queryset.filter(id__in=module_issues)
# Apply date range filter if available
if self.filters["chart_period_range"]:
start_date, end_date = self.filters["chart_period_range"]
queryset = queryset.filter(
created_at__date__gte=start_date, created_at__date__lte=end_date
)
return Response(
build_analytics_chart(queryset, x_axis, group_by),
status=status.HTTP_200_OK,
)
elif type == "work-items":
# Optionally accept cycle_id or module_id as query params
cycle_id = request.GET.get("cycle_id", None)
module_id = request.GET.get("module_id", None)
return Response(
self.work_item_completion_chart(
project_id=project_id, cycle_id=cycle_id, module_id=module_id
),
status=status.HTTP_200_OK,
)
return Response({"message": "Invalid type"}, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -9,11 +9,11 @@ from rest_framework import status
from .base import BaseAPIView
from plane.db.models import APIToken, Workspace
from plane.app.serializers import APITokenSerializer, APITokenReadSerializer
from plane.app.permissions import WorkspaceOwnerPermission
from plane.app.permissions import WorkspaceEntityPermission
class ApiTokenEndpoint(BaseAPIView):
permission_classes = [WorkspaceOwnerPermission]
permission_classes = [WorkspaceEntityPermission]
def post(self, request, slug):
label = request.data.get("label", str(uuid4().hex))
@@ -68,7 +68,7 @@ class ApiTokenEndpoint(BaseAPIView):
class ServiceApiTokenEndpoint(BaseAPIView):
permission_classes = [WorkspaceOwnerPermission]
permission_classes = [WorkspaceEntityPermission]
def post(self, request, slug):
workspace = Workspace.objects.get(slug=slug)

View File

@@ -137,7 +137,7 @@ class UserAssetsV2Endpoint(BaseAPIView):
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"error": "Invalid file type. Only JPEG, PNG, WebP, JPG and GIF files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
@@ -351,7 +351,7 @@ class WorkspaceFileAssetEndpoint(BaseAPIView):
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"error": "Invalid file type. Only JPEG, PNG, WebP, JPG and GIF files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
@@ -552,7 +552,7 @@ class ProjectAssetEndpoint(BaseAPIView):
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"error": "Invalid file type. Only JPEG, PNG, WebP, JPG and GIF files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
@@ -683,7 +683,7 @@ class ProjectBulkAssetEndpoint(BaseAPIView):
# For some cases, the bulk api is called after the issue is deleted creating
# an integrity error
try:
assets.update(issue_id=entity_id)
assets.update(issue_id=entity_id, project_id=project_id)
except IntegrityError:
pass

View File

@@ -51,8 +51,7 @@ from plane.db.models import (
)
from plane.utils.analytics_plot import burndown_plot
from plane.bgtasks.recent_visited_task import recent_visited_task
# Module imports
from plane.utils.host import base_host
from .. import BaseAPIView, BaseViewSet
from plane.bgtasks.webhook_task import model_activity
from plane.utils.timezone_converter import convert_to_utc, user_timezone_converter
@@ -118,6 +117,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -130,6 +130,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -142,6 +143,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -267,9 +269,7 @@ class CycleViewSet(BaseViewSet):
"created_by",
)
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(
data, datetime_fields, request.user.user_timezone
)
data = user_timezone_converter(data, datetime_fields, project_timezone)
return Response(data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@@ -318,9 +318,13 @@ class CycleViewSet(BaseViewSet):
.first()
)
# Fetch the project timezone
project = Project.objects.get(id=self.kwargs.get("project_id"))
project_timezone = project.timezone
datetime_fields = ["start_date", "end_date"]
cycle = user_timezone_converter(
cycle, datetime_fields, request.user.user_timezone
cycle, datetime_fields, project_timezone
)
# Send the model activity
@@ -331,7 +335,7 @@ class CycleViewSet(BaseViewSet):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(cycle, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -407,10 +411,12 @@ class CycleViewSet(BaseViewSet):
"created_by",
).first()
# Fetch the project timezone
project = Project.objects.get(id=self.kwargs.get("project_id"))
project_timezone = project.timezone
datetime_fields = ["start_date", "end_date"]
cycle = user_timezone_converter(
cycle, datetime_fields, request.user.user_timezone
)
cycle = user_timezone_converter(cycle, datetime_fields, project_timezone)
# Send the model activity
model_activity.delay(
@@ -420,7 +426,7 @@ class CycleViewSet(BaseViewSet):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(cycle, status=status.HTTP_200_OK)
@@ -480,10 +486,11 @@ class CycleViewSet(BaseViewSet):
)
queryset = queryset.first()
# Fetch the project timezone
project = Project.objects.get(id=self.kwargs.get("project_id"))
project_timezone = project.timezone
datetime_fields = ["start_date", "end_date"]
data = user_timezone_converter(
data, datetime_fields, request.user.user_timezone
)
data = user_timezone_converter(data, datetime_fields, project_timezone)
recent_visited_task.delay(
slug=slug,
@@ -532,7 +539,7 @@ class CycleViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# TODO: Soft delete the cycle break the onetoone relationship with cycle issue
cycle.delete()
@@ -566,16 +573,12 @@ class CycleDateCheckEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
is_start_date_end_date_equal = (
True if str(start_date) == str(end_date) else False
)
start_date = convert_to_utc(
date=str(start_date), project_id=project_id, is_start_date=True
)
end_date = convert_to_utc(
date=str(end_date),
project_id=project_id,
is_start_date_end_date_equal=is_start_date_end_date_equal,
)
# Check if any cycle intersects in the given interval
@@ -660,6 +663,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -724,6 +728,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
)
)
old_cycle = old_cycle.first()
estimate_type = Project.objects.filter(
workspace__slug=slug,
@@ -842,7 +847,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
estimate_completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="points",
@@ -989,7 +994,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
# Pass the new_cycle queryset to burndown_plot
completion_chart = burndown_plot(
queryset=old_cycle.first(),
queryset=old_cycle,
slug=slug,
project_id=project_id,
plot_type="issues",
@@ -1001,12 +1006,12 @@ class TransferCycleIssueEndpoint(BaseAPIView):
).first()
current_cycle.progress_snapshot = {
"total_issues": old_cycle.first().total_issues,
"completed_issues": old_cycle.first().completed_issues,
"cancelled_issues": old_cycle.first().cancelled_issues,
"started_issues": old_cycle.first().started_issues,
"unstarted_issues": old_cycle.first().unstarted_issues,
"backlog_issues": old_cycle.first().backlog_issues,
"total_issues": old_cycle.total_issues,
"completed_issues": old_cycle.completed_issues,
"cancelled_issues": old_cycle.cancelled_issues,
"started_issues": old_cycle.started_issues,
"unstarted_issues": old_cycle.unstarted_issues,
"backlog_issues": old_cycle.backlog_issues,
"distribution": {
"labels": label_distribution_data,
"assignees": assignee_distribution_data,
@@ -1071,7 +1076,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response({"message": "Success"}, status=status.HTTP_200_OK)
@@ -1114,6 +1119,13 @@ class CycleUserPropertiesEndpoint(BaseAPIView):
class CycleProgressEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, cycle_id):
cycle = Cycle.objects.filter(
workspace__slug=slug, project_id=project_id, id=cycle_id
).first()
if not cycle:
return Response(
{"error": "Cycle not found"}, status=status.HTTP_404_NOT_FOUND
)
aggregate_estimates = (
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
@@ -1164,53 +1176,60 @@ class CycleProgressEndpoint(BaseAPIView):
),
)
)
if cycle.progress_snapshot:
backlog_issues = cycle.progress_snapshot.get("backlog_issues", 0)
unstarted_issues = cycle.progress_snapshot.get("unstarted_issues", 0)
started_issues = cycle.progress_snapshot.get("started_issues", 0)
cancelled_issues = cycle.progress_snapshot.get("cancelled_issues", 0)
completed_issues = cycle.progress_snapshot.get("completed_issues", 0)
total_issues = cycle.progress_snapshot.get("total_issues", 0)
else:
backlog_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="backlog",
).count()
backlog_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="backlog",
).count()
unstarted_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="unstarted",
).count()
unstarted_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="unstarted",
).count()
started_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="started",
).count()
started_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="started",
).count()
cancelled_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="cancelled",
).count()
cancelled_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="cancelled",
).count()
completed_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="completed",
).count()
completed_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="completed",
).count()
total_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
).count()
total_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
).count()
return Response(
{
@@ -1271,6 +1290,25 @@ class CycleAnalyticsEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
# this will tell whether the issues were transferred to the new cycle
"""
if the issues were transferred to the new cycle, then the progress_snapshot will be present
return the progress_snapshot data in the analytics for each date
else issues were not transferred to the new cycle then generate the stats from the cycle isssue bridge tables
"""
if cycle.progress_snapshot:
distribution = cycle.progress_snapshot.get("distribution", {})
return Response(
{
"labels": distribution.get("labels", []),
"assignees": distribution.get("assignees", []),
"completion_chart": distribution.get("completion_chart", {}),
},
status=status.HTTP_200_OK,
)
estimate_type = Project.objects.filter(
workspace__slug=slug,
pk=project_id,

View File

@@ -27,6 +27,7 @@ from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPaginator
from plane.app.permissions import allow_permission, ROLE
from plane.utils.host import base_host
class CycleIssueViewSet(BaseViewSet):
@@ -291,7 +292,7 @@ class CycleIssueViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response({"message": "success"}, status=status.HTTP_201_CREATED)
@@ -317,7 +318,7 @@ class CycleIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
cycle_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -11,8 +11,7 @@ from rest_framework.response import Response
# Module import
from plane.app.permissions import ROLE, allow_permission
from plane.app.serializers import (ProjectLiteSerializer,
WorkspaceLiteSerializer)
from plane.app.serializers import ProjectLiteSerializer, WorkspaceLiteSerializer
from plane.db.models import Project, Workspace
from plane.license.utils.instance_value import get_configuration_value
from plane.utils.exception_logger import log_exception
@@ -22,6 +21,7 @@ from ..base import BaseAPIView
class LLMProvider:
"""Base class for LLM provider configurations"""
name: str = ""
models: List[str] = []
default_model: str = ""
@@ -34,11 +34,13 @@ class LLMProvider:
"default_model": cls.default_model,
}
class OpenAIProvider(LLMProvider):
name = "OpenAI"
models = ["gpt-3.5-turbo", "gpt-4o-mini", "gpt-4o", "o1-mini", "o1-preview"]
default_model = "gpt-4o-mini"
class AnthropicProvider(LLMProvider):
name = "Anthropic"
models = [
@@ -49,40 +51,45 @@ class AnthropicProvider(LLMProvider):
"claude-2.1",
"claude-2",
"claude-instant-1.2",
"claude-instant-1"
"claude-instant-1",
]
default_model = "claude-3-sonnet-20240229"
class GeminiProvider(LLMProvider):
name = "Gemini"
models = ["gemini-pro", "gemini-1.5-pro-latest", "gemini-pro-vision"]
default_model = "gemini-pro"
SUPPORTED_PROVIDERS = {
"openai": OpenAIProvider,
"anthropic": AnthropicProvider,
"gemini": GeminiProvider,
}
def get_llm_config() -> Tuple[str | None, str | None, str | None]:
"""
Helper to get LLM configuration values, returns:
- api_key, model, provider
"""
api_key, provider_key, model = get_configuration_value([
{
"key": "LLM_API_KEY",
"default": os.environ.get("LLM_API_KEY", None),
},
{
"key": "LLM_PROVIDER",
"default": os.environ.get("LLM_PROVIDER", "openai"),
},
{
"key": "LLM_MODEL",
"default": os.environ.get("LLM_MODEL", None),
},
])
api_key, provider_key, model = get_configuration_value(
[
{
"key": "LLM_API_KEY",
"default": os.environ.get("LLM_API_KEY", None),
},
{
"key": "LLM_PROVIDER",
"default": os.environ.get("LLM_PROVIDER", "openai"),
},
{
"key": "LLM_MODEL",
"default": os.environ.get("LLM_MODEL", None),
},
]
)
provider = SUPPORTED_PROVIDERS.get(provider_key.lower())
if not provider:
@@ -99,16 +106,20 @@ def get_llm_config() -> Tuple[str | None, str | None, str | None]:
# Validate model is supported by provider
if model not in provider.models:
log_exception(ValueError(
f"Model {model} not supported by {provider.name}. "
f"Supported models: {', '.join(provider.models)}"
))
log_exception(
ValueError(
f"Model {model} not supported by {provider.name}. "
f"Supported models: {', '.join(provider.models)}"
)
)
return None, None, None
return api_key, model, provider_key
def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> Tuple[str | None, str | None]:
def get_llm_response(
task, prompt, api_key: str, model: str, provider: str
) -> Tuple[str | None, str | None]:
"""Helper to get LLM completion response"""
final_text = task + "\n" + prompt
try:
@@ -118,10 +129,7 @@ def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> T
client = OpenAI(api_key=api_key)
chat_completion = client.chat.completions.create(
model=model,
messages=[
{"role": "user", "content": final_text}
]
model=model, messages=[{"role": "user", "content": final_text}]
)
text = chat_completion.choices[0].message.content
return text, None
@@ -135,6 +143,7 @@ def get_llm_response(task, prompt, api_key: str, model: str, provider: str) -> T
else:
return None, f"Error occurred while generating response from {provider}"
class GPTIntegrationEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id):
@@ -152,7 +161,9 @@ class GPTIntegrationEndpoint(BaseAPIView):
{"error": "Task is required"}, status=status.HTTP_400_BAD_REQUEST
)
text, error = get_llm_response(task, request.data.get("prompt", False), api_key, model, provider)
text, error = get_llm_response(
task, request.data.get("prompt", False), api_key, model, provider
)
if not text and error:
return Response(
{"error": "An internal error has occurred."},
@@ -190,7 +201,9 @@ class WorkspaceGPTIntegrationEndpoint(BaseAPIView):
{"error": "Task is required"}, status=status.HTTP_400_BAD_REQUEST
)
text, error = get_llm_response(task, request.data.get("prompt", False), api_key, model, provider)
text, error = get_llm_response(
task, request.data.get("prompt", False), api_key, model, provider
)
if not text and error:
return Response(
{"error": "An internal error has occurred."},

View File

@@ -27,16 +27,24 @@ from plane.db.models import (
Project,
ProjectMember,
CycleIssue,
IssueDescriptionVersion,
)
from plane.app.serializers import (
IssueCreateSerializer,
IssueSerializer,
IssueDetailSerializer,
IntakeSerializer,
IntakeIssueSerializer,
IntakeIssueDetailSerializer,
IssueDescriptionVersionDetailSerializer,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activities_task import issue_activity
from plane.bgtasks.issue_description_version_task import issue_description_version_task
from plane.app.views.base import BaseAPIView
from plane.utils.timezone_converter import user_timezone_converter
from plane.utils.global_paginator import paginate
from plane.utils.host import base_host
from plane.db.models.intake import SourceType
class IntakeViewSet(BaseViewSet):
@@ -87,7 +95,7 @@ class IntakeIssueViewSet(BaseViewSet):
serializer_class = IntakeIssueSerializer
model = IntakeIssue
filterset_fields = ["statulls"]
filterset_fields = ["status"]
def get_queryset(self):
return (
@@ -218,7 +226,7 @@ class IntakeIssueViewSet(BaseViewSet):
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
role=ROLE.GUEST.value,
is_active=True,
).exists()
and not project.guest_view_all_features
@@ -271,7 +279,7 @@ class IntakeIssueViewSet(BaseViewSet):
intake_id=intake_id.id,
project_id=project_id,
issue_id=serializer.data["id"],
source=request.data.get("source", "IN-APP"),
source=SourceType.IN_APP,
)
# Create an Issue Activity
issue_activity.delay(
@@ -283,9 +291,16 @@ class IntakeIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
intake=str(intake_issue.id),
)
# updated issue description version
issue_description_version_task.delay(
updated_issue=json.dumps(request.data, cls=DjangoJSONEncoder),
issue_id=str(serializer.data["id"]),
user_id=request.user.id,
is_creating=True,
)
intake_issue = (
IntakeIssue.objects.select_related("issue")
.prefetch_related("issue__labels", "issue__assignees")
@@ -385,13 +400,15 @@ class IntakeIssueViewSet(BaseViewSet):
),
"description": issue_data.get("description", issue.description),
}
current_instance = json.dumps(
IssueDetailSerializer(issue).data, cls=DjangoJSONEncoder
)
issue_serializer = IssueCreateSerializer(
issue, data=issue_data, partial=True, context={"project_id": project_id}
)
if issue_serializer.is_valid():
current_instance = issue
# Log all the updates
requested_data = json.dumps(issue_data, cls=DjangoJSONEncoder)
if issue is not None:
@@ -401,15 +418,18 @@ class IntakeIssueViewSet(BaseViewSet):
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=json.dumps(
IssueSerializer(current_instance).data,
cls=DjangoJSONEncoder,
),
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
intake=str(intake_issue.id),
)
# updated issue description version
issue_description_version_task.delay(
updated_issue=current_instance,
issue_id=str(pk),
user_id=request.user.id,
)
issue_serializer.save()
else:
return Response(
@@ -467,7 +487,7 @@ class IntakeIssueViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=False,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
intake=(intake_issue.id),
)
@@ -549,7 +569,7 @@ class IntakeIssueViewSet(BaseViewSet):
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
role=ROLE.GUEST.value,
is_active=True,
).exists()
and not project.guest_view_all_features
@@ -557,7 +577,7 @@ class IntakeIssueViewSet(BaseViewSet):
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
status=status.HTTP_403_FORBIDDEN,
)
issue = IntakeIssueDetailSerializer(intake_issue).data
return Response(issue, status=status.HTTP_200_OK)
@@ -584,3 +604,80 @@ class IntakeIssueViewSet(BaseViewSet):
intake_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class IntakeWorkItemDescriptionVersionEndpoint(BaseAPIView):
def process_paginated_result(self, fields, results, timezone):
paginated_data = results.values(*fields)
datetime_fields = ["created_at", "updated_at"]
paginated_data = user_timezone_converter(
paginated_data, datetime_fields, timezone
)
return paginated_data
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, work_item_id, pk=None):
project = Project.objects.get(pk=project_id)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=work_item_id
)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=ROLE.GUEST.value,
is_active=True,
).exists()
and not project.guest_view_all_features
and not issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_403_FORBIDDEN,
)
if pk:
issue_description_version = IssueDescriptionVersion.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=work_item_id,
pk=pk,
)
serializer = IssueDescriptionVersionDetailSerializer(
issue_description_version
)
return Response(serializer.data, status=status.HTTP_200_OK)
cursor = request.GET.get("cursor", None)
required_fields = [
"id",
"workspace",
"project",
"issue",
"last_saved_at",
"owned_by",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
issue_description_versions_queryset = IssueDescriptionVersion.objects.filter(
workspace__slug=slug, project_id=project_id, issue_id=work_item_id
)
paginated_data = paginate(
base_queryset=issue_description_versions_queryset,
queryset=issue_description_versions_queryset,
cursor=cursor,
on_result=lambda results: self.process_paginated_result(
required_fields, results, request.user.user_timezone
),
)
return Response(paginated_data, status=status.HTTP_200_OK)

View File

@@ -37,6 +37,7 @@ from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPaginator
from plane.app.permissions import allow_permission, ROLE
from plane.utils.error_codes import ERROR_CODES
from plane.utils.host import base_host
# Module imports
from .. import BaseViewSet, BaseAPIView
@@ -259,7 +260,7 @@ class IssueArchiveViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue.archived_at = timezone.now().date()
issue.save()
@@ -287,7 +288,7 @@ class IssueArchiveViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue.archived_at = None
issue.save()
@@ -333,7 +334,7 @@ class BulkArchiveIssuesEndpoint(BaseAPIView):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue.archived_at = timezone.now().date()
bulk_archive_issues.append(issue)

View File

@@ -21,6 +21,7 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.app.permissions import allow_permission, ROLE
from plane.settings.storage import S3Storage
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
from plane.utils.host import base_host
class IssueAttachmentEndpoint(BaseAPIView):
@@ -48,7 +49,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
current_instance=json.dumps(serializer.data, cls=DjangoJSONEncoder),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -67,7 +68,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -155,7 +156,7 @@ class IssueAttachmentV2Endpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -213,7 +214,7 @@ class IssueAttachmentV2Endpoint(BaseAPIView):
current_instance=json.dumps(serializer.data, cls=DjangoJSONEncoder),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# Update the attachment

View File

@@ -45,6 +45,7 @@ from plane.db.models import (
ProjectMember,
CycleIssue,
UserRecentVisit,
ModuleIssue,
)
from plane.utils.grouper import (
issue_group_values,
@@ -60,6 +61,7 @@ from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.global_paginator import paginate
from plane.bgtasks.webhook_task import model_activity
from plane.bgtasks.issue_description_version_task import issue_description_version_task
from plane.utils.host import base_host
class IssueListEndpoint(BaseAPIView):
@@ -378,7 +380,7 @@ class IssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue = (
issue_queryset_grouper(
@@ -428,7 +430,7 @@ class IssueViewSet(BaseViewSet):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# updated issue description version
issue_description_version_task.delay(
@@ -564,7 +566,7 @@ class IssueViewSet(BaseViewSet):
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
status=status.HTTP_403_FORBIDDEN,
)
recent_visited_task.delay(
@@ -631,7 +633,7 @@ class IssueViewSet(BaseViewSet):
)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
IssueDetailSerializer(issue).data, cls=DjangoJSONEncoder
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
@@ -649,7 +651,7 @@ class IssueViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
model_activity.delay(
model_name="issue",
@@ -658,7 +660,7 @@ class IssueViewSet(BaseViewSet):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
# updated issue description version
issue_description_version_task.delay(
@@ -690,7 +692,8 @@ class IssueViewSet(BaseViewSet):
current_instance={},
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
subscriber=False,
)
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -738,6 +741,13 @@ class BulkDeleteIssuesEndpoint(BaseAPIView):
total_issues = len(issues)
# First, delete all related cycle issues
CycleIssue.objects.filter(issue_id__in=issue_ids).delete()
# Then, delete all related module issues
ModuleIssue.objects.filter(issue_id__in=issue_ids).delete()
# Finally, delete the issues themselves
issues.delete()
return Response(
@@ -1025,9 +1035,17 @@ class IssueBulkUpdateDateEndpoint(BaseAPIView):
"""
Validate that start date is before target date.
"""
from datetime import datetime
start = new_start or current_start
target = new_target or current_target
# Convert string dates to datetime objects if they're strings
if isinstance(start, str):
start = datetime.strptime(start, "%Y-%m-%d").date()
if isinstance(target, str):
target = datetime.strptime(target, "%Y-%m-%d").date()
if start and target and start > target:
return False
return True
@@ -1269,7 +1287,7 @@ class IssueDetailIdentifierEndpoint(BaseAPIView):
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
status=status.HTTP_403_FORBIDDEN,
)
recent_visited_task.delay(

View File

@@ -17,6 +17,7 @@ from plane.app.serializers import IssueCommentSerializer, CommentReactionSeriali
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import IssueComment, ProjectMember, CommentReaction, Project, Issue
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.host import base_host
class IssueCommentViewSet(BaseViewSet):
@@ -87,7 +88,7 @@ class IssueCommentViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -105,7 +106,13 @@ class IssueCommentViewSet(BaseViewSet):
issue_comment, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
if (
"comment_html" in request.data
and request.data["comment_html"] != issue_comment.comment_html
):
serializer.save(edited_at=timezone.now())
else:
serializer.save()
issue_activity.delay(
type="comment.activity.updated",
requested_data=requested_data,
@@ -115,7 +122,7 @@ class IssueCommentViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -138,7 +145,7 @@ class IssueCommentViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -182,7 +189,7 @@ class CommentReactionViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -216,7 +223,7 @@ class CommentReactionViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
comment_reaction.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -15,6 +15,8 @@ from plane.app.serializers import IssueLinkSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import IssueLink
from plane.bgtasks.issue_activities_task import issue_activity
from plane.bgtasks.work_item_link_task import crawl_work_item_link_title
from plane.utils.host import base_host
class IssueLinkViewSet(BaseViewSet):
@@ -43,6 +45,9 @@ class IssueLinkViewSet(BaseViewSet):
serializer = IssueLinkSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id, issue_id=issue_id)
crawl_work_item_link_title(
serializer.data.get("id"), serializer.data.get("url")
)
issue_activity.delay(
type="link.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
@@ -52,8 +57,12 @@ class IssueLinkViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue_link = self.get_queryset().get(id=serializer.data.get("id"))
serializer = IssueLinkSerializer(issue_link)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -65,9 +74,14 @@ class IssueLinkViewSet(BaseViewSet):
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data, cls=DjangoJSONEncoder
)
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
crawl_work_item_link_title(
serializer.data.get("id"), serializer.data.get("url")
)
issue_activity.delay(
type="link.activity.updated",
requested_data=requested_data,
@@ -77,8 +91,11 @@ class IssueLinkViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue_link = self.get_queryset().get(id=serializer.data.get("id"))
serializer = IssueLinkSerializer(issue_link)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -98,7 +115,7 @@ class IssueLinkViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue_link.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -15,6 +15,7 @@ from plane.app.serializers import IssueReactionSerializer
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import IssueReaction
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.host import base_host
class IssueReactionViewSet(BaseViewSet):
@@ -53,7 +54,7 @@ class IssueReactionViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -78,7 +79,7 @@ class IssueReactionViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
issue_reaction.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -27,6 +27,7 @@ from plane.db.models import (
)
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_relation_mapper import get_actual_relation
from plane.utils.host import base_host
class IssueRelationViewSet(BaseViewSet):
@@ -253,7 +254,7 @@ class IssueRelationViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
if relation_type in ["blocking", "start_after", "finish_after"]:
@@ -290,6 +291,6 @@ class IssueRelationViewSet(BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -22,6 +22,8 @@ from plane.db.models import Issue, IssueLink, FileAsset, CycleIssue
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.timezone_converter import user_timezone_converter
from collections import defaultdict
from plane.utils.host import base_host
from plane.utils.order_queryset import order_issue_queryset
class SubIssuesEndpoint(BaseAPIView):
@@ -102,6 +104,15 @@ class SubIssuesEndpoint(BaseAPIView):
.order_by("-created_at")
)
# Ordering
order_by_param = request.GET.get("order_by", "-created_at")
group_by = request.GET.get("group_by", False)
if order_by_param:
sub_issues, order_by_param = order_issue_queryset(
sub_issues, order_by_param
)
# create's a dict with state group name with their respective issue id's
result = defaultdict(list)
for sub_issue in sub_issues:
@@ -138,6 +149,26 @@ class SubIssuesEndpoint(BaseAPIView):
sub_issues = user_timezone_converter(
sub_issues, datetime_fields, request.user.user_timezone
)
# Grouping
if group_by:
result_dict = defaultdict(list)
for issue in sub_issues:
if group_by == "assignees__ids":
if issue["assignee_ids"]:
assignee_ids = issue["assignee_ids"]
for assignee_id in assignee_ids:
result_dict[str(assignee_id)].append(issue)
elif issue["assignee_ids"] == []:
result_dict["None"].append(issue)
elif group_by:
result_dict[str(issue[group_by])].append(issue)
return Response(
{"sub_issues": result_dict, "state_distribution": result},
status=status.HTTP_200_OK,
)
return Response(
{"sub_issues": sub_issues, "state_distribution": result},
status=status.HTTP_200_OK,
@@ -176,7 +207,7 @@ class SubIssuesEndpoint(BaseAPIView):
current_instance=json.dumps({"parent": str(sub_issue_id)}),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
for sub_issue_id in sub_issue_ids
]

View File

@@ -3,7 +3,13 @@ from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import IssueVersion, IssueDescriptionVersion
from plane.db.models import (
IssueVersion,
IssueDescriptionVersion,
Project,
ProjectMember,
Issue,
)
from ..base import BaseAPIView
from plane.app.serializers import (
IssueVersionDetailSerializer,
@@ -66,7 +72,7 @@ class IssueVersionEndpoint(BaseAPIView):
return Response(paginated_data, status=status.HTTP_200_OK)
class IssueDescriptionVersionEndpoint(BaseAPIView):
class WorkItemDescriptionVersionEndpoint(BaseAPIView):
def process_paginated_result(self, fields, results, timezone):
paginated_data = results.values(*fields)
@@ -78,10 +84,34 @@ class IssueDescriptionVersionEndpoint(BaseAPIView):
return paginated_data
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, issue_id, pk=None):
def get(self, request, slug, project_id, work_item_id, pk=None):
project = Project.objects.get(pk=project_id)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=work_item_id
)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=ROLE.GUEST.value,
is_active=True,
).exists()
and not project.guest_view_all_features
and not issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_403_FORBIDDEN,
)
if pk:
issue_description_version = IssueDescriptionVersion.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=work_item_id,
pk=pk,
)
serializer = IssueDescriptionVersionDetailSerializer(
@@ -105,8 +135,8 @@ class IssueDescriptionVersionEndpoint(BaseAPIView):
]
issue_description_versions_queryset = IssueDescriptionVersion.objects.filter(
workspace__slug=slug, project_id=project_id, issue_id=issue_id
)
workspace__slug=slug, project_id=project_id, issue_id=work_item_id
).order_by("-created_at")
paginated_data = paginate(
base_queryset=issue_description_versions_queryset,
queryset=issue_description_versions_queryset,

View File

@@ -61,6 +61,7 @@ from plane.utils.timezone_converter import user_timezone_converter
from plane.bgtasks.webhook_task import model_activity
from .. import BaseAPIView, BaseViewSet
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.host import base_host
class ModuleViewSet(BaseViewSet):
@@ -376,7 +377,7 @@ class ModuleViewSet(BaseViewSet):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
datetime_fields = ["created_at", "updated_at"]
module = user_timezone_converter(
@@ -710,23 +711,31 @@ class ModuleViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def partial_update(self, request, slug, project_id, pk):
module = self.get_queryset().filter(pk=pk)
module_queryset = self.get_queryset().filter(pk=pk)
if module.first().archived_at:
current_module = module_queryset.first()
if not current_module:
return Response(
{"error": "Module not found"},
status=status.HTTP_404_NOT_FOUND,
)
if current_module.archived_at:
return Response(
{"error": "Archived module cannot be updated"},
status=status.HTTP_400_BAD_REQUEST,
)
current_instance = json.dumps(
ModuleSerializer(module.first()).data, cls=DjangoJSONEncoder
ModuleSerializer(current_module).data, cls=DjangoJSONEncoder
)
serializer = ModuleWriteSerializer(
module.first(), data=request.data, partial=True
current_module, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
module = module.values(
module = module_queryset.values(
# Required fields
"id",
"workspace_id",
@@ -768,7 +777,7 @@ class ModuleViewSet(BaseViewSet):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
datetime_fields = ["created_at", "updated_at"]
@@ -795,7 +804,7 @@ class ModuleViewSet(BaseViewSet):
current_instance=json.dumps({"module_name": str(module.name)}),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
for issue in module_issues
]

View File

@@ -34,6 +34,7 @@ from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPagina
# Module imports
from .. import BaseViewSet
from plane.utils.host import base_host
class ModuleIssueViewSet(BaseViewSet):
@@ -221,7 +222,7 @@ class ModuleIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
for issue in issues
]
@@ -261,7 +262,7 @@ class ModuleIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
for module in modules
]
@@ -280,11 +281,15 @@ class ModuleIssueViewSet(BaseViewSet):
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=json.dumps(
{"module_name": module_issue.first().module.name if (module_issue.first() and module_issue.first().module) else None}
{
"module_name": module_issue.first().module.name
if (module_issue.first() and module_issue.first().module)
else None
}
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
module_issue.delete()
@@ -309,7 +314,7 @@ class ModuleIssueViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
module_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -42,6 +42,7 @@ from plane.bgtasks.page_version_task import page_version
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.bgtasks.copy_s3_object import copy_s3_objects
def unarchive_archive_page_and_descendants(page_id, archived_at):
# Your SQL query
sql = """
@@ -198,7 +199,7 @@ class PageViewSet(BaseViewSet):
project = Project.objects.get(pk=project_id)
"""
if the role is guest and guest_view_all_features is false and owned by is not
if the role is guest and guest_view_all_features is false and owned by is not
the requesting user then dont show the page
"""
@@ -572,6 +573,12 @@ class PageDuplicateEndpoint(BaseAPIView):
pk=page_id, workspace__slug=slug, projects__id=project_id
).first()
# check for permission
if page.access == Page.PRIVATE_ACCESS and page.owned_by_id != request.user.id:
return Response(
{"error": "Permission denied"}, status=status.HTTP_403_FORBIDDEN
)
# get all the project ids where page is present
project_ids = ProjectPage.objects.filter(page_id=page_id).values_list(
"project_id", flat=True

View File

@@ -39,6 +39,7 @@ from plane.utils.cache import cache_response
from plane.bgtasks.webhook_task import model_activity, webhook_activity
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.exception_logger import log_exception
from plane.utils.host import base_host
class ProjectViewSet(BaseViewSet):
@@ -179,6 +180,7 @@ class ProjectViewSet(BaseViewSet):
"inbox_view",
"guest_view_all_features",
"project_lead",
"network",
"created_at",
"updated_at",
"created_by",
@@ -273,14 +275,14 @@ class ProjectViewSet(BaseViewSet):
states = [
{
"name": "Backlog",
"color": "#A3A3A3",
"color": "#60646C",
"sequence": 15000,
"group": "backlog",
"default": True,
},
{
"name": "Todo",
"color": "#3A3A3A",
"color": "#60646C",
"sequence": 25000,
"group": "unstarted",
},
@@ -292,13 +294,13 @@ class ProjectViewSet(BaseViewSet):
},
{
"name": "Done",
"color": "#16A34A",
"color": "#46A758",
"sequence": 45000,
"group": "completed",
},
{
"name": "Cancelled",
"color": "#EF4444",
"color": "#9AA4BC",
"sequence": 55000,
"group": "cancelled",
},
@@ -330,7 +332,7 @@ class ProjectViewSet(BaseViewSet):
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
@@ -340,7 +342,7 @@ class ProjectViewSet(BaseViewSet):
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
except Workspace.DoesNotExist:
return Response(
@@ -349,7 +351,7 @@ class ProjectViewSet(BaseViewSet):
except serializers.ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
def partial_update(self, request, slug, pk=None):
@@ -408,7 +410,7 @@ class ProjectViewSet(BaseViewSet):
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -418,7 +420,7 @@ class ProjectViewSet(BaseViewSet):
if "already exists" in str(e):
return Response(
{"name": "The project name is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
@@ -427,7 +429,7 @@ class ProjectViewSet(BaseViewSet):
except serializers.ValidationError:
return Response(
{"identifier": "The project identifier is already taken"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
def destroy(self, request, slug, pk):
@@ -443,7 +445,7 @@ class ProjectViewSet(BaseViewSet):
is_active=True,
).exists()
):
project = Project.objects.get(pk=pk)
project = Project.objects.get(pk=pk, workspace__slug=slug)
project.delete()
webhook_activity.delay(
event="project",
@@ -453,7 +455,7 @@ class ProjectViewSet(BaseViewSet):
new_value=None,
actor_id=request.user.id,
slug=slug,
current_site=request.META.get("HTTP_ORIGIN"),
current_site=base_host(request=request, is_app=True),
event_id=project.id,
old_identifier=None,
new_identifier=None,

View File

@@ -16,17 +16,18 @@ from rest_framework.permissions import AllowAny
# Module imports
from .base import BaseViewSet, BaseAPIView
from plane.app.serializers import ProjectMemberInviteSerializer
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import (
ProjectMember,
Workspace,
ProjectMemberInvite,
User,
WorkspaceMember,
Project,
IssueUserProperty,
)
from plane.db.models.project import ProjectNetwork
from plane.utils.host import base_host
class ProjectInvitationsViewset(BaseViewSet):
@@ -99,7 +100,7 @@ class ProjectInvitationsViewset(BaseViewSet):
project_invitations = ProjectMemberInvite.objects.bulk_create(
project_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get("HTTP_ORIGIN")
current_site = base_host(request=request, is_app=True)
# Send invitations
for invitation in project_invitations:
@@ -128,6 +129,7 @@ class UserProjectInvitationsViewset(BaseViewSet):
.select_related("workspace", "workspace__owner", "project")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def create(self, request, slug):
project_ids = request.data.get("project_ids", [])
@@ -136,11 +138,20 @@ class UserProjectInvitationsViewset(BaseViewSet):
member=request.user, workspace__slug=slug, is_active=True
)
if workspace_member.role not in [ROLE.ADMIN.value, ROLE.MEMBER.value]:
return Response(
{"error": "You do not have permission to join the project"},
status=status.HTTP_403_FORBIDDEN,
)
# Get all the projects
projects = Project.objects.filter(
id__in=project_ids, workspace__slug=slug
).only("id", "network")
# Check if user has permission to join each project
for project in projects:
if (
project.network == ProjectNetwork.SECRET.value
and workspace_member.role != ROLE.ADMIN.value
):
return Response(
{"error": "Only workspace admins can join private project"},
status=status.HTTP_403_FORBIDDEN,
)
workspace_role = workspace_member.role
workspace = workspace_member.workspace

View File

@@ -10,11 +10,7 @@ from plane.app.serializers import (
ProjectMemberRoleSerializer,
)
from plane.app.permissions import (
ProjectMemberPermission,
ProjectLitePermission,
WorkspaceUserPermission,
)
from plane.app.permissions import WorkspaceUserPermission
from plane.db.models import Project, ProjectMember, IssueUserProperty, WorkspaceMember
from plane.bgtasks.project_add_user_email_task import project_add_user_email
@@ -26,14 +22,6 @@ class ProjectMemberViewSet(BaseViewSet):
serializer_class = ProjectMemberAdminSerializer
model = ProjectMember
def get_permissions(self):
if self.action == "leave":
self.permission_classes = [ProjectLitePermission]
else:
self.permission_classes = [ProjectMemberPermission]
return super(ProjectMemberViewSet, self).get_permissions()
search_fields = ["member__display_name", "member__first_name"]
def get_queryset(self):
@@ -187,12 +175,20 @@ class ProjectMemberViewSet(BaseViewSet):
)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN])
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def partial_update(self, request, slug, project_id, pk):
project_member = ProjectMember.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, is_active=True
)
if request.user.id == project_member.member_id:
# Fetch the workspace role of the project member
workspace_role = WorkspaceMember.objects.get(
workspace__slug=slug, member=project_member.member, is_active=True
).role
is_workspace_admin = workspace_role == ROLE.ADMIN.value
# Check if the user is not editing their own role if they are not an admin
if request.user.id == project_member.member_id and not is_workspace_admin:
return Response(
{"error": "You cannot update your own role"},
status=status.HTTP_400_BAD_REQUEST,
@@ -205,9 +201,6 @@ class ProjectMemberViewSet(BaseViewSet):
is_active=True,
)
workspace_role = WorkspaceMember.objects.get(
workspace__slug=slug, member=project_member.member, is_active=True
).role
if workspace_role in [5] and int(
request.data.get("role", project_member.role)
) in [15, 20]:
@@ -222,6 +215,7 @@ class ProjectMemberViewSet(BaseViewSet):
"role" in request.data
and int(request.data.get("role", project_member.role))
> requested_project_member.role
and not is_workspace_admin
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},

View File

@@ -1,5 +1,5 @@
# Django imports
from django.db.models import Q
from django.db.models import Q, QuerySet
# Third party imports
from rest_framework import status
@@ -12,6 +12,95 @@ from plane.utils.issue_search import search_issues
class IssueSearchEndpoint(BaseAPIView):
def filter_issues_by_project(self, project_id: int, issues: QuerySet) -> QuerySet:
"""
Filter issues by project
"""
issues = issues.filter(project_id=project_id)
return issues
def search_issues_by_query(self, query: str, issues: QuerySet) -> QuerySet:
"""
Search issues by query
"""
issues = search_issues(query, issues)
return issues
def search_issues_and_excluding_parent(
self, issues: QuerySet, issue_id: str
) -> QuerySet:
"""
Search issues and epics by query excluding the parent
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), ~Q(parent_id=issue_id)
)
return issues
def filter_issues_excluding_related_issues(
self, issue_id: str, issues: QuerySet
) -> QuerySet:
"""
Filter issues excluding related issues
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
related_issue_ids = (
IssueRelation.objects.filter(Q(related_issue=issue) | Q(issue=issue))
.values_list("issue_id", "related_issue_id")
.distinct()
)
related_issue_ids = [item for sublist in related_issue_ids for item in sublist]
if issue:
issues = issues.filter(~Q(pk=issue_id), ~Q(pk__in=related_issue_ids))
return issues
def filter_root_issues_only(self, issue_id: str, issues: QuerySet) -> QuerySet:
"""
Filter root issues only
"""
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
if issue.parent:
issues = issues.filter(~Q(pk=issue.parent_id))
return issues
def exclude_issues_in_cycles(self, issues: QuerySet) -> QuerySet:
"""
Exclude issues in cycles
"""
issues = issues.exclude(
Q(issue_cycle__isnull=False) & Q(issue_cycle__deleted_at__isnull=True)
)
return issues
def exclude_issues_in_module(self, issues: QuerySet, module: str) -> QuerySet:
"""
Exclude issues in a module
"""
issues = issues.exclude(
Q(issue_module__module=module) & Q(issue_module__deleted_at__isnull=True)
)
return issues
def filter_issues_without_target_date(self, issues: QuerySet) -> QuerySet:
"""
Filter issues without a target date
"""
issues = issues.filter(target_date__isnull=True)
return issues
def get(self, request, slug, project_id):
query = request.query_params.get("search", False)
workspace_search = request.query_params.get("workspace_search", "false")
@@ -21,7 +110,6 @@ class IssueSearchEndpoint(BaseAPIView):
module = request.query_params.get("module", False)
sub_issue = request.query_params.get("sub_issue", "false")
target_date = request.query_params.get("target_date", True)
issue_id = request.query_params.get("issue_id", False)
issues = Issue.issue_objects.filter(
@@ -32,52 +120,28 @@ class IssueSearchEndpoint(BaseAPIView):
)
if workspace_search == "false":
issues = issues.filter(project_id=project_id)
issues = self.filter_issues_by_project(project_id, issues)
if query:
issues = search_issues(query, issues)
issues = self.search_issues_by_query(query, issues)
if parent == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), ~Q(parent_id=issue_id)
)
issues = self.search_issues_and_excluding_parent(issues, issue_id)
if issue_relation == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
related_issue_ids = IssueRelation.objects.filter(
Q(related_issue=issue) | Q(issue=issue)
).values_list(
"issue_id", "related_issue_id"
).distinct()
issues = self.filter_issues_excluding_related_issues(issue_id, issues)
related_issue_ids = [item for sublist in related_issue_ids for item in sublist]
if issue:
issues = issues.filter(
~Q(pk=issue_id),
~Q(pk__in=related_issue_ids),
)
if sub_issue == "true" and issue_id:
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
if issue.parent:
issues = issues.filter(~Q(pk=issue.parent_id))
issues = self.filter_root_issues_only(issue_id, issues)
if cycle == "true":
issues = issues.exclude(
Q(issue_cycle__isnull=False) & Q(issue_cycle__deleted_at__isnull=True)
)
issues = self.exclude_issues_in_cycles(issues)
if module:
issues = issues.exclude(
Q(issue_module__module=module)
& Q(issue_module__deleted_at__isnull=True)
)
issues = self.exclude_issues_in_module(issues, module)
if target_date == "none":
issues = issues.filter(target_date__isnull=True)
issues = self.filter_issues_without_target_date(issues)
if ProjectMember.objects.filter(
project_id=project_id, member=self.request.user, is_active=True, role=5

View File

@@ -1,5 +1,6 @@
# Python imports
from itertools import groupby
from collections import defaultdict
# Django imports
from django.db.utils import IntegrityError
@@ -74,7 +75,19 @@ class StateViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
states = StateSerializer(self.get_queryset(), many=True).data
grouped_states = defaultdict(list)
for state in states:
grouped_states[state["group"]].append(state)
for group, group_states in grouped_states.items():
count = len(group_states)
for index, state in enumerate(group_states, start=1):
state["order"] = index / count
grouped = request.GET.get("grouped", False)
if grouped == "true":
state_dict = {}
for key, value in groupby(
@@ -83,6 +96,7 @@ class StateViewSet(BaseViewSet):
):
state_dict[str(key)] = list(value)
return Response(state_dict, status=status.HTTP_200_OK)
return Response(states, status=status.HTTP_200_OK)
@invalidate_cache(path="workspaces/:slug/states/", url_params=True, user=False)

View File

@@ -21,221 +21,187 @@ class TimezoneEndpoint(APIView):
throttle_classes = [AuthenticationThrottle]
@method_decorator(cache_page(60 * 60 * 24))
@method_decorator(cache_page(60 * 60 * 2))
def get(self, request):
timezone_mapping = {
"-1100": [
("Midway Island", "Pacific/Midway"),
("American Samoa", "Pacific/Pago_Pago"),
],
"-1000": [
("Hawaii", "Pacific/Honolulu"),
("Aleutian Islands", "America/Adak"),
],
"-0930": [("Marquesas Islands", "Pacific/Marquesas")],
"-0900": [
("Alaska", "America/Anchorage"),
("Gambier Islands", "Pacific/Gambier"),
],
"-0800": [
("Pacific Time (US and Canada)", "America/Los_Angeles"),
("Baja California", "America/Tijuana"),
],
"-0700": [
("Mountain Time (US and Canada)", "America/Denver"),
("Arizona", "America/Phoenix"),
("Chihuahua, Mazatlan", "America/Chihuahua"),
],
"-0600": [
("Central Time (US and Canada)", "America/Chicago"),
("Saskatchewan", "America/Regina"),
("Guadalajara, Mexico City, Monterrey", "America/Mexico_City"),
("Tegucigalpa, Honduras", "America/Tegucigalpa"),
("Costa Rica", "America/Costa_Rica"),
],
"-0500": [
("Eastern Time (US and Canada)", "America/New_York"),
("Lima", "America/Lima"),
("Bogota", "America/Bogota"),
("Quito", "America/Guayaquil"),
("Chetumal", "America/Cancun"),
],
"-0430": [("Caracas (Old Venezuela Time)", "America/Caracas")],
"-0400": [
("Atlantic Time (Canada)", "America/Halifax"),
("Caracas", "America/Caracas"),
("Santiago", "America/Santiago"),
("La Paz", "America/La_Paz"),
("Manaus", "America/Manaus"),
("Georgetown", "America/Guyana"),
("Bermuda", "Atlantic/Bermuda"),
],
"-0330": [("Newfoundland Time (Canada)", "America/St_Johns")],
"-0300": [
("Buenos Aires", "America/Argentina/Buenos_Aires"),
("Brasilia", "America/Sao_Paulo"),
("Greenland", "America/Godthab"),
("Montevideo", "America/Montevideo"),
("Falkland Islands", "Atlantic/Stanley"),
],
"-0200": [
(
"South Georgia and the South Sandwich Islands",
"Atlantic/South_Georgia",
)
],
"-0100": [
("Azores", "Atlantic/Azores"),
("Cape Verde Islands", "Atlantic/Cape_Verde"),
],
"+0000": [
("Dublin", "Europe/Dublin"),
("Reykjavik", "Atlantic/Reykjavik"),
("Lisbon", "Europe/Lisbon"),
("Monrovia", "Africa/Monrovia"),
("Casablanca", "Africa/Casablanca"),
],
"+0100": [
("Central European Time (Berlin, Rome, Paris)", "Europe/Paris"),
("West Central Africa", "Africa/Lagos"),
("Algiers", "Africa/Algiers"),
("Lagos", "Africa/Lagos"),
("Tunis", "Africa/Tunis"),
],
"+0200": [
("Eastern European Time (Cairo, Helsinki, Kyiv)", "Europe/Kiev"),
("Athens", "Europe/Athens"),
("Jerusalem", "Asia/Jerusalem"),
("Johannesburg", "Africa/Johannesburg"),
("Harare, Pretoria", "Africa/Harare"),
],
"+0300": [
("Moscow Time", "Europe/Moscow"),
("Baghdad", "Asia/Baghdad"),
("Nairobi", "Africa/Nairobi"),
("Kuwait, Riyadh", "Asia/Riyadh"),
],
"+0330": [("Tehran", "Asia/Tehran")],
"+0400": [
("Abu Dhabi", "Asia/Dubai"),
("Baku", "Asia/Baku"),
("Yerevan", "Asia/Yerevan"),
("Astrakhan", "Europe/Astrakhan"),
("Tbilisi", "Asia/Tbilisi"),
("Mauritius", "Indian/Mauritius"),
],
"+0500": [
("Islamabad", "Asia/Karachi"),
("Karachi", "Asia/Karachi"),
("Tashkent", "Asia/Tashkent"),
("Yekaterinburg", "Asia/Yekaterinburg"),
("Maldives", "Indian/Maldives"),
("Chagos", "Indian/Chagos"),
],
"+0530": [
("Chennai", "Asia/Kolkata"),
("Kolkata", "Asia/Kolkata"),
("Mumbai", "Asia/Kolkata"),
("New Delhi", "Asia/Kolkata"),
("Sri Jayawardenepura", "Asia/Colombo"),
],
"+0545": [("Kathmandu", "Asia/Kathmandu")],
"+0600": [
("Dhaka", "Asia/Dhaka"),
("Almaty", "Asia/Almaty"),
("Bishkek", "Asia/Bishkek"),
("Thimphu", "Asia/Thimphu"),
],
"+0630": [
("Yangon (Rangoon)", "Asia/Yangon"),
("Cocos Islands", "Indian/Cocos"),
],
"+0700": [
("Bangkok", "Asia/Bangkok"),
("Hanoi", "Asia/Ho_Chi_Minh"),
("Jakarta", "Asia/Jakarta"),
("Novosibirsk", "Asia/Novosibirsk"),
("Krasnoyarsk", "Asia/Krasnoyarsk"),
],
"+0800": [
("Beijing", "Asia/Shanghai"),
("Singapore", "Asia/Singapore"),
("Perth", "Australia/Perth"),
("Hong Kong", "Asia/Hong_Kong"),
("Ulaanbaatar", "Asia/Ulaanbaatar"),
("Palau", "Pacific/Palau"),
],
"+0845": [("Eucla", "Australia/Eucla")],
"+0900": [
("Tokyo", "Asia/Tokyo"),
("Seoul", "Asia/Seoul"),
("Yakutsk", "Asia/Yakutsk"),
],
"+0930": [
("Adelaide", "Australia/Adelaide"),
("Darwin", "Australia/Darwin"),
],
"+1000": [
("Sydney", "Australia/Sydney"),
("Brisbane", "Australia/Brisbane"),
("Guam", "Pacific/Guam"),
("Vladivostok", "Asia/Vladivostok"),
("Tahiti", "Pacific/Tahiti"),
],
"+1030": [("Lord Howe Island", "Australia/Lord_Howe")],
"+1100": [
("Solomon Islands", "Pacific/Guadalcanal"),
("Magadan", "Asia/Magadan"),
("Norfolk Island", "Pacific/Norfolk"),
("Bougainville Island", "Pacific/Bougainville"),
("Chokurdakh", "Asia/Srednekolymsk"),
],
"+1200": [
("Auckland", "Pacific/Auckland"),
("Wellington", "Pacific/Auckland"),
("Fiji Islands", "Pacific/Fiji"),
("Anadyr", "Asia/Anadyr"),
],
"+1245": [("Chatham Islands", "Pacific/Chatham")],
"+1300": [("Nuku'alofa", "Pacific/Tongatapu"), ("Samoa", "Pacific/Apia")],
"+1400": [("Kiritimati Island", "Pacific/Kiritimati")],
}
timezone_locations = [
("Midway Island", "Pacific/Midway"), # UTC-11:00
("American Samoa", "Pacific/Pago_Pago"), # UTC-11:00
("Hawaii", "Pacific/Honolulu"), # UTC-10:00
("Aleutian Islands", "America/Adak"), # UTC-10:00 (DST: UTC-09:00)
("Marquesas Islands", "Pacific/Marquesas"), # UTC-09:30
("Alaska", "America/Anchorage"), # UTC-09:00 (DST: UTC-08:00)
("Gambier Islands", "Pacific/Gambier"), # UTC-09:00
(
"Pacific Time (US and Canada)",
"America/Los_Angeles",
), # UTC-08:00 (DST: UTC-07:00)
("Baja California", "America/Tijuana"), # UTC-08:00 (DST: UTC-07:00)
(
"Mountain Time (US and Canada)",
"America/Denver",
), # UTC-07:00 (DST: UTC-06:00)
("Arizona", "America/Phoenix"), # UTC-07:00
("Chihuahua, Mazatlan", "America/Chihuahua"), # UTC-07:00 (DST: UTC-06:00)
(
"Central Time (US and Canada)",
"America/Chicago",
), # UTC-06:00 (DST: UTC-05:00)
("Saskatchewan", "America/Regina"), # UTC-06:00
(
"Guadalajara, Mexico City, Monterrey",
"America/Mexico_City",
), # UTC-06:00 (DST: UTC-05:00)
("Tegucigalpa, Honduras", "America/Tegucigalpa"), # UTC-06:00
("Costa Rica", "America/Costa_Rica"), # UTC-06:00
(
"Eastern Time (US and Canada)",
"America/New_York",
), # UTC-05:00 (DST: UTC-04:00)
("Lima", "America/Lima"), # UTC-05:00
("Bogota", "America/Bogota"), # UTC-05:00
("Quito", "America/Guayaquil"), # UTC-05:00
("Chetumal", "America/Cancun"), # UTC-05:00 (DST: UTC-04:00)
("Caracas (Old Venezuela Time)", "America/Caracas"), # UTC-04:30
("Atlantic Time (Canada)", "America/Halifax"), # UTC-04:00 (DST: UTC-03:00)
("Caracas", "America/Caracas"), # UTC-04:00
("Santiago", "America/Santiago"), # UTC-04:00 (DST: UTC-03:00)
("La Paz", "America/La_Paz"), # UTC-04:00
("Manaus", "America/Manaus"), # UTC-04:00
("Georgetown", "America/Guyana"), # UTC-04:00
("Bermuda", "Atlantic/Bermuda"), # UTC-04:00 (DST: UTC-03:00)
(
"Newfoundland Time (Canada)",
"America/St_Johns",
), # UTC-03:30 (DST: UTC-02:30)
("Buenos Aires", "America/Argentina/Buenos_Aires"), # UTC-03:00
("Brasilia", "America/Sao_Paulo"), # UTC-03:00
("Greenland", "America/Godthab"), # UTC-03:00 (DST: UTC-02:00)
("Montevideo", "America/Montevideo"), # UTC-03:00
("Falkland Islands", "Atlantic/Stanley"), # UTC-03:00
(
"South Georgia and the South Sandwich Islands",
"Atlantic/South_Georgia",
), # UTC-02:00
("Azores", "Atlantic/Azores"), # UTC-01:00 (DST: UTC+00:00)
("Cape Verde Islands", "Atlantic/Cape_Verde"), # UTC-01:00
("Dublin", "Europe/Dublin"), # UTC+00:00 (DST: UTC+01:00)
("Reykjavik", "Atlantic/Reykjavik"), # UTC+00:00
("Lisbon", "Europe/Lisbon"), # UTC+00:00 (DST: UTC+01:00)
("Monrovia", "Africa/Monrovia"), # UTC+00:00
("Casablanca", "Africa/Casablanca"), # UTC+00:00 (DST: UTC+01:00)
(
"Central European Time (Berlin, Rome, Paris)",
"Europe/Paris",
), # UTC+01:00 (DST: UTC+02:00)
("West Central Africa", "Africa/Lagos"), # UTC+01:00
("Algiers", "Africa/Algiers"), # UTC+01:00
("Lagos", "Africa/Lagos"), # UTC+01:00
("Tunis", "Africa/Tunis"), # UTC+01:00
(
"Eastern European Time (Cairo, Helsinki, Kyiv)",
"Europe/Kiev",
), # UTC+02:00 (DST: UTC+03:00)
("Athens", "Europe/Athens"), # UTC+02:00 (DST: UTC+03:00)
("Jerusalem", "Asia/Jerusalem"), # UTC+02:00 (DST: UTC+03:00)
("Johannesburg", "Africa/Johannesburg"), # UTC+02:00
("Harare, Pretoria", "Africa/Harare"), # UTC+02:00
("Moscow Time", "Europe/Moscow"), # UTC+03:00
("Baghdad", "Asia/Baghdad"), # UTC+03:00
("Nairobi", "Africa/Nairobi"), # UTC+03:00
("Kuwait, Riyadh", "Asia/Riyadh"), # UTC+03:00
("Tehran", "Asia/Tehran"), # UTC+03:30 (DST: UTC+04:30)
("Abu Dhabi", "Asia/Dubai"), # UTC+04:00
("Baku", "Asia/Baku"), # UTC+04:00 (DST: UTC+05:00)
("Yerevan", "Asia/Yerevan"), # UTC+04:00 (DST: UTC+05:00)
("Astrakhan", "Europe/Astrakhan"), # UTC+04:00
("Tbilisi", "Asia/Tbilisi"), # UTC+04:00
("Mauritius", "Indian/Mauritius"), # UTC+04:00
("Islamabad", "Asia/Karachi"), # UTC+05:00
("Karachi", "Asia/Karachi"), # UTC+05:00
("Tashkent", "Asia/Tashkent"), # UTC+05:00
("Yekaterinburg", "Asia/Yekaterinburg"), # UTC+05:00
("Maldives", "Indian/Maldives"), # UTC+05:00
("Chagos", "Indian/Chagos"), # UTC+05:00
("Chennai", "Asia/Kolkata"), # UTC+05:30
("Kolkata", "Asia/Kolkata"), # UTC+05:30
("Mumbai", "Asia/Kolkata"), # UTC+05:30
("New Delhi", "Asia/Kolkata"), # UTC+05:30
("Sri Jayawardenepura", "Asia/Colombo"), # UTC+05:30
("Kathmandu", "Asia/Kathmandu"), # UTC+05:45
("Dhaka", "Asia/Dhaka"), # UTC+06:00
("Almaty", "Asia/Almaty"), # UTC+06:00
("Bishkek", "Asia/Bishkek"), # UTC+06:00
("Thimphu", "Asia/Thimphu"), # UTC+06:00
("Yangon (Rangoon)", "Asia/Yangon"), # UTC+06:30
("Cocos Islands", "Indian/Cocos"), # UTC+06:30
("Bangkok", "Asia/Bangkok"), # UTC+07:00
("Hanoi", "Asia/Ho_Chi_Minh"), # UTC+07:00
("Jakarta", "Asia/Jakarta"), # UTC+07:00
("Novosibirsk", "Asia/Novosibirsk"), # UTC+07:00
("Krasnoyarsk", "Asia/Krasnoyarsk"), # UTC+07:00
("Beijing", "Asia/Shanghai"), # UTC+08:00
("Singapore", "Asia/Singapore"), # UTC+08:00
("Perth", "Australia/Perth"), # UTC+08:00
("Hong Kong", "Asia/Hong_Kong"), # UTC+08:00
("Ulaanbaatar", "Asia/Ulaanbaatar"), # UTC+08:00
("Palau", "Pacific/Palau"), # UTC+08:00
("Eucla", "Australia/Eucla"), # UTC+08:45
("Tokyo", "Asia/Tokyo"), # UTC+09:00
("Seoul", "Asia/Seoul"), # UTC+09:00
("Yakutsk", "Asia/Yakutsk"), # UTC+09:00
("Adelaide", "Australia/Adelaide"), # UTC+09:30 (DST: UTC+10:30)
("Darwin", "Australia/Darwin"), # UTC+09:30
("Sydney", "Australia/Sydney"), # UTC+10:00 (DST: UTC+11:00)
("Brisbane", "Australia/Brisbane"), # UTC+10:00
("Guam", "Pacific/Guam"), # UTC+10:00
("Vladivostok", "Asia/Vladivostok"), # UTC+10:00
("Tahiti", "Pacific/Tahiti"), # UTC+10:00
("Lord Howe Island", "Australia/Lord_Howe"), # UTC+10:30 (DST: UTC+11:00)
("Solomon Islands", "Pacific/Guadalcanal"), # UTC+11:00
("Magadan", "Asia/Magadan"), # UTC+11:00
("Norfolk Island", "Pacific/Norfolk"), # UTC+11:00
("Bougainville Island", "Pacific/Bougainville"), # UTC+11:00
("Chokurdakh", "Asia/Srednekolymsk"), # UTC+11:00
("Auckland", "Pacific/Auckland"), # UTC+12:00 (DST: UTC+13:00)
("Wellington", "Pacific/Auckland"), # UTC+12:00 (DST: UTC+13:00)
("Fiji Islands", "Pacific/Fiji"), # UTC+12:00 (DST: UTC+13:00)
("Anadyr", "Asia/Anadyr"), # UTC+12:00
("Chatham Islands", "Pacific/Chatham"), # UTC+12:45 (DST: UTC+13:45)
("Nuku'alofa", "Pacific/Tongatapu"), # UTC+13:00
("Samoa", "Pacific/Apia"), # UTC+13:00 (DST: UTC+14:00)
("Kiritimati Island", "Pacific/Kiritimati"), # UTC+14:00
]
timezone_list = []
now = datetime.now()
# Process timezone mapping
for offset, locations in timezone_mapping.items():
sign = "-" if offset.startswith("-") else "+"
hours = offset[1:3]
minutes = offset[3:] if len(offset) > 3 else "00"
for friendly_name, tz_identifier in timezone_locations:
try:
tz = pytz.timezone(tz_identifier)
current_offset = now.astimezone(tz).strftime("%z")
for friendly_name, tz_identifier in locations:
try:
tz = pytz.timezone(tz_identifier)
current_offset = now.astimezone(tz).strftime("%z")
# converting and formatting UTC offset to GMT offset
current_utc_offset = now.astimezone(tz).utcoffset()
total_seconds = int(current_utc_offset.total_seconds())
hours_offset = total_seconds // 3600
minutes_offset = abs(total_seconds % 3600) // 60
offset = (
f"{'+' if hours_offset >= 0 else '-'}"
f"{abs(hours_offset):02}:{minutes_offset:02}"
)
# converting and formatting UTC offset to GMT offset
current_utc_offset = now.astimezone(tz).utcoffset()
total_seconds = int(current_utc_offset.total_seconds())
hours_offset = total_seconds // 3600
minutes_offset = abs(total_seconds % 3600) // 60
gmt_offset = (
f"GMT{'+' if hours_offset >= 0 else '-'}"
f"{abs(hours_offset):02}:{minutes_offset:02}"
)
timezone_value = {
"offset": int(current_offset),
"utc_offset": f"UTC{offset}",
"gmt_offset": f"GMT{offset}",
"value": tz_identifier,
"label": f"{friendly_name}",
}
timezone_value = {
"offset": int(current_offset),
"utc_offset": f"UTC{sign}{hours}:{minutes}",
"gmt_offset": gmt_offset,
"value": tz_identifier,
"label": f"{friendly_name}",
}
timezone_list.append(timezone_value)
except pytz.exceptions.UnknownTimeZoneError:
continue
timezone_list.append(timezone_value)
except pytz.exceptions.UnknownTimeZoneError:
continue
# Sort by offset and then by label
timezone_list.sort(key=lambda x: (x["offset"], x["label"]))

View File

@@ -432,7 +432,7 @@ class IssueViewViewSet(BaseViewSet):
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
status=status.HTTP_403_FORBIDDEN,
)
serializer = IssueViewSerializer(issue_view)

View File

@@ -29,7 +29,7 @@ class WebhookEndpoint(BaseAPIView):
if "already exists" in str(e):
return Response(
{"error": "URL already exists for the workspace"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
raise IntegrityError

View File

@@ -42,6 +42,7 @@ from django.views.decorators.cache import cache_control
from django.views.decorators.vary import vary_on_cookie
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
from plane.license.utils.instance_value import get_configuration_value
from plane.bgtasks.workspace_seed_task import workspace_seed
class WorkSpaceViewSet(BaseViewSet):
@@ -119,11 +120,15 @@ class WorkSpaceViewSet(BaseViewSet):
)
# Get total members and role
total_members=WorkspaceMember.objects.filter(workspace_id=serializer.data["id"]).count()
total_members = WorkspaceMember.objects.filter(
workspace_id=serializer.data["id"]
).count()
data = serializer.data
data["total_members"] = total_members
data["role"] = 20
workspace_seed.delay(serializer.data["id"])
return Response(data, status=status.HTTP_201_CREATED)
return Response(
[serializer.errors[error][0] for error in serializer.errors],
@@ -134,7 +139,7 @@ class WorkSpaceViewSet(BaseViewSet):
if "already exists" in str(e):
return Response(
{"slug": "The workspace with the slug already exists"},
status=status.HTTP_410_GONE,
status=status.HTTP_409_CONFLICT,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
@@ -167,10 +172,9 @@ class UserWorkSpacesEndpoint(BaseAPIView):
.values("count")
)
role = (
WorkspaceMember.objects.filter(workspace=OuterRef("id"), member=request.user, is_active=True)
.values("role")
)
role = WorkspaceMember.objects.filter(
workspace=OuterRef("id"), member=request.user, is_active=True
).values("role")
workspace = (
Workspace.objects.prefetch_related(

View File

@@ -12,6 +12,7 @@ from plane.app.permissions import WorkspaceViewerPermission
from plane.app.serializers.cycle import CycleSerializer
from plane.utils.timezone_converter import user_timezone_converter
class WorkspaceCyclesEndpoint(BaseAPIView):
permission_classes = [WorkspaceViewerPermission]
@@ -29,6 +30,7 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)

View File

@@ -36,6 +36,7 @@ from plane.db.models import (
from .. import BaseViewSet
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_filters import issue_filters
from plane.utils.host import base_host
class WorkspaceDraftIssueViewSet(BaseViewSet):
@@ -241,7 +242,7 @@ class WorkspaceDraftIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
if request.data.get("cycle_id", None):
@@ -270,7 +271,7 @@ class WorkspaceDraftIssueViewSet(BaseViewSet):
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
if request.data.get("module_ids", []):
@@ -300,7 +301,7 @@ class WorkspaceDraftIssueViewSet(BaseViewSet):
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
origin=base_host(request=request, is_app=True),
)
for module in request.data.get("module_ids", [])
]

View File

@@ -7,7 +7,6 @@ import jwt
from django.conf import settings
from django.core.exceptions import ValidationError
from django.core.validators import validate_email
from django.db.models import Count
from django.utils import timezone
# Third party modules
@@ -26,7 +25,8 @@ from plane.bgtasks.event_tracking_task import workspace_invite_event
from plane.bgtasks.workspace_invitation_task import workspace_invitation
from plane.db.models import User, Workspace, WorkspaceMember, WorkspaceMemberInvite
from plane.utils.cache import invalidate_cache, invalidate_cache_directly
from plane.utils.host import base_host
from plane.utils.ip_address import get_client_ip
from .. import BaseViewSet
@@ -122,7 +122,7 @@ class WorkspaceInvitationsViewset(BaseViewSet):
workspace_invitations, batch_size=10, ignore_conflicts=True
)
current_site = request.META.get("HTTP_ORIGIN")
current_site = base_host(request=request, is_app=True)
# Send invitations
for invitation in workspace_invitations:
@@ -213,7 +213,7 @@ class WorkspaceJoinEndpoint(BaseAPIView):
user=user.id if user is not None else None,
email=email,
user_agent=request.META.get("HTTP_USER_AGENT"),
ip=request.META.get("REMOTE_ADDR"),
ip=get_client_ip(request=request),
event_name="MEMBER_ACCEPTED",
accepted_from="EMAIL",
)

View File

@@ -68,10 +68,11 @@ class WorkSpaceMemberViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST,
)
if workspace_member.role > int(request.data.get("role")):
_ = ProjectMember.objects.filter(
# If a user is moved to a guest role he can't have any other role in projects
if "role" in request.data and int(request.data.get("role")) == 5:
ProjectMember.objects.filter(
workspace__slug=slug, member_id=workspace_member.member_id
).update(role=int(request.data.get("role")))
).update(role=5)
serializer = WorkSpaceMemberSerializer(
workspace_member, data=request.data, partial=True

View File

@@ -8,6 +8,7 @@ from plane.app.views.base import BaseAPIView
from plane.db.models import State
from plane.app.permissions import WorkspaceEntityPermission
from plane.utils.cache import cache_response
from collections import defaultdict
class WorkspaceStatesEndpoint(BaseAPIView):
@@ -22,5 +23,16 @@ class WorkspaceStatesEndpoint(BaseAPIView):
project__archived_at__isnull=True,
is_triage=False,
)
grouped_states = defaultdict(list)
for state in states:
grouped_states[state.group].append(state)
for group, group_states in grouped_states.items():
count = len(group_states)
for index, state in enumerate(group_states, start=1):
state.order = index / count
serializer = StateSerializer(states, many=True).data
return Response(serializer, status=status.HTTP_200_OK)

View File

@@ -27,10 +27,7 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
create_preference_keys = []
keys = [
key
for key, _ in WorkspaceUserPreference.UserPreferenceKeys.choices
]
keys = [key for key, _ in WorkspaceUserPreference.UserPreferenceKeys.choices]
for preference in keys:
if preference not in get_preference.values_list("key", flat=True):
@@ -39,7 +36,10 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
preference = WorkspaceUserPreference.objects.bulk_create(
[
WorkspaceUserPreference(
key=key, user=request.user, workspace=workspace, sort_order=(65535 + (i*10000))
key=key,
user=request.user,
workspace=workspace,
sort_order=(65535 + (i * 10000)),
)
for i, key in enumerate(create_preference_keys)
],
@@ -47,10 +47,13 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
ignore_conflicts=True,
)
preferences = WorkspaceUserPreference.objects.filter(
user=request.user, workspace_id=workspace.id
).order_by("sort_order").values("key", "is_pinned", "sort_order")
preferences = (
WorkspaceUserPreference.objects.filter(
user=request.user, workspace_id=workspace.id
)
.order_by("sort_order")
.values("key", "is_pinned", "sort_order")
)
user_preferences = {}
@@ -58,7 +61,7 @@ class WorkspaceUserPreferenceViewSet(BaseAPIView):
user_preferences[(str(preference["key"]))] = {
"is_pinned": preference["is_pinned"],
"sort_order": preference["sort_order"],
}
}
return Response(
user_preferences,
status=status.HTTP_200_OK,

View File

@@ -15,7 +15,8 @@ from plane.db.models import Profile, User, WorkspaceMemberInvite
from plane.license.utils.instance_value import get_configuration_value
from .error import AuthenticationException, AUTHENTICATION_ERROR_CODES
from plane.bgtasks.user_activation_email_task import user_activation_email
from plane.authentication.utils.host import base_host
from plane.utils.host import base_host
from plane.utils.ip_address import get_client_ip
class Adapter:
@@ -108,7 +109,7 @@ class Adapter:
user.last_login_medium = self.provider
user.last_active = timezone.now()
user.last_login_time = timezone.now()
user.last_login_ip = self.request.META.get("REMOTE_ADDR")
user.last_login_ip = get_client_ip(request=self.request)
user.last_login_uagent = self.request.META.get("HTTP_USER_AGENT")
user.token_updated_at = timezone.now()
# If user is not active, send the activation email and set the user as active

View File

@@ -36,6 +36,7 @@ AUTHENTICATION_ERROR_CODES = {
"OAUTH_NOT_CONFIGURED": 5104,
"GOOGLE_NOT_CONFIGURED": 5105,
"GITHUB_NOT_CONFIGURED": 5110,
"GITHUB_USER_NOT_IN_ORG": 5122,
"GITLAB_NOT_CONFIGURED": 5111,
"GOOGLE_OAUTH_PROVIDER_ERROR": 5115,
"GITHUB_OAUTH_PROVIDER_ERROR": 5120,

View File

@@ -18,21 +18,31 @@ from plane.authentication.adapter.error import (
class GitHubOAuthProvider(OauthAdapter):
token_url = "https://github.com/login/oauth/access_token"
userinfo_url = "https://api.github.com/user"
org_membership_url = f"https://api.github.com/orgs"
provider = "github"
scope = "read:user user:email"
organization_scope = "read:org"
def __init__(self, request, code=None, state=None, callback=None):
GITHUB_CLIENT_ID, GITHUB_CLIENT_SECRET = get_configuration_value(
[
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID"),
},
{
"key": "GITHUB_CLIENT_SECRET",
"default": os.environ.get("GITHUB_CLIENT_SECRET"),
},
]
GITHUB_CLIENT_ID, GITHUB_CLIENT_SECRET, GITHUB_ORGANIZATION_ID = (
get_configuration_value(
[
{
"key": "GITHUB_CLIENT_ID",
"default": os.environ.get("GITHUB_CLIENT_ID"),
},
{
"key": "GITHUB_CLIENT_SECRET",
"default": os.environ.get("GITHUB_CLIENT_SECRET"),
},
{
"key": "GITHUB_ORGANIZATION_ID",
"default": os.environ.get("GITHUB_ORGANIZATION_ID"),
},
]
)
)
if not (GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET):
@@ -43,6 +53,10 @@ class GitHubOAuthProvider(OauthAdapter):
client_id = GITHUB_CLIENT_ID
client_secret = GITHUB_CLIENT_SECRET
self.organization_id = GITHUB_ORGANIZATION_ID
if self.organization_id:
self.scope += f" {self.organization_scope}"
redirect_uri = f"""{"https" if request.is_secure() else "http"}://{request.get_host()}/auth/github/callback/"""
url_params = {
@@ -113,12 +127,28 @@ class GitHubOAuthProvider(OauthAdapter):
error_message="GITHUB_OAUTH_PROVIDER_ERROR",
)
def is_user_in_organization(self, github_username):
headers = {"Authorization": f"Bearer {self.token_data.get('access_token')}"}
response = requests.get(
f"{self.org_membership_url}/{self.organization_id}/memberships/{github_username}",
headers=headers,
)
return response.status_code == 200 # 200 means the user is a member
def set_user_data(self):
user_info_response = self.get_user_response()
headers = {
"Authorization": f"Bearer {self.token_data.get('access_token')}",
"Accept": "application/json",
}
if self.organization_id:
if not self.is_user_in_organization(user_info_response.get("login")):
raise AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES["GITHUB_USER_NOT_IN_ORG"],
error_message="GITHUB_USER_NOT_IN_ORG",
)
email = self.__get_email(headers=headers)
super().set_user_data(
{

View File

@@ -42,11 +42,11 @@ urlpatterns = [
# credentials
path("sign-in/", SignInAuthEndpoint.as_view(), name="sign-in"),
path("sign-up/", SignUpAuthEndpoint.as_view(), name="sign-up"),
path("spaces/sign-in/", SignInAuthSpaceEndpoint.as_view(), name="sign-in"),
path("spaces/sign-up/", SignUpAuthSpaceEndpoint.as_view(), name="sign-in"),
path("spaces/sign-in/", SignInAuthSpaceEndpoint.as_view(), name="space-sign-in"),
path("spaces/sign-up/", SignUpAuthSpaceEndpoint.as_view(), name="space-sign-up"),
# signout
path("sign-out/", SignOutAuthEndpoint.as_view(), name="sign-out"),
path("spaces/sign-out/", SignOutAuthSpaceEndpoint.as_view(), name="sign-out"),
path("spaces/sign-out/", SignOutAuthSpaceEndpoint.as_view(), name="space-sign-out"),
# csrf token
path("get-csrf-token/", CSRFTokenEndpoint.as_view(), name="get_csrf_token"),
# Magic sign in
@@ -56,17 +56,17 @@ urlpatterns = [
path(
"spaces/magic-generate/",
MagicGenerateSpaceEndpoint.as_view(),
name="magic-generate",
name="space-magic-generate",
),
path(
"spaces/magic-sign-in/",
MagicSignInSpaceEndpoint.as_view(),
name="magic-sign-in",
name="space-magic-sign-in",
),
path(
"spaces/magic-sign-up/",
MagicSignUpSpaceEndpoint.as_view(),
name="magic-sign-up",
name="space-magic-sign-up",
),
## Google Oauth
path("google/", GoogleOauthInitiateEndpoint.as_view(), name="google-initiate"),
@@ -74,12 +74,12 @@ urlpatterns = [
path(
"spaces/google/",
GoogleOauthInitiateSpaceEndpoint.as_view(),
name="google-initiate",
name="space-google-initiate",
),
path(
"google/callback/",
"spaces/google/callback/",
GoogleCallbackSpaceEndpoint.as_view(),
name="google-callback",
name="space-google-callback",
),
## Github Oauth
path("github/", GitHubOauthInitiateEndpoint.as_view(), name="github-initiate"),
@@ -87,12 +87,12 @@ urlpatterns = [
path(
"spaces/github/",
GitHubOauthInitiateSpaceEndpoint.as_view(),
name="github-initiate",
name="space-github-initiate",
),
path(
"spaces/github/callback/",
GitHubCallbackSpaceEndpoint.as_view(),
name="github-callback",
name="space-github-callback",
),
## Gitlab Oauth
path("gitlab/", GitLabOauthInitiateEndpoint.as_view(), name="gitlab-initiate"),
@@ -100,12 +100,12 @@ urlpatterns = [
path(
"spaces/gitlab/",
GitLabOauthInitiateSpaceEndpoint.as_view(),
name="gitlab-initiate",
name="space-gitlab-initiate",
),
path(
"spaces/gitlab/callback/",
GitLabCallbackSpaceEndpoint.as_view(),
name="gitlab-callback",
name="space-gitlab-callback",
),
# Email Check
path("email-check/", EmailCheckEndpoint.as_view(), name="email-check"),
@@ -120,12 +120,12 @@ urlpatterns = [
path(
"spaces/forgot-password/",
ForgotPasswordSpaceEndpoint.as_view(),
name="forgot-password",
name="space-forgot-password",
),
path(
"spaces/reset-password/<uidb64>/<token>/",
ResetPasswordSpaceEndpoint.as_view(),
name="forgot-password",
name="space-forgot-password",
),
path("change-password/", ChangePasswordEndpoint.as_view(), name="forgot-password"),
path("set-password/", SetUserPasswordEndpoint.as_view(), name="set-password"),

View File

@@ -1,32 +1,53 @@
# Python imports
from urllib.parse import urlsplit
# Django imports
from django.conf import settings
from django.http import HttpRequest
# Third party imports
from rest_framework.request import Request
# Module imports
from plane.utils.ip_address import get_client_ip
def base_host(request, is_admin=False, is_space=False, is_app=False):
def base_host(
request: Request | HttpRequest,
is_admin: bool = False,
is_space: bool = False,
is_app: bool = False,
) -> str:
"""Utility function to return host / origin from the request"""
# Calculate the base origin from request
base_origin = str(
request.META.get("HTTP_ORIGIN")
or f"{urlsplit(request.META.get('HTTP_REFERER')).scheme}://{urlsplit(request.META.get('HTTP_REFERER')).netloc}"
or f"""{"https" if request.is_secure() else "http"}://{request.get_host()}"""
)
base_origin = settings.WEB_URL or settings.APP_BASE_URL
# Admin redirections
# Admin redirection
if is_admin:
if settings.ADMIN_BASE_URL:
return settings.ADMIN_BASE_URL
else:
return base_origin + "/god-mode/"
admin_base_path = getattr(settings, "ADMIN_BASE_PATH", None)
if not isinstance(admin_base_path, str):
admin_base_path = "/god-mode/"
if not admin_base_path.startswith("/"):
admin_base_path = "/" + admin_base_path
if not admin_base_path.endswith("/"):
admin_base_path += "/"
# Space redirections
if is_space:
if settings.SPACE_BASE_URL:
return settings.SPACE_BASE_URL
if settings.ADMIN_BASE_URL:
return settings.ADMIN_BASE_URL + admin_base_path
else:
return base_origin + "/spaces/"
return base_origin + admin_base_path
# Space redirection
if is_space:
space_base_path = getattr(settings, "SPACE_BASE_PATH", None)
if not isinstance(space_base_path, str):
space_base_path = "/spaces/"
if not space_base_path.startswith("/"):
space_base_path = "/" + space_base_path
if not space_base_path.endswith("/"):
space_base_path += "/"
if settings.SPACE_BASE_URL:
return settings.SPACE_BASE_URL + space_base_path
else:
return base_origin + space_base_path
# App Redirection
if is_app:
@@ -38,5 +59,5 @@ def base_host(request, is_admin=False, is_space=False, is_app=False):
return base_origin
def user_ip(request):
return str(request.META.get("REMOTE_ADDR"))
def user_ip(request: Request | HttpRequest) -> str:
return get_client_ip(request=request)

View File

@@ -3,7 +3,8 @@ from django.contrib.auth import login
from django.conf import settings
# Module imports
from plane.authentication.utils.host import base_host
from plane.utils.host import base_host
from plane.utils.ip_address import get_client_ip
def user_login(request, user, is_app=False, is_admin=False, is_space=False):
@@ -15,7 +16,7 @@ def user_login(request, user, is_app=False, is_admin=False, is_space=False):
device_info = {
"user_agent": request.META.get("HTTP_USER_AGENT", ""),
"ip_address": request.META.get("REMOTE_ADDR", ""),
"ip_address": get_client_ip(request=request),
"domain": base_host(
request=request, is_app=is_app, is_admin=is_admin, is_space=is_space
),

View File

@@ -19,6 +19,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class SignInAuthEndpoint(View):
@@ -34,7 +35,7 @@ class SignInAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
# Base URL join
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
@@ -58,7 +59,7 @@ class SignInAuthEndpoint(View):
params = exc.get_error_dict()
# Next path
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -76,7 +77,7 @@ class SignInAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -92,7 +93,7 @@ class SignInAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -111,7 +112,7 @@ class SignInAuthEndpoint(View):
user_login(request=request, user=user, is_app=True)
# Get the redirection path
if next_path:
path = str(next_path)
path = str(validate_next_path(next_path))
else:
path = get_redirection_path(user=user)
@@ -121,7 +122,7 @@ class SignInAuthEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -141,7 +142,7 @@ class SignUpAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -161,7 +162,7 @@ class SignUpAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -179,7 +180,7 @@ class SignUpAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -197,7 +198,7 @@ class SignUpAuthEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -216,7 +217,7 @@ class SignUpAuthEndpoint(View):
user_login(request=request, user=user, is_app=True)
# Get the redirection path
if next_path:
path = next_path
path = str(validate_next_path(next_path))
else:
path = get_redirection_path(user=user)
# redirect to referer path
@@ -225,7 +226,7 @@ class SignUpAuthEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)

View File

@@ -16,6 +16,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class GitHubOauthInitiateEndpoint(View):
@@ -35,7 +36,7 @@ class GitHubOauthInitiateEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -49,7 +50,7 @@ class GitHubOauthInitiateEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -70,7 +71,7 @@ class GitHubCallbackEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)
@@ -81,7 +82,7 @@ class GitHubCallbackEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)
@@ -94,7 +95,7 @@ class GitHubCallbackEndpoint(View):
user_login(request=request, user=user, is_app=True)
# Get the redirection path
if next_path:
path = next_path
path = str(validate_next_path(next_path))
else:
path = get_redirection_path(user=user)
# redirect to referer path
@@ -103,6 +104,6 @@ class GitHubCallbackEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)

View File

@@ -16,6 +16,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class GitLabOauthInitiateEndpoint(View):
@@ -24,7 +25,7 @@ class GitLabOauthInitiateEndpoint(View):
request.session["host"] = base_host(request=request, is_app=True)
next_path = request.GET.get("next_path")
if next_path:
request.session["next_path"] = str(next_path)
request.session["next_path"] = str(validate_next_path(next_path))
# Check instance configuration
instance = Instance.objects.first()
@@ -35,7 +36,7 @@ class GitLabOauthInitiateEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -49,7 +50,7 @@ class GitLabOauthInitiateEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -81,7 +82,7 @@ class GitLabCallbackEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)
@@ -94,7 +95,7 @@ class GitLabCallbackEndpoint(View):
user_login(request=request, user=user, is_app=True)
# Get the redirection path
if next_path:
path = next_path
path = str(validate_next_path(next_path))
else:
path = get_redirection_path(user=user)
# redirect to referer path
@@ -103,6 +104,6 @@ class GitLabCallbackEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)

View File

@@ -18,6 +18,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class GoogleOauthInitiateEndpoint(View):
@@ -36,7 +37,7 @@ class GoogleOauthInitiateEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -51,7 +52,7 @@ class GoogleOauthInitiateEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -72,7 +73,7 @@ class GoogleCallbackEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)
if not code:
@@ -82,7 +83,7 @@ class GoogleCallbackEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = next_path
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)
try:
@@ -95,11 +96,13 @@ class GoogleCallbackEndpoint(View):
# Get the redirection path
path = get_redirection_path(user=user)
# redirect to referer path
url = urljoin(base_host, str(next_path) if next_path else path)
url = urljoin(
base_host, str(validate_next_path(next_path)) if next_path else path
)
return HttpResponseRedirect(url)
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(base_host, "?" + urlencode(params))
return HttpResponseRedirect(url)

View File

@@ -26,6 +26,7 @@ from plane.authentication.adapter.error import (
AUTHENTICATION_ERROR_CODES,
)
from plane.authentication.rate_limit import AuthenticationThrottle
from plane.utils.path_validator import validate_next_path
class MagicGenerateEndpoint(APIView):
@@ -43,14 +44,13 @@ class MagicGenerateEndpoint(APIView):
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
origin = request.META.get("HTTP_ORIGIN", "/")
email = request.data.get("email", "").strip().lower()
try:
validate_email(email)
adapter = MagicCodeProvider(request=request, key=email)
key, token = adapter.initiate()
# If the smtp is configured send through here
magic_link.delay(email, key, token, origin)
magic_link.delay(email, key, token)
return Response({"key": str(key)}, status=status.HTTP_200_OK)
except AuthenticationException as e:
params = e.get_error_dict()
@@ -73,7 +73,7 @@ class MagicSignInEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -89,7 +89,7 @@ class MagicSignInEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -122,7 +122,7 @@ class MagicSignInEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "sign-in?" + urlencode(params)
)
@@ -145,7 +145,7 @@ class MagicSignUpEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -159,7 +159,7 @@ class MagicSignUpEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)
@@ -177,7 +177,7 @@ class MagicSignUpEndpoint(View):
user_login(request=request, user=user, is_app=True)
# Get the redirection path
if next_path:
path = str(next_path)
path = str(validate_next_path(next_path))
else:
path = get_redirection_path(user=user)
# redirect to referer path
@@ -187,7 +187,7 @@ class MagicSignUpEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = urljoin(
base_host(request=request, is_app=True), "?" + urlencode(params)
)

View File

@@ -80,7 +80,7 @@ class ForgotPasswordEndpoint(APIView):
if user:
# Get the reset token for user
uidb64, token = generate_password_token(user=user)
current_site = request.META.get("HTTP_ORIGIN")
current_site = base_host(request=request, is_app=True)
# send the forgot password email
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site

View File

@@ -44,10 +44,23 @@ class ChangePasswordEndpoint(APIView):
def post(self, request):
user = User.objects.get(pk=request.user.id)
old_password = request.data.get("old_password", False)
# If the user password is not autoset then we need to check the old passwords
if not user.is_password_autoset:
old_password = request.data.get("old_password", False)
if not old_password:
exc = AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES["MISSING_PASSWORD"],
error_message="MISSING_PASSWORD",
payload={"error": "Old password is missing"},
)
return Response(
exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST
)
# Get the new password
new_password = request.data.get("new_password", False)
if not old_password or not new_password:
if not new_password:
exc = AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES["MISSING_PASSWORD"],
error_message="MISSING_PASSWORD",
@@ -55,7 +68,8 @@ class ChangePasswordEndpoint(APIView):
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
if not user.check_password(old_password):
# If the user password is not autoset then we need to check the old passwords
if not user.is_password_autoset and not user.check_password(old_password):
exc = AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES["INCORRECT_OLD_PASSWORD"],
error_message="INCORRECT_OLD_PASSWORD",

View File

@@ -17,6 +17,7 @@ from plane.authentication.adapter.error import (
AUTHENTICATION_ERROR_CODES,
AuthenticationException,
)
from plane.utils.path_validator import validate_next_path
class SignInAuthSpaceEndpoint(View):
@@ -32,7 +33,7 @@ class SignInAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -51,7 +52,7 @@ class SignInAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -67,7 +68,7 @@ class SignInAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -82,7 +83,7 @@ class SignInAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -99,7 +100,7 @@ class SignInAuthSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -117,7 +118,7 @@ class SignUpAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -135,7 +136,7 @@ class SignUpAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
# Validate the email
@@ -151,7 +152,7 @@ class SignUpAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -166,7 +167,7 @@ class SignUpAuthSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -183,6 +184,6 @@ class SignUpAuthSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)

View File

@@ -15,6 +15,7 @@ from plane.authentication.adapter.error import (
AUTHENTICATION_ERROR_CODES,
AuthenticationException,
)
from plane.utils.path_validator import validate_next_path
class GitHubOauthInitiateSpaceEndpoint(View):
@@ -34,7 +35,7 @@ class GitHubOauthInitiateSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -66,7 +67,7 @@ class GitHubCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -77,7 +78,7 @@ class GitHubCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -93,6 +94,6 @@ class GitHubCallbackSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)

View File

@@ -15,6 +15,7 @@ from plane.authentication.adapter.error import (
AUTHENTICATION_ERROR_CODES,
AuthenticationException,
)
from plane.utils.path_validator import validate_next_path
class GitLabOauthInitiateSpaceEndpoint(View):
@@ -34,7 +35,7 @@ class GitLabOauthInitiateSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -66,7 +67,7 @@ class GitLabCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -77,7 +78,7 @@ class GitLabCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -93,6 +94,6 @@ class GitLabCallbackSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)

View File

@@ -15,6 +15,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class GoogleOauthInitiateSpaceEndpoint(View):
@@ -33,7 +34,7 @@ class GoogleOauthInitiateSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -46,7 +47,7 @@ class GoogleOauthInitiateSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -65,7 +66,7 @@ class GoogleCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
if not code:
@@ -75,7 +76,7 @@ class GoogleCallbackSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = next_path
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
try:
@@ -89,6 +90,6 @@ class GoogleCallbackSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)

View File

@@ -23,6 +23,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.path_validator import validate_next_path
class MagicGenerateSpaceEndpoint(APIView):
@@ -38,14 +39,13 @@ class MagicGenerateSpaceEndpoint(APIView):
)
return Response(exc.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
origin = base_host(request=request, is_space=True)
email = request.data.get("email", "").strip().lower()
try:
validate_email(email)
adapter = MagicCodeProvider(request=request, key=email)
key, token = adapter.initiate()
# If the smtp is configured send through here
magic_link.delay(email, key, token, origin)
magic_link.delay(email, key, token)
return Response({"key": str(key)}, status=status.HTTP_200_OK)
except AuthenticationException as e:
return Response(e.get_error_dict(), status=status.HTTP_400_BAD_REQUEST)
@@ -67,7 +67,7 @@ class MagicSignInSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -80,7 +80,7 @@ class MagicSignInSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -121,7 +121,7 @@ class MagicSignUpSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
# Existing User
@@ -134,7 +134,7 @@ class MagicSignUpSpaceEndpoint(View):
)
params = exc.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)
@@ -152,6 +152,6 @@ class MagicSignUpSpaceEndpoint(View):
except AuthenticationException as e:
params = e.get_error_dict()
if next_path:
params["next_path"] = str(next_path)
params["next_path"] = str(validate_next_path(next_path))
url = f"{base_host(request=request, is_space=True)}?{urlencode(params)}"
return HttpResponseRedirect(url)

View File

@@ -90,7 +90,7 @@ class ForgotPasswordSpaceEndpoint(APIView):
if user:
# Get the reset token for user
uidb64, token = generate_password_token(user=user)
current_site = request.META.get("HTTP_ORIGIN")
current_site = base_host(request=request, is_space=True)
# send the forgot password email
forgot_password.delay(
user.first_name, user.email, uidb64, token, current_site

View File

@@ -7,6 +7,7 @@ from django.utils import timezone
# Module imports
from plane.authentication.utils.host import base_host, user_ip
from plane.db.models import User
from plane.utils.path_validator import validate_next_path
class SignOutAuthSpaceEndpoint(View):
@@ -21,8 +22,8 @@ class SignOutAuthSpaceEndpoint(View):
user.save()
# Log the user out
logout(request)
url = f"{base_host(request=request, is_space=True)}{next_path}"
url = f"{base_host(request=request, is_space=True)}{str(validate_next_path(next_path)) if next_path else ''}"
return HttpResponseRedirect(url)
except Exception:
url = f"{base_host(request=request, is_space=True)}{next_path}"
url = f"{base_host(request=request, is_space=True)}{str(validate_next_path(next_path)) if next_path else ''}"
return HttpResponseRedirect(url)

View File

@@ -459,8 +459,37 @@ def analytic_export_task(email, data, slug):
csv_buffer = generate_csv_from_rows(rows)
send_export_email(email, slug, csv_buffer, rows)
logging.getLogger("plane").info("Email sent succesfully.")
logging.getLogger("plane.worker").info("Email sent successfully.")
return
except Exception as e:
log_exception(e)
return
@shared_task
def export_analytics_to_csv_email(data, headers, keys, email, slug):
try:
"""
Prepares a CSV from data and sends it as an email attachment.
Parameters:
- data: List of dictionaries (e.g. from .values())
- headers: List of CSV column headers
- keys: Keys to extract from each data item (dict)
- email: Email address to send to
- slug: Used for the filename
"""
# Prepare rows: header + data rows
rows = [headers]
for item in data:
row = [item.get(key, "") for key in keys]
rows.append(row)
# Generate CSV buffer
csv_buffer = generate_csv_from_rows(rows)
# Send email with CSV attachment
send_export_email(email=email, slug=slug, csv_buffer=csv_buffer, rows=rows)
except Exception as e:
log_exception(e)
return

View File

@@ -12,6 +12,7 @@ from plane.db.models import FileAsset, Page, Issue
from plane.utils.exception_logger import log_exception
from plane.settings.storage import S3Storage
from celery import shared_task
from plane.utils.url import normalize_url_path
def get_entity_id_field(entity_type, entity_id):
@@ -67,11 +68,14 @@ def sync_with_external_service(entity_name, description_html):
"description_html": description_html,
"variant": "rich" if entity_name == "PAGE" else "document",
}
response = requests.post(
f"{settings.LIVE_BASE_URL}/convert-document/",
json=data,
headers=None,
)
live_url = settings.LIVE_URL
if not live_url:
return {}
url = normalize_url_path(f"{live_url}/convert-document/")
response = requests.post(url, json=data, headers=None)
if response.status_code == 200:
return response.json()
except requests.RequestException as e:

Some files were not shown because too many files have changed in this diff Show More