mirror of
https://github.com/makeplane/plane
synced 2025-08-07 19:59:33 +00:00
Compare commits
74 Commits
chore-issu
...
feat-mobx-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5546bd2305 | ||
|
|
94f421f27d | ||
|
|
8d7425a3b7 | ||
|
|
211d5e1cd0 | ||
|
|
3c6bbaef3c | ||
|
|
4159d12959 | ||
|
|
2f2f8dc5f4 | ||
|
|
756a71ca78 | ||
|
|
36b3328c5e | ||
|
|
a5c1282e52 | ||
|
|
ed64168ca7 | ||
|
|
f54f3a6091 | ||
|
|
2d9464e841 | ||
|
|
70f72a2b0f | ||
|
|
c0b5e0e766 | ||
|
|
fedcdf0c84 | ||
|
|
ff936887d2 | ||
|
|
ea78c2bceb | ||
|
|
ba1a314608 | ||
|
|
3a6a8e3a97 | ||
|
|
1735561ffd | ||
|
|
b80a904bbf | ||
|
|
20260f0720 | ||
|
|
6070ed4d36 | ||
|
|
ac47cc62ee | ||
|
|
1059fbbebf | ||
|
|
61478d1b6b | ||
|
|
88737b1072 | ||
|
|
34d114a4c5 | ||
|
|
d54c1bae03 | ||
|
|
9f5def3a6a | ||
|
|
043f4eaa5e | ||
|
|
1ee0661ac1 | ||
|
|
60f7edcef8 | ||
|
|
23849789f9 | ||
|
|
33acb9e8ed | ||
|
|
d2c0940f04 | ||
|
|
00624eafbd | ||
|
|
e6bf57aa18 | ||
|
|
3c8c657ee0 | ||
|
|
119d343d5f | ||
|
|
c10b875e2a | ||
|
|
f10f9cbd41 | ||
|
|
9b71a702c7 | ||
|
|
0a320a8540 | ||
|
|
44d8de1169 | ||
|
|
6214c09170 | ||
|
|
51ca353577 | ||
|
|
881c744eb9 | ||
|
|
ec41ae61b4 | ||
|
|
5773c2bde3 | ||
|
|
e33bae2125 | ||
|
|
580c4b1930 | ||
|
|
ddd4b51b4e | ||
|
|
ede4aad55b | ||
|
|
1a715c98b2 | ||
|
|
8e6d885731 | ||
|
|
4507802aba | ||
|
|
438cc33046 | ||
|
|
442b0fd7e5 | ||
|
|
1119b9dc36 | ||
|
|
47a76f48b4 | ||
|
|
a0f03d07f3 | ||
|
|
74b2ec03ff | ||
|
|
5908998127 | ||
|
|
df6a80e7ae | ||
|
|
6ff258ceca | ||
|
|
a8140a5f08 | ||
|
|
9234f21f26 | ||
|
|
ab11e83535 | ||
|
|
b4112358ac | ||
|
|
77239ebcd4 | ||
|
|
54f828cbfa | ||
|
|
9ad8b43408 |
123
README.md
123
README.md
@@ -5,9 +5,7 @@
|
||||
<img src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_logo_.webp" alt="Plane Logo" width="70">
|
||||
</a>
|
||||
</p>
|
||||
|
||||
<h3 align="center"><b>Plane</b></h3>
|
||||
<p align="center"><b>Open-source project management that unlocks customer value</b></p>
|
||||
<h1 align="center"><b>Plane</b></h1>
|
||||
|
||||
<p align="center">
|
||||
<a href="https://discord.com/invite/A92xrEGCge">
|
||||
@@ -44,79 +42,85 @@ Meet [Plane](https://dub.sh/plane-website-readme), an open-source project manage
|
||||
|
||||
> Plane is evolving every day. Your suggestions, ideas, and reported bugs help us immensely. Do not hesitate to join in the conversation on [Discord](https://discord.com/invite/A92xrEGCge) or raise a GitHub issue. We read everything and respond to most.
|
||||
|
||||
## ⚡ Installation
|
||||
## 🚀 Installation
|
||||
|
||||
The easiest way to get started with Plane is by creating a [Plane Cloud](https://app.plane.so) account.
|
||||
Getting started with Plane is simple. Choose the setup that works best for you:
|
||||
|
||||
If you would like to self-host Plane, please see our [deployment guide](https://docs.plane.so/docker-compose).
|
||||
- **Plane Cloud**
|
||||
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
|
||||
|
||||
- **Self-host Plane**
|
||||
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
|
||||
|
||||
| Installation methods | Docs link |
|
||||
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ |
|
||||
| Docker | [](https://docs.plane.so/self-hosting/methods/docker-compose) |
|
||||
| Kubernetes | [](https://docs.plane.so/kubernetes) |
|
||||
| Docker | [](https://developers.plane.so/self-hosting/methods/docker-compose) |
|
||||
| Kubernetes | [](https://developers.plane.so/self-hosting/methods/kubernetes) |
|
||||
|
||||
`Instance admins` can configure instance settings with [God-mode](https://docs.plane.so/instance-admin).
|
||||
`Instance admins` can manage and customize settings using [God mode](https://developers.plane.so/self-hosting/govern/instance-admin).
|
||||
|
||||
## 🚀 Features
|
||||
## 🌟 Features
|
||||
|
||||
- **Issues**: Quickly create issues and add details using a powerful rich text editor that supports file uploads. Add sub-properties and references to problems for better organization and tracking.
|
||||
- **Issues**
|
||||
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
|
||||
|
||||
- **Cycles**:
|
||||
Keep up your team's momentum with Cycles. Gain insights into your project's progress with burn-down charts and other valuable features.
|
||||
- **Cycles**
|
||||
Maintain your team’s momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
|
||||
|
||||
- **Modules**: Break down your large projects into smaller, more manageable modules. Assign modules between teams to track and plan your project's progress easily.
|
||||
- **Modules**
|
||||
Simplify complex projects by dividing them into smaller, manageable modules.
|
||||
|
||||
- **Views**: Create custom filters to display only the issues that matter to you. Save and share your filters in just a few clicks.
|
||||
- **Views**
|
||||
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
|
||||
|
||||
- **Pages**: Plane pages, equipped with AI and a rich text editor, let you jot down your thoughts on the fly. Format your text, upload images, hyperlink, or sync your existing ideas into an actionable item or issue.
|
||||
- **Pages**
|
||||
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
|
||||
|
||||
- **Analytics**: Get insights into all your Plane data in real-time. Visualize issue data to spot trends, remove blockers, and progress your work.
|
||||
- **Analytics**
|
||||
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
|
||||
|
||||
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
|
||||
|
||||
## 🛠️ Quick start for contributors
|
||||
|
||||
> Development system must have docker engine installed and running.
|
||||
## 🛠️ Local development
|
||||
|
||||
Setting up local environment is extremely easy and straight forward. Follow the below step and you will be ready to contribute -
|
||||
### Pre-requisites
|
||||
- Ensure Docker Engine is installed and running.
|
||||
|
||||
1. Clone the code locally using:
|
||||
### Development setup
|
||||
Setting up your local environment is simple and straightforward. Follow these steps to get started:
|
||||
|
||||
1. Clone the repository:
|
||||
```
|
||||
git clone https://github.com/makeplane/plane.git
|
||||
```
|
||||
2. Switch to the code folder:
|
||||
2. Navigate to the project folder:
|
||||
```
|
||||
cd plane
|
||||
```
|
||||
3. Create your feature or fix branch you plan to work on using:
|
||||
3. Create a new branch for your feature or fix:
|
||||
```
|
||||
git checkout -b <feature-branch-name>
|
||||
```
|
||||
4. Open terminal and run:
|
||||
4. Run the setup script in the terminal:
|
||||
```
|
||||
./setup.sh
|
||||
```
|
||||
5. Open the code on VSCode or similar equivalent IDE.
|
||||
6. Review the `.env` files available in various folders.
|
||||
Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system.
|
||||
7. Run the docker command to initiate services:
|
||||
5. Open the project in an IDE such as VS Code.
|
||||
|
||||
6. Review the `.env` files in the relevant folders. Refer to [Environment Setup](./ENV_SETUP.md) for details on the environment variables used.
|
||||
|
||||
7. Start the services using Docker:
|
||||
```
|
||||
docker compose -f docker-compose-local.yml up -d
|
||||
```
|
||||
|
||||
You are ready to make changes to the code. Do not forget to refresh the browser (in case it does not auto-reload).
|
||||
That’s it! You’re all set to begin coding. Remember to refresh your browser if changes don’t auto-reload. Happy contributing! 🎉
|
||||
|
||||
Thats it!
|
||||
|
||||
## ❤️ Community
|
||||
|
||||
The Plane community can be found on [GitHub Discussions](https://github.com/orgs/makeplane/discussions), and our [Discord server](https://discord.com/invite/A92xrEGCge). Our [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community chanels.
|
||||
|
||||
Ask questions, report bugs, join discussions, voice ideas, make feature requests, or share your projects.
|
||||
|
||||
### Repo Activity
|
||||
|
||||

|
||||
## Built with
|
||||
[](https://nextjs.org/)<br/>
|
||||
[](https://www.djangoproject.com/)<br/>
|
||||
[](https://nodejs.org/en)
|
||||
|
||||
## 📸 Screenshots
|
||||
|
||||
@@ -165,7 +169,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
|
||||
</a>
|
||||
</p>
|
||||
</p>
|
||||
<p>
|
||||
<p>
|
||||
<a href="https://plane.so" target="_blank">
|
||||
<img
|
||||
src="https://ik.imagekit.io/w2okwbtu2/Drive_LlfeY4xn3.png?updatedAt=1709298837917"
|
||||
@@ -176,23 +180,42 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
|
||||
</p>
|
||||
</p>
|
||||
|
||||
## ⛓️ Security
|
||||
## 📝 Documentation
|
||||
Explore Plane's [product documentation](https://docs.plane.so/) and [developer documentation](https://developers.plane.so/) to learn about features, setup, and usage.
|
||||
|
||||
If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports.
|
||||
## ❤️ Community
|
||||
|
||||
Email squawk@plane.so to disclose any security vulnerabilities.
|
||||
Join the Plane community on [GitHub Discussions](https://github.com/orgs/makeplane/discussions) and our [Discord server](https://discord.com/invite/A92xrEGCge). We follow a [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) in all our community channels.
|
||||
|
||||
## ❤️ Contribute
|
||||
Feel free to ask questions, report bugs, participate in discussions, share ideas, request features, or showcase your projects. We’d love to hear from you!
|
||||
|
||||
There are many ways to contribute to Plane, including:
|
||||
## 🛡️ Security
|
||||
|
||||
- Submitting [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) and [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+) for various components.
|
||||
- Reviewing [the documentation](https://docs.plane.so/) and submitting [pull requests](https://github.com/makeplane/plane), from fixing typos to adding new features.
|
||||
- Speaking or writing about Plane or any other ecosystem integration and [letting us know](https://discord.com/invite/A92xrEGCge)!
|
||||
- Upvoting [popular feature requests](https://github.com/makeplane/plane/issues) to show your support.
|
||||
If you discover a security vulnerability in Plane, please report it responsibly instead of opening a public issue. We take all legitimate reports seriously and will investigate them promptly. See [Security policy](https://github.com/makeplane/plane/blob/master/SECURITY.md) for more info.
|
||||
|
||||
To disclose any security issues, please email us at security@plane.so.
|
||||
|
||||
## 🤝 Contributing
|
||||
|
||||
There are many ways you can contribute to Plane:
|
||||
|
||||
- Report [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) or submit [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+).
|
||||
- Review the [documentation](https://docs.plane.so/) and submit [pull requests](https://github.com/makeplane/docs) to improve it—whether it's fixing typos or adding new content.
|
||||
- Talk or write about Plane or any other ecosystem integration and [let us know](https://discord.com/invite/A92xrEGCge)!
|
||||
- Show your support by upvoting [popular feature requests](https://github.com/makeplane/plane/issues).
|
||||
|
||||
Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CONTRIBUTING.md) for details on the process for submitting pull requests to us.
|
||||
|
||||
### Repo activity
|
||||
|
||||

|
||||
|
||||
### We couldn't have done this without you.
|
||||
|
||||
<a href="https://github.com/makeplane/plane/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=makeplane/plane" />
|
||||
</a>
|
||||
|
||||
|
||||
## License
|
||||
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).
|
||||
@@ -4,10 +4,11 @@ import { FC, useState } from "react";
|
||||
import isEmpty from "lodash/isEmpty";
|
||||
import Link from "next/link";
|
||||
import { useForm } from "react-hook-form";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import { IFormattedInstanceConfiguration, TInstanceGithubAuthenticationConfigurationKeys } from "@plane/types";
|
||||
// ui
|
||||
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
|
||||
import { cn } from "@plane/utils";
|
||||
// components
|
||||
import {
|
||||
CodeBlock,
|
||||
@@ -17,8 +18,6 @@ import {
|
||||
TControllerInputFormField,
|
||||
TCopyField,
|
||||
} from "@/components/common";
|
||||
// helpers
|
||||
import { API_BASE_URL, cn } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
@@ -103,8 +102,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
|
||||
url: originURL,
|
||||
description: (
|
||||
<>
|
||||
We will auto-generate this. Paste this into the{" "}
|
||||
<CodeBlock darkerShade>Authorized origin URL</CodeBlock> field{" "}
|
||||
We will auto-generate this. Paste this into the <CodeBlock darkerShade>Authorized origin URL</CodeBlock> field{" "}
|
||||
<a
|
||||
tabIndex={-1}
|
||||
href="https://github.com/settings/applications/new"
|
||||
@@ -123,8 +121,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
|
||||
url: `${originURL}/auth/github/callback/`,
|
||||
description: (
|
||||
<>
|
||||
We will auto-generate this. Paste this into your{" "}
|
||||
<CodeBlock darkerShade>Authorized Callback URI</CodeBlock> field{" "}
|
||||
We will auto-generate this. Paste this into your <CodeBlock darkerShade>Authorized Callback URI</CodeBlock>{" "}
|
||||
field{" "}
|
||||
<a
|
||||
tabIndex={-1}
|
||||
href="https://github.com/settings/applications/new"
|
||||
|
||||
@@ -5,12 +5,12 @@ import { observer } from "mobx-react";
|
||||
import Image from "next/image";
|
||||
import { useTheme } from "next-themes";
|
||||
import useSWR from "swr";
|
||||
// plane internal packages
|
||||
import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
|
||||
import { resolveGeneralTheme } from "@plane/utils";
|
||||
// components
|
||||
import { AuthenticationMethodCard } from "@/components/authentication";
|
||||
import { PageHeader } from "@/components/common";
|
||||
// helpers
|
||||
import { resolveGeneralTheme } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
// icons
|
||||
@@ -44,7 +44,7 @@ const InstanceGithubAuthenticationPage = observer(() => {
|
||||
loading: "Saving Configuration...",
|
||||
success: {
|
||||
title: "Configuration saved",
|
||||
message: () => `Github authentication is now ${value ? "active" : "disabled"}.`,
|
||||
message: () => `GitHub authentication is now ${value ? "active" : "disabled"}.`,
|
||||
},
|
||||
error: {
|
||||
title: "Error",
|
||||
@@ -67,8 +67,8 @@ const InstanceGithubAuthenticationPage = observer(() => {
|
||||
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
|
||||
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
|
||||
<AuthenticationMethodCard
|
||||
name="Github"
|
||||
description="Allow members to login or sign up to plane with their Github accounts."
|
||||
name="GitHub"
|
||||
description="Allow members to login or sign up to plane with their GitHub accounts."
|
||||
icon={
|
||||
<Image
|
||||
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
|
||||
|
||||
@@ -2,10 +2,11 @@ import { FC, useState } from "react";
|
||||
import isEmpty from "lodash/isEmpty";
|
||||
import Link from "next/link";
|
||||
import { useForm } from "react-hook-form";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import { IFormattedInstanceConfiguration, TInstanceGitlabAuthenticationConfigurationKeys } from "@plane/types";
|
||||
// ui
|
||||
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
|
||||
import { cn } from "@plane/utils";
|
||||
// components
|
||||
import {
|
||||
CodeBlock,
|
||||
@@ -15,8 +16,6 @@ import {
|
||||
TControllerInputFormField,
|
||||
TCopyField,
|
||||
} from "@/components/common";
|
||||
// helpers
|
||||
import { API_BASE_URL, cn } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
@@ -117,8 +116,7 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
|
||||
url: `${originURL}/auth/gitlab/callback/`,
|
||||
description: (
|
||||
<>
|
||||
We will auto-generate this. Paste this into the{" "}
|
||||
<CodeBlock darkerShade>Redirect URI</CodeBlock> field of your{" "}
|
||||
We will auto-generate this. Paste this into the <CodeBlock darkerShade>Redirect URI</CodeBlock> field of your{" "}
|
||||
<a
|
||||
tabIndex={-1}
|
||||
href="https://docs.gitlab.com/ee/integration/oauth_provider.html"
|
||||
|
||||
@@ -3,10 +3,11 @@ import { FC, useState } from "react";
|
||||
import isEmpty from "lodash/isEmpty";
|
||||
import Link from "next/link";
|
||||
import { useForm } from "react-hook-form";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import { IFormattedInstanceConfiguration, TInstanceGoogleAuthenticationConfigurationKeys } from "@plane/types";
|
||||
// ui
|
||||
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
|
||||
import { cn } from "@plane/utils";
|
||||
// components
|
||||
import {
|
||||
CodeBlock,
|
||||
@@ -16,8 +17,6 @@ import {
|
||||
TControllerInputFormField,
|
||||
TCopyField,
|
||||
} from "@/components/common";
|
||||
// helpers
|
||||
import { API_BASE_URL, cn } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -3,10 +3,10 @@
|
||||
import { useState } from "react";
|
||||
import { observer } from "mobx-react";
|
||||
import useSWR from "swr";
|
||||
// plane internal packages
|
||||
import { TInstanceConfigurationKeys } from "@plane/types";
|
||||
import { Loader, ToggleSwitch, setPromiseToast } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
// plane admin components
|
||||
|
||||
@@ -4,11 +4,11 @@ import { ReactNode } from "react";
|
||||
import { ThemeProvider, useTheme } from "next-themes";
|
||||
import { SWRConfig } from "swr";
|
||||
// ui
|
||||
import { ADMIN_BASE_PATH, DEFAULT_SWR_CONFIG } from "@plane/constants";
|
||||
import { Toast } from "@plane/ui";
|
||||
import { resolveGeneralTheme } from "@plane/utils";
|
||||
// constants
|
||||
import { SWR_CONFIG } from "@/constants/swr-config";
|
||||
// helpers
|
||||
import { ASSET_PREFIX, resolveGeneralTheme } from "@/helpers/common.helper";
|
||||
// lib
|
||||
import { InstanceProvider } from "@/lib/instance-provider";
|
||||
import { StoreProvider } from "@/lib/store-provider";
|
||||
@@ -22,6 +22,7 @@ const ToastWithTheme = () => {
|
||||
};
|
||||
|
||||
export default function RootLayout({ children }: { children: ReactNode }) {
|
||||
const ASSET_PREFIX = ADMIN_BASE_PATH;
|
||||
return (
|
||||
<html lang="en">
|
||||
<head>
|
||||
@@ -34,7 +35,7 @@ export default function RootLayout({ children }: { children: ReactNode }) {
|
||||
<body className={`antialiased`}>
|
||||
<ThemeProvider themes={["light", "dark"]} defaultTheme="system" enableSystem>
|
||||
<ToastWithTheme />
|
||||
<SWRConfig value={SWR_CONFIG}>
|
||||
<SWRConfig value={DEFAULT_SWR_CONFIG}>
|
||||
<StoreProvider>
|
||||
<InstanceProvider>
|
||||
<UserProvider>{children}</UserProvider>
|
||||
|
||||
@@ -3,13 +3,11 @@ import Link from "next/link";
|
||||
import { useRouter } from "next/navigation";
|
||||
import { Controller, useForm } from "react-hook-form";
|
||||
// constants
|
||||
import { ORGANIZATION_SIZE, RESTRICTED_URLS } from "@plane/constants";
|
||||
import { WEB_BASE_URL, ORGANIZATION_SIZE, RESTRICTED_URLS } from "@plane/constants";
|
||||
// types
|
||||
import { IWorkspace } from "@plane/types";
|
||||
// components
|
||||
import { Button, CustomSelect, getButtonStyling, Input, setToast, TOAST_TYPE } from "@plane/ui";
|
||||
// helpers
|
||||
import { WEB_BASE_URL } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useWorkspace } from "@/hooks/store";
|
||||
// services
|
||||
|
||||
@@ -7,12 +7,10 @@ import useSWR from "swr";
|
||||
import { Loader as LoaderIcon } from "lucide-react";
|
||||
// types
|
||||
import { TInstanceConfigurationKeys } from "@plane/types";
|
||||
// ui
|
||||
import { Button, getButtonStyling, Loader, setPromiseToast, ToggleSwitch } from "@plane/ui";
|
||||
import { cn } from "@plane/utils";
|
||||
// components
|
||||
import { WorkspaceListItem } from "@/components/workspace";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
// hooks
|
||||
import { useInstance, useWorkspace } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ import {
|
||||
// components
|
||||
import { AuthenticationMethodCard } from "@/components/authentication";
|
||||
// helpers
|
||||
import { getBaseAuthenticationModes } from "@/helpers/authentication.helper";
|
||||
import { getBaseAuthenticationModes } from "@/lib/auth-helpers";
|
||||
// plane admin components
|
||||
import { UpgradeButton } from "@/plane-admin/components/common";
|
||||
// images
|
||||
|
||||
@@ -3,10 +3,9 @@
|
||||
import React from "react";
|
||||
// icons
|
||||
import { SquareArrowOutUpRight } from "lucide-react";
|
||||
// ui
|
||||
// plane internal packages
|
||||
import { getButtonStyling } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
|
||||
export const UpgradeButton: React.FC = () => (
|
||||
<a href="https://plane.so/pricing?mode=self-hosted" target="_blank" className={cn(getButtonStyling("primary", "sm"))}>
|
||||
|
||||
@@ -5,13 +5,14 @@ import { observer } from "mobx-react";
|
||||
import Link from "next/link";
|
||||
import { ExternalLink, FileText, HelpCircle, MoveLeft } from "lucide-react";
|
||||
import { Transition } from "@headlessui/react";
|
||||
// ui
|
||||
// plane internal packages
|
||||
import { WEB_BASE_URL } from "@plane/constants";
|
||||
import { DiscordIcon, GithubIcon, Tooltip } from "@plane/ui";
|
||||
// helpers
|
||||
import { WEB_BASE_URL, cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useTheme } from "@/hooks/store";
|
||||
// assets
|
||||
// eslint-disable-next-line import/order
|
||||
import packageJson from "package.json";
|
||||
|
||||
const helpOptions = [
|
||||
|
||||
@@ -5,11 +5,10 @@ import { observer } from "mobx-react";
|
||||
import { useTheme as useNextTheme } from "next-themes";
|
||||
import { LogOut, UserCog2, Palette } from "lucide-react";
|
||||
import { Menu, Transition } from "@headlessui/react";
|
||||
// plane ui
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import { Avatar } from "@plane/ui";
|
||||
// helpers
|
||||
import { API_BASE_URL, cn } from "@/helpers/common.helper";
|
||||
import { getFileURL } from "@/helpers/file.helper";
|
||||
import { getFileURL, cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useTheme, useUser } from "@/hooks/store";
|
||||
// services
|
||||
|
||||
@@ -4,11 +4,11 @@ import { observer } from "mobx-react";
|
||||
import Link from "next/link";
|
||||
import { usePathname } from "next/navigation";
|
||||
import { Image, BrainCog, Cog, Lock, Mail } from "lucide-react";
|
||||
// plane internal packages
|
||||
import { Tooltip, WorkspaceIcon } from "@plane/ui";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { useTheme } from "@/hooks/store";
|
||||
// helpers
|
||||
|
||||
const INSTANCE_ADMIN_LINKS = [
|
||||
{
|
||||
|
||||
@@ -30,7 +30,7 @@ export const InstanceHeader: FC = observer(() => {
|
||||
case "google":
|
||||
return "Google";
|
||||
case "github":
|
||||
return "Github";
|
||||
return "GitHub";
|
||||
case "gitlab":
|
||||
return "GitLab";
|
||||
case "workspace":
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { FC } from "react";
|
||||
import { Info, X } from "lucide-react";
|
||||
// helpers
|
||||
import { TAuthErrorInfo } from "@/helpers/authentication.helper";
|
||||
// plane constants
|
||||
import { TAuthErrorInfo } from "@plane/constants";
|
||||
|
||||
type TAuthBanner = {
|
||||
bannerData: TAuthErrorInfo | undefined;
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
import { FC } from "react";
|
||||
// helpers
|
||||
import { cn } from "helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
|
||||
type Props = {
|
||||
name: string;
|
||||
|
||||
@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
|
||||
import Link from "next/link";
|
||||
// icons
|
||||
import { Settings2 } from "lucide-react";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
|
||||
// ui
|
||||
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
|
||||
import Link from "next/link";
|
||||
// icons
|
||||
import { Settings2 } from "lucide-react";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
|
||||
// ui
|
||||
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -5,12 +5,10 @@ import { observer } from "mobx-react";
|
||||
import Link from "next/link";
|
||||
// icons
|
||||
import { Settings2 } from "lucide-react";
|
||||
// types
|
||||
// plane internal packages
|
||||
import { TInstanceAuthenticationMethodKeys } from "@plane/types";
|
||||
// ui
|
||||
import { ToggleSwitch, getButtonStyling } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
// hooks
|
||||
import { useInstance } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
|
||||
type TProps = {
|
||||
children: React.ReactNode;
|
||||
|
||||
@@ -4,10 +4,9 @@ import React, { useState } from "react";
|
||||
import { Controller, Control } from "react-hook-form";
|
||||
// icons
|
||||
import { Eye, EyeOff } from "lucide-react";
|
||||
// ui
|
||||
// plane internal packages
|
||||
import { Input } from "@plane/ui";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import { cn } from "@plane/utils";
|
||||
|
||||
type Props = {
|
||||
control: Control<any>;
|
||||
@@ -37,9 +36,7 @@ export const ControllerInput: React.FC<Props> = (props) => {
|
||||
|
||||
return (
|
||||
<div className="flex flex-col gap-1">
|
||||
<h4 className="text-sm text-custom-text-300">
|
||||
{label}
|
||||
</h4>
|
||||
<h4 className="text-sm text-custom-text-300">{label}</h4>
|
||||
<div className="relative">
|
||||
<Controller
|
||||
control={control}
|
||||
|
||||
@@ -1,14 +1,9 @@
|
||||
"use client";
|
||||
|
||||
import { FC, useMemo } from "react";
|
||||
// import { CircleCheck } from "lucide-react";
|
||||
// helpers
|
||||
import { cn } from "@/helpers/common.helper";
|
||||
import {
|
||||
E_PASSWORD_STRENGTH,
|
||||
// PASSWORD_CRITERIA,
|
||||
getPasswordStrength,
|
||||
} from "@/helpers/password.helper";
|
||||
// plane internal packages
|
||||
import { E_PASSWORD_STRENGTH } from "@plane/constants";
|
||||
import { cn, getPasswordStrength } from "@plane/utils";
|
||||
|
||||
type TPasswordStrengthMeter = {
|
||||
password: string;
|
||||
|
||||
@@ -4,13 +4,12 @@ import { FC, useEffect, useMemo, useState } from "react";
|
||||
import { useSearchParams } from "next/navigation";
|
||||
// icons
|
||||
import { Eye, EyeOff } from "lucide-react";
|
||||
// ui
|
||||
// plane internal packages
|
||||
import { API_BASE_URL, E_PASSWORD_STRENGTH } from "@plane/constants";
|
||||
import { Button, Checkbox, Input, Spinner } from "@plane/ui";
|
||||
import { getPasswordStrength } from "@plane/utils";
|
||||
// components
|
||||
import { Banner, PasswordStrengthMeter } from "@/components/common";
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
import { E_PASSWORD_STRENGTH, getPasswordStrength } from "@/helpers/password.helper";
|
||||
// services
|
||||
import { AuthService } from "@/services/auth.service";
|
||||
|
||||
|
||||
@@ -2,24 +2,18 @@
|
||||
|
||||
import { FC, useEffect, useMemo, useState } from "react";
|
||||
import { useSearchParams } from "next/navigation";
|
||||
// services
|
||||
import { Eye, EyeOff } from "lucide-react";
|
||||
// plane internal packages
|
||||
import { API_BASE_URL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
|
||||
import { Button, Input, Spinner } from "@plane/ui";
|
||||
// components
|
||||
import { Banner } from "@/components/common";
|
||||
// helpers
|
||||
import {
|
||||
authErrorHandler,
|
||||
EAuthenticationErrorCodes,
|
||||
EErrorAlertType,
|
||||
TAuthErrorInfo,
|
||||
} from "@/helpers/authentication.helper";
|
||||
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
import { authErrorHandler } from "@/lib/auth-helpers";
|
||||
// services
|
||||
import { AuthService } from "@/services/auth.service";
|
||||
// local components
|
||||
import { AuthBanner } from "../authentication";
|
||||
// ui
|
||||
// icons
|
||||
|
||||
// service initialization
|
||||
const authService = new AuthService();
|
||||
@@ -102,7 +96,7 @@ export const InstanceSignInForm: FC = (props) => {
|
||||
|
||||
useEffect(() => {
|
||||
if (errorCode) {
|
||||
const errorDetail = authErrorHandler(errorCode?.toString() as EAuthenticationErrorCodes);
|
||||
const errorDetail = authErrorHandler(errorCode?.toString() as EAdminAuthErrorCodes);
|
||||
if (errorDetail) {
|
||||
setErrorInfo(errorDetail);
|
||||
}
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
"use client";
|
||||
|
||||
import React from "react";
|
||||
import { resolveGeneralTheme } from "helpers/common.helper";
|
||||
import { observer } from "mobx-react";
|
||||
import Image from "next/image";
|
||||
import Link from "next/link";
|
||||
import { useTheme as nextUseTheme } from "next-themes";
|
||||
// ui
|
||||
import { Button, getButtonStyling } from "@plane/ui";
|
||||
import { resolveGeneralTheme } from "@plane/utils";
|
||||
// hooks
|
||||
import { useTheme } from "@/hooks/store";
|
||||
// icons
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import { observer } from "mobx-react";
|
||||
import { ExternalLink } from "lucide-react";
|
||||
// helpers
|
||||
// plane internal packages
|
||||
import { WEB_BASE_URL } from "@plane/constants";
|
||||
import { Tooltip } from "@plane/ui";
|
||||
import { WEB_BASE_URL } from "@/helpers/common.helper";
|
||||
import { getFileURL } from "@/helpers/file.helper";
|
||||
import { getFileURL } from "@plane/utils";
|
||||
// hooks
|
||||
import { useWorkspace } from "@/hooks/store";
|
||||
|
||||
|
||||
@@ -1,8 +0,0 @@
|
||||
export const SITE_NAME = "Plane | Simple, extensible, open-source project management tool.";
|
||||
export const SITE_TITLE = "Plane | Simple, extensible, open-source project management tool.";
|
||||
export const SITE_DESCRIPTION =
|
||||
"Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.";
|
||||
export const SITE_KEYWORDS =
|
||||
"software development, plan, ship, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
|
||||
export const SITE_URL = "https://app.plane.so/";
|
||||
export const TWITTER_USER_NAME = "Plane | Simple, extensible, open-source project management tool.";
|
||||
164
admin/core/lib/auth-helpers.tsx
Normal file
164
admin/core/lib/auth-helpers.tsx
Normal file
@@ -0,0 +1,164 @@
|
||||
import { ReactNode } from "react";
|
||||
import Image from "next/image";
|
||||
import Link from "next/link";
|
||||
import { KeyRound, Mails } from "lucide-react";
|
||||
// plane packages
|
||||
import { SUPPORT_EMAIL, EAdminAuthErrorCodes, TAuthErrorInfo } from "@plane/constants";
|
||||
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
|
||||
import { resolveGeneralTheme } from "@plane/utils";
|
||||
// components
|
||||
import {
|
||||
EmailCodesConfiguration,
|
||||
GithubConfiguration,
|
||||
GitlabConfiguration,
|
||||
GoogleConfiguration,
|
||||
PasswordLoginConfiguration,
|
||||
} from "@/components/authentication";
|
||||
// images
|
||||
import githubLightModeImage from "@/public/logos/github-black.png";
|
||||
import githubDarkModeImage from "@/public/logos/github-white.png";
|
||||
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
|
||||
import GoogleLogo from "@/public/logos/google-logo.svg";
|
||||
|
||||
export enum EErrorAlertType {
|
||||
BANNER_ALERT = "BANNER_ALERT",
|
||||
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
|
||||
INLINE_EMAIL = "INLINE_EMAIL",
|
||||
INLINE_PASSWORD = "INLINE_PASSWORD",
|
||||
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
|
||||
}
|
||||
|
||||
const errorCodeMessages: {
|
||||
[key in EAdminAuthErrorCodes]: { title: string; message: (email?: string | undefined) => ReactNode };
|
||||
} = {
|
||||
// admin
|
||||
[EAdminAuthErrorCodes.ADMIN_ALREADY_EXIST]: {
|
||||
title: `Admin already exists`,
|
||||
message: () => `Admin already exists. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME]: {
|
||||
title: `Email, password and first name required`,
|
||||
message: () => `Email, password and first name required. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.INVALID_ADMIN_EMAIL]: {
|
||||
title: `Invalid admin email`,
|
||||
message: () => `Invalid admin email. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.INVALID_ADMIN_PASSWORD]: {
|
||||
title: `Invalid admin password`,
|
||||
message: () => `Invalid admin password. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD]: {
|
||||
title: `Email and password required`,
|
||||
message: () => `Email and password required. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.ADMIN_AUTHENTICATION_FAILED]: {
|
||||
title: `Authentication failed`,
|
||||
message: () => `Authentication failed. Please try again.`,
|
||||
},
|
||||
[EAdminAuthErrorCodes.ADMIN_USER_ALREADY_EXIST]: {
|
||||
title: `Admin user already exists`,
|
||||
message: () => (
|
||||
<div>
|
||||
Admin user already exists.
|
||||
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
|
||||
Sign In
|
||||
</Link>
|
||||
now.
|
||||
</div>
|
||||
),
|
||||
},
|
||||
[EAdminAuthErrorCodes.ADMIN_USER_DOES_NOT_EXIST]: {
|
||||
title: `Admin user does not exist`,
|
||||
message: () => (
|
||||
<div>
|
||||
Admin user does not exist.
|
||||
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
|
||||
Sign In
|
||||
</Link>
|
||||
now.
|
||||
</div>
|
||||
),
|
||||
},
|
||||
[EAdminAuthErrorCodes.ADMIN_USER_DEACTIVATED]: {
|
||||
title: `User account deactivated`,
|
||||
message: () => `User account deactivated. Please contact ${!!SUPPORT_EMAIL ? SUPPORT_EMAIL : "administrator"}.`,
|
||||
},
|
||||
};
|
||||
|
||||
export const authErrorHandler = (
|
||||
errorCode: EAdminAuthErrorCodes,
|
||||
email?: string | undefined
|
||||
): TAuthErrorInfo | undefined => {
|
||||
const bannerAlertErrorCodes = [
|
||||
EAdminAuthErrorCodes.ADMIN_ALREADY_EXIST,
|
||||
EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,
|
||||
EAdminAuthErrorCodes.INVALID_ADMIN_EMAIL,
|
||||
EAdminAuthErrorCodes.INVALID_ADMIN_PASSWORD,
|
||||
EAdminAuthErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD,
|
||||
EAdminAuthErrorCodes.ADMIN_AUTHENTICATION_FAILED,
|
||||
EAdminAuthErrorCodes.ADMIN_USER_ALREADY_EXIST,
|
||||
EAdminAuthErrorCodes.ADMIN_USER_DOES_NOT_EXIST,
|
||||
EAdminAuthErrorCodes.ADMIN_USER_DEACTIVATED,
|
||||
];
|
||||
|
||||
if (bannerAlertErrorCodes.includes(errorCode))
|
||||
return {
|
||||
type: EErrorAlertType.BANNER_ALERT,
|
||||
code: errorCode,
|
||||
title: errorCodeMessages[errorCode]?.title || "Error",
|
||||
message: errorCodeMessages[errorCode]?.message(email) || "Something went wrong. Please try again.",
|
||||
};
|
||||
|
||||
return undefined;
|
||||
};
|
||||
|
||||
export const getBaseAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
|
||||
disabled,
|
||||
updateConfig,
|
||||
resolvedTheme,
|
||||
}) => [
|
||||
{
|
||||
key: "unique-codes",
|
||||
name: "Unique codes",
|
||||
description:
|
||||
"Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
|
||||
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
|
||||
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "passwords-login",
|
||||
name: "Passwords",
|
||||
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
|
||||
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
|
||||
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "google",
|
||||
name: "Google",
|
||||
description: "Allow members to log in or sign up for Plane with their Google accounts.",
|
||||
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
|
||||
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "github",
|
||||
name: "GitHub",
|
||||
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
|
||||
icon: (
|
||||
<Image
|
||||
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
|
||||
height={20}
|
||||
width={20}
|
||||
alt="GitHub Logo"
|
||||
/>
|
||||
),
|
||||
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "gitlab",
|
||||
name: "GitLab",
|
||||
description: "Allow members to log in or sign up to plane with their GitLab accounts.",
|
||||
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
|
||||
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
];
|
||||
@@ -1,5 +1,4 @@
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
// services
|
||||
import { APIService } from "@/services/api.service";
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import type {
|
||||
IFormattedInstanceConfiguration,
|
||||
IInstance,
|
||||
@@ -7,7 +8,6 @@ import type {
|
||||
IInstanceInfo,
|
||||
} from "@plane/types";
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
import { APIService } from "@/services/api.service";
|
||||
|
||||
export class InstanceService extends APIService {
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import type { IUser } from "@plane/types";
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
// services
|
||||
import { APIService } from "@/services/api.service";
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
// types
|
||||
// plane internal packages
|
||||
import { API_BASE_URL } from "@plane/constants";
|
||||
import type { IWorkspace, TWorkspacePaginationInfo } from "@plane/types";
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
// services
|
||||
import { APIService } from "@/services/api.service";
|
||||
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import set from "lodash/set";
|
||||
import { observable, action, computed, makeObservable, runInAction } from "mobx";
|
||||
// plane internal packages
|
||||
import { EInstanceStatus, TInstanceStatus } from "@plane/constants";
|
||||
import {
|
||||
IInstance,
|
||||
IInstanceAdmin,
|
||||
@@ -8,8 +10,6 @@ import {
|
||||
IInstanceInfo,
|
||||
IInstanceConfig,
|
||||
} from "@plane/types";
|
||||
// helpers
|
||||
import { EInstanceStatus, TInstanceStatus } from "@/helpers/instance.helper";
|
||||
// services
|
||||
import { InstanceService } from "@/services/instance.service";
|
||||
// root store
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { action, observable, runInAction, makeObservable } from "mobx";
|
||||
// plane internal packages
|
||||
import { EUserStatus, TUserStatus } from "@plane/constants";
|
||||
import { IUser } from "@plane/types";
|
||||
// helpers
|
||||
import { EUserStatus, TUserStatus } from "@/helpers/user.helper";
|
||||
// services
|
||||
import { AuthService } from "@/services/auth.service";
|
||||
import { UserService } from "@/services/user.service";
|
||||
|
||||
@@ -1,203 +0,0 @@
|
||||
import { ReactNode } from "react";
|
||||
import Image from "next/image";
|
||||
import Link from "next/link";
|
||||
import { KeyRound, Mails } from "lucide-react";
|
||||
// types
|
||||
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
|
||||
// components
|
||||
import {
|
||||
EmailCodesConfiguration,
|
||||
GithubConfiguration,
|
||||
GitlabConfiguration,
|
||||
GoogleConfiguration,
|
||||
PasswordLoginConfiguration,
|
||||
} from "@/components/authentication";
|
||||
// helpers
|
||||
import { SUPPORT_EMAIL, resolveGeneralTheme } from "@/helpers/common.helper";
|
||||
// images
|
||||
import githubLightModeImage from "@/public/logos/github-black.png";
|
||||
import githubDarkModeImage from "@/public/logos/github-white.png";
|
||||
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
|
||||
import GoogleLogo from "@/public/logos/google-logo.svg";
|
||||
|
||||
export enum EPageTypes {
|
||||
PUBLIC = "PUBLIC",
|
||||
NON_AUTHENTICATED = "NON_AUTHENTICATED",
|
||||
SET_PASSWORD = "SET_PASSWORD",
|
||||
ONBOARDING = "ONBOARDING",
|
||||
AUTHENTICATED = "AUTHENTICATED",
|
||||
}
|
||||
|
||||
export enum EAuthModes {
|
||||
SIGN_IN = "SIGN_IN",
|
||||
SIGN_UP = "SIGN_UP",
|
||||
}
|
||||
|
||||
export enum EAuthSteps {
|
||||
EMAIL = "EMAIL",
|
||||
PASSWORD = "PASSWORD",
|
||||
UNIQUE_CODE = "UNIQUE_CODE",
|
||||
}
|
||||
|
||||
export enum EErrorAlertType {
|
||||
BANNER_ALERT = "BANNER_ALERT",
|
||||
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
|
||||
INLINE_EMAIL = "INLINE_EMAIL",
|
||||
INLINE_PASSWORD = "INLINE_PASSWORD",
|
||||
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
|
||||
}
|
||||
|
||||
export enum EAuthenticationErrorCodes {
|
||||
// Admin
|
||||
ADMIN_ALREADY_EXIST = "5150",
|
||||
REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME = "5155",
|
||||
INVALID_ADMIN_EMAIL = "5160",
|
||||
INVALID_ADMIN_PASSWORD = "5165",
|
||||
REQUIRED_ADMIN_EMAIL_PASSWORD = "5170",
|
||||
ADMIN_AUTHENTICATION_FAILED = "5175",
|
||||
ADMIN_USER_ALREADY_EXIST = "5180",
|
||||
ADMIN_USER_DOES_NOT_EXIST = "5185",
|
||||
ADMIN_USER_DEACTIVATED = "5190",
|
||||
}
|
||||
|
||||
export type TAuthErrorInfo = {
|
||||
type: EErrorAlertType;
|
||||
code: EAuthenticationErrorCodes;
|
||||
title: string;
|
||||
message: ReactNode;
|
||||
};
|
||||
|
||||
const errorCodeMessages: {
|
||||
[key in EAuthenticationErrorCodes]: { title: string; message: (email?: string | undefined) => ReactNode };
|
||||
} = {
|
||||
// admin
|
||||
[EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST]: {
|
||||
title: `Admin already exists`,
|
||||
message: () => `Admin already exists. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME]: {
|
||||
title: `Email, password and first name required`,
|
||||
message: () => `Email, password and first name required. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL]: {
|
||||
title: `Invalid admin email`,
|
||||
message: () => `Invalid admin email. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD]: {
|
||||
title: `Invalid admin password`,
|
||||
message: () => `Invalid admin password. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD]: {
|
||||
title: `Email and password required`,
|
||||
message: () => `Email and password required. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED]: {
|
||||
title: `Authentication failed`,
|
||||
message: () => `Authentication failed. Please try again.`,
|
||||
},
|
||||
[EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST]: {
|
||||
title: `Admin user already exists`,
|
||||
message: () => (
|
||||
<div>
|
||||
Admin user already exists.
|
||||
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
|
||||
Sign In
|
||||
</Link>
|
||||
now.
|
||||
</div>
|
||||
),
|
||||
},
|
||||
[EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST]: {
|
||||
title: `Admin user does not exist`,
|
||||
message: () => (
|
||||
<div>
|
||||
Admin user does not exist.
|
||||
<Link className="underline underline-offset-4 font-medium hover:font-bold transition-all" href={`/admin`}>
|
||||
Sign In
|
||||
</Link>
|
||||
now.
|
||||
</div>
|
||||
),
|
||||
},
|
||||
[EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED]: {
|
||||
title: `User account deactivated`,
|
||||
message: () => `User account deactivated. Please contact ${!!SUPPORT_EMAIL ? SUPPORT_EMAIL : "administrator"}.`,
|
||||
},
|
||||
};
|
||||
|
||||
export const authErrorHandler = (
|
||||
errorCode: EAuthenticationErrorCodes,
|
||||
email?: string | undefined
|
||||
): TAuthErrorInfo | undefined => {
|
||||
const bannerAlertErrorCodes = [
|
||||
EAuthenticationErrorCodes.ADMIN_ALREADY_EXIST,
|
||||
EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME,
|
||||
EAuthenticationErrorCodes.INVALID_ADMIN_EMAIL,
|
||||
EAuthenticationErrorCodes.INVALID_ADMIN_PASSWORD,
|
||||
EAuthenticationErrorCodes.REQUIRED_ADMIN_EMAIL_PASSWORD,
|
||||
EAuthenticationErrorCodes.ADMIN_AUTHENTICATION_FAILED,
|
||||
EAuthenticationErrorCodes.ADMIN_USER_ALREADY_EXIST,
|
||||
EAuthenticationErrorCodes.ADMIN_USER_DOES_NOT_EXIST,
|
||||
EAuthenticationErrorCodes.ADMIN_USER_DEACTIVATED,
|
||||
];
|
||||
|
||||
if (bannerAlertErrorCodes.includes(errorCode))
|
||||
return {
|
||||
type: EErrorAlertType.BANNER_ALERT,
|
||||
code: errorCode,
|
||||
title: errorCodeMessages[errorCode]?.title || "Error",
|
||||
message: errorCodeMessages[errorCode]?.message(email) || "Something went wrong. Please try again.",
|
||||
};
|
||||
|
||||
return undefined;
|
||||
};
|
||||
|
||||
export const getBaseAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
|
||||
disabled,
|
||||
updateConfig,
|
||||
resolvedTheme,
|
||||
}) => [
|
||||
{
|
||||
key: "unique-codes",
|
||||
name: "Unique codes",
|
||||
description:
|
||||
"Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
|
||||
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
|
||||
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "passwords-login",
|
||||
name: "Passwords",
|
||||
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
|
||||
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
|
||||
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "google",
|
||||
name: "Google",
|
||||
description: "Allow members to log in or sign up for Plane with their Google accounts.",
|
||||
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
|
||||
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "github",
|
||||
name: "GitHub",
|
||||
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
|
||||
icon: (
|
||||
<Image
|
||||
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
|
||||
height={20}
|
||||
width={20}
|
||||
alt="GitHub Logo"
|
||||
/>
|
||||
),
|
||||
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
{
|
||||
key: "gitlab",
|
||||
name: "GitLab",
|
||||
description: "Allow members to log in or sign up to plane with their GitLab accounts.",
|
||||
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
|
||||
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
|
||||
},
|
||||
];
|
||||
@@ -1,20 +0,0 @@
|
||||
import { clsx, type ClassValue } from "clsx";
|
||||
import { twMerge } from "tailwind-merge";
|
||||
|
||||
export const API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL || "";
|
||||
|
||||
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "";
|
||||
|
||||
export const SPACE_BASE_URL = process.env.NEXT_PUBLIC_SPACE_BASE_URL || "";
|
||||
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "";
|
||||
|
||||
export const WEB_BASE_URL = process.env.NEXT_PUBLIC_WEB_BASE_URL || "";
|
||||
|
||||
export const SUPPORT_EMAIL = process.env.NEXT_PUBLIC_SUPPORT_EMAIL || "";
|
||||
|
||||
export const ASSET_PREFIX = ADMIN_BASE_PATH;
|
||||
|
||||
export const cn = (...inputs: ClassValue[]) => twMerge(clsx(inputs));
|
||||
|
||||
export const resolveGeneralTheme = (resolvedTheme: string | undefined) =>
|
||||
resolvedTheme?.includes("light") ? "light" : resolvedTheme?.includes("dark") ? "dark" : "system";
|
||||
@@ -1,14 +0,0 @@
|
||||
// helpers
|
||||
import { API_BASE_URL } from "@/helpers/common.helper";
|
||||
|
||||
/**
|
||||
* @description combine the file path with the base URL
|
||||
* @param {string} path
|
||||
* @returns {string} final URL with the base URL
|
||||
*/
|
||||
export const getFileURL = (path: string): string | undefined => {
|
||||
if (!path) return undefined;
|
||||
const isValidURL = path.startsWith("http");
|
||||
if (isValidURL) return path;
|
||||
return `${API_BASE_URL}${path}`;
|
||||
};
|
||||
@@ -1,2 +0,0 @@
|
||||
export * from "./instance.helper";
|
||||
export * from "./user.helper";
|
||||
@@ -1,67 +0,0 @@
|
||||
import zxcvbn from "zxcvbn";
|
||||
|
||||
export enum E_PASSWORD_STRENGTH {
|
||||
EMPTY = "empty",
|
||||
LENGTH_NOT_VALID = "length_not_valid",
|
||||
STRENGTH_NOT_VALID = "strength_not_valid",
|
||||
STRENGTH_VALID = "strength_valid",
|
||||
}
|
||||
|
||||
const PASSWORD_MIN_LENGTH = 8;
|
||||
// const PASSWORD_NUMBER_REGEX = /\d/;
|
||||
// const PASSWORD_CHAR_CAPS_REGEX = /[A-Z]/;
|
||||
// const PASSWORD_SPECIAL_CHAR_REGEX = /[`!@#$%^&*()_\-+=\[\]{};':"\\|,.<>\/?~ ]/;
|
||||
|
||||
export const PASSWORD_CRITERIA = [
|
||||
{
|
||||
key: "min_8_char",
|
||||
label: "Min 8 characters",
|
||||
isCriteriaValid: (password: string) => password.length >= PASSWORD_MIN_LENGTH,
|
||||
},
|
||||
// {
|
||||
// key: "min_1_upper_case",
|
||||
// label: "Min 1 upper-case letter",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_NUMBER_REGEX.test(password),
|
||||
// },
|
||||
// {
|
||||
// key: "min_1_number",
|
||||
// label: "Min 1 number",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_CHAR_CAPS_REGEX.test(password),
|
||||
// },
|
||||
// {
|
||||
// key: "min_1_special_char",
|
||||
// label: "Min 1 special character",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_SPECIAL_CHAR_REGEX.test(password),
|
||||
// },
|
||||
];
|
||||
|
||||
export const getPasswordStrength = (password: string): E_PASSWORD_STRENGTH => {
|
||||
let passwordStrength: E_PASSWORD_STRENGTH = E_PASSWORD_STRENGTH.EMPTY;
|
||||
|
||||
if (!password || password === "" || password.length <= 0) {
|
||||
return passwordStrength;
|
||||
}
|
||||
|
||||
if (password.length >= PASSWORD_MIN_LENGTH) {
|
||||
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_NOT_VALID;
|
||||
} else {
|
||||
passwordStrength = E_PASSWORD_STRENGTH.LENGTH_NOT_VALID;
|
||||
return passwordStrength;
|
||||
}
|
||||
|
||||
const passwordCriteriaValidation = PASSWORD_CRITERIA.map((criteria) => criteria.isCriteriaValid(password)).every(
|
||||
(criterion) => criterion
|
||||
);
|
||||
const passwordStrengthScore = zxcvbn(password).score;
|
||||
|
||||
if (passwordCriteriaValidation === false || passwordStrengthScore <= 2) {
|
||||
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_NOT_VALID;
|
||||
return passwordStrength;
|
||||
}
|
||||
|
||||
if (passwordCriteriaValidation === true && passwordStrengthScore >= 3) {
|
||||
passwordStrength = E_PASSWORD_STRENGTH.STRENGTH_VALID;
|
||||
}
|
||||
|
||||
return passwordStrength;
|
||||
};
|
||||
@@ -1,21 +0,0 @@
|
||||
/**
|
||||
* @description
|
||||
* This function test whether a URL is valid or not.
|
||||
*
|
||||
* It accepts URLs with or without the protocol.
|
||||
* @param {string} url
|
||||
* @returns {boolean}
|
||||
* @example
|
||||
* checkURLValidity("https://example.com") => true
|
||||
* checkURLValidity("example.com") => true
|
||||
* checkURLValidity("example") => false
|
||||
*/
|
||||
export const checkURLValidity = (url: string): boolean => {
|
||||
if (!url) return false;
|
||||
|
||||
// regex to support complex query parameters and fragments
|
||||
const urlPattern =
|
||||
/^(https?:\/\/)?((([a-z\d-]+\.)*[a-z\d-]+\.[a-z]{2,6})|(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}))(:\d+)?(\/[\w.-]*)*(\?[^#\s]*)?(#[\w-]*)?$/i;
|
||||
|
||||
return urlPattern.test(url);
|
||||
};
|
||||
@@ -5,7 +5,6 @@
|
||||
"baseUrl": ".",
|
||||
"paths": {
|
||||
"@/*": ["core/*"],
|
||||
"@/helpers/*": ["helpers/*"],
|
||||
"@/public/*": ["public/*"],
|
||||
"@/plane-admin/*": ["ce/*"]
|
||||
}
|
||||
|
||||
@@ -4,7 +4,7 @@ from rest_framework import serializers
|
||||
# Module imports
|
||||
from .base import BaseSerializer
|
||||
from plane.db.models import Cycle, CycleIssue
|
||||
|
||||
from plane.utils.timezone_converter import convert_to_utc
|
||||
|
||||
class CycleSerializer(BaseSerializer):
|
||||
total_issues = serializers.IntegerField(read_only=True)
|
||||
@@ -24,6 +24,18 @@ class CycleSerializer(BaseSerializer):
|
||||
and data.get("start_date", None) > data.get("end_date", None)
|
||||
):
|
||||
raise serializers.ValidationError("Start date cannot exceed end date")
|
||||
|
||||
if (
|
||||
data.get("start_date", None) is not None
|
||||
and data.get("end_date", None) is not None
|
||||
):
|
||||
project_id = self.initial_data.get("project_id") or self.instance.project_id
|
||||
data["start_date"] = convert_to_utc(
|
||||
str(data.get("start_date").date()), project_id, is_start_date=True
|
||||
)
|
||||
data["end_date"] = convert_to_utc(
|
||||
str(data.get("end_date", None).date()), project_id
|
||||
)
|
||||
return data
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -237,17 +237,37 @@ class IssueSerializer(BaseSerializer):
|
||||
from .user import UserLiteSerializer
|
||||
|
||||
data["assignees"] = UserLiteSerializer(
|
||||
instance.assignees.all(), many=True
|
||||
User.objects.filter(
|
||||
pk__in=IssueAssignee.objects.filter(issue=instance).values_list(
|
||||
"assignee_id", flat=True
|
||||
)
|
||||
),
|
||||
many=True,
|
||||
).data
|
||||
else:
|
||||
data["assignees"] = [
|
||||
str(assignee.id) for assignee in instance.assignees.all()
|
||||
str(assignee)
|
||||
for assignee in IssueAssignee.objects.filter(
|
||||
issue=instance
|
||||
).values_list("assignee_id", flat=True)
|
||||
]
|
||||
if "labels" in self.fields:
|
||||
if "labels" in self.expand:
|
||||
data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
|
||||
data["labels"] = LabelSerializer(
|
||||
Label.objects.filter(
|
||||
pk__in=IssueLabel.objects.filter(issue=instance).values_list(
|
||||
"label_id", flat=True
|
||||
)
|
||||
),
|
||||
many=True,
|
||||
).data
|
||||
else:
|
||||
data["labels"] = [str(label.id) for label in instance.labels.all()]
|
||||
data["labels"] = [
|
||||
str(label)
|
||||
for label in IssueLabel.objects.filter(issue=instance).values_list(
|
||||
"label_id", flat=True
|
||||
)
|
||||
]
|
||||
|
||||
return data
|
||||
|
||||
|
||||
@@ -109,16 +109,6 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
|
||||
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Create or get state
|
||||
state, _ = State.objects.get_or_create(
|
||||
name="Triage",
|
||||
group="triage",
|
||||
description="Default state for managing all Intake Issues",
|
||||
project_id=project_id,
|
||||
color="#ff7700",
|
||||
is_triage=True,
|
||||
)
|
||||
|
||||
# create an issue
|
||||
issue = Issue.objects.create(
|
||||
name=request.data.get("issue", {}).get("name"),
|
||||
@@ -128,7 +118,6 @@ class IntakeIssueAPIEndpoint(BaseAPIView):
|
||||
),
|
||||
priority=request.data.get("issue", {}).get("priority", "none"),
|
||||
project_id=project_id,
|
||||
state=state,
|
||||
)
|
||||
|
||||
# create an intake issue
|
||||
|
||||
@@ -258,7 +258,9 @@ class ProjectAPIEndpoint(BaseAPIView):
|
||||
ProjectSerializer(project).data, cls=DjangoJSONEncoder
|
||||
)
|
||||
|
||||
intake_view = request.data.get("inbox_view", project.intake_view)
|
||||
intake_view = request.data.get(
|
||||
"inbox_view", request.data.get("intake_view", project.intake_view)
|
||||
)
|
||||
|
||||
if project.archived_at:
|
||||
return Response(
|
||||
|
||||
@@ -5,6 +5,7 @@ from rest_framework import serializers
|
||||
from .base import BaseSerializer
|
||||
from .issue import IssueStateSerializer
|
||||
from plane.db.models import Cycle, CycleIssue, CycleUserProperties
|
||||
from plane.utils.timezone_converter import convert_to_utc
|
||||
|
||||
|
||||
class CycleWriteSerializer(BaseSerializer):
|
||||
@@ -15,6 +16,17 @@ class CycleWriteSerializer(BaseSerializer):
|
||||
and data.get("start_date", None) > data.get("end_date", None)
|
||||
):
|
||||
raise serializers.ValidationError("Start date cannot exceed end date")
|
||||
if (
|
||||
data.get("start_date", None) is not None
|
||||
and data.get("end_date", None) is not None
|
||||
):
|
||||
project_id = self.initial_data.get("project_id") or self.instance.project_id
|
||||
data["start_date"] = convert_to_utc(
|
||||
str(data.get("start_date").date()), project_id, is_start_date=True
|
||||
)
|
||||
data["end_date"] = convert_to_utc(
|
||||
str(data.get("end_date", None).date()), project_id
|
||||
)
|
||||
return data
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -116,7 +116,7 @@ class WebhookSerializer(DynamicBaseSerializer):
|
||||
class Meta:
|
||||
model = Webhook
|
||||
fields = "__all__"
|
||||
read_only_fields = ["workspace", "secret_key"]
|
||||
read_only_fields = ["workspace", "secret_key", "deleted_at"]
|
||||
|
||||
|
||||
class WebhookLogSerializer(DynamicBaseSerializer):
|
||||
|
||||
@@ -17,6 +17,7 @@ from .user import urlpatterns as user_urls
|
||||
from .views import urlpatterns as view_urls
|
||||
from .webhook import urlpatterns as webhook_urls
|
||||
from .workspace import urlpatterns as workspace_urls
|
||||
from .timezone import urlpatterns as timezone_urls
|
||||
|
||||
urlpatterns = [
|
||||
*analytic_urls,
|
||||
@@ -38,4 +39,5 @@ urlpatterns = [
|
||||
*workspace_urls,
|
||||
*api_urls,
|
||||
*webhook_urls,
|
||||
*timezone_urls,
|
||||
]
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from django.urls import path
|
||||
|
||||
|
||||
from plane.app.views import GlobalSearchEndpoint, IssueSearchEndpoint
|
||||
from plane.app.views import GlobalSearchEndpoint, IssueSearchEndpoint, SearchEndpoint
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
@@ -15,4 +15,9 @@ urlpatterns = [
|
||||
IssueSearchEndpoint.as_view(),
|
||||
name="project-issue-search",
|
||||
),
|
||||
path(
|
||||
"workspaces/<str:slug>/entity-search/",
|
||||
SearchEndpoint.as_view(),
|
||||
name="entity-search",
|
||||
),
|
||||
]
|
||||
|
||||
8
apiserver/plane/app/urls/timezone.py
Normal file
8
apiserver/plane/app/urls/timezone.py
Normal file
@@ -0,0 +1,8 @@
|
||||
from django.urls import path
|
||||
|
||||
from plane.app.views import TimezoneEndpoint
|
||||
|
||||
urlpatterns = [
|
||||
# timezone endpoint
|
||||
path("timezones/", TimezoneEndpoint.as_view(), name="timezone-list")
|
||||
]
|
||||
@@ -158,7 +158,7 @@ from .page.base import (
|
||||
)
|
||||
from .page.version import PageVersionEndpoint
|
||||
|
||||
from .search.base import GlobalSearchEndpoint
|
||||
from .search.base import GlobalSearchEndpoint, SearchEndpoint
|
||||
from .search.issue import IssueSearchEndpoint
|
||||
|
||||
|
||||
@@ -204,3 +204,5 @@ from .error_404 import custom_404_view
|
||||
|
||||
from .notification.base import MarkAllReadNotificationViewSet
|
||||
from .user.base import AccountEndpoint, ProfileEndpoint, UserSessionEndpoint
|
||||
|
||||
from .timezone.base import TimezoneEndpoint
|
||||
|
||||
@@ -126,7 +126,13 @@ class UserAssetsV2Endpoint(BaseAPIView):
|
||||
)
|
||||
|
||||
# Check if the file type is allowed
|
||||
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
|
||||
allowed_types = [
|
||||
"image/jpeg",
|
||||
"image/png",
|
||||
"image/webp",
|
||||
"image/jpg",
|
||||
"image/gif",
|
||||
]
|
||||
if type not in allowed_types:
|
||||
return Response(
|
||||
{
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
# Python imports
|
||||
import json
|
||||
import pytz
|
||||
|
||||
|
||||
# Django imports
|
||||
from django.contrib.postgres.aggregates import ArrayAgg
|
||||
@@ -52,6 +54,11 @@ from plane.bgtasks.recent_visited_task import recent_visited_task
|
||||
# Module imports
|
||||
from .. import BaseAPIView, BaseViewSet
|
||||
from plane.bgtasks.webhook_task import model_activity
|
||||
from plane.utils.timezone_converter import (
|
||||
convert_utc_to_project_timezone,
|
||||
convert_to_utc,
|
||||
user_timezone_converter,
|
||||
)
|
||||
|
||||
|
||||
class CycleViewSet(BaseViewSet):
|
||||
@@ -67,6 +74,19 @@ class CycleViewSet(BaseViewSet):
|
||||
project_id=self.kwargs.get("project_id"),
|
||||
workspace__slug=self.kwargs.get("slug"),
|
||||
)
|
||||
|
||||
project = Project.objects.get(id=self.kwargs.get("project_id"))
|
||||
|
||||
# Fetch project for the specific record or pass project_id dynamically
|
||||
project_timezone = project.timezone
|
||||
|
||||
# Convert the current time (timezone.now()) to the project's timezone
|
||||
local_tz = pytz.timezone(project_timezone)
|
||||
current_time_in_project_tz = timezone.now().astimezone(local_tz)
|
||||
|
||||
# Convert project local time back to UTC for comparison (start_date is stored in UTC)
|
||||
current_time_in_utc = current_time_in_project_tz.astimezone(pytz.utc)
|
||||
|
||||
return self.filter_queryset(
|
||||
super()
|
||||
.get_queryset()
|
||||
@@ -119,12 +139,15 @@ class CycleViewSet(BaseViewSet):
|
||||
.annotate(
|
||||
status=Case(
|
||||
When(
|
||||
Q(start_date__lte=timezone.now())
|
||||
& Q(end_date__gte=timezone.now()),
|
||||
Q(start_date__lte=current_time_in_utc)
|
||||
& Q(end_date__gte=current_time_in_utc),
|
||||
then=Value("CURRENT"),
|
||||
),
|
||||
When(start_date__gt=timezone.now(), then=Value("UPCOMING")),
|
||||
When(end_date__lt=timezone.now(), then=Value("COMPLETED")),
|
||||
When(
|
||||
start_date__gt=current_time_in_utc,
|
||||
then=Value("UPCOMING"),
|
||||
),
|
||||
When(end_date__lt=current_time_in_utc, then=Value("COMPLETED")),
|
||||
When(
|
||||
Q(start_date__isnull=True) & Q(end_date__isnull=True),
|
||||
then=Value("DRAFT"),
|
||||
@@ -160,10 +183,22 @@ class CycleViewSet(BaseViewSet):
|
||||
# Update the order by
|
||||
queryset = queryset.order_by("-is_favorite", "-created_at")
|
||||
|
||||
project = Project.objects.get(id=self.kwargs.get("project_id"))
|
||||
|
||||
# Fetch project for the specific record or pass project_id dynamically
|
||||
project_timezone = project.timezone
|
||||
|
||||
# Convert the current time (timezone.now()) to the project's timezone
|
||||
local_tz = pytz.timezone(project_timezone)
|
||||
current_time_in_project_tz = timezone.now().astimezone(local_tz)
|
||||
|
||||
# Convert project local time back to UTC for comparison (start_date is stored in UTC)
|
||||
current_time_in_utc = current_time_in_project_tz.astimezone(pytz.utc)
|
||||
|
||||
# Current Cycle
|
||||
if cycle_view == "current":
|
||||
queryset = queryset.filter(
|
||||
start_date__lte=timezone.now(), end_date__gte=timezone.now()
|
||||
start_date__lte=current_time_in_utc, end_date__gte=current_time_in_utc
|
||||
)
|
||||
|
||||
data = queryset.values(
|
||||
@@ -191,6 +226,8 @@ class CycleViewSet(BaseViewSet):
|
||||
"version",
|
||||
"created_by",
|
||||
)
|
||||
datetime_fields = ["start_date", "end_date"]
|
||||
data = user_timezone_converter(data, datetime_fields, project_timezone)
|
||||
|
||||
if data:
|
||||
return Response(data, status=status.HTTP_200_OK)
|
||||
@@ -221,6 +258,8 @@ class CycleViewSet(BaseViewSet):
|
||||
"version",
|
||||
"created_by",
|
||||
)
|
||||
datetime_fields = ["start_date", "end_date"]
|
||||
data = user_timezone_converter(data, datetime_fields, request.user.user_timezone)
|
||||
return Response(data, status=status.HTTP_200_OK)
|
||||
|
||||
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
|
||||
@@ -417,6 +456,8 @@ class CycleViewSet(BaseViewSet):
|
||||
)
|
||||
|
||||
queryset = queryset.first()
|
||||
datetime_fields = ["start_date", "end_date"]
|
||||
data = user_timezone_converter(data, datetime_fields, request.user.user_timezone)
|
||||
|
||||
recent_visited_task.delay(
|
||||
slug=slug,
|
||||
@@ -492,6 +533,9 @@ class CycleDateCheckEndpoint(BaseAPIView):
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
start_date = convert_to_utc(str(start_date), project_id, is_start_date=True)
|
||||
end_date = convert_to_utc(str(end_date), project_id)
|
||||
|
||||
# Check if any cycle intersects in the given interval
|
||||
cycles = Cycle.objects.filter(
|
||||
Q(workspace__slug=slug)
|
||||
|
||||
@@ -54,10 +54,11 @@ from plane.utils.issue_filters import issue_filters
|
||||
from plane.utils.order_queryset import order_issue_queryset
|
||||
from plane.utils.paginator import GroupedOffsetPaginator, SubGroupedOffsetPaginator
|
||||
from .. import BaseAPIView, BaseViewSet
|
||||
from plane.utils.user_timezone_converter import user_timezone_converter
|
||||
from plane.utils.timezone_converter import user_timezone_converter
|
||||
from plane.bgtasks.recent_visited_task import recent_visited_task
|
||||
from plane.utils.global_paginator import paginate
|
||||
from plane.bgtasks.webhook_task import model_activity
|
||||
from plane.bgtasks.issue_description_version_task import issue_description_version_task
|
||||
|
||||
|
||||
class IssueListEndpoint(BaseAPIView):
|
||||
@@ -428,6 +429,13 @@ class IssueViewSet(BaseViewSet):
|
||||
slug=slug,
|
||||
origin=request.META.get("HTTP_ORIGIN"),
|
||||
)
|
||||
# updated issue description version
|
||||
issue_description_version_task.delay(
|
||||
updated_issue=json.dumps(request.data, cls=DjangoJSONEncoder),
|
||||
issue_id=str(serializer.data["id"]),
|
||||
user_id=request.user.id,
|
||||
is_creating=True,
|
||||
)
|
||||
return Response(issue, status=status.HTTP_201_CREATED)
|
||||
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
@@ -649,6 +657,12 @@ class IssueViewSet(BaseViewSet):
|
||||
slug=slug,
|
||||
origin=request.META.get("HTTP_ORIGIN"),
|
||||
)
|
||||
# updated issue description version
|
||||
issue_description_version_task.delay(
|
||||
updated_issue=current_instance,
|
||||
issue_id=str(serializer.data.get("id", None)),
|
||||
user_id=request.user.id,
|
||||
)
|
||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
||||
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
@@ -20,7 +20,7 @@ from plane.app.serializers import IssueSerializer
|
||||
from plane.app.permissions import ProjectEntityPermission
|
||||
from plane.db.models import Issue, IssueLink, FileAsset, CycleIssue
|
||||
from plane.bgtasks.issue_activities_task import issue_activity
|
||||
from plane.utils.user_timezone_converter import user_timezone_converter
|
||||
from plane.utils.timezone_converter import user_timezone_converter
|
||||
from collections import defaultdict
|
||||
|
||||
|
||||
|
||||
@@ -28,7 +28,7 @@ from plane.app.permissions import ProjectEntityPermission
|
||||
from plane.app.serializers import ModuleDetailSerializer
|
||||
from plane.db.models import Issue, Module, ModuleLink, UserFavorite, Project
|
||||
from plane.utils.analytics_plot import burndown_plot
|
||||
from plane.utils.user_timezone_converter import user_timezone_converter
|
||||
from plane.utils.timezone_converter import user_timezone_converter
|
||||
|
||||
|
||||
# Module imports
|
||||
|
||||
@@ -56,7 +56,7 @@ from plane.db.models import (
|
||||
Project,
|
||||
)
|
||||
from plane.utils.analytics_plot import burndown_plot
|
||||
from plane.utils.user_timezone_converter import user_timezone_converter
|
||||
from plane.utils.timezone_converter import user_timezone_converter
|
||||
from plane.bgtasks.webhook_task import model_activity
|
||||
from .. import BaseAPIView, BaseViewSet
|
||||
from plane.bgtasks.recent_visited_task import recent_visited_task
|
||||
|
||||
@@ -2,10 +2,21 @@
|
||||
import re
|
||||
|
||||
# Django imports
|
||||
from django.db.models import Q, OuterRef, Subquery, Value, UUIDField, CharField
|
||||
from django.db import models
|
||||
from django.db.models import (
|
||||
Q,
|
||||
OuterRef,
|
||||
Subquery,
|
||||
Value,
|
||||
UUIDField,
|
||||
CharField,
|
||||
When,
|
||||
Case,
|
||||
)
|
||||
from django.contrib.postgres.aggregates import ArrayAgg
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.db.models.functions import Coalesce
|
||||
from django.db.models.functions import Coalesce, Concat
|
||||
from django.utils import timezone
|
||||
|
||||
# Third party imports
|
||||
from rest_framework import status
|
||||
@@ -21,7 +32,9 @@ from plane.db.models import (
|
||||
Module,
|
||||
Page,
|
||||
IssueView,
|
||||
ProjectMember,
|
||||
ProjectPage,
|
||||
WorkspaceMember,
|
||||
)
|
||||
|
||||
|
||||
@@ -237,3 +250,459 @@ class GlobalSearchEndpoint(BaseAPIView):
|
||||
func = MODELS_MAPPER.get(model, None)
|
||||
results[model] = func(query, slug, project_id, workspace_search)
|
||||
return Response({"results": results}, status=status.HTTP_200_OK)
|
||||
|
||||
|
||||
class SearchEndpoint(BaseAPIView):
|
||||
def get(self, request, slug):
|
||||
query = request.query_params.get("query", False)
|
||||
query_types = request.query_params.get("query_type", "user_mention").split(",")
|
||||
query_types = [qt.strip() for qt in query_types]
|
||||
count = int(request.query_params.get("count", 5))
|
||||
project_id = request.query_params.get("project_id", None)
|
||||
issue_id = request.query_params.get("issue_id", None)
|
||||
|
||||
response_data = {}
|
||||
|
||||
if project_id:
|
||||
for query_type in query_types:
|
||||
if query_type == "user_mention":
|
||||
fields = [
|
||||
"member__first_name",
|
||||
"member__last_name",
|
||||
"member__display_name",
|
||||
]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
base_filters = Q(
|
||||
q,
|
||||
is_active=True,
|
||||
workspace__slug=slug,
|
||||
member__is_bot=False,
|
||||
project_id=project_id,
|
||||
role__gt=10,
|
||||
)
|
||||
if issue_id:
|
||||
issue_created_by = (
|
||||
Issue.objects.filter(id=issue_id)
|
||||
.values_list("created_by_id", flat=True)
|
||||
.first()
|
||||
)
|
||||
# Add condition to include `issue_created_by` in the query
|
||||
filters = Q(member_id=issue_created_by) | base_filters
|
||||
else:
|
||||
filters = base_filters
|
||||
|
||||
# Query to fetch users
|
||||
users = (
|
||||
ProjectMember.objects.filter(filters)
|
||||
.annotate(
|
||||
member__avatar_url=Case(
|
||||
When(
|
||||
member__avatar_asset__isnull=False,
|
||||
then=Concat(
|
||||
Value("/api/assets/v2/static/"),
|
||||
"member__avatar_asset",
|
||||
Value("/"),
|
||||
),
|
||||
),
|
||||
When(
|
||||
member__avatar_asset__isnull=True,
|
||||
then="member__avatar",
|
||||
),
|
||||
default=Value(None),
|
||||
output_field=CharField(),
|
||||
)
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.values(
|
||||
"member__avatar_url",
|
||||
"member__display_name",
|
||||
"member__id",
|
||||
)[:count]
|
||||
)
|
||||
|
||||
response_data["user_mention"] = list(users)
|
||||
|
||||
elif query_type == "project":
|
||||
fields = ["name", "identifier"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
projects = (
|
||||
Project.objects.filter(
|
||||
q,
|
||||
Q(project_projectmember__member=self.request.user)
|
||||
| Q(network=2),
|
||||
workspace__slug=slug,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name", "id", "identifier", "logo_props", "workspace__slug"
|
||||
)[:count]
|
||||
)
|
||||
response_data["project"] = list(projects)
|
||||
|
||||
elif query_type == "issue":
|
||||
fields = ["name", "sequence_id", "project__identifier"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
if field == "sequence_id":
|
||||
sequences = re.findall(r"\b\d+\b", query)
|
||||
for sequence_id in sequences:
|
||||
q |= Q(**{"sequence_id": sequence_id})
|
||||
else:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
issues = (
|
||||
Issue.issue_objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
project_id=project_id,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"sequence_id",
|
||||
"project__identifier",
|
||||
"project_id",
|
||||
"priority",
|
||||
"state_id",
|
||||
"type_id",
|
||||
)[:count]
|
||||
)
|
||||
response_data["issue"] = list(issues)
|
||||
|
||||
elif query_type == "cycle":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
cycles = (
|
||||
Cycle.objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
project_id=project_id,
|
||||
)
|
||||
.annotate(
|
||||
status=Case(
|
||||
When(
|
||||
Q(start_date__lte=timezone.now())
|
||||
& Q(end_date__gte=timezone.now()),
|
||||
then=Value("CURRENT"),
|
||||
),
|
||||
When(
|
||||
start_date__gt=timezone.now(),
|
||||
then=Value("UPCOMING"),
|
||||
),
|
||||
When(
|
||||
end_date__lt=timezone.now(), then=Value("COMPLETED")
|
||||
),
|
||||
When(
|
||||
Q(start_date__isnull=True)
|
||||
& Q(end_date__isnull=True),
|
||||
then=Value("DRAFT"),
|
||||
),
|
||||
default=Value("DRAFT"),
|
||||
output_field=CharField(),
|
||||
)
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"project_id",
|
||||
"project__identifier",
|
||||
"status",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["cycle"] = list(cycles)
|
||||
|
||||
elif query_type == "module":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
modules = (
|
||||
Module.objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
project_id=project_id,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"project_id",
|
||||
"project__identifier",
|
||||
"status",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["module"] = list(modules)
|
||||
|
||||
elif query_type == "page":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
pages = (
|
||||
Page.objects.filter(
|
||||
q,
|
||||
projects__project_projectmember__member=self.request.user,
|
||||
projects__project_projectmember__is_active=True,
|
||||
projects__id=project_id,
|
||||
workspace__slug=slug,
|
||||
access=0,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"logo_props",
|
||||
"projects__id",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["page"] = list(pages)
|
||||
return Response(response_data, status=status.HTTP_200_OK)
|
||||
|
||||
else:
|
||||
for query_type in query_types:
|
||||
if query_type == "user_mention":
|
||||
fields = [
|
||||
"member__first_name",
|
||||
"member__last_name",
|
||||
"member__display_name",
|
||||
]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
users = (
|
||||
WorkspaceMember.objects.filter(
|
||||
q,
|
||||
is_active=True,
|
||||
workspace__slug=slug,
|
||||
member__is_bot=False,
|
||||
)
|
||||
.annotate(
|
||||
member__avatar_url=Case(
|
||||
When(
|
||||
member__avatar_asset__isnull=False,
|
||||
then=Concat(
|
||||
Value("/api/assets/v2/static/"),
|
||||
"member__avatar_asset",
|
||||
Value("/"),
|
||||
),
|
||||
),
|
||||
When(
|
||||
member__avatar_asset__isnull=True,
|
||||
then="member__avatar",
|
||||
),
|
||||
default=Value(None),
|
||||
output_field=models.CharField(),
|
||||
)
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.values(
|
||||
"member__avatar_url", "member__display_name", "member__id"
|
||||
)[:count]
|
||||
)
|
||||
response_data["user_mention"] = list(users)
|
||||
|
||||
elif query_type == "project":
|
||||
fields = ["name", "identifier"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
projects = (
|
||||
Project.objects.filter(
|
||||
q,
|
||||
Q(project_projectmember__member=self.request.user)
|
||||
| Q(network=2),
|
||||
workspace__slug=slug,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name", "id", "identifier", "logo_props", "workspace__slug"
|
||||
)[:count]
|
||||
)
|
||||
response_data["project"] = list(projects)
|
||||
|
||||
elif query_type == "issue":
|
||||
fields = ["name", "sequence_id", "project__identifier"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
if field == "sequence_id":
|
||||
sequences = re.findall(r"\b\d+\b", query)
|
||||
for sequence_id in sequences:
|
||||
q |= Q(**{"sequence_id": sequence_id})
|
||||
else:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
issues = (
|
||||
Issue.issue_objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"sequence_id",
|
||||
"project__identifier",
|
||||
"project_id",
|
||||
"priority",
|
||||
"state_id",
|
||||
"type_id",
|
||||
)[:count]
|
||||
)
|
||||
response_data["issue"] = list(issues)
|
||||
|
||||
elif query_type == "cycle":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
cycles = (
|
||||
Cycle.objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
)
|
||||
.annotate(
|
||||
status=Case(
|
||||
When(
|
||||
Q(start_date__lte=timezone.now())
|
||||
& Q(end_date__gte=timezone.now()),
|
||||
then=Value("CURRENT"),
|
||||
),
|
||||
When(
|
||||
start_date__gt=timezone.now(),
|
||||
then=Value("UPCOMING"),
|
||||
),
|
||||
When(
|
||||
end_date__lt=timezone.now(), then=Value("COMPLETED")
|
||||
),
|
||||
When(
|
||||
Q(start_date__isnull=True)
|
||||
& Q(end_date__isnull=True),
|
||||
then=Value("DRAFT"),
|
||||
),
|
||||
default=Value("DRAFT"),
|
||||
output_field=CharField(),
|
||||
)
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"project_id",
|
||||
"project__identifier",
|
||||
"status",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["cycle"] = list(cycles)
|
||||
|
||||
elif query_type == "module":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
modules = (
|
||||
Module.objects.filter(
|
||||
q,
|
||||
project__project_projectmember__member=self.request.user,
|
||||
project__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"project_id",
|
||||
"project__identifier",
|
||||
"status",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["module"] = list(modules)
|
||||
|
||||
elif query_type == "page":
|
||||
fields = ["name"]
|
||||
q = Q()
|
||||
|
||||
if query:
|
||||
for field in fields:
|
||||
q |= Q(**{f"{field}__icontains": query})
|
||||
|
||||
pages = (
|
||||
Page.objects.filter(
|
||||
q,
|
||||
projects__project_projectmember__member=self.request.user,
|
||||
projects__project_projectmember__is_active=True,
|
||||
workspace__slug=slug,
|
||||
access=0,
|
||||
is_global=True,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.distinct()
|
||||
.values(
|
||||
"name",
|
||||
"id",
|
||||
"logo_props",
|
||||
"projects__id",
|
||||
"workspace__slug",
|
||||
)[:count]
|
||||
)
|
||||
response_data["page"] = list(pages)
|
||||
return Response(response_data, status=status.HTTP_200_OK)
|
||||
|
||||
247
apiserver/plane/app/views/timezone/base.py
Normal file
247
apiserver/plane/app/views/timezone/base.py
Normal file
@@ -0,0 +1,247 @@
|
||||
# Python imports
|
||||
import pytz
|
||||
from datetime import datetime
|
||||
|
||||
# Django imports
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.cache import cache_page
|
||||
|
||||
# Third party imports
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.permissions import AllowAny
|
||||
from rest_framework.views import APIView
|
||||
|
||||
# Module imports
|
||||
from plane.authentication.rate_limit import AuthenticationThrottle
|
||||
|
||||
|
||||
class TimezoneEndpoint(APIView):
|
||||
permission_classes = [AllowAny]
|
||||
|
||||
throttle_classes = [AuthenticationThrottle]
|
||||
|
||||
@method_decorator(cache_page(60 * 60 * 24))
|
||||
def get(self, request):
|
||||
timezone_mapping = {
|
||||
"-1100": [
|
||||
("Midway Island", "Pacific/Midway"),
|
||||
("American Samoa", "Pacific/Pago_Pago"),
|
||||
],
|
||||
"-1000": [
|
||||
("Hawaii", "Pacific/Honolulu"),
|
||||
("Aleutian Islands", "America/Adak"),
|
||||
],
|
||||
"-0930": [("Marquesas Islands", "Pacific/Marquesas")],
|
||||
"-0900": [
|
||||
("Alaska", "America/Anchorage"),
|
||||
("Gambier Islands", "Pacific/Gambier"),
|
||||
],
|
||||
"-0800": [
|
||||
("Pacific Time (US and Canada)", "America/Los_Angeles"),
|
||||
("Baja California", "America/Tijuana"),
|
||||
],
|
||||
"-0700": [
|
||||
("Mountain Time (US and Canada)", "America/Denver"),
|
||||
("Arizona", "America/Phoenix"),
|
||||
("Chihuahua, Mazatlan", "America/Chihuahua"),
|
||||
],
|
||||
"-0600": [
|
||||
("Central Time (US and Canada)", "America/Chicago"),
|
||||
("Saskatchewan", "America/Regina"),
|
||||
("Guadalajara, Mexico City, Monterrey", "America/Mexico_City"),
|
||||
("Tegucigalpa, Honduras", "America/Tegucigalpa"),
|
||||
("Costa Rica", "America/Costa_Rica"),
|
||||
],
|
||||
"-0500": [
|
||||
("Eastern Time (US and Canada)", "America/New_York"),
|
||||
("Lima", "America/Lima"),
|
||||
("Bogota", "America/Bogota"),
|
||||
("Quito", "America/Guayaquil"),
|
||||
("Chetumal", "America/Cancun"),
|
||||
],
|
||||
"-0430": [("Caracas (Old Venezuela Time)", "America/Caracas")],
|
||||
"-0400": [
|
||||
("Atlantic Time (Canada)", "America/Halifax"),
|
||||
("Caracas", "America/Caracas"),
|
||||
("Santiago", "America/Santiago"),
|
||||
("La Paz", "America/La_Paz"),
|
||||
("Manaus", "America/Manaus"),
|
||||
("Georgetown", "America/Guyana"),
|
||||
("Bermuda", "Atlantic/Bermuda"),
|
||||
],
|
||||
"-0330": [("Newfoundland Time (Canada)", "America/St_Johns")],
|
||||
"-0300": [
|
||||
("Buenos Aires", "America/Argentina/Buenos_Aires"),
|
||||
("Brasilia", "America/Sao_Paulo"),
|
||||
("Greenland", "America/Godthab"),
|
||||
("Montevideo", "America/Montevideo"),
|
||||
("Falkland Islands", "Atlantic/Stanley"),
|
||||
],
|
||||
"-0200": [
|
||||
(
|
||||
"South Georgia and the South Sandwich Islands",
|
||||
"Atlantic/South_Georgia",
|
||||
)
|
||||
],
|
||||
"-0100": [
|
||||
("Azores", "Atlantic/Azores"),
|
||||
("Cape Verde Islands", "Atlantic/Cape_Verde"),
|
||||
],
|
||||
"+0000": [
|
||||
("Dublin", "Europe/Dublin"),
|
||||
("Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Lisbon", "Europe/Lisbon"),
|
||||
("Monrovia", "Africa/Monrovia"),
|
||||
("Casablanca", "Africa/Casablanca"),
|
||||
],
|
||||
"+0100": [
|
||||
("Central European Time (Berlin, Rome, Paris)", "Europe/Paris"),
|
||||
("West Central Africa", "Africa/Lagos"),
|
||||
("Algiers", "Africa/Algiers"),
|
||||
("Lagos", "Africa/Lagos"),
|
||||
("Tunis", "Africa/Tunis"),
|
||||
],
|
||||
"+0200": [
|
||||
("Eastern European Time (Cairo, Helsinki, Kyiv)", "Europe/Kiev"),
|
||||
("Athens", "Europe/Athens"),
|
||||
("Jerusalem", "Asia/Jerusalem"),
|
||||
("Johannesburg", "Africa/Johannesburg"),
|
||||
("Harare, Pretoria", "Africa/Harare"),
|
||||
],
|
||||
"+0300": [
|
||||
("Moscow Time", "Europe/Moscow"),
|
||||
("Baghdad", "Asia/Baghdad"),
|
||||
("Nairobi", "Africa/Nairobi"),
|
||||
("Kuwait, Riyadh", "Asia/Riyadh"),
|
||||
],
|
||||
"+0330": [("Tehran", "Asia/Tehran")],
|
||||
"+0400": [
|
||||
("Abu Dhabi", "Asia/Dubai"),
|
||||
("Baku", "Asia/Baku"),
|
||||
("Yerevan", "Asia/Yerevan"),
|
||||
("Astrakhan", "Europe/Astrakhan"),
|
||||
("Tbilisi", "Asia/Tbilisi"),
|
||||
("Mauritius", "Indian/Mauritius"),
|
||||
],
|
||||
"+0500": [
|
||||
("Islamabad", "Asia/Karachi"),
|
||||
("Karachi", "Asia/Karachi"),
|
||||
("Tashkent", "Asia/Tashkent"),
|
||||
("Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Maldives", "Indian/Maldives"),
|
||||
("Chagos", "Indian/Chagos"),
|
||||
],
|
||||
"+0530": [
|
||||
("Chennai", "Asia/Kolkata"),
|
||||
("Kolkata", "Asia/Kolkata"),
|
||||
("Mumbai", "Asia/Kolkata"),
|
||||
("New Delhi", "Asia/Kolkata"),
|
||||
("Sri Jayawardenepura", "Asia/Colombo"),
|
||||
],
|
||||
"+0545": [("Kathmandu", "Asia/Kathmandu")],
|
||||
"+0600": [
|
||||
("Dhaka", "Asia/Dhaka"),
|
||||
("Almaty", "Asia/Almaty"),
|
||||
("Bishkek", "Asia/Bishkek"),
|
||||
("Thimphu", "Asia/Thimphu"),
|
||||
],
|
||||
"+0630": [
|
||||
("Yangon (Rangoon)", "Asia/Yangon"),
|
||||
("Cocos Islands", "Indian/Cocos"),
|
||||
],
|
||||
"+0700": [
|
||||
("Bangkok", "Asia/Bangkok"),
|
||||
("Hanoi", "Asia/Ho_Chi_Minh"),
|
||||
("Jakarta", "Asia/Jakarta"),
|
||||
("Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
],
|
||||
"+0800": [
|
||||
("Beijing", "Asia/Shanghai"),
|
||||
("Singapore", "Asia/Singapore"),
|
||||
("Perth", "Australia/Perth"),
|
||||
("Hong Kong", "Asia/Hong_Kong"),
|
||||
("Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Palau", "Pacific/Palau"),
|
||||
],
|
||||
"+0845": [("Eucla", "Australia/Eucla")],
|
||||
"+0900": [
|
||||
("Tokyo", "Asia/Tokyo"),
|
||||
("Seoul", "Asia/Seoul"),
|
||||
("Yakutsk", "Asia/Yakutsk"),
|
||||
],
|
||||
"+0930": [
|
||||
("Adelaide", "Australia/Adelaide"),
|
||||
("Darwin", "Australia/Darwin"),
|
||||
],
|
||||
"+1000": [
|
||||
("Sydney", "Australia/Sydney"),
|
||||
("Brisbane", "Australia/Brisbane"),
|
||||
("Guam", "Pacific/Guam"),
|
||||
("Vladivostok", "Asia/Vladivostok"),
|
||||
("Tahiti", "Pacific/Tahiti"),
|
||||
],
|
||||
"+1030": [("Lord Howe Island", "Australia/Lord_Howe")],
|
||||
"+1100": [
|
||||
("Solomon Islands", "Pacific/Guadalcanal"),
|
||||
("Magadan", "Asia/Magadan"),
|
||||
("Norfolk Island", "Pacific/Norfolk"),
|
||||
("Bougainville Island", "Pacific/Bougainville"),
|
||||
("Chokurdakh", "Asia/Srednekolymsk"),
|
||||
],
|
||||
"+1200": [
|
||||
("Auckland", "Pacific/Auckland"),
|
||||
("Wellington", "Pacific/Auckland"),
|
||||
("Fiji Islands", "Pacific/Fiji"),
|
||||
("Anadyr", "Asia/Anadyr"),
|
||||
],
|
||||
"+1245": [("Chatham Islands", "Pacific/Chatham")],
|
||||
"+1300": [("Nuku'alofa", "Pacific/Tongatapu"), ("Samoa", "Pacific/Apia")],
|
||||
"+1400": [("Kiritimati Island", "Pacific/Kiritimati")],
|
||||
}
|
||||
|
||||
timezone_list = []
|
||||
now = datetime.now()
|
||||
|
||||
# Process timezone mapping
|
||||
for offset, locations in timezone_mapping.items():
|
||||
sign = "-" if offset.startswith("-") else "+"
|
||||
hours = offset[1:3]
|
||||
minutes = offset[3:] if len(offset) > 3 else "00"
|
||||
|
||||
for friendly_name, tz_identifier in locations:
|
||||
try:
|
||||
tz = pytz.timezone(tz_identifier)
|
||||
current_offset = now.astimezone(tz).strftime("%z")
|
||||
|
||||
# converting and formatting UTC offset to GMT offset
|
||||
current_utc_offset = now.astimezone(tz).utcoffset()
|
||||
total_seconds = int(current_utc_offset.total_seconds())
|
||||
hours_offset = total_seconds // 3600
|
||||
minutes_offset = abs(total_seconds % 3600) // 60
|
||||
gmt_offset = (
|
||||
f"GMT{'+' if hours_offset >= 0 else '-'}"
|
||||
f"{abs(hours_offset):02}:{minutes_offset:02}"
|
||||
)
|
||||
|
||||
timezone_value = {
|
||||
"offset": int(current_offset),
|
||||
"utc_offset": f"UTC{sign}{hours}:{minutes}",
|
||||
"gmt_offset": gmt_offset,
|
||||
"value": tz_identifier,
|
||||
"label": f"{friendly_name}",
|
||||
}
|
||||
|
||||
timezone_list.append(timezone_value)
|
||||
except pytz.exceptions.UnknownTimeZoneError:
|
||||
continue
|
||||
|
||||
# Sort by offset and then by label
|
||||
timezone_list.sort(key=lambda x: (x["offset"], x["label"]))
|
||||
|
||||
# Remove offset from final output
|
||||
for tz in timezone_list:
|
||||
del tz["offset"]
|
||||
|
||||
return Response({"timezones": timezone_list}, status=status.HTTP_200_OK)
|
||||
@@ -10,7 +10,7 @@ from plane.app.views.base import BaseAPIView
|
||||
from plane.db.models import Cycle
|
||||
from plane.app.permissions import WorkspaceViewerPermission
|
||||
from plane.app.serializers.cycle import CycleSerializer
|
||||
|
||||
from plane.utils.timezone_converter import user_timezone_converter
|
||||
|
||||
class WorkspaceCyclesEndpoint(BaseAPIView):
|
||||
permission_classes = [WorkspaceViewerPermission]
|
||||
|
||||
125
apiserver/plane/bgtasks/issue_description_version_sync.py
Normal file
125
apiserver/plane/bgtasks/issue_description_version_sync.py
Normal file
@@ -0,0 +1,125 @@
|
||||
# Python imports
|
||||
from typing import Optional
|
||||
import logging
|
||||
|
||||
# Django imports
|
||||
from django.utils import timezone
|
||||
from django.db import transaction
|
||||
|
||||
# Third party imports
|
||||
from celery import shared_task
|
||||
|
||||
# Module imports
|
||||
from plane.db.models import Issue, IssueDescriptionVersion, ProjectMember
|
||||
from plane.utils.exception_logger import log_exception
|
||||
|
||||
|
||||
def get_owner_id(issue: Issue) -> Optional[int]:
|
||||
"""Get the owner ID of the issue"""
|
||||
|
||||
if issue.updated_by_id:
|
||||
return issue.updated_by_id
|
||||
|
||||
if issue.created_by_id:
|
||||
return issue.created_by_id
|
||||
|
||||
# Find project admin as fallback
|
||||
project_member = ProjectMember.objects.filter(
|
||||
project_id=issue.project_id,
|
||||
role=20, # Admin role
|
||||
).first()
|
||||
|
||||
return project_member.member_id if project_member else None
|
||||
|
||||
|
||||
@shared_task
|
||||
def sync_issue_description_version(batch_size=5000, offset=0, countdown=300):
|
||||
"""Task to create IssueDescriptionVersion records for existing Issues in batches"""
|
||||
try:
|
||||
with transaction.atomic():
|
||||
base_query = Issue.objects
|
||||
total_issues_count = base_query.count()
|
||||
|
||||
if total_issues_count == 0:
|
||||
return
|
||||
|
||||
# Calculate batch range
|
||||
end_offset = min(offset + batch_size, total_issues_count)
|
||||
|
||||
# Fetch issues with related data
|
||||
issues_batch = (
|
||||
base_query.order_by("created_at")
|
||||
.select_related("workspace", "project")
|
||||
.only(
|
||||
"id",
|
||||
"workspace_id",
|
||||
"project_id",
|
||||
"created_by_id",
|
||||
"updated_by_id",
|
||||
"description_binary",
|
||||
"description_html",
|
||||
"description_stripped",
|
||||
"description",
|
||||
)[offset:end_offset]
|
||||
)
|
||||
|
||||
if not issues_batch:
|
||||
return
|
||||
|
||||
version_objects = []
|
||||
for issue in issues_batch:
|
||||
# Validate required fields
|
||||
if not issue.workspace_id or not issue.project_id:
|
||||
logging.warning(
|
||||
f"Skipping {issue.id} - missing workspace_id or project_id"
|
||||
)
|
||||
continue
|
||||
|
||||
# Determine owned_by_id
|
||||
owned_by_id = get_owner_id(issue)
|
||||
if owned_by_id is None:
|
||||
logging.warning(f"Skipping issue {issue.id} - missing owned_by")
|
||||
continue
|
||||
|
||||
# Create version object
|
||||
version_objects.append(
|
||||
IssueDescriptionVersion(
|
||||
workspace_id=issue.workspace_id,
|
||||
project_id=issue.project_id,
|
||||
created_by_id=issue.created_by_id,
|
||||
updated_by_id=issue.updated_by_id,
|
||||
owned_by_id=owned_by_id,
|
||||
last_saved_at=timezone.now(),
|
||||
issue_id=issue.id,
|
||||
description_binary=issue.description_binary,
|
||||
description_html=issue.description_html,
|
||||
description_stripped=issue.description_stripped,
|
||||
description_json=issue.description,
|
||||
)
|
||||
)
|
||||
|
||||
# Bulk create version objects
|
||||
if version_objects:
|
||||
IssueDescriptionVersion.objects.bulk_create(version_objects)
|
||||
|
||||
# Schedule next batch if needed
|
||||
if end_offset < total_issues_count:
|
||||
sync_issue_description_version.apply_async(
|
||||
kwargs={
|
||||
"batch_size": batch_size,
|
||||
"offset": end_offset,
|
||||
"countdown": countdown,
|
||||
},
|
||||
countdown=countdown,
|
||||
)
|
||||
return
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return
|
||||
|
||||
|
||||
@shared_task
|
||||
def schedule_issue_description_version(batch_size=5000, countdown=300):
|
||||
sync_issue_description_version.delay(
|
||||
batch_size=int(batch_size), countdown=countdown
|
||||
)
|
||||
84
apiserver/plane/bgtasks/issue_description_version_task.py
Normal file
84
apiserver/plane/bgtasks/issue_description_version_task.py
Normal file
@@ -0,0 +1,84 @@
|
||||
from celery import shared_task
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
from typing import Optional, Dict
|
||||
import json
|
||||
|
||||
from plane.db.models import Issue, IssueDescriptionVersion
|
||||
from plane.utils.exception_logger import log_exception
|
||||
|
||||
|
||||
def should_update_existing_version(
|
||||
version: IssueDescriptionVersion, user_id: str, max_time_difference: int = 600
|
||||
) -> bool:
|
||||
if not version:
|
||||
return
|
||||
|
||||
time_difference = (timezone.now() - version.last_saved_at).total_seconds()
|
||||
return (
|
||||
str(version.owned_by_id) == str(user_id)
|
||||
and time_difference <= max_time_difference
|
||||
)
|
||||
|
||||
|
||||
def update_existing_version(version: IssueDescriptionVersion, issue) -> None:
|
||||
version.description_json = issue.description
|
||||
version.description_html = issue.description_html
|
||||
version.description_binary = issue.description_binary
|
||||
version.description_stripped = issue.description_stripped
|
||||
version.last_saved_at = timezone.now()
|
||||
|
||||
version.save(
|
||||
update_fields=[
|
||||
"description_json",
|
||||
"description_html",
|
||||
"description_binary",
|
||||
"description_stripped",
|
||||
"last_saved_at",
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
@shared_task
|
||||
def issue_description_version_task(
|
||||
updated_issue, issue_id, user_id, is_creating=False
|
||||
) -> Optional[bool]:
|
||||
try:
|
||||
# Parse updated issue data
|
||||
current_issue: Dict = json.loads(updated_issue) if updated_issue else {}
|
||||
|
||||
# Get current issue
|
||||
issue = Issue.objects.get(id=issue_id)
|
||||
|
||||
# Check if description has changed
|
||||
if (
|
||||
current_issue.get("description_html") == issue.description_html
|
||||
and not is_creating
|
||||
):
|
||||
return
|
||||
|
||||
with transaction.atomic():
|
||||
# Get latest version
|
||||
latest_version = (
|
||||
IssueDescriptionVersion.objects.filter(issue_id=issue_id)
|
||||
.order_by("-last_saved_at")
|
||||
.first()
|
||||
)
|
||||
|
||||
# Determine whether to update existing or create new version
|
||||
if should_update_existing_version(version=latest_version, user_id=user_id):
|
||||
update_existing_version(latest_version, issue)
|
||||
else:
|
||||
IssueDescriptionVersion.log_issue_description_version(issue, user_id)
|
||||
|
||||
return
|
||||
|
||||
except Issue.DoesNotExist:
|
||||
# Issue no longer exists, skip processing
|
||||
return
|
||||
except json.JSONDecodeError as e:
|
||||
log_exception(f"Invalid JSON for updated_issue: {e}")
|
||||
return
|
||||
except Exception as e:
|
||||
log_exception(f"Error processing issue description version: {e}")
|
||||
return
|
||||
254
apiserver/plane/bgtasks/issue_version_sync.py
Normal file
254
apiserver/plane/bgtasks/issue_version_sync.py
Normal file
@@ -0,0 +1,254 @@
|
||||
# Python imports
|
||||
import json
|
||||
from typing import Optional, List, Dict
|
||||
from uuid import UUID
|
||||
from itertools import groupby
|
||||
import logging
|
||||
|
||||
# Django imports
|
||||
from django.utils import timezone
|
||||
from django.db import transaction
|
||||
|
||||
# Third party imports
|
||||
from celery import shared_task
|
||||
|
||||
# Module imports
|
||||
from plane.db.models import (
|
||||
Issue,
|
||||
IssueVersion,
|
||||
ProjectMember,
|
||||
CycleIssue,
|
||||
ModuleIssue,
|
||||
IssueActivity,
|
||||
IssueAssignee,
|
||||
IssueLabel,
|
||||
)
|
||||
from plane.utils.exception_logger import log_exception
|
||||
|
||||
|
||||
@shared_task
|
||||
def issue_task(updated_issue, issue_id, user_id):
|
||||
try:
|
||||
current_issue = json.loads(updated_issue) if updated_issue else {}
|
||||
issue = Issue.objects.get(id=issue_id)
|
||||
|
||||
updated_current_issue = {}
|
||||
for key, value in current_issue.items():
|
||||
if getattr(issue, key) != value:
|
||||
updated_current_issue[key] = value
|
||||
|
||||
if updated_current_issue:
|
||||
issue_version = (
|
||||
IssueVersion.objects.filter(issue_id=issue_id)
|
||||
.order_by("-last_saved_at")
|
||||
.first()
|
||||
)
|
||||
|
||||
if (
|
||||
issue_version
|
||||
and str(issue_version.owned_by) == str(user_id)
|
||||
and (timezone.now() - issue_version.last_saved_at).total_seconds()
|
||||
<= 600
|
||||
):
|
||||
for key, value in updated_current_issue.items():
|
||||
setattr(issue_version, key, value)
|
||||
issue_version.last_saved_at = timezone.now()
|
||||
issue_version.save(
|
||||
update_fields=list(updated_current_issue.keys()) + ["last_saved_at"]
|
||||
)
|
||||
else:
|
||||
IssueVersion.log_issue_version(issue, user_id)
|
||||
|
||||
return
|
||||
except Issue.DoesNotExist:
|
||||
return
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return
|
||||
|
||||
|
||||
def get_owner_id(issue: Issue) -> Optional[int]:
|
||||
"""Get the owner ID of the issue"""
|
||||
|
||||
if issue.updated_by_id:
|
||||
return issue.updated_by_id
|
||||
|
||||
if issue.created_by_id:
|
||||
return issue.created_by_id
|
||||
|
||||
# Find project admin as fallback
|
||||
project_member = ProjectMember.objects.filter(
|
||||
project_id=issue.project_id,
|
||||
role=20, # Admin role
|
||||
).first()
|
||||
|
||||
return project_member.member_id if project_member else None
|
||||
|
||||
|
||||
def get_related_data(issue_ids: List[UUID]) -> Dict:
|
||||
"""Get related data for the given issue IDs"""
|
||||
|
||||
cycle_issues = {
|
||||
ci.issue_id: ci.cycle_id
|
||||
for ci in CycleIssue.objects.filter(issue_id__in=issue_ids)
|
||||
}
|
||||
|
||||
# Get assignees with proper grouping
|
||||
assignee_records = list(
|
||||
IssueAssignee.objects.filter(issue_id__in=issue_ids)
|
||||
.values_list("issue_id", "assignee_id")
|
||||
.order_by("issue_id")
|
||||
)
|
||||
assignees = {}
|
||||
for issue_id, group in groupby(assignee_records, key=lambda x: x[0]):
|
||||
assignees[issue_id] = [str(g[1]) for g in group]
|
||||
|
||||
# Get labels with proper grouping
|
||||
label_records = list(
|
||||
IssueLabel.objects.filter(issue_id__in=issue_ids)
|
||||
.values_list("issue_id", "label_id")
|
||||
.order_by("issue_id")
|
||||
)
|
||||
labels = {}
|
||||
for issue_id, group in groupby(label_records, key=lambda x: x[0]):
|
||||
labels[issue_id] = [str(g[1]) for g in group]
|
||||
|
||||
# Get modules with proper grouping
|
||||
module_records = list(
|
||||
ModuleIssue.objects.filter(issue_id__in=issue_ids)
|
||||
.values_list("issue_id", "module_id")
|
||||
.order_by("issue_id")
|
||||
)
|
||||
modules = {}
|
||||
for issue_id, group in groupby(module_records, key=lambda x: x[0]):
|
||||
modules[issue_id] = [str(g[1]) for g in group]
|
||||
|
||||
# Get latest activities
|
||||
latest_activities = {}
|
||||
activities = IssueActivity.objects.filter(issue_id__in=issue_ids).order_by(
|
||||
"issue_id", "-created_at"
|
||||
)
|
||||
for issue_id, activities_group in groupby(activities, key=lambda x: x.issue_id):
|
||||
first_activity = next(activities_group, None)
|
||||
if first_activity:
|
||||
latest_activities[issue_id] = first_activity.id
|
||||
|
||||
return {
|
||||
"cycle_issues": cycle_issues,
|
||||
"assignees": assignees,
|
||||
"labels": labels,
|
||||
"modules": modules,
|
||||
"activities": latest_activities,
|
||||
}
|
||||
|
||||
|
||||
def create_issue_version(issue: Issue, related_data: Dict) -> Optional[IssueVersion]:
|
||||
"""Create IssueVersion object from the given issue and related data"""
|
||||
|
||||
try:
|
||||
if not issue.workspace_id or not issue.project_id:
|
||||
logging.warning(
|
||||
f"Skipping issue {issue.id} - missing workspace_id or project_id"
|
||||
)
|
||||
return None
|
||||
|
||||
owned_by_id = get_owner_id(issue)
|
||||
if owned_by_id is None:
|
||||
logging.warning(f"Skipping issue {issue.id} - missing owned_by")
|
||||
return None
|
||||
|
||||
return IssueVersion(
|
||||
workspace_id=issue.workspace_id,
|
||||
project_id=issue.project_id,
|
||||
created_by_id=issue.created_by_id,
|
||||
updated_by_id=issue.updated_by_id,
|
||||
owned_by_id=owned_by_id,
|
||||
last_saved_at=timezone.now(),
|
||||
activity_id=related_data["activities"].get(issue.id),
|
||||
properties=getattr(issue, "properties", {}),
|
||||
meta=getattr(issue, "meta", {}),
|
||||
issue_id=issue.id,
|
||||
parent=issue.parent_id,
|
||||
state=issue.state_id,
|
||||
estimate_point=issue.estimate_point_id,
|
||||
name=issue.name,
|
||||
priority=issue.priority,
|
||||
start_date=issue.start_date,
|
||||
target_date=issue.target_date,
|
||||
assignees=related_data["assignees"].get(issue.id, []),
|
||||
sequence_id=issue.sequence_id,
|
||||
labels=related_data["labels"].get(issue.id, []),
|
||||
sort_order=issue.sort_order,
|
||||
completed_at=issue.completed_at,
|
||||
archived_at=issue.archived_at,
|
||||
is_draft=issue.is_draft,
|
||||
external_source=issue.external_source,
|
||||
external_id=issue.external_id,
|
||||
type=issue.type_id,
|
||||
cycle=related_data["cycle_issues"].get(issue.id),
|
||||
modules=related_data["modules"].get(issue.id, []),
|
||||
)
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return None
|
||||
|
||||
|
||||
@shared_task
|
||||
def sync_issue_version(batch_size=5000, offset=0, countdown=300):
|
||||
"""Task to create IssueVersion records for existing Issues in batches"""
|
||||
|
||||
try:
|
||||
with transaction.atomic():
|
||||
base_query = Issue.objects
|
||||
total_issues_count = base_query.count()
|
||||
|
||||
if total_issues_count == 0:
|
||||
return
|
||||
|
||||
end_offset = min(offset + batch_size, total_issues_count)
|
||||
|
||||
# Get issues batch with optimized queries
|
||||
issues_batch = list(
|
||||
base_query.order_by("created_at")
|
||||
.select_related("workspace", "project")
|
||||
.all()[offset:end_offset]
|
||||
)
|
||||
|
||||
if not issues_batch:
|
||||
return
|
||||
|
||||
# Get all related data in bulk
|
||||
issue_ids = [issue.id for issue in issues_batch]
|
||||
related_data = get_related_data(issue_ids)
|
||||
|
||||
issue_versions = []
|
||||
for issue in issues_batch:
|
||||
version = create_issue_version(issue, related_data)
|
||||
if version:
|
||||
issue_versions.append(version)
|
||||
|
||||
# Bulk create versions
|
||||
if issue_versions:
|
||||
IssueVersion.objects.bulk_create(issue_versions, batch_size=1000)
|
||||
|
||||
# Schedule the next batch if there are more workspaces to process
|
||||
if end_offset < total_issues_count:
|
||||
sync_issue_version.apply_async(
|
||||
kwargs={
|
||||
"batch_size": batch_size,
|
||||
"offset": end_offset,
|
||||
"countdown": countdown,
|
||||
},
|
||||
countdown=countdown,
|
||||
)
|
||||
|
||||
logging.info(f"Processed Issues: {end_offset}")
|
||||
return
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return
|
||||
|
||||
|
||||
@shared_task
|
||||
def schedule_issue_version(batch_size=5000, countdown=300):
|
||||
sync_issue_version.delay(batch_size=int(batch_size), countdown=countdown)
|
||||
@@ -32,7 +32,6 @@ from bs4 import BeautifulSoup
|
||||
|
||||
def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
|
||||
aggregated_issue_mentions = []
|
||||
|
||||
for mention_id in new_mentions:
|
||||
aggregated_issue_mentions.append(
|
||||
IssueMention(
|
||||
@@ -125,7 +124,9 @@ def extract_mentions(issue_instance):
|
||||
data = json.loads(issue_instance)
|
||||
html = data.get("description_html")
|
||||
soup = BeautifulSoup(html, "html.parser")
|
||||
mention_tags = soup.find_all("mention-component", attrs={"target": "users"})
|
||||
mention_tags = soup.find_all(
|
||||
"mention-component", attrs={"entity_name": "user_mention"}
|
||||
)
|
||||
|
||||
mentions = [mention_tag["entity_identifier"] for mention_tag in mention_tags]
|
||||
|
||||
@@ -139,7 +140,9 @@ def extract_comment_mentions(comment_value):
|
||||
try:
|
||||
mentions = []
|
||||
soup = BeautifulSoup(comment_value, "html.parser")
|
||||
mentions_tags = soup.find_all("mention-component", attrs={"target": "users"})
|
||||
mentions_tags = soup.find_all(
|
||||
"mention-component", attrs={"entity_name": "user_mention"}
|
||||
)
|
||||
for mention_tag in mentions_tags:
|
||||
mentions.append(mention_tag["entity_identifier"])
|
||||
return list(set(mentions))
|
||||
@@ -255,12 +258,9 @@ def notifications(
|
||||
new_mentions = get_new_mentions(
|
||||
requested_instance=requested_data, current_instance=current_instance
|
||||
)
|
||||
|
||||
new_mentions = [
|
||||
str(mention)
|
||||
for mention in new_mentions
|
||||
if mention in set(project_members)
|
||||
]
|
||||
new_mentions = list(
|
||||
set(new_mentions) & {str(member) for member in project_members}
|
||||
)
|
||||
removed_mention = get_removed_mentions(
|
||||
requested_instance=requested_data, current_instance=current_instance
|
||||
)
|
||||
|
||||
@@ -0,0 +1,23 @@
|
||||
# Django imports
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
# Module imports
|
||||
from plane.bgtasks.issue_description_version_sync import (
|
||||
schedule_issue_description_version,
|
||||
)
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Creates IssueDescriptionVersion records for existing Issues in batches"
|
||||
|
||||
def handle(self, *args, **options):
|
||||
batch_size = input("Enter the batch size: ")
|
||||
batch_countdown = input("Enter the batch countdown: ")
|
||||
|
||||
schedule_issue_description_version.delay(
|
||||
batch_size=batch_size, countdown=int(batch_countdown)
|
||||
)
|
||||
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS("Successfully created issue description version task")
|
||||
)
|
||||
19
apiserver/plane/db/management/commands/sync_issue_version.py
Normal file
19
apiserver/plane/db/management/commands/sync_issue_version.py
Normal file
@@ -0,0 +1,19 @@
|
||||
# Django imports
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
# Module imports
|
||||
from plane.bgtasks.issue_version_sync import schedule_issue_version
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Creates IssueVersion records for existing Issues in batches"
|
||||
|
||||
def handle(self, *args, **options):
|
||||
batch_size = input("Enter the batch size: ")
|
||||
batch_countdown = input("Enter the batch countdown: ")
|
||||
|
||||
schedule_issue_version.delay(
|
||||
batch_size=batch_size, countdown=int(batch_countdown)
|
||||
)
|
||||
|
||||
self.stdout.write(self.style.SUCCESS("Successfully created issue version task"))
|
||||
@@ -0,0 +1,117 @@
|
||||
# Generated by Django 4.2.17 on 2024-12-13 10:09
|
||||
|
||||
from django.conf import settings
|
||||
import django.core.validators
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
import plane.db.models.user
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('db', '0086_issueversion_alter_teampage_unique_together_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name='issueversion',
|
||||
name='description',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='issueversion',
|
||||
name='description_binary',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='issueversion',
|
||||
name='description_html',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='issueversion',
|
||||
name='description_stripped',
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='issueversion',
|
||||
name='activity',
|
||||
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='versions', to='db.issueactivity'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='profile',
|
||||
name='is_mobile_onboarded',
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='profile',
|
||||
name='mobile_onboarding_step',
|
||||
field=models.JSONField(default=plane.db.models.user.get_mobile_default_onboarding),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='profile',
|
||||
name='mobile_timezone_auto_set',
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='profile',
|
||||
name='language',
|
||||
field=models.CharField(default='en', max_length=255),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='issueversion',
|
||||
name='owned_by',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_versions', to=settings.AUTH_USER_MODEL),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Sticky',
|
||||
fields=[
|
||||
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
|
||||
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
|
||||
('deleted_at', models.DateTimeField(blank=True, null=True, verbose_name='Deleted At')),
|
||||
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
|
||||
('name', models.TextField()),
|
||||
('description', models.JSONField(blank=True, default=dict)),
|
||||
('description_html', models.TextField(blank=True, default='<p></p>')),
|
||||
('description_stripped', models.TextField(blank=True, null=True)),
|
||||
('description_binary', models.BinaryField(null=True)),
|
||||
('logo_props', models.JSONField(default=dict)),
|
||||
('color', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('background_color', models.CharField(blank=True, max_length=255, null=True)),
|
||||
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
|
||||
('owner', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='stickies', to=settings.AUTH_USER_MODEL)),
|
||||
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
|
||||
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='stickies', to='db.workspace')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Sticky',
|
||||
'verbose_name_plural': 'Stickies',
|
||||
'db_table': 'stickies',
|
||||
'ordering': ('-created_at',),
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='IssueDescriptionVersion',
|
||||
fields=[
|
||||
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
|
||||
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
|
||||
('deleted_at', models.DateTimeField(blank=True, null=True, verbose_name='Deleted At')),
|
||||
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
|
||||
('description_binary', models.BinaryField(null=True)),
|
||||
('description_html', models.TextField(blank=True, default='<p></p>')),
|
||||
('description_stripped', models.TextField(blank=True, null=True)),
|
||||
('description_json', models.JSONField(blank=True, default=dict)),
|
||||
('last_saved_at', models.DateTimeField(default=django.utils.timezone.now)),
|
||||
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
|
||||
('issue', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='description_versions', to='db.issue')),
|
||||
('owned_by', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='issue_description_versions', to=settings.AUTH_USER_MODEL)),
|
||||
('project', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
|
||||
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
|
||||
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Issue Description Version',
|
||||
'verbose_name_plural': 'Issue Description Versions',
|
||||
'db_table': 'issue_description_versions',
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,124 @@
|
||||
# Generated by Django 4.2.15 on 2024-12-24 14:57
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('db', '0087_remove_issueversion_description_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="sticky",
|
||||
name="sort_order",
|
||||
field=models.FloatField(default=65535),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="WorkspaceUserLink",
|
||||
fields=[
|
||||
(
|
||||
"created_at",
|
||||
models.DateTimeField(auto_now_add=True, verbose_name="Created At"),
|
||||
),
|
||||
(
|
||||
"updated_at",
|
||||
models.DateTimeField(
|
||||
auto_now=True, verbose_name="Last Modified At"
|
||||
),
|
||||
),
|
||||
(
|
||||
"deleted_at",
|
||||
models.DateTimeField(
|
||||
blank=True, null=True, verbose_name="Deleted At"
|
||||
),
|
||||
),
|
||||
(
|
||||
"id",
|
||||
models.UUIDField(
|
||||
db_index=True,
|
||||
default=uuid.uuid4,
|
||||
editable=False,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
unique=True,
|
||||
),
|
||||
),
|
||||
("title", models.CharField(blank=True, max_length=255, null=True)),
|
||||
("url", models.TextField()),
|
||||
("metadata", models.JSONField(default=dict)),
|
||||
(
|
||||
"created_by",
|
||||
models.ForeignKey(
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="%(class)s_created_by",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
verbose_name="Created By",
|
||||
),
|
||||
),
|
||||
(
|
||||
"owner",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="owner_workspace_user_link",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
(
|
||||
"project",
|
||||
models.ForeignKey(
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="project_%(class)s",
|
||||
to="db.project",
|
||||
),
|
||||
),
|
||||
(
|
||||
"updated_by",
|
||||
models.ForeignKey(
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="%(class)s_updated_by",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
verbose_name="Last Modified By",
|
||||
),
|
||||
),
|
||||
(
|
||||
"workspace",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="workspace_%(class)s",
|
||||
to="db.workspace",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"verbose_name": "Workspace User Link",
|
||||
"verbose_name_plural": "Workspace User Links",
|
||||
"db_table": "workspace_user_links",
|
||||
"ordering": ("-created_at",),
|
||||
},
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="pagelog",
|
||||
name="entity_name",
|
||||
field=models.CharField(max_length=30, verbose_name="Transaction Type"),
|
||||
),
|
||||
migrations.AlterUniqueTogether(
|
||||
name="webhook",
|
||||
unique_together={("workspace", "url", "deleted_at")},
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name="webhook",
|
||||
constraint=models.UniqueConstraint(
|
||||
condition=models.Q(("deleted_at__isnull", True)),
|
||||
fields=("workspace", "url"),
|
||||
name="webhook_url_unique_url_when_deleted_at_null",
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -41,6 +41,8 @@ from .issue import (
|
||||
IssueSequence,
|
||||
IssueSubscriber,
|
||||
IssueVote,
|
||||
IssueVersion,
|
||||
IssueDescriptionVersion,
|
||||
)
|
||||
from .module import Module, ModuleIssue, ModuleLink, ModuleMember, ModuleUserProperties
|
||||
from .notification import EmailNotificationLog, Notification, UserNotificationPreference
|
||||
@@ -66,17 +68,9 @@ from .workspace import (
|
||||
WorkspaceMemberInvite,
|
||||
WorkspaceTheme,
|
||||
WorkspaceUserProperties,
|
||||
WorkspaceUserLink,
|
||||
)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
from .favorite import UserFavorite
|
||||
|
||||
from .issue_type import IssueType
|
||||
@@ -86,3 +80,5 @@ from .recent_visit import UserRecentVisit
|
||||
from .label import Label
|
||||
|
||||
from .device import Device, DeviceSession
|
||||
|
||||
from .sticky import Sticky
|
||||
|
||||
@@ -15,6 +15,7 @@ from django import apps
|
||||
from plane.utils.html_processor import strip_tags
|
||||
from plane.db.mixins import SoftDeletionManager
|
||||
from plane.utils.exception_logger import log_exception
|
||||
from .base import BaseModel
|
||||
from .project import ProjectBaseModel
|
||||
|
||||
|
||||
@@ -660,9 +661,6 @@ class IssueVote(ProjectBaseModel):
|
||||
|
||||
|
||||
class IssueVersion(ProjectBaseModel):
|
||||
issue = models.ForeignKey(
|
||||
"db.Issue", on_delete=models.CASCADE, related_name="versions"
|
||||
)
|
||||
PRIORITY_CHOICES = (
|
||||
("urgent", "Urgent"),
|
||||
("high", "High"),
|
||||
@@ -670,14 +668,11 @@ class IssueVersion(ProjectBaseModel):
|
||||
("low", "Low"),
|
||||
("none", "None"),
|
||||
)
|
||||
|
||||
parent = models.UUIDField(blank=True, null=True)
|
||||
state = models.UUIDField(blank=True, null=True)
|
||||
estimate_point = models.UUIDField(blank=True, null=True)
|
||||
name = models.CharField(max_length=255, verbose_name="Issue Name")
|
||||
description = models.JSONField(blank=True, default=dict)
|
||||
description_html = models.TextField(blank=True, default="<p></p>")
|
||||
description_stripped = models.TextField(blank=True, null=True)
|
||||
description_binary = models.BinaryField(null=True)
|
||||
priority = models.CharField(
|
||||
max_length=30,
|
||||
choices=PRIORITY_CHOICES,
|
||||
@@ -686,7 +681,9 @@ class IssueVersion(ProjectBaseModel):
|
||||
)
|
||||
start_date = models.DateField(null=True, blank=True)
|
||||
target_date = models.DateField(null=True, blank=True)
|
||||
assignees = ArrayField(models.UUIDField(), blank=True, default=list)
|
||||
sequence_id = models.IntegerField(default=1, verbose_name="Issue Sequence ID")
|
||||
labels = ArrayField(models.UUIDField(), blank=True, default=list)
|
||||
sort_order = models.FloatField(default=65535)
|
||||
completed_at = models.DateTimeField(null=True)
|
||||
archived_at = models.DateField(null=True)
|
||||
@@ -694,14 +691,26 @@ class IssueVersion(ProjectBaseModel):
|
||||
external_source = models.CharField(max_length=255, null=True, blank=True)
|
||||
external_id = models.CharField(max_length=255, blank=True, null=True)
|
||||
type = models.UUIDField(blank=True, null=True)
|
||||
last_saved_at = models.DateTimeField(default=timezone.now)
|
||||
owned_by = models.UUIDField()
|
||||
assignees = ArrayField(models.UUIDField(), blank=True, default=list)
|
||||
labels = ArrayField(models.UUIDField(), blank=True, default=list)
|
||||
cycle = models.UUIDField(null=True, blank=True)
|
||||
modules = ArrayField(models.UUIDField(), blank=True, default=list)
|
||||
properties = models.JSONField(default=dict)
|
||||
meta = models.JSONField(default=dict)
|
||||
properties = models.JSONField(default=dict) # issue properties
|
||||
meta = models.JSONField(default=dict) # issue meta
|
||||
last_saved_at = models.DateTimeField(default=timezone.now)
|
||||
|
||||
issue = models.ForeignKey(
|
||||
"db.Issue", on_delete=models.CASCADE, related_name="versions"
|
||||
)
|
||||
activity = models.ForeignKey(
|
||||
"db.IssueActivity",
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
related_name="versions",
|
||||
)
|
||||
owned_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="issue_versions",
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Issue Version"
|
||||
@@ -721,39 +730,93 @@ class IssueVersion(ProjectBaseModel):
|
||||
|
||||
Module = apps.get_model("db.Module")
|
||||
CycleIssue = apps.get_model("db.CycleIssue")
|
||||
IssueAssignee = apps.get_model("db.IssueAssignee")
|
||||
IssueLabel = apps.get_model("db.IssueLabel")
|
||||
|
||||
cycle_issue = CycleIssue.objects.filter(issue=issue).first()
|
||||
|
||||
cls.objects.create(
|
||||
issue=issue,
|
||||
parent=issue.parent,
|
||||
state=issue.state,
|
||||
point=issue.point,
|
||||
estimate_point=issue.estimate_point,
|
||||
parent=issue.parent_id,
|
||||
state=issue.state_id,
|
||||
estimate_point=issue.estimate_point_id,
|
||||
name=issue.name,
|
||||
description=issue.description,
|
||||
description_html=issue.description_html,
|
||||
description_stripped=issue.description_stripped,
|
||||
description_binary=issue.description_binary,
|
||||
priority=issue.priority,
|
||||
start_date=issue.start_date,
|
||||
target_date=issue.target_date,
|
||||
assignees=list(
|
||||
IssueAssignee.objects.filter(issue=issue).values_list(
|
||||
"assignee_id", flat=True
|
||||
)
|
||||
),
|
||||
sequence_id=issue.sequence_id,
|
||||
labels=list(
|
||||
IssueLabel.objects.filter(issue=issue).values_list(
|
||||
"label_id", flat=True
|
||||
)
|
||||
),
|
||||
sort_order=issue.sort_order,
|
||||
completed_at=issue.completed_at,
|
||||
archived_at=issue.archived_at,
|
||||
is_draft=issue.is_draft,
|
||||
external_source=issue.external_source,
|
||||
external_id=issue.external_id,
|
||||
type=issue.type,
|
||||
last_saved_at=issue.last_saved_at,
|
||||
assignees=issue.assignees,
|
||||
labels=issue.labels,
|
||||
cycle=cycle_issue.cycle if cycle_issue else None,
|
||||
modules=Module.objects.filter(issue=issue).values_list("id", flat=True),
|
||||
type=issue.type_id,
|
||||
cycle=cycle_issue.cycle_id if cycle_issue else None,
|
||||
modules=list(
|
||||
Module.objects.filter(issue=issue).values_list("id", flat=True)
|
||||
),
|
||||
properties={},
|
||||
meta={},
|
||||
last_saved_at=timezone.now(),
|
||||
owned_by=user,
|
||||
)
|
||||
return True
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return False
|
||||
|
||||
|
||||
class IssueDescriptionVersion(ProjectBaseModel):
|
||||
issue = models.ForeignKey(
|
||||
"db.Issue", on_delete=models.CASCADE, related_name="description_versions"
|
||||
)
|
||||
description_binary = models.BinaryField(null=True)
|
||||
description_html = models.TextField(blank=True, default="<p></p>")
|
||||
description_stripped = models.TextField(blank=True, null=True)
|
||||
description_json = models.JSONField(default=dict, blank=True)
|
||||
last_saved_at = models.DateTimeField(default=timezone.now)
|
||||
owned_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="issue_description_versions",
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Issue Description Version"
|
||||
verbose_name_plural = "Issue Description Versions"
|
||||
db_table = "issue_description_versions"
|
||||
|
||||
@classmethod
|
||||
def log_issue_description_version(cls, issue, user):
|
||||
try:
|
||||
"""
|
||||
Log the issue description version
|
||||
"""
|
||||
cls.objects.create(
|
||||
workspace_id=issue.workspace_id,
|
||||
project_id=issue.project_id,
|
||||
created_by_id=issue.created_by_id,
|
||||
updated_by_id=issue.updated_by_id,
|
||||
owned_by_id=user,
|
||||
last_saved_at=timezone.now(),
|
||||
issue_id=issue.id,
|
||||
description_binary=issue.description_binary,
|
||||
description_html=issue.description_html,
|
||||
description_stripped=issue.description_stripped,
|
||||
description_json=issue.description,
|
||||
)
|
||||
return True
|
||||
except Exception as e:
|
||||
log_exception(e)
|
||||
return False
|
||||
|
||||
@@ -90,7 +90,7 @@ class PageLog(BaseModel):
|
||||
page = models.ForeignKey(Page, related_name="page_log", on_delete=models.CASCADE)
|
||||
entity_identifier = models.UUIDField(null=True)
|
||||
entity_name = models.CharField(
|
||||
max_length=30, choices=TYPE_CHOICES, verbose_name="Transaction Type"
|
||||
max_length=30, verbose_name="Transaction Type"
|
||||
)
|
||||
workspace = models.ForeignKey(
|
||||
"db.Workspace", on_delete=models.CASCADE, related_name="workspace_page_log"
|
||||
|
||||
48
apiserver/plane/db/models/sticky.py
Normal file
48
apiserver/plane/db/models/sticky.py
Normal file
@@ -0,0 +1,48 @@
|
||||
# Django imports
|
||||
from django.conf import settings
|
||||
from django.db import models
|
||||
|
||||
# Module imports
|
||||
from .base import BaseModel
|
||||
|
||||
|
||||
class Sticky(BaseModel):
|
||||
name = models.TextField()
|
||||
|
||||
description = models.JSONField(blank=True, default=dict)
|
||||
description_html = models.TextField(blank=True, default="<p></p>")
|
||||
description_stripped = models.TextField(blank=True, null=True)
|
||||
description_binary = models.BinaryField(null=True)
|
||||
|
||||
logo_props = models.JSONField(default=dict)
|
||||
color = models.CharField(max_length=255, blank=True, null=True)
|
||||
background_color = models.CharField(max_length=255, blank=True, null=True)
|
||||
|
||||
workspace = models.ForeignKey(
|
||||
"db.Workspace", on_delete=models.CASCADE, related_name="stickies"
|
||||
)
|
||||
owner = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name="stickies"
|
||||
)
|
||||
sort_order = models.FloatField(default=65535)
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Sticky"
|
||||
verbose_name_plural = "Stickies"
|
||||
db_table = "stickies"
|
||||
ordering = ("-created_at",)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if self._state.adding:
|
||||
# Get the maximum sequence value from the database
|
||||
last_id = Sticky.objects.filter(workspace=self.workspace).aggregate(
|
||||
largest=models.Max("sort_order")
|
||||
)["largest"]
|
||||
# if last_id is not None
|
||||
if last_id is not None:
|
||||
self.sort_order = last_id + 10000
|
||||
|
||||
super(Sticky, self).save(*args, **kwargs)
|
||||
|
||||
def __str__(self):
|
||||
return str(self.name)
|
||||
@@ -26,6 +26,14 @@ def get_default_onboarding():
|
||||
}
|
||||
|
||||
|
||||
def get_mobile_default_onboarding():
|
||||
return {
|
||||
"profile_complete": False,
|
||||
"workspace_create": False,
|
||||
"workspace_join": False,
|
||||
}
|
||||
|
||||
|
||||
class User(AbstractBaseUser, PermissionsMixin):
|
||||
id = models.UUIDField(
|
||||
default=uuid.uuid4, unique=True, editable=False, db_index=True, primary_key=True
|
||||
@@ -178,6 +186,12 @@ class Profile(TimeAuditModel):
|
||||
billing_address = models.JSONField(null=True)
|
||||
has_billing_address = models.BooleanField(default=False)
|
||||
company_name = models.CharField(max_length=255, blank=True)
|
||||
# mobile
|
||||
is_mobile_onboarded = models.BooleanField(default=False)
|
||||
mobile_onboarding_step = models.JSONField(default=get_mobile_default_onboarding)
|
||||
mobile_timezone_auto_set = models.BooleanField(default=False)
|
||||
# language
|
||||
language = models.CharField(max_length=255, default="en")
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Profile"
|
||||
|
||||
@@ -47,11 +47,18 @@ class Webhook(BaseModel):
|
||||
return f"{self.workspace.slug} {self.url}"
|
||||
|
||||
class Meta:
|
||||
unique_together = ["workspace", "url"]
|
||||
unique_together = ["workspace", "url", "deleted_at"]
|
||||
verbose_name = "Webhook"
|
||||
verbose_name_plural = "Webhooks"
|
||||
db_table = "webhooks"
|
||||
ordering = ("-created_at",)
|
||||
constraints = [
|
||||
models.UniqueConstraint(
|
||||
fields=["workspace", "url"],
|
||||
condition=models.Q(deleted_at__isnull=True),
|
||||
name="webhook_url_unique_url_when_deleted_at_null",
|
||||
)
|
||||
]
|
||||
|
||||
|
||||
class WebhookLog(BaseModel):
|
||||
|
||||
@@ -322,3 +322,23 @@ class WorkspaceUserProperties(BaseModel):
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.workspace.name} {self.user.email}"
|
||||
|
||||
|
||||
class WorkspaceUserLink(WorkspaceBaseModel):
|
||||
title = models.CharField(max_length=255, null=True, blank=True)
|
||||
url = models.TextField()
|
||||
metadata = models.JSONField(default=dict)
|
||||
owner = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="owner_workspace_user_link",
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Workspace User Link"
|
||||
verbose_name_plural = "Workspace User Links"
|
||||
db_table = "workspace_user_links"
|
||||
ordering = ("-created_at",)
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.workspace.id} {self.url}"
|
||||
@@ -262,6 +262,9 @@ CELERY_IMPORTS = (
|
||||
"plane.license.bgtasks.tracer",
|
||||
# management tasks
|
||||
"plane.bgtasks.dummy_data_task",
|
||||
# issue version tasks
|
||||
"plane.bgtasks.issue_version_sync",
|
||||
"plane.bgtasks.issue_description_version_sync",
|
||||
)
|
||||
|
||||
# Sentry Settings
|
||||
|
||||
@@ -91,6 +91,7 @@ def issue_on_results(issues, group_by, sub_group_by):
|
||||
Case(
|
||||
When(
|
||||
votes__isnull=False,
|
||||
votes__deleted_at__isnull=True,
|
||||
then=JSONObject(
|
||||
vote=F("votes__vote"),
|
||||
actor_details=JSONObject(
|
||||
@@ -117,13 +118,14 @@ def issue_on_results(issues, group_by, sub_group_by):
|
||||
default=None,
|
||||
output_field=JSONField(),
|
||||
),
|
||||
filter=Q(votes__isnull=False),
|
||||
filter=Q(votes__isnull=False,votes__deleted_at__isnull=True),
|
||||
distinct=True,
|
||||
),
|
||||
reaction_items=ArrayAgg(
|
||||
Case(
|
||||
When(
|
||||
issue_reactions__isnull=False,
|
||||
issue_reactions__deleted_at__isnull=True,
|
||||
then=JSONObject(
|
||||
reaction=F("issue_reactions__reaction"),
|
||||
actor_details=JSONObject(
|
||||
@@ -150,7 +152,7 @@ def issue_on_results(issues, group_by, sub_group_by):
|
||||
default=None,
|
||||
output_field=JSONField(),
|
||||
),
|
||||
filter=Q(issue_reactions__isnull=False),
|
||||
filter=Q(issue_reactions__isnull=False, issue_reactions__deleted_at__isnull=True),
|
||||
distinct=True,
|
||||
),
|
||||
).values(*required_fields, "vote_items", "reaction_items")
|
||||
|
||||
@@ -86,7 +86,13 @@ class EntityAssetEndpoint(BaseAPIView):
|
||||
)
|
||||
|
||||
# Check if the file type is allowed
|
||||
allowed_types = ["image/jpeg", "image/png", "image/webp"]
|
||||
allowed_types = [
|
||||
"image/jpeg",
|
||||
"image/png",
|
||||
"image/webp",
|
||||
"image/jpg",
|
||||
"image/gif",
|
||||
]
|
||||
if type not in allowed_types:
|
||||
return Response(
|
||||
{
|
||||
|
||||
@@ -701,6 +701,7 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
|
||||
Case(
|
||||
When(
|
||||
votes__isnull=False,
|
||||
votes__deleted_at__isnull=True,
|
||||
then=JSONObject(
|
||||
vote=F("votes__vote"),
|
||||
actor_details=JSONObject(
|
||||
@@ -732,7 +733,11 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
|
||||
output_field=JSONField(),
|
||||
),
|
||||
filter=Case(
|
||||
When(votes__isnull=False, then=True),
|
||||
When(
|
||||
votes__isnull=False,
|
||||
votes__deleted_at__isnull=True,
|
||||
then=True,
|
||||
),
|
||||
default=False,
|
||||
output_field=JSONField(),
|
||||
),
|
||||
@@ -742,6 +747,7 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
|
||||
Case(
|
||||
When(
|
||||
issue_reactions__isnull=False,
|
||||
issue_reactions__deleted_at__isnull=True,
|
||||
then=JSONObject(
|
||||
reaction=F("issue_reactions__reaction"),
|
||||
actor_details=JSONObject(
|
||||
@@ -775,7 +781,11 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
|
||||
output_field=JSONField(),
|
||||
),
|
||||
filter=Case(
|
||||
When(issue_reactions__isnull=False, then=True),
|
||||
When(
|
||||
issue_reactions__isnull=False,
|
||||
issue_reactions__deleted_at__isnull=True,
|
||||
then=True,
|
||||
),
|
||||
default=False,
|
||||
output_field=JSONField(),
|
||||
),
|
||||
|
||||
100
apiserver/plane/utils/timezone_converter.py
Normal file
100
apiserver/plane/utils/timezone_converter.py
Normal file
@@ -0,0 +1,100 @@
|
||||
import pytz
|
||||
from plane.db.models import Project
|
||||
from datetime import datetime, time
|
||||
from datetime import timedelta
|
||||
|
||||
def user_timezone_converter(queryset, datetime_fields, user_timezone):
|
||||
# Create a timezone object for the user's timezone
|
||||
user_tz = pytz.timezone(user_timezone)
|
||||
|
||||
# Check if queryset is a dictionary (single item) or a list of dictionaries
|
||||
if isinstance(queryset, dict):
|
||||
queryset_values = [queryset]
|
||||
else:
|
||||
queryset_values = list(queryset)
|
||||
|
||||
# Iterate over the dictionaries in the list
|
||||
for item in queryset_values:
|
||||
# Iterate over the datetime fields
|
||||
for field in datetime_fields:
|
||||
# Convert the datetime field to the user's timezone
|
||||
if field in item and item[field]:
|
||||
item[field] = item[field].astimezone(user_tz)
|
||||
|
||||
# If queryset was a single item, return a single item
|
||||
if isinstance(queryset, dict):
|
||||
return queryset_values[0]
|
||||
else:
|
||||
return queryset_values
|
||||
|
||||
|
||||
def convert_to_utc(date, project_id, is_start_date=False):
|
||||
"""
|
||||
Converts a start date string to the project's local timezone at 12:00 AM
|
||||
and then converts it to UTC for storage.
|
||||
|
||||
Args:
|
||||
date (str): The date string in "YYYY-MM-DD" format.
|
||||
project_id (int): The project's ID to fetch the associated timezone.
|
||||
|
||||
Returns:
|
||||
datetime: The UTC datetime.
|
||||
"""
|
||||
# Retrieve the project's timezone using the project ID
|
||||
project = Project.objects.get(id=project_id)
|
||||
project_timezone = project.timezone
|
||||
if not date or not project_timezone:
|
||||
raise ValueError("Both date and timezone must be provided.")
|
||||
|
||||
# Parse the string into a date object
|
||||
start_date = datetime.strptime(date, "%Y-%m-%d").date()
|
||||
|
||||
# Get the project's timezone
|
||||
local_tz = pytz.timezone(project_timezone)
|
||||
|
||||
# Combine the date with 12:00 AM time
|
||||
local_datetime = datetime.combine(start_date, time.min)
|
||||
|
||||
# Localize the datetime to the project's timezone
|
||||
localized_datetime = local_tz.localize(local_datetime)
|
||||
|
||||
# If it's an start date, add one minute
|
||||
if is_start_date:
|
||||
localized_datetime += timedelta(minutes=1)
|
||||
|
||||
# Convert the localized datetime to UTC
|
||||
utc_datetime = localized_datetime.astimezone(pytz.utc)
|
||||
|
||||
# Return the UTC datetime for storage
|
||||
return utc_datetime
|
||||
|
||||
|
||||
def convert_utc_to_project_timezone(utc_datetime, project_id):
|
||||
"""
|
||||
Converts a UTC datetime (stored in the database) to the project's local timezone.
|
||||
|
||||
Args:
|
||||
utc_datetime (datetime): The UTC datetime to be converted.
|
||||
project_id (int): The project's ID to fetch the associated timezone.
|
||||
|
||||
Returns:
|
||||
datetime: The datetime in the project's local timezone.
|
||||
"""
|
||||
# Retrieve the project's timezone using the project ID
|
||||
project = Project.objects.get(id=project_id)
|
||||
project_timezone = project.timezone
|
||||
if not project_timezone:
|
||||
raise ValueError("Project timezone must be provided.")
|
||||
|
||||
# Get the timezone object for the project's timezone
|
||||
local_tz = pytz.timezone(project_timezone)
|
||||
|
||||
# Convert the UTC datetime to the project's local timezone
|
||||
if utc_datetime.tzinfo is None:
|
||||
# Localize UTC datetime if it's naive (i.e., without timezone info)
|
||||
utc_datetime = pytz.utc.localize(utc_datetime)
|
||||
|
||||
# Convert to the project's local timezone
|
||||
local_datetime = utc_datetime.astimezone(local_tz)
|
||||
|
||||
return local_datetime
|
||||
@@ -1,26 +0,0 @@
|
||||
import pytz
|
||||
|
||||
|
||||
def user_timezone_converter(queryset, datetime_fields, user_timezone):
|
||||
# Create a timezone object for the user's timezone
|
||||
user_tz = pytz.timezone(user_timezone)
|
||||
|
||||
# Check if queryset is a dictionary (single item) or a list of dictionaries
|
||||
if isinstance(queryset, dict):
|
||||
queryset_values = [queryset]
|
||||
else:
|
||||
queryset_values = list(queryset)
|
||||
|
||||
# Iterate over the dictionaries in the list
|
||||
for item in queryset_values:
|
||||
# Iterate over the datetime fields
|
||||
for field in datetime_fields:
|
||||
# Convert the datetime field to the user's timezone
|
||||
if field in item and item[field]:
|
||||
item[field] = item[field].astimezone(user_tz)
|
||||
|
||||
# If queryset was a single item, return a single item
|
||||
if isinstance(queryset, dict):
|
||||
return queryset_values[0]
|
||||
else:
|
||||
return queryset_values
|
||||
2
app.json
2
app.json
@@ -70,7 +70,7 @@
|
||||
"value": ""
|
||||
},
|
||||
"GITHUB_CLIENT_SECRET": {
|
||||
"description": "Github Client Secret",
|
||||
"description": "GitHub Client Secret",
|
||||
"value": ""
|
||||
},
|
||||
"NEXT_PUBLIC_API_BASE_URL": {
|
||||
|
||||
@@ -62,7 +62,7 @@ mkdir plane-selfhost
|
||||
|
||||
cd plane-selfhost
|
||||
|
||||
curl -fsSL -o setup.sh https://raw.githubusercontent.com/makeplane/plane/master/deploy/selfhost/install.sh
|
||||
curl -fsSL -o setup.sh https://github.com/makeplane/plane/releases/latest/download/setup.sh
|
||||
|
||||
chmod +x setup.sh
|
||||
```
|
||||
|
||||
@@ -1,54 +1,63 @@
|
||||
x-app-env: &app-env
|
||||
environment:
|
||||
- NGINX_PORT=${NGINX_PORT:-80}
|
||||
- WEB_URL=${WEB_URL:-http://localhost}
|
||||
- DEBUG=${DEBUG:-0}
|
||||
- SENTRY_DSN=${SENTRY_DSN:-""}
|
||||
- SENTRY_ENVIRONMENT=${SENTRY_ENVIRONMENT:-"production"}
|
||||
- CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS:-}
|
||||
# Gunicorn Workers
|
||||
- GUNICORN_WORKERS=${GUNICORN_WORKERS:-1}
|
||||
#DB SETTINGS
|
||||
- PGHOST=${PGHOST:-plane-db}
|
||||
- PGDATABASE=${PGDATABASE:-plane}
|
||||
- POSTGRES_USER=${POSTGRES_USER:-plane}
|
||||
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-plane}
|
||||
- POSTGRES_DB=${POSTGRES_DB:-plane}
|
||||
- POSTGRES_PORT=${POSTGRES_PORT:-5432}
|
||||
- PGDATA=${PGDATA:-/var/lib/postgresql/data}
|
||||
- DATABASE_URL=${DATABASE_URL:-postgresql://plane:plane@plane-db/plane}
|
||||
# REDIS SETTINGS
|
||||
- REDIS_HOST=${REDIS_HOST:-plane-redis}
|
||||
- REDIS_PORT=${REDIS_PORT:-6379}
|
||||
- REDIS_URL=${REDIS_URL:-redis://plane-redis:6379/}
|
||||
x-db-env: &db-env
|
||||
PGHOST: ${PGHOST:-plane-db}
|
||||
PGDATABASE: ${PGDATABASE:-plane}
|
||||
POSTGRES_USER: ${POSTGRES_USER:-plane}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-plane}
|
||||
POSTGRES_DB: ${POSTGRES_DB:-plane}
|
||||
POSTGRES_PORT: ${POSTGRES_PORT:-5432}
|
||||
PGDATA: ${PGDATA:-/var/lib/postgresql/data}
|
||||
|
||||
x-redis-env: &redis-env
|
||||
REDIS_HOST: ${REDIS_HOST:-plane-redis}
|
||||
REDIS_PORT: ${REDIS_PORT:-6379}
|
||||
REDIS_URL: ${REDIS_URL:-redis://plane-redis:6379/}
|
||||
|
||||
x-minio-env: &minio-env
|
||||
MINIO_ROOT_USER: ${AWS_ACCESS_KEY_ID:-access-key}
|
||||
MINIO_ROOT_PASSWORD: ${AWS_SECRET_ACCESS_KEY:-secret-key}
|
||||
|
||||
x-aws-s3-env: &aws-s3-env
|
||||
AWS_REGION: ${AWS_REGION:-}
|
||||
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID:-access-key}
|
||||
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY:-secret-key}
|
||||
AWS_S3_ENDPOINT_URL: ${AWS_S3_ENDPOINT_URL:-http://plane-minio:9000}
|
||||
AWS_S3_BUCKET_NAME: ${AWS_S3_BUCKET_NAME:-uploads}
|
||||
|
||||
x-proxy-env: &proxy-env
|
||||
NGINX_PORT: ${NGINX_PORT:-80}
|
||||
BUCKET_NAME: ${AWS_S3_BUCKET_NAME:-uploads}
|
||||
FILE_SIZE_LIMIT: ${FILE_SIZE_LIMIT:-5242880}
|
||||
|
||||
x-mq-env: &mq-env
|
||||
# RabbitMQ Settings
|
||||
RABBITMQ_HOST: ${RABBITMQ_HOST:-plane-mq}
|
||||
RABBITMQ_PORT: ${RABBITMQ_PORT:-5672}
|
||||
RABBITMQ_DEFAULT_USER: ${RABBITMQ_USER:-plane}
|
||||
RABBITMQ_DEFAULT_PASS: ${RABBITMQ_PASSWORD:-plane}
|
||||
RABBITMQ_DEFAULT_VHOST: ${RABBITMQ_VHOST:-plane}
|
||||
RABBITMQ_VHOST: ${RABBITMQ_VHOST:-plane}
|
||||
|
||||
x-live-env: &live-env
|
||||
API_BASE_URL: ${API_BASE_URL:-http://api:8000}
|
||||
|
||||
x-app-env: &app-env
|
||||
WEB_URL: ${WEB_URL:-http://localhost}
|
||||
DEBUG: ${DEBUG:-0}
|
||||
SENTRY_DSN: ${SENTRY_DSN}
|
||||
SENTRY_ENVIRONMENT: ${SENTRY_ENVIRONMENT:-production}
|
||||
CORS_ALLOWED_ORIGINS: ${CORS_ALLOWED_ORIGINS}
|
||||
GUNICORN_WORKERS: 1
|
||||
USE_MINIO: ${USE_MINIO:-1}
|
||||
DATABASE_URL: ${DATABASE_URL:-postgresql://plane:plane@plane-db/plane}
|
||||
SECRET_KEY: ${SECRET_KEY:-60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5}
|
||||
ADMIN_BASE_URL: ${ADMIN_BASE_URL}
|
||||
SPACE_BASE_URL: ${SPACE_BASE_URL}
|
||||
APP_BASE_URL: ${APP_BASE_URL}
|
||||
AMQP_URL: ${AMQP_URL:-amqp://plane:plane@plane-mq:5672/plane}
|
||||
|
||||
# RabbitMQ Settings
|
||||
- RABBITMQ_HOST=${RABBITMQ_HOST:-plane-mq}
|
||||
- RABBITMQ_PORT=${RABBITMQ_PORT:-5672}
|
||||
- RABBITMQ_DEFAULT_USER=${RABBITMQ_USER:-plane}
|
||||
- RABBITMQ_DEFAULT_PASS=${RABBITMQ_PASSWORD:-plane}
|
||||
- RABBITMQ_DEFAULT_VHOST=${RABBITMQ_VHOST:-plane}
|
||||
- RABBITMQ_VHOST=${RABBITMQ_VHOST:-plane}
|
||||
- AMQP_URL=${AMQP_URL:-amqp://plane:plane@plane-mq:5672/plane}
|
||||
# Application secret
|
||||
- SECRET_KEY=${SECRET_KEY:-60gp0byfz2dvffa45cxl20p1scy9xbpf6d8c5y0geejgkyp1b5}
|
||||
# DATA STORE SETTINGS
|
||||
- USE_MINIO=${USE_MINIO:-1}
|
||||
- AWS_REGION=${AWS_REGION:-}
|
||||
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID:-"access-key"}
|
||||
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY:-"secret-key"}
|
||||
- AWS_S3_ENDPOINT_URL=${AWS_S3_ENDPOINT_URL:-http://plane-minio:9000}
|
||||
- AWS_S3_BUCKET_NAME=${AWS_S3_BUCKET_NAME:-uploads}
|
||||
- MINIO_ROOT_USER=${MINIO_ROOT_USER:-"access-key"}
|
||||
- MINIO_ROOT_PASSWORD=${MINIO_ROOT_PASSWORD:-"secret-key"}
|
||||
- BUCKET_NAME=${BUCKET_NAME:-uploads}
|
||||
- FILE_SIZE_LIMIT=${FILE_SIZE_LIMIT:-5242880}
|
||||
# Live server env
|
||||
- API_BASE_URL=${API_BASE_URL:-http://api:8000}
|
||||
|
||||
services:
|
||||
web:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-frontend:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -61,7 +70,6 @@ services:
|
||||
- worker
|
||||
|
||||
space:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-space:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -75,7 +83,6 @@ services:
|
||||
- web
|
||||
|
||||
admin:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-admin:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -88,12 +95,13 @@ services:
|
||||
- web
|
||||
|
||||
live:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-live:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
restart: unless-stopped
|
||||
command: node live/dist/server.js live
|
||||
environment:
|
||||
<<: [ *live-env ]
|
||||
deploy:
|
||||
replicas: ${LIVE_REPLICAS:-1}
|
||||
depends_on:
|
||||
@@ -101,7 +109,6 @@ services:
|
||||
- web
|
||||
|
||||
api:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -111,14 +118,14 @@ services:
|
||||
replicas: ${API_REPLICAS:-1}
|
||||
volumes:
|
||||
- logs_api:/code/plane/logs
|
||||
environment:
|
||||
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
|
||||
depends_on:
|
||||
- plane-db
|
||||
- plane-redis
|
||||
- plane-mq
|
||||
|
||||
|
||||
worker:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -126,6 +133,8 @@ services:
|
||||
command: ./bin/docker-entrypoint-worker.sh
|
||||
volumes:
|
||||
- logs_worker:/code/plane/logs
|
||||
environment:
|
||||
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
|
||||
depends_on:
|
||||
- api
|
||||
- plane-db
|
||||
@@ -133,7 +142,6 @@ services:
|
||||
- plane-mq
|
||||
|
||||
beat-worker:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -141,6 +149,8 @@ services:
|
||||
command: ./bin/docker-entrypoint-beat.sh
|
||||
volumes:
|
||||
- logs_beat-worker:/code/plane/logs
|
||||
environment:
|
||||
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
|
||||
depends_on:
|
||||
- api
|
||||
- plane-db
|
||||
@@ -148,7 +158,6 @@ services:
|
||||
- plane-mq
|
||||
|
||||
migrator:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-backend:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
@@ -156,21 +165,23 @@ services:
|
||||
command: ./bin/docker-entrypoint-migrator.sh
|
||||
volumes:
|
||||
- logs_migrator:/code/plane/logs
|
||||
environment:
|
||||
<<: [ *app-env, *db-env, *redis-env, *minio-env, *aws-s3-env, *proxy-env ]
|
||||
depends_on:
|
||||
- plane-db
|
||||
- plane-redis
|
||||
|
||||
plane-db:
|
||||
<<: *app-env
|
||||
image: postgres:15.7-alpine
|
||||
pull_policy: if_not_present
|
||||
restart: unless-stopped
|
||||
command: postgres -c 'max_connections=1000'
|
||||
environment:
|
||||
<<: *db-env
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
|
||||
plane-redis:
|
||||
<<: *app-env
|
||||
image: valkey/valkey:7.2.5-alpine
|
||||
pull_policy: if_not_present
|
||||
restart: unless-stopped
|
||||
@@ -178,30 +189,33 @@ services:
|
||||
- redisdata:/data
|
||||
|
||||
plane-mq:
|
||||
<<: *app-env
|
||||
image: rabbitmq:3.13.6-management-alpine
|
||||
restart: always
|
||||
environment:
|
||||
<<: *mq-env
|
||||
volumes:
|
||||
- rabbitmq_data:/var/lib/rabbitmq
|
||||
|
||||
plane-minio:
|
||||
<<: *app-env
|
||||
image: minio/minio:latest
|
||||
pull_policy: if_not_present
|
||||
restart: unless-stopped
|
||||
command: server /export --console-address ":9090"
|
||||
environment:
|
||||
<<: *minio-env
|
||||
volumes:
|
||||
- uploads:/export
|
||||
|
||||
# Comment this if you already have a reverse proxy running
|
||||
proxy:
|
||||
<<: *app-env
|
||||
image: ${DOCKERHUB_USER:-makeplane}/plane-proxy:${APP_RELEASE:-stable}
|
||||
platform: ${DOCKER_PLATFORM:-}
|
||||
pull_policy: if_not_present
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- ${NGINX_PORT}:80
|
||||
environment:
|
||||
<<: *proxy-env
|
||||
depends_on:
|
||||
- web
|
||||
- api
|
||||
|
||||
@@ -4,9 +4,12 @@ BRANCH=${BRANCH:-master}
|
||||
SCRIPT_DIR=$PWD
|
||||
SERVICE_FOLDER=plane-app
|
||||
PLANE_INSTALL_DIR=$PWD/$SERVICE_FOLDER
|
||||
export APP_RELEASE="stable"
|
||||
export APP_RELEASE=stable
|
||||
export DOCKERHUB_USER=makeplane
|
||||
export PULL_POLICY=${PULL_POLICY:-if_not_present}
|
||||
export GH_REPO=makeplane/plane
|
||||
export RELEASE_DOWNLOAD_URL="https://github.com/$GH_REPO/releases/download"
|
||||
export FALLBACK_DOWNLOAD_URL="https://raw.githubusercontent.com/$GH_REPO/$BRANCH/deploy/selfhost"
|
||||
|
||||
CPU_ARCH=$(uname -m)
|
||||
OS_NAME=$(uname)
|
||||
@@ -16,13 +19,6 @@ mkdir -p $PLANE_INSTALL_DIR/archive
|
||||
DOCKER_FILE_PATH=$PLANE_INSTALL_DIR/docker-compose.yaml
|
||||
DOCKER_ENV_PATH=$PLANE_INSTALL_DIR/plane.env
|
||||
|
||||
SED_PREFIX=()
|
||||
if [ "$OS_NAME" == "Darwin" ]; then
|
||||
SED_PREFIX=("-i" "")
|
||||
else
|
||||
SED_PREFIX=("-i")
|
||||
fi
|
||||
|
||||
function print_header() {
|
||||
clear
|
||||
|
||||
@@ -59,6 +55,17 @@ function spinner() {
|
||||
printf " \b\b\b\b" >&2
|
||||
}
|
||||
|
||||
function checkLatestRelease(){
|
||||
echo "Checking for the latest release..." >&2
|
||||
local latest_release=$(curl -s https://api.github.com/repos/$GH_REPO/releases/latest | grep -o '"tag_name": "[^"]*"' | sed 's/"tag_name": "//;s/"//g')
|
||||
if [ -z "$latest_release" ]; then
|
||||
echo "Failed to check for the latest release. Exiting..." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo $latest_release
|
||||
}
|
||||
|
||||
function initialize(){
|
||||
printf "Please wait while we check the availability of Docker images for the selected release ($APP_RELEASE) with ${UPPER_CPU_ARCH} support." >&2
|
||||
|
||||
@@ -130,8 +137,12 @@ function updateEnvFile() {
|
||||
echo "$key=$value" >> "$file"
|
||||
return
|
||||
else
|
||||
# if key exists, update the value
|
||||
sed "${SED_PREFIX[@]}" "s/^$key=.*/$key=$value/g" "$file"
|
||||
if [ "$OS_NAME" == "Darwin" ]; then
|
||||
value=$(echo "$value" | sed 's/|/\\|/g')
|
||||
sed -i '' "s|^$key=.*|$key=$value|g" "$file"
|
||||
else
|
||||
sed -i "s/^$key=.*/$key=$value/g" "$file"
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo "File not found: $file"
|
||||
@@ -182,7 +193,7 @@ function buildYourOwnImage(){
|
||||
local PLANE_TEMP_CODE_DIR=~/tmp/plane
|
||||
rm -rf $PLANE_TEMP_CODE_DIR
|
||||
mkdir -p $PLANE_TEMP_CODE_DIR
|
||||
REPO=https://github.com/makeplane/plane.git
|
||||
REPO=https://github.com/$GH_REPO.git
|
||||
git clone "$REPO" "$PLANE_TEMP_CODE_DIR" --branch "$BRANCH" --single-branch --depth 1
|
||||
|
||||
cp "$PLANE_TEMP_CODE_DIR/deploy/selfhost/build.yml" "$PLANE_TEMP_CODE_DIR/build.yml"
|
||||
@@ -204,6 +215,10 @@ function install() {
|
||||
echo "Begin Installing Plane"
|
||||
echo ""
|
||||
|
||||
if [ "$APP_RELEASE" == "stable" ]; then
|
||||
export APP_RELEASE=$(checkLatestRelease)
|
||||
fi
|
||||
|
||||
local build_image=$(initialize)
|
||||
|
||||
if [ "$build_image" == "build" ]; then
|
||||
@@ -232,8 +247,49 @@ function download() {
|
||||
mv $PLANE_INSTALL_DIR/docker-compose.yaml $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml
|
||||
fi
|
||||
|
||||
curl -H 'Cache-Control: no-cache, no-store' -s -o $PLANE_INSTALL_DIR/docker-compose.yaml https://raw.githubusercontent.com/makeplane/plane/$BRANCH/deploy/selfhost/docker-compose.yml?$(date +%s)
|
||||
curl -H 'Cache-Control: no-cache, no-store' -s -o $PLANE_INSTALL_DIR/variables-upgrade.env https://raw.githubusercontent.com/makeplane/plane/$BRANCH/deploy/selfhost/variables.env?$(date +%s)
|
||||
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$RELEASE_DOWNLOAD_URL/$APP_RELEASE/docker-compose.yml?$(date +%s)")
|
||||
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
|
||||
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
|
||||
|
||||
if [ "$STATUS" -eq 200 ]; then
|
||||
echo "$BODY" > $PLANE_INSTALL_DIR/docker-compose.yaml
|
||||
else
|
||||
# Fallback to download from the raw github url
|
||||
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$FALLBACK_DOWNLOAD_URL/docker-compose.yml?$(date +%s)")
|
||||
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
|
||||
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
|
||||
|
||||
if [ "$STATUS" -eq 200 ]; then
|
||||
echo "$BODY" > $PLANE_INSTALL_DIR/docker-compose.yaml
|
||||
else
|
||||
echo "Failed to download docker-compose.yml. HTTP Status: $STATUS"
|
||||
echo "URL: $RELEASE_DOWNLOAD_URL/$APP_RELEASE/docker-compose.yml"
|
||||
mv $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml $PLANE_INSTALL_DIR/docker-compose.yaml
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$RELEASE_DOWNLOAD_URL/$APP_RELEASE/variables.env?$(date +%s)")
|
||||
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
|
||||
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
|
||||
|
||||
if [ "$STATUS" -eq 200 ]; then
|
||||
echo "$BODY" > $PLANE_INSTALL_DIR/variables-upgrade.env
|
||||
else
|
||||
# Fallback to download from the raw github url
|
||||
RESPONSE=$(curl -H 'Cache-Control: no-cache, no-store' -s -w "HTTPSTATUS:%{http_code}" "$FALLBACK_DOWNLOAD_URL/variables.env?$(date +%s)")
|
||||
BODY=$(echo "$RESPONSE" | sed -e 's/HTTPSTATUS\:.*//g')
|
||||
STATUS=$(echo "$RESPONSE" | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')
|
||||
|
||||
if [ "$STATUS" -eq 200 ]; then
|
||||
echo "$BODY" > $PLANE_INSTALL_DIR/variables-upgrade.env
|
||||
else
|
||||
echo "Failed to download variables.env. HTTP Status: $STATUS"
|
||||
echo "URL: $RELEASE_DOWNLOAD_URL/$APP_RELEASE/variables.env"
|
||||
mv $PLANE_INSTALL_DIR/archive/$TS.docker-compose.yaml $PLANE_INSTALL_DIR/docker-compose.yaml
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -f "$DOCKER_ENV_PATH" ];
|
||||
then
|
||||
@@ -335,6 +391,34 @@ function restartServices() {
|
||||
startServices
|
||||
}
|
||||
function upgrade() {
|
||||
local latest_release=$(checkLatestRelease)
|
||||
|
||||
echo ""
|
||||
echo "Current release: $APP_RELEASE"
|
||||
|
||||
if [ "$latest_release" == "$APP_RELEASE" ]; then
|
||||
echo ""
|
||||
echo "You are already using the latest release"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Latest release: $latest_release"
|
||||
echo ""
|
||||
|
||||
# Check for confirmation to upgrade
|
||||
echo "Do you want to upgrade to the latest release ($latest_release)?"
|
||||
read -p "Continue? [y/N]: " confirm
|
||||
|
||||
if [[ ! "$confirm" =~ ^[Yy]$ ]]; then
|
||||
echo "Exiting..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
export APP_RELEASE=$latest_release
|
||||
|
||||
echo "Upgrading Plane to the latest release..."
|
||||
echo ""
|
||||
|
||||
echo "***** STOPPING SERVICES ****"
|
||||
stopServices
|
||||
|
||||
|
||||
@@ -47,9 +47,6 @@ AWS_ACCESS_KEY_ID=access-key
|
||||
AWS_SECRET_ACCESS_KEY=secret-key
|
||||
AWS_S3_ENDPOINT_URL=http://plane-minio:9000
|
||||
AWS_S3_BUCKET_NAME=uploads
|
||||
MINIO_ROOT_USER=access-key
|
||||
MINIO_ROOT_PASSWORD=secret-key
|
||||
BUCKET_NAME=uploads
|
||||
FILE_SIZE_LIMIT=5242880
|
||||
|
||||
# Gunicorn Workers
|
||||
|
||||
6
live/.prettierignore
Normal file
6
live/.prettierignore
Normal file
@@ -0,0 +1,6 @@
|
||||
.next
|
||||
.turbo
|
||||
out/
|
||||
dist/
|
||||
build/
|
||||
node_modules/
|
||||
5
live/.prettierrc
Normal file
5
live/.prettierrc
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"printWidth": 120,
|
||||
"tabWidth": 2,
|
||||
"trailingComma": "es5"
|
||||
}
|
||||
@@ -20,6 +20,7 @@
|
||||
"@hocuspocus/extension-logger": "^2.11.3",
|
||||
"@hocuspocus/extension-redis": "^2.13.5",
|
||||
"@hocuspocus/server": "^2.11.3",
|
||||
"@plane/constants": "*",
|
||||
"@plane/editor": "*",
|
||||
"@plane/types": "*",
|
||||
"@sentry/node": "^8.28.0",
|
||||
|
||||
3
packages/constants/src/ai.ts
Normal file
3
packages/constants/src/ai.ts
Normal file
@@ -0,0 +1,3 @@
|
||||
export enum AI_EDITOR_TASKS {
|
||||
ASK_ANYTHING = "ASK_ANYTHING",
|
||||
}
|
||||
@@ -1,3 +1,36 @@
|
||||
export enum E_PASSWORD_STRENGTH {
|
||||
EMPTY = "empty",
|
||||
LENGTH_NOT_VALID = "length_not_valid",
|
||||
STRENGTH_NOT_VALID = "strength_not_valid",
|
||||
STRENGTH_VALID = "strength_valid",
|
||||
}
|
||||
|
||||
export const PASSWORD_MIN_LENGTH = 8;
|
||||
|
||||
export const SPACE_PASSWORD_CRITERIA = [
|
||||
{
|
||||
key: "min_8_char",
|
||||
label: "Min 8 characters",
|
||||
isCriteriaValid: (password: string) =>
|
||||
password.length >= PASSWORD_MIN_LENGTH,
|
||||
},
|
||||
// {
|
||||
// key: "min_1_upper_case",
|
||||
// label: "Min 1 upper-case letter",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_NUMBER_REGEX.test(password),
|
||||
// },
|
||||
// {
|
||||
// key: "min_1_number",
|
||||
// label: "Min 1 number",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_CHAR_CAPS_REGEX.test(password),
|
||||
// },
|
||||
// {
|
||||
// key: "min_1_special_char",
|
||||
// label: "Min 1 special character",
|
||||
// isCriteriaValid: (password: string) => PASSWORD_SPECIAL_CHAR_REGEX.test(password),
|
||||
// },
|
||||
];
|
||||
|
||||
export enum EAuthPageTypes {
|
||||
PUBLIC = "PUBLIC",
|
||||
NON_AUTHENTICATED = "NON_AUTHENTICATED",
|
||||
@@ -6,6 +39,14 @@ export enum EAuthPageTypes {
|
||||
AUTHENTICATED = "AUTHENTICATED",
|
||||
}
|
||||
|
||||
export enum EPageTypes {
|
||||
INIT = "INIT",
|
||||
PUBLIC = "PUBLIC",
|
||||
NON_AUTHENTICATED = "NON_AUTHENTICATED",
|
||||
ONBOARDING = "ONBOARDING",
|
||||
AUTHENTICATED = "AUTHENTICATED",
|
||||
}
|
||||
|
||||
export enum EAuthModes {
|
||||
SIGN_IN = "SIGN_IN",
|
||||
SIGN_UP = "SIGN_UP",
|
||||
@@ -17,15 +58,35 @@ export enum EAuthSteps {
|
||||
UNIQUE_CODE = "UNIQUE_CODE",
|
||||
}
|
||||
|
||||
// TODO: remove this
|
||||
export enum EErrorAlertType {
|
||||
BANNER_ALERT = "BANNER_ALERT",
|
||||
TOAST_ALERT = "TOAST_ALERT",
|
||||
INLINE_FIRST_NAME = "INLINE_FIRST_NAME",
|
||||
INLINE_EMAIL = "INLINE_EMAIL",
|
||||
INLINE_PASSWORD = "INLINE_PASSWORD",
|
||||
INLINE_EMAIL_CODE = "INLINE_EMAIL_CODE",
|
||||
}
|
||||
|
||||
export type TAuthErrorInfo = {
|
||||
type: EErrorAlertType;
|
||||
code: EAdminAuthErrorCodes;
|
||||
title: string;
|
||||
message: any;
|
||||
};
|
||||
|
||||
export enum EAdminAuthErrorCodes {
|
||||
// Admin
|
||||
ADMIN_ALREADY_EXIST = "5150",
|
||||
REQUIRED_ADMIN_EMAIL_PASSWORD_FIRST_NAME = "5155",
|
||||
INVALID_ADMIN_EMAIL = "5160",
|
||||
INVALID_ADMIN_PASSWORD = "5165",
|
||||
REQUIRED_ADMIN_EMAIL_PASSWORD = "5170",
|
||||
ADMIN_AUTHENTICATION_FAILED = "5175",
|
||||
ADMIN_USER_ALREADY_EXIST = "5180",
|
||||
ADMIN_USER_DOES_NOT_EXIST = "5185",
|
||||
ADMIN_USER_DEACTIVATED = "5190",
|
||||
}
|
||||
|
||||
export enum EAuthErrorCodes {
|
||||
// Global
|
||||
INSTANCE_NOT_CONFIGURED = "5000",
|
||||
@@ -74,7 +135,7 @@ export enum EAuthErrorCodes {
|
||||
INCORRECT_OLD_PASSWORD = "5135",
|
||||
MISSING_PASSWORD = "5138",
|
||||
INVALID_NEW_PASSWORD = "5140",
|
||||
// set passowrd
|
||||
// set password
|
||||
PASSWORD_ALREADY_SET = "5145",
|
||||
// Admin
|
||||
ADMIN_ALREADY_EXIST = "5150",
|
||||
|
||||
@@ -1,18 +1,25 @@
|
||||
export const API_BASE_URL = process.env.NEXT_PUBLIC_API_BASE_URL || "";
|
||||
// PI Base Url
|
||||
export const PI_BASE_URL = process.env.NEXT_PUBLIC_PI_BASE_URL || "";
|
||||
export const API_BASE_PATH = process.env.NEXT_PUBLIC_API_BASE_PATH || "/";
|
||||
export const API_URL = encodeURI(`${API_BASE_URL}${API_BASE_PATH}`);
|
||||
// God Mode Admin App Base Url
|
||||
export const ADMIN_BASE_URL = process.env.NEXT_PUBLIC_ADMIN_BASE_URL || "";
|
||||
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "";
|
||||
export const GOD_MODE_URL = encodeURI(`${ADMIN_BASE_URL}${ADMIN_BASE_PATH}/`);
|
||||
export const ADMIN_BASE_PATH = process.env.NEXT_PUBLIC_ADMIN_BASE_PATH || "/";
|
||||
export const GOD_MODE_URL = encodeURI(`${ADMIN_BASE_URL}${ADMIN_BASE_PATH}`);
|
||||
// Publish App Base Url
|
||||
export const SPACE_BASE_URL = process.env.NEXT_PUBLIC_SPACE_BASE_URL || "";
|
||||
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "";
|
||||
export const SITES_URL = encodeURI(`${SPACE_BASE_URL}${SPACE_BASE_PATH}/`);
|
||||
export const SPACE_BASE_PATH = process.env.NEXT_PUBLIC_SPACE_BASE_PATH || "/";
|
||||
export const SITES_URL = encodeURI(`${SPACE_BASE_URL}${SPACE_BASE_PATH}`);
|
||||
// Live App Base Url
|
||||
export const LIVE_BASE_URL = process.env.NEXT_PUBLIC_LIVE_BASE_URL || "";
|
||||
export const LIVE_BASE_PATH = process.env.NEXT_PUBLIC_LIVE_BASE_PATH || "";
|
||||
export const LIVE_URL = encodeURI(`${LIVE_BASE_URL}${LIVE_BASE_PATH}/`);
|
||||
export const LIVE_BASE_PATH = process.env.NEXT_PUBLIC_LIVE_BASE_PATH || "/";
|
||||
export const LIVE_URL = encodeURI(`${LIVE_BASE_URL}${LIVE_BASE_PATH}`);
|
||||
// Web App Base Url
|
||||
export const WEB_BASE_URL = process.env.NEXT_PUBLIC_WEB_BASE_URL || "";
|
||||
export const WEB_BASE_PATH = process.env.NEXT_PUBLIC_WEB_BASE_PATH || "/";
|
||||
export const WEB_URL = encodeURI(`${WEB_BASE_URL}${WEB_BASE_PATH}`);
|
||||
// plane website url
|
||||
export const WEBSITE_URL =
|
||||
process.env.NEXT_PUBLIC_WEBSITE_URL || "https://plane.so";
|
||||
// support email
|
||||
export const SUPPORT_EMAIL =
|
||||
process.env.NEXT_PUBLIC_SUPPORT_EMAIL || "support@plane.so";
|
||||
|
||||
@@ -1,4 +1,11 @@
|
||||
export * from "./ai";
|
||||
export * from "./auth";
|
||||
export * from "./endpoints";
|
||||
export * from "./file";
|
||||
export * from "./instance";
|
||||
export * from "./issue";
|
||||
export * from "./metadata";
|
||||
export * from "./state";
|
||||
export * from "./swr";
|
||||
export * from "./user";
|
||||
export * from "./workspace";
|
||||
|
||||
@@ -1,5 +1,25 @@
|
||||
import { List, Kanban } from "lucide-react";
|
||||
|
||||
export const ALL_ISSUES = "All Issues";
|
||||
|
||||
export type TIssuePriorities = "urgent" | "high" | "medium" | "low" | "none";
|
||||
|
||||
export type TIssueFilterKeys = "priority" | "state" | "labels";
|
||||
|
||||
export type TIssueLayout =
|
||||
| "list"
|
||||
| "kanban"
|
||||
| "calendar"
|
||||
| "spreadsheet"
|
||||
| "gantt";
|
||||
|
||||
export type TIssueFilterPriorityObject = {
|
||||
key: TIssuePriorities;
|
||||
title: string;
|
||||
className: string;
|
||||
icon: string;
|
||||
};
|
||||
|
||||
export enum EIssueGroupByToServerOptions {
|
||||
"state" = "state_id",
|
||||
"priority" = "priority",
|
||||
@@ -11,6 +31,7 @@ export enum EIssueGroupByToServerOptions {
|
||||
"target_date" = "target_date",
|
||||
"project" = "project_id",
|
||||
"created_by" = "created_by",
|
||||
"team_project" = "project_id",
|
||||
}
|
||||
|
||||
export enum EIssueGroupBYServerToProperty {
|
||||
@@ -38,3 +59,127 @@ export enum EServerGroupByToFilterOptions {
|
||||
"project_id" = "project",
|
||||
"created_by" = "created_by",
|
||||
}
|
||||
|
||||
export enum EIssueServiceType {
|
||||
ISSUES = "issues",
|
||||
EPICS = "epics",
|
||||
}
|
||||
|
||||
export enum EIssueLayoutTypes {
|
||||
LIST = "list",
|
||||
KANBAN = "kanban",
|
||||
CALENDAR = "calendar",
|
||||
GANTT = "gantt_chart",
|
||||
SPREADSHEET = "spreadsheet",
|
||||
}
|
||||
|
||||
export enum EIssuesStoreType {
|
||||
GLOBAL = "GLOBAL",
|
||||
PROFILE = "PROFILE",
|
||||
TEAM = "TEAM",
|
||||
PROJECT = "PROJECT",
|
||||
CYCLE = "CYCLE",
|
||||
MODULE = "MODULE",
|
||||
TEAM_VIEW = "TEAM_VIEW",
|
||||
PROJECT_VIEW = "PROJECT_VIEW",
|
||||
ARCHIVED = "ARCHIVED",
|
||||
DRAFT = "DRAFT",
|
||||
DEFAULT = "DEFAULT",
|
||||
WORKSPACE_DRAFT = "WORKSPACE_DRAFT",
|
||||
EPIC = "EPIC",
|
||||
}
|
||||
|
||||
export enum EIssueFilterType {
|
||||
FILTERS = "filters",
|
||||
DISPLAY_FILTERS = "display_filters",
|
||||
DISPLAY_PROPERTIES = "display_properties",
|
||||
KANBAN_FILTERS = "kanban_filters",
|
||||
}
|
||||
|
||||
export enum EIssueCommentAccessSpecifier {
|
||||
EXTERNAL = "EXTERNAL",
|
||||
INTERNAL = "INTERNAL",
|
||||
}
|
||||
|
||||
export enum EIssueListRow {
|
||||
HEADER = "HEADER",
|
||||
ISSUE = "ISSUE",
|
||||
NO_ISSUES = "NO_ISSUES",
|
||||
QUICK_ADD = "QUICK_ADD",
|
||||
}
|
||||
|
||||
export const ISSUE_DISPLAY_FILTERS_BY_LAYOUT: {
|
||||
[key in TIssueLayout]: Record<"filters", TIssueFilterKeys[]>;
|
||||
} = {
|
||||
list: {
|
||||
filters: ["priority", "state", "labels"],
|
||||
},
|
||||
kanban: {
|
||||
filters: ["priority", "state", "labels"],
|
||||
},
|
||||
calendar: {
|
||||
filters: ["priority", "state", "labels"],
|
||||
},
|
||||
spreadsheet: {
|
||||
filters: ["priority", "state", "labels"],
|
||||
},
|
||||
gantt: {
|
||||
filters: ["priority", "state", "labels"],
|
||||
},
|
||||
};
|
||||
|
||||
export const ISSUE_PRIORITIES: {
|
||||
key: TIssuePriorities;
|
||||
title: string;
|
||||
}[] = [
|
||||
{ key: "urgent", title: "Urgent" },
|
||||
{ key: "high", title: "High" },
|
||||
{ key: "medium", title: "Medium" },
|
||||
{ key: "low", title: "Low" },
|
||||
{ key: "none", title: "None" },
|
||||
];
|
||||
|
||||
export const ISSUE_PRIORITY_FILTERS: TIssueFilterPriorityObject[] = [
|
||||
{
|
||||
key: "urgent",
|
||||
title: "Urgent",
|
||||
className: "bg-red-500 border-red-500 text-white",
|
||||
icon: "error",
|
||||
},
|
||||
{
|
||||
key: "high",
|
||||
title: "High",
|
||||
className: "text-orange-500 border-custom-border-300",
|
||||
icon: "signal_cellular_alt",
|
||||
},
|
||||
{
|
||||
key: "medium",
|
||||
title: "Medium",
|
||||
className: "text-yellow-500 border-custom-border-300",
|
||||
icon: "signal_cellular_alt_2_bar",
|
||||
},
|
||||
{
|
||||
key: "low",
|
||||
title: "Low",
|
||||
className: "text-green-500 border-custom-border-300",
|
||||
icon: "signal_cellular_alt_1_bar",
|
||||
},
|
||||
{
|
||||
key: "none",
|
||||
title: "None",
|
||||
className: "text-gray-500 border-custom-border-300",
|
||||
icon: "block",
|
||||
},
|
||||
];
|
||||
|
||||
export const SITES_ISSUE_LAYOUTS: {
|
||||
key: TIssueLayout;
|
||||
title: string;
|
||||
icon: any;
|
||||
}[] = [
|
||||
{ key: "list", title: "List", icon: List },
|
||||
{ key: "kanban", title: "Kanban", icon: Kanban },
|
||||
// { key: "calendar", title: "Calendar", icon: Calendar },
|
||||
// { key: "spreadsheet", title: "Spreadsheet", icon: Sheet },
|
||||
// { key: "gantt", title: "Gantt chart", icon: GanttChartSquare },
|
||||
];
|
||||
|
||||
23
packages/constants/src/metadata.ts
Normal file
23
packages/constants/src/metadata.ts
Normal file
@@ -0,0 +1,23 @@
|
||||
export const SITE_NAME =
|
||||
"Plane | Simple, extensible, open-source project management tool.";
|
||||
export const SITE_TITLE =
|
||||
"Plane | Simple, extensible, open-source project management tool.";
|
||||
export const SITE_DESCRIPTION =
|
||||
"Open-source project management tool to manage issues, sprints, and product roadmaps with peace of mind.";
|
||||
export const SITE_KEYWORDS =
|
||||
"software development, plan, ship, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
|
||||
export const SITE_URL = "https://app.plane.so/";
|
||||
export const TWITTER_USER_NAME =
|
||||
"Plane | Simple, extensible, open-source project management tool.";
|
||||
|
||||
// Plane Sites Metadata
|
||||
export const SPACE_SITE_NAME =
|
||||
"Plane Publish | Make your Plane boards and roadmaps pubic with just one-click. ";
|
||||
export const SPACE_SITE_TITLE =
|
||||
"Plane Publish | Make your Plane boards public with one-click";
|
||||
export const SPACE_SITE_DESCRIPTION =
|
||||
"Plane Publish is a customer feedback management tool built on top of plane.so";
|
||||
export const SPACE_SITE_KEYWORDS =
|
||||
"software development, customer feedback, software, accelerate, code management, release management, project management, issue tracking, agile, scrum, kanban, collaboration";
|
||||
export const SPACE_SITE_URL = "https://app.plane.so/";
|
||||
export const SPACE_TWITTER_USER_NAME = "planepowers";
|
||||
@@ -1,4 +1,9 @@
|
||||
import { TStateGroups } from "@plane/types";
|
||||
export type TStateGroups =
|
||||
| "backlog"
|
||||
| "unstarted"
|
||||
| "started"
|
||||
| "completed"
|
||||
| "cancelled";
|
||||
|
||||
export const STATE_GROUPS: {
|
||||
[key in TStateGroups]: {
|
||||
@@ -34,4 +39,7 @@ export const STATE_GROUPS: {
|
||||
},
|
||||
};
|
||||
|
||||
export const ARCHIVABLE_STATE_GROUPS = [STATE_GROUPS.completed.key, STATE_GROUPS.cancelled.key];
|
||||
export const ARCHIVABLE_STATE_GROUPS = [
|
||||
STATE_GROUPS.completed.key,
|
||||
STATE_GROUPS.cancelled.key,
|
||||
];
|
||||
@@ -1,4 +1,4 @@
|
||||
export const SWR_CONFIG = {
|
||||
export const DEFAULT_SWR_CONFIG = {
|
||||
refreshWhenHidden: false,
|
||||
revalidateIfStale: false,
|
||||
revalidateOnFocus: false,
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user