1
0
Fork 0
mirror of synced 2024-06-23 08:30:31 +12:00

Merge remote-tracking branch 'origin/master' into fix/peter-fixes

This commit is contained in:
Peter Clement 2022-11-02 12:09:48 +00:00
commit 5138559a69
350 changed files with 12283 additions and 3159 deletions

View file

@ -79,11 +79,13 @@ spec:
- name: MINIO_URL
value: {{ .Values.services.objectStore.url }}
- name: PLUGIN_BUCKET_NAME
value: {{ .Values.services.objectStore.pluginBucketName | default "plugins" | quote }}
value: {{ .Values.services.objectStore.pluginBucketName | quote }}
- name: APPS_BUCKET_NAME
value: {{ .Values.services.objectStore.appsBucketName | default "apps" | quote }}
value: {{ .Values.services.objectStore.appsBucketName | quote }}
- name: GLOBAL_CLOUD_BUCKET_NAME
value: {{ .Values.services.objectStore.globalBucketName | default "global" | quote }}
value: {{ .Values.services.objectStore.globalBucketName | quote }}
- name: BACKUPS_BUCKET_NAME
value: {{ .Values.services.objectStore.backupsBucketName | quote }}
- name: PORT
value: {{ .Values.services.apps.port | quote }}
{{ if .Values.services.worker.publicApiRateLimitPerSecond }}

View file

@ -78,11 +78,13 @@ spec:
- name: MINIO_URL
value: {{ .Values.services.objectStore.url }}
- name: PLUGIN_BUCKET_NAME
value: {{ .Values.services.objectStore.pluginBucketName | default "plugins" | quote }}
value: {{ .Values.services.objectStore.pluginBucketName | quote }}
- name: APPS_BUCKET_NAME
value: {{ .Values.services.objectStore.appsBucketName | default "apps" | quote }}
value: {{ .Values.services.objectStore.appsBucketName | quote }}
- name: GLOBAL_CLOUD_BUCKET_NAME
value: {{ .Values.services.objectStore.globalBucketName | default "global" | quote }}
value: {{ .Values.services.objectStore.globalBucketName | quote }}
- name: BACKUPS_BUCKET_NAME
value: {{ .Values.services.objectStore.backupsBucketName | quote }}
- name: PORT
value: {{ .Values.services.worker.port | quote }}
- name: MULTI_TENANCY

View file

@ -1,12 +1,15 @@
## Dev Environment on Debian 11
### Install Node
### Install NVM & Node 14
NVM documentation: https://github.com/nvm-sh/nvm#installing-and-updating
Budibase requires a recent version of node (14+):
Install NVM
```
curl -sL https://deb.nodesource.com/setup_16.x | sudo bash -
apt -y install nodejs
node -v
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
```
Install Node 14
```
nvm install 14
```
### Install npm requirements
@ -31,7 +34,7 @@ This setup process was tested on Debian 11 (bullseye) with version numbers show
- Docker: 20.10.5
- Docker-Compose: 1.29.2
- Node: v16.15.1
- Node: v14.20.1
- Yarn: 1.22.19
- Lerna: 5.1.4

View file

@ -11,7 +11,7 @@ through brew.
### Install Node
Budibase requires a recent version of node (14+):
Budibase requires a recent version of node 14:
```
brew install node npm
node -v
@ -38,7 +38,7 @@ This setup process was tested on Mac OSX 12 (Monterey) with version numbers show
- Docker: 20.10.14
- Docker-Compose: 2.6.0
- Node: 18.3.0
- Node: 14.20.1
- Yarn: 1.22.19
- Lerna: 5.1.4
@ -59,4 +59,7 @@ The dev version will be available on port 10000 i.e.
http://127.0.0.1:10000/builder/admin
| **NOTE**: If you are working on a M1 Apple Silicon, you will need to uncomment `# platform: linux/amd64` line in
[hosting/docker-compose-dev.yaml](../hosting/docker-compose.dev.yaml)
[hosting/docker-compose-dev.yaml](../hosting/docker-compose.dev.yaml)
### Troubleshooting
If there are errors with the `yarn setup` command, you can try installing nvm and node 14. This is the same as the instructions for Debian 11.

81
docs/DEV-SETUP-WINDOWS.md Normal file
View file

@ -0,0 +1,81 @@
## Dev Environment on Windows 10/11 (WSL2)
### Install WSL with Ubuntu LTS
Enable WSL 2 on Windows 10/11 for docker support.
```
wsl --set-default-version 2
```
Install Ubuntu LTS.
```
wsl --install Ubuntu
```
Or follow the instruction here:
https://learn.microsoft.com/en-us/windows/wsl/install
### Install Docker in windows
Download the installer from docker and install it.
Check this url for more detailed instructions:
https://docs.docker.com/desktop/install/windows-install/
You should follow the next steps from within the Ubuntu terminal.
### Install NVM & Node 14
NVM documentation: https://github.com/nvm-sh/nvm#installing-and-updating
Install NVM
```
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
```
Install Node 14
```
nvm install 14
```
### Install npm requirements
```
npm install -g yarn jest lerna
```
### Clone the repo
```
git clone https://github.com/Budibase/budibase.git
```
### Check Versions
This setup process was tested on Windows 11 with version numbers show below. Your mileage may vary using anything else.
- Docker: 20.10.7
- Docker-Compose: 2.10.2
- Node: v14.20.1
- Yarn: 1.22.19
- Lerna: 5.5.4
### Build
```
cd budibase
yarn setup
```
The yarn setup command runs several build steps i.e.
```
node ./hosting/scripts/setup.js && yarn && yarn bootstrap && yarn build && yarn dev
```
So this command will actually run the application in dev mode. It creates .env files under `./packages/server` and `./packages/worker` and runs docker containers for each service via docker-compose.
The dev version will be available on port 10000 i.e.
http://127.0.0.1:10000/builder/admin
### Working with the code
Here are the instructions to work on the application from within Visual Studio Code (in Windows) through the WSL. All the commands and files are within the Ubuntu system and it should run as if you were working on a Linux machine.
https://code.visualstudio.com/docs/remote/wsl
Note you will be able to run the application from within the WSL terminal and you will be able to access the application from the a browser in Windows.

View file

@ -24,6 +24,21 @@ http {
default "upgrade";
}
upstream app-service {
server {{address}}:4001;
keepalive 32;
}
upstream worker-service {
server {{address}}:4002;
keepalive 32;
}
upstream builder {
server {{address}}:3000;
keepalive 32;
}
server {
listen 10000 default_server;
server_name _;
@ -43,45 +58,78 @@ http {
}
location ~ ^/api/(system|admin|global)/ {
proxy_pass http://{{ address }}:4002;
proxy_pass http://worker-service;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_http_version 1.1;
proxy_set_header Connection "";
}
location /api/ {
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_pass http://{{ address }}:4001;
proxy_pass http://app-service;
proxy_http_version 1.1;
proxy_set_header Connection "";
}
location = / {
proxy_pass http://{{ address }}:4001;
proxy_pass http://app-service;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_http_version 1.1;
proxy_set_header Connection "";
}
location /app_ {
proxy_pass http://{{ address }}:4001;
proxy_pass http://app-service;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_http_version 1.1;
proxy_set_header Connection "";
}
location /app {
proxy_pass http://{{ address }}:4001;
proxy_pass http://app-service;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_http_version 1.1;
proxy_set_header Connection "";
}
location /builder {
proxy_pass http://{{ address }}:3000;
proxy_pass http://builder;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
proxy_http_version 1.1;
proxy_set_header Connection "";
rewrite ^/builder(.*)$ /builder/$1 break;
}
location /builder/ {
proxy_pass http://{{ address }}:3000;
proxy_pass http://builder;
proxy_http_version 1.1;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
}
location /vite/ {
proxy_pass http://{{ address }}:3000;
proxy_pass http://builder;
proxy_read_timeout 120s;
proxy_connect_timeout 120s;
proxy_send_timeout 120s;
rewrite ^/vite(.*)$ /$1 break;
}
@ -91,7 +139,7 @@ http {
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_pass http://{{ address }}:4001;
proxy_pass http://app-service;
}
location / {

View file

@ -171,11 +171,13 @@ http {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $http_host;
proxy_connect_timeout 300;
proxy_http_version 1.1;
proxy_set_header Connection "";
chunked_transfer_encoding off;
proxy_pass http://$minio:9000;
}

View file

@ -0,0 +1,24 @@
#!/bin/sh
# vim:sw=4:ts=4:et
set -e
ME=$(basename $0)
NGINX_CONF_FILE="/etc/nginx/nginx.conf"
DEFAULT_CONF_FILE="/etc/nginx/conf.d/default.conf"
# check if we have ipv6 available
if [ ! -f "/proc/net/if_inet6" ]; then
# ipv6 not available so delete lines from nginx conf
if [ -f "$NGINX_CONF_FILE" ]; then
sed -i '/listen \[::\]/d' $NGINX_CONF_FILE
fi
if [ -f "$DEFAULT_CONF_FILE" ]; then
sed -i '/listen \[::\]/d' $DEFAULT_CONF_FILE
fi
echo "$ME: info: ipv6 not available so delete lines from nginx conf"
else
echo "$ME: info: ipv6 is available so no need to delete lines from nginx conf"
fi
exit 0

View file

@ -5,6 +5,7 @@ FROM nginx:latest
# override the output dir to output directly to /etc/nginx instead of /etc/nginx/conf.d
ENV NGINX_ENVSUBST_OUTPUT_DIR=/etc/nginx
COPY .generated-nginx.prod.conf /etc/nginx/templates/nginx.conf.template
# IPv6 removal needs to happen after envsubst
RUN rm -rf /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh
COPY 80-listen-on-ipv6-by-default.sh /docker-entrypoint.d/80-listen-on-ipv6-by-default.sh

View file

@ -4,6 +4,7 @@ echo ${TARGETBUILD} > /buildtarget.txt
if [[ "${TARGETBUILD}" = "aas" ]]; then
# Azure AppService uses /home for persisent data & SSH on port 2222
DATA_DIR=/home
WEBSITES_ENABLE_APP_SERVICE_STORAGE=true
mkdir -p $DATA_DIR/{search,minio,couch}
mkdir -p $DATA_DIR/couch/{dbs,views}
chown -R couchdb:couchdb $DATA_DIR/couch/

View file

@ -22,6 +22,7 @@ declare -a DOCKER_VARS=("APP_PORT" "APPS_URL" "ARCHITECTURE" "BUDIBASE_ENVIRONME
# Azure App Service customisations
if [[ "${TARGETBUILD}" = "aas" ]]; then
DATA_DIR=/home
WEBSITES_ENABLE_APP_SERVICE_STORAGE=true
/etc/init.d/ssh start
else
DATA_DIR=${DATA_DIR:-/data}

View file

@ -1,5 +1,5 @@
{
"version": "2.0.39",
"version": "2.1.6",
"npmClient": "yarn",
"packages": [
"packages/*"

View file

@ -3,7 +3,6 @@
"private": true,
"devDependencies": {
"@rollup/plugin-json": "^4.0.2",
"@types/mongodb": "3.6.3",
"@typescript-eslint/parser": "4.28.0",
"babel-eslint": "^10.0.3",
"eslint": "^7.28.0",

View file

@ -6,6 +6,7 @@ const {
updateAppId,
doInAppContext,
doInTenant,
doInContext,
} = require("./src/context")
const identity = require("./src/context/identity")
@ -19,4 +20,5 @@ module.exports = {
doInAppContext,
doInTenant,
identity,
doInContext,
}

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/backend-core",
"version": "2.0.39",
"version": "2.1.6",
"description": "Budibase backend core libraries used in server and worker",
"main": "dist/src/index.js",
"types": "dist/src/index.d.ts",
@ -20,12 +20,13 @@
"test:watch": "jest --watchAll"
},
"dependencies": {
"@budibase/types": "^2.0.39",
"@budibase/types": "^2.1.6",
"@shopify/jest-koa-mocks": "5.0.1",
"@techpass/passport-openidconnect": "0.3.2",
"aws-sdk": "2.1030.0",
"bcrypt": "5.0.1",
"bcryptjs": "2.4.3",
"bull": "4.10.1",
"dotenv": "16.0.1",
"emitter-listener": "1.1.2",
"ioredis": "4.28.0",
@ -62,6 +63,8 @@
]
},
"devDependencies": {
"@types/chance": "1.1.3",
"@types/ioredis": "4.28.0",
"@types/jest": "27.5.1",
"@types/koa": "2.0.52",
"@types/lodash": "4.14.180",
@ -72,6 +75,7 @@
"@types/semver": "7.3.7",
"@types/tar-fs": "2.0.1",
"@types/uuid": "8.3.4",
"chance": "1.1.3",
"ioredis-mock": "5.8.0",
"jest": "27.5.1",
"koa": "2.7.0",

View file

@ -24,10 +24,15 @@ import {
} from "./middleware"
import { invalidateUser } from "./cache/user"
import { User } from "@budibase/types"
import { logAlert } from "./logging"
// Strategies
passport.use(new LocalStrategy(local.options, local.authenticate))
passport.use(new JwtStrategy(jwt.options, jwt.authenticate))
if (jwt.options.secretOrKey) {
passport.use(new JwtStrategy(jwt.options, jwt.authenticate))
} else {
logAlert("No JWT Secret supplied, cannot configure JWT strategy")
}
passport.serializeUser((user: User, done: any) => done(null, user))

View file

@ -1,6 +1,7 @@
import BaseCache from "./base"
import { getWritethroughClient } from "../redis/init"
import { logWarn } from "../logging"
import PouchDB from "pouchdb"
const DEFAULT_WRITE_RATE_MS = 10000
let CACHE: BaseCache | null = null

View file

@ -6,6 +6,7 @@ import { baseGlobalDBName } from "../db/tenancy"
import { IdentityContext } from "@budibase/types"
import { DEFAULT_TENANT_ID as _DEFAULT_TENANT_ID } from "../constants"
import { ContextKey } from "./constants"
import PouchDB from "pouchdb"
import {
updateUsing,
closeWithUsing,
@ -22,16 +23,15 @@ export const DEFAULT_TENANT_ID = _DEFAULT_TENANT_ID
let TEST_APP_ID: string | null = null
export const closeTenancy = async () => {
let db
try {
if (env.USE_COUCH) {
db = getGlobalDB()
const db = getGlobalDB()
await closeDB(db)
}
} catch (err) {
// no DB found - skip closing
return
}
await closeDB(db)
// clear from context now that database is closed/task is finished
cls.setOnContext(ContextKey.TENANT_ID, null)
cls.setOnContext(ContextKey.GLOBAL_DB, null)
@ -53,6 +53,9 @@ export const getTenantIDFromAppID = (appId: string) => {
if (!appId) {
return null
}
if (!isMultiTenant()) {
return DEFAULT_TENANT_ID
}
const split = appId.split(SEPARATOR)
const hasDev = split[1] === DocumentType.DEV
if ((hasDev && split.length === 3) || (!hasDev && split.length === 2)) {
@ -65,7 +68,16 @@ export const getTenantIDFromAppID = (appId: string) => {
}
}
// used for automations, API endpoints should always be in context already
export const doInContext = async (appId: string, task: any) => {
// gets the tenant ID from the app ID
const tenantId = getTenantIDFromAppID(appId)
return doInTenant(tenantId, async () => {
return doInAppContext(appId, async () => {
return task()
})
})
}
export const doInTenant = (tenantId: string | null, task: any) => {
// make sure default always selected in single tenancy
if (!env.MULTI_TENANCY) {

View file

@ -21,6 +21,7 @@ export enum ViewName {
ACCOUNT_BY_EMAIL = "account_by_email",
PLATFORM_USERS_LOWERCASE = "platform_users_lowercase",
USER_BY_GROUP = "by_group_user",
APP_BACKUP_BY_TRIGGER = "by_trigger",
}
export const DeprecatedViews = {
@ -30,6 +31,10 @@ export const DeprecatedViews = {
],
}
export enum InternalTable {
USER_METADATA = "ta_users",
}
export enum DocumentType {
USER = "us",
GROUP = "gr",
@ -46,6 +51,23 @@ export enum DocumentType {
AUTOMATION_LOG = "log_au",
ACCOUNT_METADATA = "acc_metadata",
PLUGIN = "plg",
DATASOURCE = "datasource",
DATASOURCE_PLUS = "datasource_plus",
APP_BACKUP = "backup",
TABLE = "ta",
ROW = "ro",
AUTOMATION = "au",
LINK = "li",
WEBHOOK = "wh",
INSTANCE = "inst",
LAYOUT = "layout",
SCREEN = "screen",
QUERY = "query",
DEPLOYMENTS = "deployments",
METADATA = "metadata",
MEM_VIEW = "view",
USER_FLAG = "flag",
AUTOMATION_METADATA = "meta_au",
}
export const StaticDatabases = {

View file

@ -1,91 +0,0 @@
const pouch = require("./pouch")
const env = require("../environment")
const openDbs = []
let PouchDB
let initialised = false
const dbList = new Set()
if (env.MEMORY_LEAK_CHECK) {
setInterval(() => {
console.log("--- OPEN DBS ---")
console.log(openDbs)
}, 5000)
}
const put =
dbPut =>
async (doc, options = {}) => {
if (!doc.createdAt) {
doc.createdAt = new Date().toISOString()
}
doc.updatedAt = new Date().toISOString()
return dbPut(doc, options)
}
const checkInitialised = () => {
if (!initialised) {
throw new Error("init has not been called")
}
}
exports.init = opts => {
PouchDB = pouch.getPouch(opts)
initialised = true
}
// NOTE: THIS IS A DANGEROUS FUNCTION - USE WITH CAUTION
// this function is prone to leaks, should only be used
// in situations that using the function doWithDB does not work
exports.dangerousGetDB = (dbName, opts) => {
checkInitialised()
if (env.isTest()) {
dbList.add(dbName)
}
const db = new PouchDB(dbName, opts)
if (env.MEMORY_LEAK_CHECK) {
openDbs.push(db.name)
}
const dbPut = db.put
db.put = put(dbPut)
return db
}
// use this function if you have called dangerousGetDB - close
// the databases you've opened once finished
exports.closeDB = async db => {
if (!db || env.isTest()) {
return
}
if (env.MEMORY_LEAK_CHECK) {
openDbs.splice(openDbs.indexOf(db.name), 1)
}
try {
// specifically await so that if there is an error, it can be ignored
return await db.close()
} catch (err) {
// ignore error, already closed
}
}
// we have to use a callback for this so that we can close
// the DB when we're done, without this manual requests would
// need to close the database when done with it to avoid memory leaks
exports.doWithDB = async (dbName, cb, opts = {}) => {
const db = exports.dangerousGetDB(dbName, opts)
// need this to be async so that we can correctly close DB after all
// async operations have been completed
try {
return await cb(db)
} finally {
await exports.closeDB(db)
}
}
exports.allDbs = () => {
if (!env.isTest()) {
throw new Error("Cannot be used outside test environment.")
}
checkInitialised()
return [...dbList]
}

View file

@ -0,0 +1,133 @@
import * as pouch from "./pouch"
import env from "../environment"
import { checkSlashesInUrl } from "../helpers"
import fetch from "node-fetch"
import { PouchOptions, CouchFindOptions } from "@budibase/types"
import PouchDB from "pouchdb"
const openDbs: string[] = []
let Pouch: any
let initialised = false
const dbList = new Set()
if (env.MEMORY_LEAK_CHECK) {
setInterval(() => {
console.log("--- OPEN DBS ---")
console.log(openDbs)
}, 5000)
}
const put =
(dbPut: any) =>
async (doc: any, options = {}) => {
if (!doc.createdAt) {
doc.createdAt = new Date().toISOString()
}
doc.updatedAt = new Date().toISOString()
return dbPut(doc, options)
}
const checkInitialised = () => {
if (!initialised) {
throw new Error("init has not been called")
}
}
export async function init(opts?: PouchOptions) {
Pouch = pouch.getPouch(opts)
initialised = true
}
// NOTE: THIS IS A DANGEROUS FUNCTION - USE WITH CAUTION
// this function is prone to leaks, should only be used
// in situations that using the function doWithDB does not work
export function dangerousGetDB(dbName: string, opts?: any): PouchDB.Database {
checkInitialised()
if (env.isTest()) {
dbList.add(dbName)
}
const db = new Pouch(dbName, opts)
if (env.MEMORY_LEAK_CHECK) {
openDbs.push(db.name)
}
const dbPut = db.put
db.put = put(dbPut)
return db
}
// use this function if you have called dangerousGetDB - close
// the databases you've opened once finished
export async function closeDB(db: PouchDB.Database) {
if (!db || env.isTest()) {
return
}
if (env.MEMORY_LEAK_CHECK) {
openDbs.splice(openDbs.indexOf(db.name), 1)
}
try {
// specifically await so that if there is an error, it can be ignored
return await db.close()
} catch (err) {
// ignore error, already closed
}
}
// we have to use a callback for this so that we can close
// the DB when we're done, without this manual requests would
// need to close the database when done with it to avoid memory leaks
export async function doWithDB(dbName: string, cb: any, opts = {}) {
const db = dangerousGetDB(dbName, opts)
// need this to be async so that we can correctly close DB after all
// async operations have been completed
try {
return await cb(db)
} finally {
await closeDB(db)
}
}
export function allDbs() {
if (!env.isTest()) {
throw new Error("Cannot be used outside test environment.")
}
checkInitialised()
return [...dbList]
}
export async function directCouchQuery(
path: string,
method: string = "GET",
body?: any
) {
let { url, cookie } = pouch.getCouchInfo()
const couchUrl = `${url}/${path}`
const params: any = {
method: method,
headers: {
Authorization: cookie,
},
}
if (body && method !== "GET") {
params.body = JSON.stringify(body)
params.headers["Content-Type"] = "application/json"
}
const response = await fetch(checkSlashesInUrl(encodeURI(couchUrl)), params)
if (response.status < 300) {
return await response.json()
} else {
throw "Cannot connect to CouchDB instance"
}
}
export async function directCouchAllDbs(queryString?: string) {
let couchPath = "/_all_dbs"
if (queryString) {
couchPath += `?${queryString}`
}
return await directCouchQuery(couchPath)
}
export async function directCouchFind(dbName: string, opts: CouchFindOptions) {
const json = await directCouchQuery(`${dbName}/_find`, "POST", opts)
return { rows: json.docs, bookmark: json.bookmark }
}

View file

@ -1,7 +1,7 @@
const PouchDB = require("pouchdb")
const env = require("../environment")
import PouchDB from "pouchdb"
import env from "../environment"
exports.getUrlInfo = (url = env.COUCH_DB_URL) => {
export const getUrlInfo = (url = env.COUCH_DB_URL) => {
let cleanUrl, username, password, host
if (url) {
// Ensure the URL starts with a protocol
@ -44,8 +44,8 @@ exports.getUrlInfo = (url = env.COUCH_DB_URL) => {
}
}
exports.getCouchInfo = () => {
const urlInfo = exports.getUrlInfo()
export const getCouchInfo = () => {
const urlInfo = getUrlInfo()
let username
let password
if (env.COUCH_DB_USERNAME) {
@ -82,11 +82,11 @@ exports.getCouchInfo = () => {
* This should be rarely used outside of the main application config.
* Exposed for exceptional cases such as in-memory views.
*/
exports.getPouch = (opts = {}) => {
let { url, cookie } = exports.getCouchInfo()
export const getPouch = (opts: any = {}) => {
let { url, cookie } = getCouchInfo()
let POUCH_DB_DEFAULTS = {
prefix: url,
fetch: (url, opts) => {
fetch: (url: string, opts: any) => {
// use a specific authorization cookie - be very explicit about how we authenticate
opts.headers.set("Authorization", cookie)
return PouchDB.fetch(url, opts)
@ -98,6 +98,7 @@ exports.getPouch = (opts = {}) => {
PouchDB.plugin(inMemory)
POUCH_DB_DEFAULTS = {
prefix: undefined,
// @ts-ignore
adapter: "memory",
}
}
@ -105,6 +106,7 @@ exports.getPouch = (opts = {}) => {
if (opts.onDisk) {
POUCH_DB_DEFAULTS = {
prefix: undefined,
// @ts-ignore
adapter: "leveldb",
}
}
@ -112,6 +114,7 @@ exports.getPouch = (opts = {}) => {
if (opts.replication) {
const replicationStream = require("pouchdb-replication-stream")
PouchDB.plugin(replicationStream.plugin)
// @ts-ignore
PouchDB.adapter("writableStream", replicationStream.adapters.writableStream)
}

View file

@ -1,14 +1,17 @@
import { newid } from "../hashing"
import { DEFAULT_TENANT_ID, Configs } from "../constants"
import env from "../environment"
import { SEPARATOR, DocumentType, UNICODE_MAX, ViewName } from "./constants"
import {
SEPARATOR,
DocumentType,
UNICODE_MAX,
ViewName,
InternalTable,
} from "./constants"
import { getTenantId, getGlobalDB } from "../context"
import { getGlobalDBName } from "./tenancy"
import fetch from "node-fetch"
import { doWithDB, allDbs } from "./index"
import { getCouchInfo } from "./pouch"
import { doWithDB, allDbs, directCouchAllDbs } from "./index"
import { getAppMetadata } from "../cache/appMetadata"
import { checkSlashesInUrl } from "../helpers"
import { isDevApp, isDevAppID, getProdAppID } from "./conversions"
import { APP_PREFIX } from "./constants"
import * as events from "../events"
@ -43,8 +46,8 @@ export const generateAppID = (tenantId = null) => {
* @returns {object} Parameters which can then be used with an allDocs request.
*/
export function getDocParams(
docType: any,
docId: any = null,
docType: string,
docId?: string | null,
otherProps: any = {}
) {
if (docId == null) {
@ -57,6 +60,28 @@ export function getDocParams(
}
}
/**
* Gets the DB allDocs/query params for retrieving a row.
* @param {string|null} tableId The table in which the rows have been stored.
* @param {string|null} rowId The ID of the row which is being specifically queried for. This can be
* left null to get all the rows in the table.
* @param {object} otherProps Any other properties to add to the request.
* @returns {object} Parameters which can then be used with an allDocs request.
*/
export function getRowParams(
tableId?: string | null,
rowId?: string | null,
otherProps = {}
) {
if (tableId == null) {
return getDocParams(DocumentType.ROW, null, otherProps)
}
const endOfKey = rowId == null ? `${tableId}${SEPARATOR}` : rowId
return getDocParams(DocumentType.ROW, endOfKey, otherProps)
}
/**
* Retrieve the correct index for a view based on default design DB.
*/
@ -64,6 +89,39 @@ export function getQueryIndex(viewName: ViewName) {
return `database/${viewName}`
}
/**
* Gets a new row ID for the specified table.
* @param {string} tableId The table which the row is being created for.
* @param {string|null} id If an ID is to be used then the UUID can be substituted for this.
* @returns {string} The new ID which a row doc can be stored under.
*/
export function generateRowID(tableId: string, id?: string) {
id = id || newid()
return `${DocumentType.ROW}${SEPARATOR}${tableId}${SEPARATOR}${id}`
}
/**
* Check if a given ID is that of a table.
* @returns {boolean}
*/
export const isTableId = (id: string) => {
// this includes datasource plus tables
return (
id &&
(id.startsWith(`${DocumentType.TABLE}${SEPARATOR}`) ||
id.startsWith(`${DocumentType.DATASOURCE_PLUS}${SEPARATOR}`))
)
}
/**
* Check if a given ID is that of a datasource or datasource plus.
* @returns {boolean}
*/
export const isDatasourceId = (id: string) => {
// this covers both datasources and datasource plus
return id && id.startsWith(`${DocumentType.DATASOURCE}${SEPARATOR}`)
}
/**
* Generates a new workspace ID.
* @returns {string} The new workspace ID which the workspace doc can be stored under.
@ -109,6 +167,33 @@ export function getGlobalUserParams(globalId: any, otherProps: any = {}) {
}
}
/**
* Gets parameters for retrieving users, this is a utility function for the getDocParams function.
*/
export function getUserMetadataParams(userId?: string, otherProps = {}) {
return getRowParams(InternalTable.USER_METADATA, userId, otherProps)
}
/**
* Generates a new user ID based on the passed in global ID.
* @param {string} globalId The ID of the global user.
* @returns {string} The new user ID which the user doc can be stored under.
*/
export function generateUserMetadataID(globalId: string) {
return generateRowID(InternalTable.USER_METADATA, globalId)
}
/**
* Breaks up the ID to get the global ID.
*/
export function getGlobalIDFromUserMetadataID(id: string) {
const prefix = `${DocumentType.ROW}${SEPARATOR}${InternalTable.USER_METADATA}${SEPARATOR}`
if (!id || !id.includes(prefix)) {
return id
}
return id.split(prefix)[1]
}
export function getUsersByAppParams(appId: any, otherProps: any = {}) {
const prodAppId = getProdAppID(appId)
return {
@ -169,9 +254,9 @@ export function getRoleParams(roleId = null, otherProps = {}) {
return getDocParams(DocumentType.ROLE, roleId, otherProps)
}
export function getStartEndKeyURL(base: any, baseKey: any, tenantId = null) {
export function getStartEndKeyURL(baseKey: any, tenantId = null) {
const tenancy = tenantId ? `${SEPARATOR}${tenantId}` : ""
return `${base}?startkey="${baseKey}${tenancy}"&endkey="${baseKey}${tenancy}${UNICODE_MAX}"`
return `startkey="${baseKey}${tenancy}"&endkey="${baseKey}${tenancy}${UNICODE_MAX}"`
}
/**
@ -187,22 +272,10 @@ export async function getAllDbs(opts = { efficient: false }) {
return allDbs()
}
let dbs: any[] = []
let { url, cookie } = getCouchInfo()
async function addDbs(couchUrl: string) {
const response = await fetch(checkSlashesInUrl(encodeURI(couchUrl)), {
method: "GET",
headers: {
Authorization: cookie,
},
})
if (response.status === 200) {
let json = await response.json()
dbs = dbs.concat(json)
} else {
throw "Cannot connect to CouchDB instance"
}
async function addDbs(queryString?: string) {
const json = await directCouchAllDbs(queryString)
dbs = dbs.concat(json)
}
let couchUrl = `${url}/_all_dbs`
let tenantId = getTenantId()
if (!env.MULTI_TENANCY || (!efficient && tenantId === DEFAULT_TENANT_ID)) {
// just get all DBs when:
@ -210,12 +283,12 @@ export async function getAllDbs(opts = { efficient: false }) {
// - default tenant
// - apps dbs don't contain tenant id
// - non-default tenant dbs are filtered out application side in getAllApps
await addDbs(couchUrl)
await addDbs()
} else {
// get prod apps
await addDbs(getStartEndKeyURL(couchUrl, DocumentType.APP, tenantId))
await addDbs(getStartEndKeyURL(DocumentType.APP, tenantId))
// get dev apps
await addDbs(getStartEndKeyURL(couchUrl, DocumentType.APP_DEV, tenantId))
await addDbs(getStartEndKeyURL(DocumentType.APP_DEV, tenantId))
// add global db name
dbs.push(getGlobalDBName(tenantId))
}

View file

@ -0,0 +1,12 @@
import { AppBackup, AppBackupRestoreEvent, Event } from "@budibase/types"
import { publishEvent } from "../events"
export async function appBackupRestored(backup: AppBackup) {
const properties: AppBackupRestoreEvent = {
appId: backup.appId,
backupName: backup.name!,
backupCreatedAt: backup.timestamp,
}
await publishEvent(Event.APP_BACKUP_RESTORED, properties)
}

View file

@ -19,3 +19,4 @@ export * as installation from "./installation"
export * as backfill from "./backfill"
export * as group from "./group"
export * as plugin from "./plugin"
export * as backup from "./backup"

View file

@ -4,6 +4,7 @@ import * as events from "./events"
import * as migrations from "./migrations"
import * as users from "./users"
import * as roles from "./security/roles"
import * as permissions from "./security/permissions"
import * as accounts from "./cloud/accounts"
import * as installation from "./installation"
import env from "./environment"
@ -19,6 +20,7 @@ import pino from "./pino"
import * as middleware from "./middleware"
import plugins from "./plugin"
import encryption from "./security/encryption"
import * as queue from "./queue"
// mimic the outer package exports
import * as db from "./pkg/db"
@ -37,6 +39,7 @@ const core = {
db,
...dbConstants,
redis,
locks: redis.redlock,
objectStore,
utils,
users,
@ -62,6 +65,8 @@ const core = {
...errorClasses,
middleware,
encryption,
queue,
permissions,
}
export = core

View file

@ -11,7 +11,7 @@ export const DEFINITIONS: MigrationDefinition[] = [
},
{
type: MigrationType.GLOBAL,
name: MigrationName.QUOTAS_1,
name: MigrationName.SYNC_QUOTAS,
},
{
type: MigrationType.APP,
@ -33,8 +33,4 @@ export const DEFINITIONS: MigrationDefinition[] = [
type: MigrationType.GLOBAL,
name: MigrationName.GLOBAL_INFO_SYNC_USERS,
},
{
type: MigrationType.GLOBAL,
name: MigrationName.PLUGIN_COUNT,
},
]

View file

@ -18,11 +18,16 @@ const STATE = {
bucketCreationPromises: {},
}
type ListParams = {
ContinuationToken?: string
}
const CONTENT_TYPE_MAP: any = {
html: "text/html",
css: "text/css",
js: "application/javascript",
json: "application/json",
gz: "application/gzip",
}
const STRING_CONTENT_TYPES = [
CONTENT_TYPE_MAP.html,
@ -32,16 +37,16 @@ const STRING_CONTENT_TYPES = [
]
// does normal sanitization and then swaps dev apps to apps
export function sanitizeKey(input: any) {
export function sanitizeKey(input: string) {
return sanitize(sanitizeBucket(input)).replace(/\\/g, "/")
}
// simply handles the dev app to app conversion
export function sanitizeBucket(input: any) {
export function sanitizeBucket(input: string) {
return input.replace(new RegExp(APP_DEV_PREFIX, "g"), APP_PREFIX)
}
function publicPolicy(bucketName: any) {
function publicPolicy(bucketName: string) {
return {
Version: "2012-10-17",
Statement: [
@ -69,7 +74,7 @@ const PUBLIC_BUCKETS = [
* @return {Object} an S3 object store object, check S3 Nodejs SDK for usage.
* @constructor
*/
export const ObjectStore = (bucket: any) => {
export const ObjectStore = (bucket: string) => {
const config: any = {
s3ForcePathStyle: true,
signatureVersion: "v4",
@ -93,7 +98,7 @@ export const ObjectStore = (bucket: any) => {
* Given an object store and a bucket name this will make sure the bucket exists,
* if it does not exist then it will create it.
*/
export const makeSureBucketExists = async (client: any, bucketName: any) => {
export const makeSureBucketExists = async (client: any, bucketName: string) => {
bucketName = sanitizeBucket(bucketName)
try {
await client
@ -145,7 +150,7 @@ export const upload = async ({
type,
metadata,
}: any) => {
const extension = [...filename.split(".")].pop()
const extension = filename.split(".").pop()
const fileBytes = fs.readFileSync(path)
const objectStore = ObjectStore(bucketName)
@ -168,8 +173,8 @@ export const upload = async ({
* through to the object store.
*/
export const streamUpload = async (
bucketName: any,
filename: any,
bucketName: string,
filename: string,
stream: any,
extra = {}
) => {
@ -202,7 +207,7 @@ export const streamUpload = async (
* retrieves the contents of a file from the object store, if it is a known content type it
* will be converted, otherwise it will be returned as a buffer stream.
*/
export const retrieve = async (bucketName: any, filepath: any) => {
export const retrieve = async (bucketName: string, filepath: string) => {
const objectStore = ObjectStore(bucketName)
const params = {
Bucket: sanitizeBucket(bucketName),
@ -217,10 +222,38 @@ export const retrieve = async (bucketName: any, filepath: any) => {
}
}
export const listAllObjects = async (bucketName: string, path: string) => {
const objectStore = ObjectStore(bucketName)
const list = (params: ListParams = {}) => {
return objectStore
.listObjectsV2({
...params,
Bucket: sanitizeBucket(bucketName),
Prefix: sanitizeKey(path),
})
.promise()
}
let isTruncated = false,
token,
objects: AWS.S3.Types.Object[] = []
do {
let params: ListParams = {}
if (token) {
params.ContinuationToken = token
}
const response = await list(params)
if (response.Contents) {
objects = objects.concat(response.Contents)
}
isTruncated = !!response.IsTruncated
} while (isTruncated)
return objects
}
/**
* Same as retrieval function but puts to a temporary file.
*/
export const retrieveToTmp = async (bucketName: any, filepath: any) => {
export const retrieveToTmp = async (bucketName: string, filepath: string) => {
bucketName = sanitizeBucket(bucketName)
filepath = sanitizeKey(filepath)
const data = await retrieve(bucketName, filepath)
@ -229,10 +262,31 @@ export const retrieveToTmp = async (bucketName: any, filepath: any) => {
return outputPath
}
export const retrieveDirectory = async (bucketName: string, path: string) => {
let writePath = join(budibaseTempDir(), v4())
fs.mkdirSync(writePath)
const objects = await listAllObjects(bucketName, path)
let fullObjects = await Promise.all(
objects.map(obj => retrieve(bucketName, obj.Key!))
)
let count = 0
for (let obj of objects) {
const filename = obj.Key!
const data = fullObjects[count++]
const possiblePath = filename.split("/")
if (possiblePath.length > 1) {
const dirs = possiblePath.slice(0, possiblePath.length - 1)
fs.mkdirSync(join(writePath, ...dirs), { recursive: true })
}
fs.writeFileSync(join(writePath, ...possiblePath), data)
}
return writePath
}
/**
* Delete a single file.
*/
export const deleteFile = async (bucketName: any, filepath: any) => {
export const deleteFile = async (bucketName: string, filepath: string) => {
const objectStore = ObjectStore(bucketName)
await makeSureBucketExists(objectStore, bucketName)
const params = {
@ -242,7 +296,7 @@ export const deleteFile = async (bucketName: any, filepath: any) => {
return objectStore.deleteObject(params)
}
export const deleteFiles = async (bucketName: any, filepaths: any) => {
export const deleteFiles = async (bucketName: string, filepaths: string[]) => {
const objectStore = ObjectStore(bucketName)
await makeSureBucketExists(objectStore, bucketName)
const params = {
@ -258,8 +312,8 @@ export const deleteFiles = async (bucketName: any, filepaths: any) => {
* Delete a path, including everything within.
*/
export const deleteFolder = async (
bucketName: any,
folder: any
bucketName: string,
folder: string
): Promise<any> => {
bucketName = sanitizeBucket(bucketName)
folder = sanitizeKey(folder)
@ -292,9 +346,9 @@ export const deleteFolder = async (
}
export const uploadDirectory = async (
bucketName: any,
localPath: any,
bucketPath: any
bucketName: string,
localPath: string,
bucketPath: string
) => {
bucketName = sanitizeBucket(bucketName)
let uploads = []
@ -326,7 +380,11 @@ exports.downloadTarballDirect = async (
await streamPipeline(response.body, zlib.Unzip(), tar.extract(path))
}
export const downloadTarball = async (url: any, bucketName: any, path: any) => {
export const downloadTarball = async (
url: string,
bucketName: string,
path: string
) => {
bucketName = sanitizeBucket(bucketName)
path = sanitizeKey(path)
const response = await fetch(url)

View file

@ -1,5 +1,6 @@
const { join } = require("path")
const { tmpdir } = require("os")
const fs = require("fs")
const env = require("../environment")
/****************************************************
@ -16,6 +17,11 @@ exports.ObjectStoreBuckets = {
PLUGINS: env.PLUGIN_BUCKET_NAME,
}
exports.budibaseTempDir = function () {
return join(tmpdir(), ".budibase")
const bbTmp = join(tmpdir(), ".budibase")
if (!fs.existsSync(bbTmp)) {
fs.mkdirSync(bbTmp)
}
exports.budibaseTempDir = function () {
return bbTmp
}

View file

@ -8,6 +8,7 @@ import {
updateAppId,
doInAppContext,
doInTenant,
doInContext,
} from "../context"
import * as identity from "../context/identity"
@ -20,5 +21,6 @@ export = {
updateAppId,
doInAppContext,
doInTenant,
doInContext,
identity,
}

View file

@ -3,9 +3,11 @@
import Client from "../redis"
import utils from "../redis/utils"
import clients from "../redis/init"
import * as redlock from "../redis/redlock"
export = {
Client,
utils,
clients,
redlock,
}

View file

@ -0,0 +1,4 @@
export enum JobQueue {
AUTOMATION = "automationQueue",
APP_BACKUP = "appBackupQueue",
}

View file

@ -0,0 +1,127 @@
import events from "events"
/**
* Bull works with a Job wrapper around all messages that contains a lot more information about
* the state of the message, this object constructor implements the same schema of Bull jobs
* for the sake of maintaining API consistency.
* @param {string} queue The name of the queue which the message will be carried on.
* @param {object} message The JSON message which will be passed back to the consumer.
* @returns {Object} A new job which can now be put onto the queue, this is mostly an
* internal structure so that an in memory queue can be easily swapped for a Bull queue.
*/
function newJob(queue: string, message: any) {
return {
timestamp: Date.now(),
queue: queue,
data: message,
}
}
/**
* This is designed to replicate Bull (https://github.com/OptimalBits/bull) in memory as a sort of mock.
* It is relatively simple, using an event emitter internally to register when messages are available
* to the consumers - in can support many inputs and many consumers.
*/
class InMemoryQueue {
_name: string
_opts?: any
_messages: any[]
_emitter: EventEmitter
/**
* The constructor the queue, exactly the same as that of Bulls.
* @param {string} name The name of the queue which is being configured.
* @param {object|null} opts This is not used by the in memory queue as there is no real use
* case when in memory, but is the same API as Bull
*/
constructor(name: string, opts = null) {
this._name = name
this._opts = opts
this._messages = []
this._emitter = new events.EventEmitter()
}
/**
* Same callback API as Bull, each callback passed to this will consume messages as they are
* available. Please note this is a queue service, not a notification service, so each
* consumer will receive different messages.
* @param {function<object>} func The callback function which will return a "Job", the same
* as the Bull API, within this job the property "data" contains the JSON message. Please
* note this is incredibly limited compared to Bull as in reality the Job would contain
* a lot more information about the queue and current status of Bull cluster.
*/
process(func: any) {
this._emitter.on("message", async () => {
if (this._messages.length <= 0) {
return
}
let msg = this._messages.shift()
let resp = func(msg)
if (resp.then != null) {
await resp
}
})
}
// simply puts a message to the queue and emits to the queue for processing
/**
* Simple function to replicate the add message functionality of Bull, putting
* a new message on the queue. This then emits an event which will be used to
* return the message to a consumer (if one is attached).
* @param {object} msg A message to be transported over the queue, this should be
* a JSON message as this is required by Bull.
* @param {boolean} repeat serves no purpose for the import queue.
*/
// eslint-disable-next-line no-unused-vars
add(msg: any, repeat: boolean) {
if (typeof msg !== "object") {
throw "Queue only supports carrying JSON."
}
this._messages.push(newJob(this._name, msg))
this._emitter.emit("message")
}
/**
* replicating the close function from bull, which waits for jobs to finish.
*/
async close() {
return []
}
/**
* This removes a cron which has been implemented, this is part of Bull API.
* @param {string} cronJobId The cron which is to be removed.
*/
removeRepeatableByKey(cronJobId: string) {
// TODO: implement for testing
console.log(cronJobId)
}
/**
* Implemented for tests
*/
getRepeatableJobs() {
return []
}
// eslint-disable-next-line no-unused-vars
removeJobs(pattern: string) {
// no-op
}
/**
* Implemented for tests
*/
async clean() {
return []
}
async getJob() {
return {}
}
on() {
// do nothing
}
}
export = InMemoryQueue

View file

@ -0,0 +1,2 @@
export * from "./queue"
export * from "./constants"

View file

@ -0,0 +1,101 @@
import { Job, JobId, Queue } from "bull"
import { JobQueue } from "./constants"
export type StalledFn = (job: Job) => Promise<void>
export function addListeners(
queue: Queue,
jobQueue: JobQueue,
removeStalledCb?: StalledFn
) {
logging(queue, jobQueue)
if (removeStalledCb) {
handleStalled(queue, removeStalledCb)
}
}
function handleStalled(queue: Queue, removeStalledCb?: StalledFn) {
queue.on("stalled", async (job: Job) => {
if (removeStalledCb) {
await removeStalledCb(job)
} else if (job.opts.repeat) {
const jobId = job.id
const repeatJobs = await queue.getRepeatableJobs()
for (let repeatJob of repeatJobs) {
if (repeatJob.id === jobId) {
await queue.removeRepeatableByKey(repeatJob.key)
}
}
console.log(`jobId=${jobId} disabled`)
}
})
}
function logging(queue: Queue, jobQueue: JobQueue) {
let eventType: string
switch (jobQueue) {
case JobQueue.AUTOMATION:
eventType = "automation-event"
break
case JobQueue.APP_BACKUP:
eventType = "app-backup-event"
break
}
if (process.env.NODE_DEBUG?.includes("bull")) {
queue
.on("error", (error: any) => {
// An error occurred.
console.error(`${eventType}=error error=${JSON.stringify(error)}`)
})
.on("waiting", (jobId: JobId) => {
// A Job is waiting to be processed as soon as a worker is idling.
console.log(`${eventType}=waiting jobId=${jobId}`)
})
.on("active", (job: Job, jobPromise: any) => {
// A job has started. You can use `jobPromise.cancel()`` to abort it.
console.log(`${eventType}=active jobId=${job.id}`)
})
.on("stalled", (job: Job) => {
// A job has been marked as stalled. This is useful for debugging job
// workers that crash or pause the event loop.
console.error(
`${eventType}=stalled jobId=${job.id} job=${JSON.stringify(job)}`
)
})
.on("progress", (job: Job, progress: any) => {
// A job's progress was updated!
console.log(
`${eventType}=progress jobId=${job.id} progress=${progress}`
)
})
.on("completed", (job: Job, result) => {
// A job successfully completed with a `result`.
console.log(`${eventType}=completed jobId=${job.id} result=${result}`)
})
.on("failed", (job, err: any) => {
// A job failed with reason `err`!
console.log(`${eventType}=failed jobId=${job.id} error=${err}`)
})
.on("paused", () => {
// The queue has been paused.
console.log(`${eventType}=paused`)
})
.on("resumed", (job: Job) => {
// The queue has been resumed.
console.log(`${eventType}=paused jobId=${job.id}`)
})
.on("cleaned", (jobs: Job[], type: string) => {
// Old jobs have been cleaned from the queue. `jobs` is an array of cleaned
// jobs, and `type` is the type of jobs cleaned.
console.log(`${eventType}=cleaned length=${jobs.length} type=${type}`)
})
.on("drained", () => {
// Emitted every time the queue has processed all the waiting jobs (even if there can be some delayed jobs not yet processed)
console.log(`${eventType}=drained`)
})
.on("removed", (job: Job) => {
// A job successfully removed.
console.log(`${eventType}=removed jobId=${job.id}`)
})
}
}

View file

@ -0,0 +1,51 @@
import env from "../environment"
import { getRedisOptions } from "../redis/utils"
import { JobQueue } from "./constants"
import InMemoryQueue from "./inMemoryQueue"
import BullQueue from "bull"
import { addListeners, StalledFn } from "./listeners"
const { opts: redisOpts, redisProtocolUrl } = getRedisOptions()
const CLEANUP_PERIOD_MS = 60 * 1000
let QUEUES: BullQueue.Queue[] | InMemoryQueue[] = []
let cleanupInterval: NodeJS.Timeout
async function cleanup() {
for (let queue of QUEUES) {
await queue.clean(CLEANUP_PERIOD_MS, "completed")
}
}
export function createQueue<T>(
jobQueue: JobQueue,
opts: { removeStalledCb?: StalledFn } = {}
): BullQueue.Queue<T> {
const queueConfig: any = redisProtocolUrl || { redis: redisOpts }
let queue: any
if (!env.isTest()) {
queue = new BullQueue(jobQueue, queueConfig)
} else {
queue = new InMemoryQueue(jobQueue, queueConfig)
}
addListeners(queue, jobQueue, opts?.removeStalledCb)
QUEUES.push(queue)
if (!cleanupInterval) {
cleanupInterval = setInterval(cleanup, CLEANUP_PERIOD_MS)
// fire off an initial cleanup
cleanup().catch(err => {
console.error(`Unable to cleanup automation queue initially - ${err}`)
})
}
return queue
}
exports.shutdown = async () => {
if (QUEUES.length) {
clearInterval(cleanupInterval)
for (let queue of QUEUES) {
await queue.close()
}
QUEUES = []
}
console.log("Queues shutdown")
}

View file

@ -1,27 +1,23 @@
const Client = require("./index")
const utils = require("./utils")
const { getRedlock } = require("./redlock")
let userClient, sessionClient, appClient, cacheClient, writethroughClient
let migrationsRedlock
// turn retry off so that only one instance can ever hold the lock
const migrationsRedlockConfig = { retryCount: 0 }
let userClient,
sessionClient,
appClient,
cacheClient,
writethroughClient,
lockClient
async function init() {
userClient = await new Client(utils.Databases.USER_CACHE).init()
sessionClient = await new Client(utils.Databases.SESSIONS).init()
appClient = await new Client(utils.Databases.APP_METADATA).init()
cacheClient = await new Client(utils.Databases.GENERIC_CACHE).init()
lockClient = await new Client(utils.Databases.LOCKS).init()
writethroughClient = await new Client(
utils.Databases.WRITE_THROUGH,
utils.SelectableDatabases.WRITE_THROUGH
).init()
// pass the underlying ioredis client to redlock
migrationsRedlock = getRedlock(
cacheClient.getClient(),
migrationsRedlockConfig
)
}
process.on("exit", async () => {
@ -30,6 +26,7 @@ process.on("exit", async () => {
if (appClient) await appClient.finish()
if (cacheClient) await cacheClient.finish()
if (writethroughClient) await writethroughClient.finish()
if (lockClient) await lockClient.finish()
})
module.exports = {
@ -63,10 +60,10 @@ module.exports = {
}
return writethroughClient
},
getMigrationsRedlock: async () => {
if (!migrationsRedlock) {
getLockClient: async () => {
if (!lockClient) {
await init()
}
return migrationsRedlock
return lockClient
},
}

View file

@ -1,14 +1,37 @@
import Redlock from "redlock"
import Redlock, { Options } from "redlock"
import { getLockClient } from "./init"
import { LockOptions, LockType } from "@budibase/types"
import * as tenancy from "../tenancy"
export const getRedlock = (redisClient: any, opts = { retryCount: 10 }) => {
return new Redlock([redisClient], {
let noRetryRedlock: Redlock | undefined
const getClient = async (type: LockType): Promise<Redlock> => {
switch (type) {
case LockType.TRY_ONCE: {
if (!noRetryRedlock) {
noRetryRedlock = await newRedlock(OPTIONS.TRY_ONCE)
}
return noRetryRedlock
}
default: {
throw new Error(`Could not get redlock client: ${type}`)
}
}
}
export const OPTIONS = {
TRY_ONCE: {
// immediately throws an error if the lock is already held
retryCount: 0,
},
DEFAULT: {
// the expected clock drift; for more details
// see http://redis.io/topics/distlock
driftFactor: 0.01, // multiplied by lock ttl to determine drift time
// the max number of times Redlock will attempt
// to lock a resource before erroring
retryCount: opts.retryCount,
retryCount: 10,
// the time in ms between attempts
retryDelay: 200, // time in ms
@ -16,6 +39,50 @@ export const getRedlock = (redisClient: any, opts = { retryCount: 10 }) => {
// the max time in ms randomly added to retries
// to improve performance under high contention
// see https://www.awsarchitectureblog.com/2015/03/backoff.html
retryJitter: 200, // time in ms
})
retryJitter: 100, // time in ms
},
}
export const newRedlock = async (opts: Options = {}) => {
let options = { ...OPTIONS.DEFAULT, ...opts }
const redisWrapper = await getLockClient()
const client = redisWrapper.getClient()
return new Redlock([client], options)
}
export const doWithLock = async (opts: LockOptions, task: any) => {
const redlock = await getClient(opts.type)
let lock
try {
// aquire lock
let name: string
if (opts.systemLock) {
name = opts.name
} else {
name = `${tenancy.getTenantId()}_${opts.name}`
}
if (opts.nameSuffix) {
name = name + `_${opts.nameSuffix}`
}
lock = await redlock.lock(name, opts.ttl)
// perform locked task
return task()
} catch (e: any) {
// lock limit exceeded
if (e.name === "LockError") {
if (opts.type === LockType.TRY_ONCE) {
// don't throw for try-once locks, they will always error
// due to retry count (0) exceeded
return
} else {
throw e
}
} else {
throw e
}
} finally {
if (lock) {
await lock.unlock()
}
}
}

View file

@ -28,6 +28,7 @@ exports.Databases = {
LICENSES: "license",
GENERIC_CACHE: "data_cache",
WRITE_THROUGH: "writeThrough",
LOCKS: "locks",
}
/**

View file

@ -0,0 +1,23 @@
import { generator, uuid } from "."
import { AuthType, CloudAccount, Hosting } from "@budibase/types"
import * as db from "../../../src/db/utils"
export const cloudAccount = (): CloudAccount => {
return {
accountId: uuid(),
createdAt: Date.now(),
verified: true,
verificationSent: true,
tier: "",
email: generator.email(),
tenantId: generator.word(),
hosting: Hosting.CLOUD,
authType: AuthType.PASSWORD,
password: generator.word(),
tenantName: generator.word(),
name: generator.name(),
size: "10+",
profession: "Software Engineer",
budibaseUserId: db.generateGlobalUserID(),
}
}

View file

@ -0,0 +1 @@
export { v4 as uuid } from "uuid"

View file

@ -1 +1,8 @@
export * from "./common"
import Chance from "chance"
export const generator = new Chance()
export * as koa from "./koa"
export * as accounts from "./accounts"
export * as licenses from "./licenses"

View file

@ -0,0 +1,18 @@
import { AccountPlan, License, PlanType, Quotas } from "@budibase/types"
const newPlan = (type: PlanType = PlanType.FREE): AccountPlan => {
return {
type,
}
}
export const newLicense = (opts: {
quotas: Quotas
planType?: PlanType
}): License => {
return {
features: [],
quotas: opts.quotas,
plan: newPlan(opts.planType),
}
}

View file

@ -543,6 +543,36 @@
semver "^7.3.5"
tar "^6.1.11"
"@msgpackr-extract/msgpackr-extract-darwin-arm64@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-darwin-arm64/-/msgpackr-extract-darwin-arm64-2.1.2.tgz#9571b87be3a3f2c46de05585470bc4f3af2f6f00"
integrity sha512-TyVLn3S/+ikMDsh0gbKv2YydKClN8HaJDDpONlaZR+LVJmsxLFUgA+O7zu59h9+f9gX1aj/ahw9wqa6rosmrYQ==
"@msgpackr-extract/msgpackr-extract-darwin-x64@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-darwin-x64/-/msgpackr-extract-darwin-x64-2.1.2.tgz#bfbc6936ede2955218f5621a675679a5fe8e6f4c"
integrity sha512-YPXtcVkhmVNoMGlqp81ZHW4dMxK09msWgnxtsDpSiZwTzUBG2N+No2bsr7WMtBKCVJMSD6mbAl7YhKUqkp/Few==
"@msgpackr-extract/msgpackr-extract-linux-arm64@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-linux-arm64/-/msgpackr-extract-linux-arm64-2.1.2.tgz#22555e28382af2922e7450634c8a2f240bb9eb82"
integrity sha512-vHZ2JiOWF2+DN9lzltGbhtQNzDo8fKFGrf37UJrgqxU0yvtERrzUugnfnX1wmVfFhSsF8OxrfqiNOUc5hko1Zg==
"@msgpackr-extract/msgpackr-extract-linux-arm@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-linux-arm/-/msgpackr-extract-linux-arm-2.1.2.tgz#ffb6ae1beea7ac572b6be6bf2a8e8162ebdd8be7"
integrity sha512-42R4MAFeIeNn+L98qwxAt360bwzX2Kf0ZQkBBucJ2Ircza3asoY4CDbgiu9VWklq8gWJVSJSJBwDI+c/THiWkA==
"@msgpackr-extract/msgpackr-extract-linux-x64@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-linux-x64/-/msgpackr-extract-linux-x64-2.1.2.tgz#7caf62eebbfb1345de40f75e89666b3d4194755f"
integrity sha512-RjRoRxg7Q3kPAdUSC5EUUPlwfMkIVhmaRTIe+cqHbKrGZ4M6TyCA/b5qMaukQ/1CHWrqYY2FbKOAU8Hg0pQFzg==
"@msgpackr-extract/msgpackr-extract-win32-x64@2.1.2":
version "2.1.2"
resolved "https://registry.yarnpkg.com/@msgpackr-extract/msgpackr-extract-win32-x64/-/msgpackr-extract-win32-x64-2.1.2.tgz#f2d8b9ddd8d191205ed26ce54aba3dfc5ae3e7c9"
integrity sha512-rIZVR48zA8hGkHIK7ED6+ZiXsjRCcAVBJbm8o89OKAMTmEAQ2QvoOxoiu3w2isAaWwzgtQIOFIqHwvZDyLKCvw==
"@shopify/jest-koa-mocks@5.0.1":
version "5.0.1"
resolved "https://registry.yarnpkg.com/@shopify/jest-koa-mocks/-/jest-koa-mocks-5.0.1.tgz#fba490b6b7985fbb571eb9974897d396a3642e94"
@ -663,6 +693,11 @@
"@types/connect" "*"
"@types/node" "*"
"@types/chance@1.1.3":
version "1.1.3"
resolved "https://registry.yarnpkg.com/@types/chance/-/chance-1.1.3.tgz#d19fe9391288d60fdccd87632bfc9ab2b4523fea"
integrity sha512-X6c6ghhe4/sQh4XzcZWSFaTAUOda38GQHmq9BUanYkOE/EO7ZrkazwKmtsj3xzTjkLWmwULE++23g3d3CCWaWw==
"@types/connect@*":
version "3.4.35"
resolved "https://registry.yarnpkg.com/@types/connect/-/connect-3.4.35.tgz#5fcf6ae445e4021d1fc2219a4873cc73a3bb2ad1"
@ -728,6 +763,13 @@
resolved "https://registry.yarnpkg.com/@types/http-errors/-/http-errors-1.8.2.tgz#7315b4c4c54f82d13fa61c228ec5c2ea5cc9e0e1"
integrity sha512-EqX+YQxINb+MeXaIqYDASb6U6FCHbWjkj4a1CKDBks3d/QiB2+PqBLyO72vLDgAO1wUI4O+9gweRcQK11bTL/w==
"@types/ioredis@4.28.0":
version "4.28.0"
resolved "https://registry.yarnpkg.com/@types/ioredis/-/ioredis-4.28.0.tgz#609b2ea0d91231df2dd7f67dd77436bc72584911"
integrity sha512-HSA/JQivJgV0e+353gvgu6WVoWvGRe0HyHOnAN2AvbVIhUlJBhNnnkP8gEEokrDWrxywrBkwo8NuDZ6TVPL9XA==
dependencies:
"@types/node" "*"
"@types/istanbul-lib-coverage@*", "@types/istanbul-lib-coverage@^2.0.0", "@types/istanbul-lib-coverage@^2.0.1":
version "2.0.4"
resolved "https://registry.yarnpkg.com/@types/istanbul-lib-coverage/-/istanbul-lib-coverage-2.0.4.tgz#8467d4b3c087805d63580480890791277ce35c44"
@ -1492,6 +1534,21 @@ buffer@^5.5.0, buffer@^5.6.0:
base64-js "^1.3.1"
ieee754 "^1.1.13"
bull@4.10.1:
version "4.10.1"
resolved "https://registry.yarnpkg.com/bull/-/bull-4.10.1.tgz#f14974b6089358b62b495a2cbf838aadc098e43f"
integrity sha512-Fp21tRPb2EaZPVfmM+ONZKVz2RA+to+zGgaTLyCKt3JMSU8OOBqK8143OQrnGuGpsyE5G+9FevFAGhdZZfQP2g==
dependencies:
cron-parser "^4.2.1"
debuglog "^1.0.0"
get-port "^5.1.1"
ioredis "^4.28.5"
lodash "^4.17.21"
msgpackr "^1.5.2"
p-timeout "^3.2.0"
semver "^7.3.2"
uuid "^8.3.0"
cache-content-type@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/cache-content-type/-/cache-content-type-1.0.1.tgz#035cde2b08ee2129f4a8315ea8f00a00dba1453c"
@ -1555,6 +1612,11 @@ chalk@^4.0.0, chalk@^4.1.0:
ansi-styles "^4.1.0"
supports-color "^7.1.0"
chance@1.1.3:
version "1.1.3"
resolved "https://registry.yarnpkg.com/chance/-/chance-1.1.3.tgz#414f08634ee479c7a316b569050ea20751b82dd3"
integrity sha512-XeJsdoVAzDb1WRPRuMBesRSiWpW1uNTo5Fd7mYxPJsAfgX71+jfuCOHOdbyBz2uAUZ8TwKcXgWk3DMedFfJkbg==
char-regex@^1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/char-regex/-/char-regex-1.0.2.tgz#d744358226217f981ed58f479b1d6bcc29545dcf"
@ -1754,6 +1816,13 @@ core-util-is@~1.0.0:
resolved "https://registry.yarnpkg.com/core-util-is/-/core-util-is-1.0.3.tgz#a6042d3634c2b27e9328f837b965fac83808db85"
integrity sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==
cron-parser@^4.2.1:
version "4.6.0"
resolved "https://registry.yarnpkg.com/cron-parser/-/cron-parser-4.6.0.tgz#404c3fdbff10ae80eef6b709555d577ef2fd2e0d"
integrity sha512-guZNLMGUgg6z4+eGhmHGw7ft+v6OQeuHzd1gcLxCo9Yg/qoxmG3nindp2/uwGCLizEisf2H0ptqeVXeoCpP6FA==
dependencies:
luxon "^3.0.1"
cross-spawn@^7.0.3:
version "7.0.3"
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.3.tgz#f73a85b9d5d41d045551c177e2882d4ac85728a6"
@ -1827,6 +1896,11 @@ debug@~3.1.0:
dependencies:
ms "2.0.0"
debuglog@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/debuglog/-/debuglog-1.0.1.tgz#aa24ffb9ac3df9a2351837cfb2d279360cd78492"
integrity sha512-syBZ+rnAK3EgMsH2aYEOLUW7mZSY9Gb+0wUMCFsZvcmiz+HigA0LOcq/HoQqVuGG+EKykunc7QG2bzrponfaSw==
decimal.js@^10.2.1:
version "10.3.1"
resolved "https://registry.yarnpkg.com/decimal.js/-/decimal.js-10.3.1.tgz#d8c3a444a9c6774ba60ca6ad7261c3a94fd5e783"
@ -2308,6 +2382,11 @@ get-package-type@^0.1.0:
resolved "https://registry.yarnpkg.com/get-package-type/-/get-package-type-0.1.0.tgz#8de2d803cff44df3bc6c456e6668b36c3926e11a"
integrity sha512-pjzuKtY64GYfWizNAJ0fr9VqttZkNiK2iS430LtIHzjBEr6bX8Am2zm4sW4Ro5wjWW5cAlRL1qAMTcXbjNAO2Q==
get-port@^5.1.1:
version "5.1.1"
resolved "https://registry.yarnpkg.com/get-port/-/get-port-5.1.1.tgz#0469ed07563479de6efb986baf053dcd7d4e3193"
integrity sha512-g/Q1aTSDOxFpchXC4i8ZWvxA1lnPqx/JHqcpIw0/LX9T8x/GBbi6YnlN5nhaKIFkT8oFsscUKgDJYxfwfS6QsQ==
get-stream@^4.1.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-4.1.0.tgz#c1b255575f3dc21d59bfc79cd3d2b46b1c3a54b5"
@ -2642,6 +2721,23 @@ ioredis@4.28.0:
redis-parser "^3.0.0"
standard-as-callback "^2.1.0"
ioredis@^4.28.5:
version "4.28.5"
resolved "https://registry.yarnpkg.com/ioredis/-/ioredis-4.28.5.tgz#5c149e6a8d76a7f8fa8a504ffc85b7d5b6797f9f"
integrity sha512-3GYo0GJtLqgNXj4YhrisLaNNvWSNwSS2wS4OELGfGxH8I69+XfNdnmV1AyN+ZqMh0i7eX+SWjrwFKDBDgfBC1A==
dependencies:
cluster-key-slot "^1.1.0"
debug "^4.3.1"
denque "^1.1.0"
lodash.defaults "^4.2.0"
lodash.flatten "^4.4.0"
lodash.isarguments "^3.1.0"
p-map "^2.1.0"
redis-commands "1.7.0"
redis-errors "^1.2.0"
redis-parser "^3.0.0"
standard-as-callback "^2.1.0"
is-arrayish@^0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d"
@ -3715,6 +3811,11 @@ ltgt@2.2.1, ltgt@^2.1.2, ltgt@~2.2.0:
resolved "https://registry.yarnpkg.com/ltgt/-/ltgt-2.2.1.tgz#f35ca91c493f7b73da0e07495304f17b31f87ee5"
integrity sha512-AI2r85+4MquTw9ZYqabu4nMwy9Oftlfa/e/52t9IjtfG+mGBbTNdAoZ3RQKLHR6r0wQnwZnPIEh/Ya6XTWAKNA==
luxon@^3.0.1:
version "3.0.4"
resolved "https://registry.yarnpkg.com/luxon/-/luxon-3.0.4.tgz#d179e4e9f05e092241e7044f64aaa54796b03929"
integrity sha512-aV48rGUwP/Vydn8HT+5cdr26YYQiUZ42NM6ToMoaGKwYfWbfLeRkEu1wXWMHBZT6+KyLfcbbtVcoQFCbbPjKlw==
make-dir@^3.0.0, make-dir@^3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/make-dir/-/make-dir-3.1.0.tgz#415e967046b3a7f1d185277d84aa58203726a13f"
@ -3862,6 +3963,27 @@ ms@^2.1.1, ms@^2.1.3:
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2"
integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==
msgpackr-extract@^2.1.2:
version "2.1.2"
resolved "https://registry.yarnpkg.com/msgpackr-extract/-/msgpackr-extract-2.1.2.tgz#56272030f3e163e1b51964ef8b1cd5e7240c03ed"
integrity sha512-cmrmERQFb19NX2JABOGtrKdHMyI6RUyceaPBQ2iRz9GnDkjBWFjNJC0jyyoOfZl2U/LZE3tQCCQc4dlRyA8mcA==
dependencies:
node-gyp-build-optional-packages "5.0.3"
optionalDependencies:
"@msgpackr-extract/msgpackr-extract-darwin-arm64" "2.1.2"
"@msgpackr-extract/msgpackr-extract-darwin-x64" "2.1.2"
"@msgpackr-extract/msgpackr-extract-linux-arm" "2.1.2"
"@msgpackr-extract/msgpackr-extract-linux-arm64" "2.1.2"
"@msgpackr-extract/msgpackr-extract-linux-x64" "2.1.2"
"@msgpackr-extract/msgpackr-extract-win32-x64" "2.1.2"
msgpackr@^1.5.2:
version "1.7.2"
resolved "https://registry.yarnpkg.com/msgpackr/-/msgpackr-1.7.2.tgz#68d6debf5999d6b61abb6e7046a689991ebf7261"
integrity sha512-mWScyHTtG6TjivXX9vfIy2nBtRupaiAj0HQ2mtmpmYujAmqZmaaEVPaSZ1NKLMvicaMLFzEaMk0ManxMRg8rMQ==
optionalDependencies:
msgpackr-extract "^2.1.2"
napi-macros@~2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/napi-macros/-/napi-macros-2.0.0.tgz#2b6bae421e7b96eb687aa6c77a7858640670001b"
@ -3909,6 +4031,11 @@ node-forge@^0.7.1:
resolved "https://registry.yarnpkg.com/node-forge/-/node-forge-0.7.6.tgz#fdf3b418aee1f94f0ef642cd63486c77ca9724ac"
integrity sha512-sol30LUpz1jQFBjOKwbjxijiE3b6pjd74YwfD0fJOKPjF+fONKb2Yg8rYgS6+bK6VDl+/wfr4IYpC7jDzLUIfw==
node-gyp-build-optional-packages@5.0.3:
version "5.0.3"
resolved "https://registry.yarnpkg.com/node-gyp-build-optional-packages/-/node-gyp-build-optional-packages-5.0.3.tgz#92a89d400352c44ad3975010368072b41ad66c17"
integrity sha512-k75jcVzk5wnnc/FMxsf4udAoTEUv2jY3ycfdSd3yWu6Cnd1oee6/CfZJApyscA4FJOmdoixWwiwOyf16RzD5JA==
node-gyp-build@~4.1.0:
version "4.1.1"
resolved "https://registry.yarnpkg.com/node-gyp-build/-/node-gyp-build-4.1.1.tgz#d7270b5d86717068d114cc57fff352f96d745feb"
@ -4065,6 +4192,11 @@ p-cancelable@^1.0.0:
resolved "https://registry.yarnpkg.com/p-cancelable/-/p-cancelable-1.1.0.tgz#d078d15a3af409220c886f1d9a0ca2e441ab26cc"
integrity sha512-s73XxOZ4zpt1edZYZzvhqFa6uvQc1vwUa0K0BdtIZgQMAJj9IbebH+JkgKZc9h+B05PKHLOTl4ajG1BmNrVZlw==
p-finally@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/p-finally/-/p-finally-1.0.0.tgz#3fbcfb15b899a44123b34b6dcc18b724336a2cae"
integrity sha512-LICb2p9CB7FS+0eR1oqWnHhp0FljGLZCWBE9aix0Uye9W8LTQPwMTYVGWQWIw9RdQiDg4+epXQODwIYJtSJaow==
p-limit@^2.2.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/p-limit/-/p-limit-2.3.0.tgz#3dd33c647a214fdfffd835933eb086da0dc21db1"
@ -4084,6 +4216,13 @@ p-map@^2.1.0:
resolved "https://registry.yarnpkg.com/p-map/-/p-map-2.1.0.tgz#310928feef9c9ecc65b68b17693018a665cea175"
integrity sha512-y3b8Kpd8OAN444hxfBbFfj1FY/RjtTd8tzYwhUqNYXx0fXx2iX4maP4Qr6qhIKbQXI02wTLAda4fYUbDagTUFw==
p-timeout@^3.2.0:
version "3.2.0"
resolved "https://registry.yarnpkg.com/p-timeout/-/p-timeout-3.2.0.tgz#c7e17abc971d2a7962ef83626b35d635acf23dfe"
integrity sha512-rhIwUycgwwKcP9yTOOFK/AKsAopjjCakVqLHePO3CC6Mir1Z99xT+R63jZxAT5lFZLa2inS5h+ZS2GvR99/FBg==
dependencies:
p-finally "^1.0.0"
p-try@^2.0.0:
version "2.2.0"
resolved "https://registry.yarnpkg.com/p-try/-/p-try-2.2.0.tgz#cb2868540e313d61de58fafbe35ce9004d5540e6"
@ -5350,7 +5489,7 @@ uuid@8.1.0:
resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.1.0.tgz#6f1536eb43249f473abc6bd58ff983da1ca30d8d"
integrity sha512-CI18flHDznR0lq54xBycOVmphdCYnQLKn8abKn7PXUiKUGdEd+/l9LWNJmugXel4hXq7S+RMNl34ecyC9TntWg==
uuid@8.3.2, uuid@^8.3.2:
uuid@8.3.2, uuid@^8.3.0, uuid@^8.3.2:
version "8.3.2"
resolved "https://registry.yarnpkg.com/uuid/-/uuid-8.3.2.tgz#80d5b5ced271bb9af6c445f21a1a04c606cefbe2"
integrity sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==

View file

@ -1,7 +1,7 @@
{
"name": "@budibase/bbui",
"description": "A UI solution used in the different Budibase projects.",
"version": "2.0.39",
"version": "2.1.6",
"license": "MPL-2.0",
"svelte": "src/index.js",
"module": "dist/bbui.es.js",
@ -38,7 +38,7 @@
],
"dependencies": {
"@adobe/spectrum-css-workflow-icons": "^1.2.1",
"@budibase/string-templates": "^2.0.39",
"@budibase/string-templates": "^2.1.6",
"@spectrum-css/actionbutton": "^1.0.1",
"@spectrum-css/actiongroup": "^1.0.1",
"@spectrum-css/avatar": "^3.0.2",

View file

@ -1,18 +1,18 @@
export default function clickOutside(element, callbackFunction) {
function onClick(event) {
if (!element.contains(event.target)) {
callbackFunction()
callbackFunction(event)
}
}
document.body.addEventListener("mousedown", onClick, true)
document.body.addEventListener("click", onClick, true)
return {
update(newCallbackFunction) {
callbackFunction = newCallbackFunction
},
destroy() {
document.body.removeEventListener("mousedown", onClick, true)
document.body.removeEventListener("click", onClick, true)
},
}
}

View file

@ -119,6 +119,13 @@
return "var(--spectrum-global-color-static-gray-900)"
}
const handleOutsideClick = event => {
if (open) {
event.stopPropagation()
open = false
}
}
</script>
<div class="container">
@ -131,7 +138,7 @@
</div>
{#if open}
<div
use:clickOutside={() => (open = false)}
use:clickOutside={handleOutsideClick}
transition:fly|local={{ y: -20, duration: 200 }}
class="spectrum-Popover spectrum-Popover--bottom spectrum-Picker-popover is-open"
class:spectrum-Popover--align-right={alignRight}

View file

@ -17,7 +17,7 @@
export let timeOnly = false
export let ignoreTimezones = false
export let time24hr = false
export let range = false
const dispatch = createEventDispatcher()
const flatpickrId = `${uuid()}-wrapper`
let open = false
@ -41,6 +41,7 @@
time_24hr: time24hr || false,
altFormat: timeOnly ? "H:i" : enableTime ? "F j Y, H:i" : "F j, Y",
wrap: true,
mode: range ? "range" : null,
appendTo,
disableMobile: "true",
onReady: () => {
@ -64,7 +65,6 @@
if (newValue) {
newValue = newValue.toISOString()
}
// If time only set date component to 2000-01-01
if (timeOnly) {
// Classic flackpickr causing issues.
@ -95,7 +95,11 @@
.slice(0, -1)
}
dispatch("change", newValue)
if (range) {
dispatch("change", event.detail)
} else {
dispatch("change", newValue)
}
}
const clearDateOnBackspace = event => {
@ -160,7 +164,7 @@
{#key redrawOptions}
<Flatpickr
bind:flatpickr
value={parseDate(value)}
value={range ? value : parseDate(value)}
on:open={onOpen}
on:close={onClose}
options={flatpickrOptions}

View file

@ -43,6 +43,7 @@
let selectedImageIdx = 0
let fileDragged = false
let selectedUrl
let fileInput
$: selectedImage = value?.[selectedImageIdx] ?? null
$: fileCount = value?.length ?? 0
$: isImage =
@ -102,6 +103,7 @@
await deleteAttachments(
value.filter((x, idx) => idx === selectedImageIdx).map(item => item.key)
)
fileInput.value = ""
}
selectedImageIdx = 0
}
@ -234,6 +236,7 @@
type="file"
multiple
accept={extensions}
bind:this={fileInput}
on:change={handleFile}
/>
<svg

View file

@ -102,6 +102,13 @@
}
return value
}
const handleOutsideClick = event => {
if (open) {
event.stopPropagation()
open = false
}
}
</script>
<div
@ -151,7 +158,7 @@
{disabled}
class:is-open={open}
aria-haspopup="listbox"
on:mousedown={onClick}
on:click={onClick}
>
<span class="spectrum-Picker-label">
<div>
@ -168,7 +175,7 @@
</button>
{#if open}
<div
use:clickOutside={() => (open = false)}
use:clickOutside={handleOutsideClick}
transition:fly|local={{ y: -20, duration: 200 }}
class="spectrum-Popover spectrum-Popover--bottom spectrum-Picker-popover is-open"
>

View file

@ -19,6 +19,7 @@
export let placeholderOption = null
export let options = []
export let isOptionSelected = () => false
export let isOptionEnabled = () => true
export let onSelectOption = () => {}
export let getOptionLabel = option => option
export let getOptionValue = option => option
@ -84,7 +85,7 @@
class:is-invalid={!!error}
class:is-open={open}
aria-haspopup="listbox"
on:mousedown={onClick}
on:click={onClick}
>
{#if fieldIcon}
<span class="option-extra">
@ -164,6 +165,7 @@
aria-selected="true"
tabindex="0"
on:click={() => onSelectOption(getOptionValue(option, idx))}
class:is-disabled={!isOptionEnabled(option)}
>
{#if getOptionIcon(option, idx)}
<span class="option-extra">
@ -256,4 +258,7 @@
.spectrum-Popover :global(.spectrum-Search .spectrum-Textfield-icon) {
top: 9px;
}
.spectrum-Menu-item.is-disabled {
pointer-events: none;
}
</style>

View file

@ -87,6 +87,20 @@
updateValue(event.target.value)
}
}
const handlePrimaryOutsideClick = event => {
if (primaryOpen) {
event.stopPropagation()
primaryOpen = false
}
}
const handleSecondaryOutsideClick = event => {
if (secondaryOpen) {
event.stopPropagation()
secondaryOpen = false
}
}
</script>
<div
@ -148,7 +162,7 @@
</div>
{#if primaryOpen}
<div
use:clickOutside={() => (primaryOpen = false)}
use:clickOutside={handlePrimaryOutsideClick}
transition:fly|local={{ y: -20, duration: 200 }}
class="spectrum-Popover spectrum-Popover--bottom spectrum-Picker-popover is-open"
class:auto-width={autoWidth}
@ -256,7 +270,7 @@
{disabled}
class:is-open={secondaryOpen}
aria-haspopup="listbox"
on:mousedown={onClickSecondary}
on:click={onClickSecondary}
>
{#if secondaryFieldIcon}
<span class="option-left">
@ -281,7 +295,7 @@
</button>
{#if secondaryOpen}
<div
use:clickOutside={() => (secondaryOpen = false)}
use:clickOutside={handleSecondaryOutsideClick}
transition:fly|local={{ y: -20, duration: 200 }}
class="spectrum-Popover spectrum-Popover--bottom spectrum-Picker-popover is-open"
style="width: 30%"

View file

@ -12,6 +12,7 @@
export let getOptionValue = option => option
export let getOptionIcon = () => null
export let getOptionColour = () => null
export let isOptionEnabled
export let readonly = false
export let quiet = false
export let autoWidth = false
@ -66,6 +67,7 @@
{getOptionValue}
{getOptionIcon}
{getOptionColour}
{isOptionEnabled}
{autocomplete}
{sort}
isPlaceholder={value == null || value === ""}

View file

@ -14,11 +14,17 @@
export let placeholder = null
export let appendTo = undefined
export let ignoreTimezones = false
export let range = false
const dispatch = createEventDispatcher()
const onChange = e => {
value = e.detail
if (range) {
// Flatpickr cant take two dates and work out what to display, needs to be provided a string.
// Like - "Date1 to Date2". Hence passing in that specifically from the array
value = e?.detail[1]
} else {
value = e.detail
}
dispatch("change", e.detail)
}
</script>
@ -34,6 +40,7 @@
{time24hr}
{appendTo}
{ignoreTimezones}
{range}
on:change={onChange}
/>
</Field>

View file

@ -15,6 +15,7 @@
export let getOptionValue = option => extractProperty(option, "value")
export let getOptionIcon = option => option?.icon
export let getOptionColour = option => option?.colour
export let isOptionEnabled
export let quiet = false
export let autoWidth = false
export let sort = false
@ -49,6 +50,7 @@
{getOptionValue}
{getOptionIcon}
{getOptionColour}
{isOptionEnabled}
on:change={onChange}
on:click
/>

View file

@ -50,6 +50,13 @@
dispatch("change", value)
open = false
}
const handleOutsideClick = event => {
if (open) {
event.stopPropagation()
open = false
}
}
</script>
<div class="container">
@ -64,7 +71,7 @@
</div>
{#if open}
<div
use:clickOutside={() => (open = false)}
use:clickOutside={handleOutsideClick}
transition:fly={{ y: -20, duration: 200 }}
class="spectrum-Popover spectrum-Popover--bottom spectrum-Picker-popover is-open"
class:spectrum-Popover--align-right={alignRight}

View file

@ -33,6 +33,13 @@
open = false
}
const handleOutsideClick = e => {
if (open) {
e.stopPropagation()
hide()
}
}
let open = null
function handleEscape(e) {
@ -47,7 +54,7 @@
<div
tabindex="0"
use:positionDropdown={{ anchor, align, maxWidth }}
use:clickOutside={hide}
use:clickOutside={handleOutsideClick}
on:keydown={handleEscape}
class={"spectrum-Popover is-open " + (tooltipClasses || "")}
role="presentation"

View file

@ -56,6 +56,7 @@
{schema}
value={cellValue}
on:clickrelationship
on:buttonclick
>
<slot />
</svelte:component>

View file

@ -387,6 +387,7 @@
schema={schema[field]}
value={deepGet(row, field)}
on:clickrelationship
on:buttonclick
>
<slot />
</CellRenderer>

Binary file not shown.

After

Width:  |  Height:  |  Size: 314 KiB

View file

@ -20,7 +20,9 @@ filterTests(["smoke", "all"], () => {
cy.get(".spectrum-Form-itemField").eq(3).should('contain', 'App User')
// User should not have app access
cy.get(interact.LIST_ITEMS, { timeout: 500 }).should("contain", "No apps")
cy.get(".spectrum-Heading").contains("Apps").parent().within(() => {
cy.get(interact.LIST_ITEMS, { timeout: 500 }).should("contain", "This user has access to no apps")
})
})
if (Cypress.env("TEST_ENV")) {

View file

@ -2,7 +2,7 @@ import filterTests from "../support/filterTests"
const interact = require('../support/interact')
filterTests(['smoke', 'all'], () => {
context("Auto Screens UI", () => {
xcontext("Auto Screens UI", () => {
before(() => {
cy.login()
cy.deleteAllApps()
@ -54,6 +54,7 @@ filterTests(['smoke', 'all'], () => {
cy.createDatasourceScreen([initialTable, secondTable])
// Confirm screens have been auto generated
// Previously generated tables are suffixed with numbers - as expected
cy.wait(1000)
cy.get(interact.BODY).should('contain', 'cypress-tests-2')
.and('contain', 'cypress-tests-2/:id')
.and('contain', 'cypress-tests-2/new/row')

View file

@ -1,7 +1,7 @@
import filterTests from "../../support/filterTests"
filterTests(["all"], () => {
context("MySQL Datasource Testing", () => {
xcontext("MySQL Datasource Testing", () => {
if (Cypress.env("TEST_ENV")) {
before(() => {
cy.login()

View file

@ -1,7 +1,7 @@
import filterTests from "../../support/filterTests"
filterTests(["all"], () => {
context("PostgreSQL Datasource Testing", () => {
xcontext("PostgreSQL Datasource Testing", () => {
if (Cypress.env("TEST_ENV")) {
before(() => {
cy.login()

View file

@ -22,7 +22,7 @@ filterTests(["smoke", "all"], () => {
cy.wait("@queryError")
cy.get("@queryError")
.its("response.body")
.should("have.property", "message", "Invalid URL: http://random text?")
.should("have.property", "message", "Invalid URL: http://random text")
cy.get("@queryError")
.its("response.body")
.should("have.property", "status", 400)

View file

@ -1,5 +1,5 @@
import filterTests from "../support/filterTests"
const interact = require('../support/interact')
const interact = require("../support/interact")
filterTests(["smoke", "all"], () => {
context("Query Level Transformers", () => {

View file

@ -2,7 +2,7 @@ import filterTests from "../support/filterTests"
const interact = require("../support/interact")
filterTests(["all"], () => {
context("Rename an App", () => {
xcontext("Rename an App", () => {
beforeEach(() => {
cy.login()
cy.createTestApp()

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/builder",
"version": "2.0.39",
"version": "2.1.6",
"license": "GPL-3.0",
"private": true,
"scripts": {
@ -71,10 +71,10 @@
}
},
"dependencies": {
"@budibase/bbui": "^2.0.39",
"@budibase/client": "^2.0.39",
"@budibase/frontend-core": "^2.0.39",
"@budibase/string-templates": "^2.0.39",
"@budibase/bbui": "^2.1.6",
"@budibase/client": "^2.1.6",
"@budibase/frontend-core": "^2.1.6",
"@budibase/string-templates": "^2.1.6",
"@sentry/browser": "5.19.1",
"@spectrum-css/page": "^3.0.1",
"@spectrum-css/vars": "^3.0.1",

View file

@ -185,43 +185,42 @@ export const makeComponentUnique = component => {
// Replace component ID
const oldId = component._id
const newId = Helpers.uuid()
component._id = newId
let definition = JSON.stringify(component)
if (component._children?.length) {
let children = JSON.stringify(component._children)
// Replace all instances of this ID in HBS bindings
definition = definition.replace(new RegExp(oldId, "g"), newId)
// Replace all instances of this ID in child HBS bindings
children = children.replace(new RegExp(oldId, "g"), newId)
// Replace all instances of this ID in JS bindings
const bindings = findHBSBlocks(definition)
bindings.forEach(binding => {
// JSON.stringify will have escaped double quotes, so we need
// to account for that
let sanitizedBinding = binding.replace(/\\"/g, '"')
// Replace all instances of this ID in child JS bindings
const bindings = findHBSBlocks(children)
bindings.forEach(binding => {
// JSON.stringify will have escaped double quotes, so we need
// to account for that
let sanitizedBinding = binding.replace(/\\"/g, '"')
// Check if this is a valid JS binding
let js = decodeJSBinding(sanitizedBinding)
if (js != null) {
// Replace ID inside JS binding
js = js.replace(new RegExp(oldId, "g"), newId)
// Check if this is a valid JS binding
let js = decodeJSBinding(sanitizedBinding)
if (js != null) {
// Replace ID inside JS binding
js = js.replace(new RegExp(oldId, "g"), newId)
// Create new valid JS binding
let newBinding = encodeJSBinding(js)
// Create new valid JS binding
let newBinding = encodeJSBinding(js)
// Replace escaped double quotes
newBinding = newBinding.replace(/"/g, '\\"')
// Replace escaped double quotes
newBinding = newBinding.replace(/"/g, '\\"')
// Insert new JS back into binding.
// A single string replace here is better than a regex as
// the binding contains special characters, and we only need
// to replace a single instance.
definition = definition.replace(binding, newBinding)
}
})
// Insert new JS back into binding.
// A single string replace here is better than a regex as
// the binding contains special characters, and we only need
// to replace a single instance.
children = children.replace(binding, newBinding)
}
})
// Recurse on all children
component._children = JSON.parse(children)
component._children.forEach(makeComponentUnique)
// Recurse on all children
component = JSON.parse(definition)
return {
...component,
_children: component._children?.map(makeComponentUnique),
}
}

View file

@ -169,7 +169,12 @@ export const getComponentBindableProperties = (asset, componentId) => {
/**
* Gets all data provider components above a component.
*/
export const getContextProviderComponents = (asset, componentId, type) => {
export const getContextProviderComponents = (
asset,
componentId,
type,
options = { includeSelf: false }
) => {
if (!asset || !componentId) {
return []
}
@ -177,7 +182,9 @@ export const getContextProviderComponents = (asset, componentId, type) => {
// Get the component tree leading up to this component, ignoring the component
// itself
const path = findComponentPath(asset.props, componentId)
path.pop()
if (!options?.includeSelf) {
path.pop()
}
// Filter by only data provider components
return path.filter(component => {
@ -798,6 +805,17 @@ export const buildFormSchema = component => {
if (!component) {
return schema
}
// If this is a form block, simply use the fields setting
if (component._component.endsWith("formblock")) {
let schema = {}
component.fields?.forEach(field => {
schema[field] = { type: "string" }
})
return schema
}
// Otherwise find all field component children
const settings = getComponentSettings(component._component)
const fieldSetting = settings.find(
setting => setting.key === "field" && setting.type.startsWith("field/")

View file

@ -182,7 +182,70 @@ export const getFrontendStore = () => {
return state
})
},
validate: screen => {
// Recursive function to find any illegal children in component trees
const findIllegalChild = (
component,
illegalChildren = [],
legalDirectChildren = []
) => {
const type = component._component
if (illegalChildren.includes(type)) {
return type
}
if (
legalDirectChildren.length &&
!legalDirectChildren.includes(type)
) {
return type
}
if (!component?._children?.length) {
return
}
const definition = store.actions.components.getDefinition(
component._component
)
// Reset whitelist for direct children
legalDirectChildren = []
if (definition?.legalDirectChildren?.length) {
legalDirectChildren = definition.legalDirectChildren.map(x => {
return `@budibase/standard-components/${x}`
})
}
// Append blacklisted components and remove duplicates
if (definition?.illegalChildren?.length) {
const blacklist = definition.illegalChildren.map(x => {
return `@budibase/standard-components/${x}`
})
illegalChildren = [...new Set([...illegalChildren, ...blacklist])]
}
// Recurse on all children
for (let child of component._children) {
const illegalChild = findIllegalChild(
child,
illegalChildren,
legalDirectChildren
)
if (illegalChild) {
return illegalChild
}
}
}
// Validate the entire tree and throw an error if an illegal child is
// found anywhere
const illegalChild = findIllegalChild(screen.props)
if (illegalChild) {
const def = store.actions.components.getDefinition(illegalChild)
throw `You can't place a ${def.name} here`
}
},
save: async screen => {
store.actions.screens.validate(screen)
const state = get(store)
const creatingNewScreen = screen._id === undefined
const savedScreen = await API.saveScreen(screen)
@ -330,6 +393,16 @@ export const getFrontendStore = () => {
return state
})
},
sendEvent: (name, payload) => {
const { previewEventHandler } = get(store)
previewEventHandler?.(name, payload)
},
registerEventHandler: handler => {
store.update(state => {
state.previewEventHandler = handler
return state
})
},
},
layouts: {
select: layoutId => {
@ -435,13 +508,17 @@ export const getFrontendStore = () => {
return {
_id: Helpers.uuid(),
_component: definition.component,
_styles: { normal: {}, hover: {}, active: {} },
_styles: {
normal: {},
hover: {},
active: {},
},
_instanceName: `New ${definition.friendlyName || definition.name}`,
...cloneDeep(props),
...extras,
}
},
create: async (componentName, presetProps) => {
create: async (componentName, presetProps, parent, index) => {
const state = get(store)
const componentInstance = store.actions.components.createInstance(
componentName,
@ -451,48 +528,62 @@ export const getFrontendStore = () => {
return
}
// Patch selected screen
await store.actions.screens.patch(screen => {
// Find the selected component
const currentComponent = findComponent(
screen.props,
state.selectedComponentId
)
if (!currentComponent) {
return false
}
// Find parent node to attach this component to
let parentComponent
if (currentComponent) {
// Use selected component as parent if one is selected
const definition = store.actions.components.getDefinition(
currentComponent._component
)
if (definition?.hasChildren) {
// Use selected component if it allows children
parentComponent = currentComponent
// Insert in position if specified
if (parent && index != null) {
await store.actions.screens.patch(screen => {
let parentComponent = findComponent(screen.props, parent)
if (!parentComponent._children?.length) {
parentComponent._children = [componentInstance]
} else {
// Otherwise we need to use the parent of this component
parentComponent = findComponentParent(
screen.props,
currentComponent._id
)
parentComponent._children.splice(index, 0, componentInstance)
}
} else {
// Use screen or layout if no component is selected
parentComponent = screen.props
}
})
}
// Attach new component
if (!parentComponent) {
return false
}
if (!parentComponent._children) {
parentComponent._children = []
}
parentComponent._children.push(componentInstance)
})
// Otherwise we work out where this component should be inserted
else {
await store.actions.screens.patch(screen => {
// Find the selected component
const currentComponent = findComponent(
screen.props,
state.selectedComponentId
)
if (!currentComponent) {
return false
}
// Find parent node to attach this component to
let parentComponent
if (currentComponent) {
// Use selected component as parent if one is selected
const definition = store.actions.components.getDefinition(
currentComponent._component
)
if (definition?.hasChildren) {
// Use selected component if it allows children
parentComponent = currentComponent
} else {
// Otherwise we need to use the parent of this component
parentComponent = findComponentParent(
screen.props,
currentComponent._id
)
}
} else {
// Use screen or layout if no component is selected
parentComponent = screen.props
}
// Attach new component
if (!parentComponent) {
return false
}
if (!parentComponent._children) {
parentComponent._children = []
}
parentComponent._children.push(componentInstance)
})
}
// Select new component
store.update(state => {
@ -509,12 +600,11 @@ export const getFrontendStore = () => {
},
patch: async (patchFn, componentId, screenId) => {
// Use selected component by default
if (!componentId && !screenId) {
if (!componentId || !screenId) {
const state = get(store)
componentId = state.selectedComponentId
screenId = state.selectedScreenId
componentId = componentId || state.selectedComponentId
screenId = screenId || state.selectedScreenId
}
// Invalid if only a screen or component ID provided
if (!componentId || !screenId || !patchFn) {
return
}
@ -577,16 +667,14 @@ export const getFrontendStore = () => {
})
// Select the parent if cutting
if (cut) {
if (cut && selectParent) {
const screen = get(selectedScreen)
const parent = findComponentParent(screen?.props, component._id)
if (parent) {
if (selectParent) {
store.update(state => {
state.selectedComponentId = parent._id
return state
})
}
store.update(state => {
state.selectedComponentId = parent._id
return state
})
}
}
},
@ -597,21 +685,29 @@ export const getFrontendStore = () => {
}
let newComponentId
// Remove copied component if cutting, regardless if pasting works
let componentToPaste = cloneDeep(state.componentToPaste)
if (componentToPaste.isCut) {
store.update(state => {
delete state.componentToPaste
return state
})
}
// Patch screen
const patch = screen => {
// Get up to date ref to target
targetComponent = findComponent(screen.props, targetComponent._id)
if (!targetComponent) {
return
return false
}
const cut = state.componentToPaste.isCut
const originalId = state.componentToPaste._id
let componentToPaste = cloneDeep(state.componentToPaste)
const cut = componentToPaste.isCut
const originalId = componentToPaste._id
delete componentToPaste.isCut
// Make new component unique if copying
if (!cut) {
makeComponentUnique(componentToPaste)
componentToPaste = makeComponentUnique(componentToPaste)
}
newComponentId = componentToPaste._id
@ -661,11 +757,8 @@ export const getFrontendStore = () => {
const targetScreenId = targetScreen?._id || state.selectedScreenId
await store.actions.screens.patch(patch, targetScreenId)
// Select the new component
store.update(state => {
// Remove copied component if cutting
if (state.componentToPaste.isCut) {
delete state.componentToPaste
}
state.selectedScreenId = targetScreenId
state.selectedComponentId = newComponentId
return state
@ -869,6 +962,15 @@ export const getFrontendStore = () => {
}
})
},
updateStyles: async (styles, id) => {
const patchFn = component => {
component._styles.normal = {
...component._styles.normal,
...styles,
}
}
await store.actions.components.patch(patchFn, id)
},
updateCustomStyle: async style => {
await store.actions.components.patch(component => {
component._styles.custom = style
@ -891,6 +993,50 @@ export const getFrontendStore = () => {
component[name] = value
})
},
requestEjectBlock: componentId => {
store.actions.preview.sendEvent("eject-block", componentId)
},
handleEjectBlock: async (componentId, ejectedDefinition) => {
let nextSelectedComponentId
await store.actions.screens.patch(screen => {
const block = findComponent(screen.props, componentId)
const parent = findComponentParent(screen.props, componentId)
// Sanity check
if (!block || !parent?._children?.length) {
return false
}
// Attach block children back into ejected definition, using the
// _containsSlot flag to know where to insert them
const slotContainer = findAllMatchingComponents(
ejectedDefinition,
x => x._containsSlot
)[0]
if (slotContainer) {
delete slotContainer._containsSlot
slotContainer._children = [
...(slotContainer._children || []),
...(block._children || []),
]
}
// Replace block with ejected definition
ejectedDefinition = makeComponentUnique(ejectedDefinition)
const index = parent._children.findIndex(x => x._id === componentId)
parent._children[index] = ejectedDefinition
nextSelectedComponentId = ejectedDefinition._id
})
// Select new root component
if (nextSelectedComponentId) {
store.update(state => {
state.selectedComponentId = nextSelectedComponentId
return state
})
}
},
},
links: {
save: async (url, title) => {
@ -936,6 +1082,19 @@ export const getFrontendStore = () => {
}))
},
},
dnd: {
start: component => {
store.actions.preview.sendEvent("dragging-new-component", {
dragging: true,
component,
})
},
stop: () => {
store.actions.preview.sendEvent("dragging-new-component", {
dragging: false,
})
},
},
}
return store

View file

@ -1,13 +1,8 @@
import sanitizeUrl from "./utils/sanitizeUrl"
import { Screen } from "./utils/Screen"
import { Component } from "./utils/Component"
import {
makeBreadcrumbContainer,
makeMainForm,
makeTitleContainer,
makeSaveButton,
makeDatasourceFormComponents,
} from "./utils/commonComponents"
import { makeBreadcrumbContainer } from "./utils/commonComponents"
import { getSchemaForDatasource } from "../../dataBinding"
export default function (tables) {
return tables.map(table => {
@ -23,48 +18,55 @@ export default function (tables) {
export const newRowUrl = table => sanitizeUrl(`/${table.name}/new/row`)
export const NEW_ROW_TEMPLATE = "NEW_ROW_TEMPLATE"
function generateTitleContainer(table, formId) {
return makeTitleContainer("New Row").addChild(makeSaveButton(table, formId))
const rowListUrl = table => sanitizeUrl(`/${table.name}`)
const getFields = schema => {
let columns = []
Object.entries(schema || {}).forEach(([field, fieldSchema]) => {
if (!field || !fieldSchema) {
return
}
if (!fieldSchema?.autocolumn) {
columns.push(field)
}
})
return columns
}
const createScreen = table => {
const screen = new Screen()
.instanceName(`${table.name} - New`)
.customProps({
hAlign: "center",
})
.route(newRowUrl(table))
const form = makeMainForm()
.instanceName("Form")
const generateFormBlock = table => {
const datasource = { type: "table", tableId: table._id }
const { schema } = getSchemaForDatasource(null, datasource, {
formSchema: true,
})
const formBlock = new Component("@budibase/standard-components/formblock")
formBlock
.customProps({
title: "New row",
actionType: "Create",
actionUrl: rowListUrl(table),
showDeleteButton: false,
showSaveButton: true,
fields: getFields(schema),
dataSource: {
label: table.name,
tableId: table._id,
type: "table",
},
labelPosition: "left",
size: "spectrum--medium",
})
const fieldGroup = new Component("@budibase/standard-components/fieldgroup")
.instanceName("Field Group")
.customProps({
labelPosition: "left",
})
// Add all form fields from this schema to the field group
const datasource = { type: "table", tableId: table._id }
makeDatasourceFormComponents(datasource).forEach(component => {
fieldGroup.addChild(component)
})
// Add all children to the form
const formId = form._json._id
form
.addChild(makeBreadcrumbContainer(table.name, "New"))
.addChild(generateTitleContainer(table, formId))
.addChild(fieldGroup)
return screen.addChild(form).json()
.instanceName(`${table.name} - Form block`)
return formBlock
}
const createScreen = table => {
const formBlock = generateFormBlock(table)
const screen = new Screen()
.instanceName(`${table.name} - New`)
.route(newRowUrl(table))
return screen
.addChild(makeBreadcrumbContainer(table.name, "New row"))
.addChild(formBlock)
.json()
}

View file

@ -1,15 +1,8 @@
import sanitizeUrl from "./utils/sanitizeUrl"
import { rowListUrl } from "./rowListScreen"
import { Screen } from "./utils/Screen"
import { Component } from "./utils/Component"
import { makePropSafe } from "@budibase/string-templates"
import {
makeBreadcrumbContainer,
makeTitleContainer,
makeSaveButton,
makeMainForm,
makeDatasourceFormComponents,
} from "./utils/commonComponents"
import { makeBreadcrumbContainer } from "./utils/commonComponents"
import { getSchemaForDatasource } from "../../dataBinding"
export default function (tables) {
return tables.map(table => {
@ -25,125 +18,53 @@ export default function (tables) {
export const ROW_DETAIL_TEMPLATE = "ROW_DETAIL_TEMPLATE"
export const rowDetailUrl = table => sanitizeUrl(`/${table.name}/:id`)
function generateTitleContainer(table, title, formId, repeaterId) {
const saveButton = makeSaveButton(table, formId)
const deleteButton = new Component("@budibase/standard-components/button")
.text("Delete")
.customProps({
type: "secondary",
quiet: true,
size: "M",
onClick: [
{
parameters: {
tableId: table._id,
rowId: `{{ ${makePropSafe(repeaterId)}.${makePropSafe("_id")} }}`,
revId: `{{ ${makePropSafe(repeaterId)}.${makePropSafe("_rev")} }}`,
confirm: true,
},
"##eventHandlerType": "Delete Row",
},
{
parameters: {
url: rowListUrl(table),
},
"##eventHandlerType": "Navigate To",
},
],
})
.instanceName("Delete Button")
const rowListUrl = table => sanitizeUrl(`/${table.name}`)
const buttons = new Component("@budibase/standard-components/container")
.instanceName("Button Container")
.customProps({
direction: "row",
hAlign: "right",
vAlign: "middle",
size: "shrink",
gap: "M",
})
.addChild(deleteButton)
.addChild(saveButton)
const getFields = schema => {
let columns = []
Object.entries(schema || {}).forEach(([field, fieldSchema]) => {
if (!field || !fieldSchema) {
return
}
if (!fieldSchema?.autocolumn) {
columns.push(field)
}
})
return columns
}
return makeTitleContainer(title).addChild(buttons)
const generateFormBlock = table => {
const datasource = { type: "table", tableId: table._id }
const { schema } = getSchemaForDatasource(null, datasource, {
formSchema: true,
})
const formBlock = new Component("@budibase/standard-components/formblock")
formBlock
.customProps({
title: "Edit row",
actionType: "Update",
actionUrl: rowListUrl(table),
showDeleteButton: true,
showSaveButton: true,
fields: getFields(schema),
dataSource: {
label: table.name,
tableId: table._id,
type: "table",
},
labelPosition: "left",
size: "spectrum--medium",
})
.instanceName(`${table.name} - Form block`)
return formBlock
}
const createScreen = table => {
const provider = new Component("@budibase/standard-components/dataprovider")
.instanceName(`Data Provider`)
.customProps({
dataSource: {
label: table.name,
name: table._id,
tableId: table._id,
type: "table",
},
filter: [
{
field: "_id",
operator: "equal",
type: "string",
value: `{{ ${makePropSafe("url")}.${makePropSafe("id")} }}`,
valueType: "Binding",
},
],
limit: 1,
paginate: false,
})
const repeater = new Component("@budibase/standard-components/repeater")
.instanceName("Repeater")
.customProps({
dataProvider: `{{ literal ${makePropSafe(provider._json._id)} }}`,
noRowsMessage: "We couldn't find a row to display",
})
const form = makeMainForm()
.instanceName("Form")
.customProps({
actionType: "Update",
size: "spectrum--medium",
dataSource: {
label: table.name,
tableId: table._id,
type: "table",
},
})
const fieldGroup = new Component("@budibase/standard-components/fieldgroup")
.instanceName("Field Group")
.customProps({
labelPosition: "left",
})
// Add all form fields from this schema to the field group
const datasource = { type: "table", tableId: table._id }
makeDatasourceFormComponents(datasource).forEach(component => {
fieldGroup.addChild(component)
})
// Add all children to the form
const formId = form._json._id
const repeaterId = repeater._json._id
const heading = table.primaryDisplay
? `{{ ${makePropSafe(repeaterId)}.${makePropSafe(table.primaryDisplay)} }}`
: null
form
.addChild(makeBreadcrumbContainer(table.name, heading || "Edit"))
.addChild(
generateTitleContainer(table, heading || "Edit Row", formId, repeaterId)
)
.addChild(fieldGroup)
repeater.addChild(form)
provider.addChild(repeater)
return new Screen()
.instanceName(`${table.name} - Detail`)
.route(rowDetailUrl(table))
.customProps({
hAlign: "center",
})
.addChild(provider)
.addChild(makeBreadcrumbContainer(table.name, "Edit row"))
.addChild(generateFormBlock(table))
.json()
}

View file

@ -2,7 +2,6 @@ import sanitizeUrl from "./utils/sanitizeUrl"
import { newRowUrl } from "./newRowScreen"
import { Screen } from "./utils/Screen"
import { Component } from "./utils/Component"
import { makePropSafe } from "@budibase/string-templates"
export default function (tables) {
return tables.map(table => {
@ -18,48 +17,17 @@ export default function (tables) {
export const ROW_LIST_TEMPLATE = "ROW_LIST_TEMPLATE"
export const rowListUrl = table => sanitizeUrl(`/${table.name}`)
function generateTitleContainer(table) {
const newButton = new Component("@budibase/standard-components/button")
.text("Create New")
.customProps({
size: "M",
type: "primary",
onClick: [
{
parameters: {
url: newRowUrl(table),
},
"##eventHandlerType": "Navigate To",
},
],
})
.instanceName("New Button")
const heading = new Component("@budibase/standard-components/heading")
.instanceName("Title")
.text(table.name)
.customProps({
size: "M",
align: "left",
})
return new Component("@budibase/standard-components/container")
.customProps({
direction: "row",
hAlign: "stretch",
vAlign: "middle",
size: "shrink",
gap: "M",
})
.instanceName("Title Container")
.addChild(heading)
.addChild(newButton)
}
const createScreen = table => {
const provider = new Component("@budibase/standard-components/dataprovider")
.instanceName(`Data Provider`)
const generateTableBlock = table => {
const tableBlock = new Component("@budibase/standard-components/tableblock")
tableBlock
.customProps({
linkRows: true,
linkURL: `${rowListUrl(table)}/:id`,
showAutoColumns: false,
showTitleButton: true,
titleButtonText: "Create new",
titleButtonURL: newRowUrl(table),
title: table.name,
dataSource: {
label: table.name,
name: table._id,
@ -68,41 +36,16 @@ const createScreen = table => {
},
size: "spectrum--medium",
paginate: true,
limit: 8,
})
const spectrumTable = new Component("@budibase/standard-components/table")
.customProps({
dataProvider: `{{ literal ${makePropSafe(provider._json._id)} }}`,
showAutoColumns: false,
quiet: false,
rowCount: 8,
})
.instanceName(`${table.name} Table`)
const safeTableId = makePropSafe(spectrumTable._json._id)
const safeRowId = makePropSafe("_id")
const viewLink = new Component("@budibase/standard-components/link")
.customProps({
text: "View",
url: `${rowListUrl(table)}/{{ ${safeTableId}.${safeRowId} }}`,
size: "S",
color: "var(--spectrum-global-color-gray-600)",
align: "left",
})
.normalStyle({
["margin-left"]: "16px",
["margin-right"]: "16px",
})
.instanceName("View Link")
spectrumTable.addChild(viewLink)
provider.addChild(spectrumTable)
.instanceName(`${table.name} - Table block`)
return tableBlock
}
const createScreen = table => {
return new Screen()
.route(rowListUrl(table))
.instanceName(`${table.name} - List`)
.addChild(generateTitleContainer(table))
.addChild(provider)
.addChild(generateTableBlock(table))
.json()
}

View file

@ -65,6 +65,11 @@ export function makeBreadcrumbContainer(tableName, text) {
vAlign: "middle",
size: "shrink",
})
.normalStyle({
width: "600px",
"margin-right": "auto",
"margin-left": "auto",
})
.instanceName("Breadcrumbs")
.addChild(link)
.addChild(arrowText)
@ -138,6 +143,7 @@ const fieldTypeToComponentMap = {
attachment: "attachmentfield",
link: "relationshipfield",
json: "jsonfield",
barcodeqr: "codescanner",
}
export function makeDatasourceFormComponents(datasource) {

View file

@ -261,6 +261,7 @@
} else {
return [
FIELDS.STRING,
FIELDS.BARCODEQR,
FIELDS.LONGFORM,
FIELDS.OPTIONS,
FIELDS.DATETIME,
@ -314,7 +315,7 @@
const relatedTable = $tables.list.find(
tbl => tbl._id === fieldInfo.tableId
)
if (inUse(relatedTable, fieldInfo.fieldName)) {
if (inUse(relatedTable, fieldInfo.fieldName) && !originalName) {
newError.relatedName = `Column name already in use in table ${relatedTable.name}`
}
}

View file

@ -10,10 +10,14 @@
import KeyValueBuilder from "components/integration/KeyValueBuilder.svelte"
import { capitalise } from "helpers"
import { IntegrationTypes } from "constants/backend"
import { createValidationStore } from "helpers/validation/yup"
import { createEventDispatcher } from "svelte"
export let datasource
export let schema
export let creating
const validation = createValidationStore()
const dispatch = createEventDispatcher()
function filter([key, value]) {
if (!value) {
@ -31,6 +35,17 @@
.filter(el => filter(el))
.map(([key]) => key)
// setup the validation for each required field
$: configKeys.forEach(key => {
if (schema[key].required) {
validation.addValidatorType(key, schema[key].type, schema[key].required)
}
})
// run the validation whenever the config changes
$: validation.check(config)
// dispatch the validation result
$: dispatch("valid", $validation.valid)
let addButton
function getDisplayName(key) {
@ -79,6 +94,7 @@
type={schema[configKey].type}
on:change
bind:value={config[configKey]}
error={$validation.errors[configKey]}
/>
</div>
{:else}
@ -88,6 +104,7 @@
type={schema[configKey].type}
on:change
bind:value={config[configKey]}
error={$validation.errors[configKey]}
/>
</div>
{/if}

View file

@ -13,6 +13,7 @@
// kill the reference so the input isn't saved
let datasource = cloneDeep(integration)
let skipFetch = false
let isValid = false
$: name =
IntegrationNames[datasource.type] || datasource.name || datasource.type
@ -53,6 +54,7 @@
return true
}}
size="L"
disabled={!isValid}
>
<Layout noPadding>
<Body size="XS"
@ -63,5 +65,6 @@
schema={datasource.schema}
bind:datasource
creating={true}
on:valid={e => (isValid = e.detail)}
/>
</ModalContent>

View file

@ -124,6 +124,14 @@
label: "Multi-select",
value: FIELDS.ARRAY.type,
},
{
label: "Barcode/QR",
value: FIELDS.BARCODEQR.type,
},
{
label: "Long Form Text",
value: FIELDS.LONGFORM.type,
},
]
</script>

View file

@ -53,6 +53,7 @@ const componentMap = {
"field/link": FormFieldSelect,
"field/array": FormFieldSelect,
"field/json": FormFieldSelect,
"field/barcode/qr": FormFieldSelect,
// Some validation types are the same as others, so not all types are
// explicitly listed here. e.g. options uses string validation
"validation/string": ValidationEditor,

View file

@ -21,6 +21,7 @@
export let key
export let actions
export let bindings = []
export let nested
$: showAvailableActions = !actions?.length
@ -187,6 +188,7 @@
this={selectedActionComponent}
parameters={selectedAction.parameters}
bindings={allBindings}
{nested}
/>
</div>
{/key}

View file

@ -12,6 +12,7 @@
export let value = []
export let name
export let bindings
export let nested
let drawer
let tmpValue
@ -90,6 +91,7 @@
eventType={name}
{bindings}
{key}
{nested}
/>
</Drawer>

View file

@ -1,16 +1,31 @@
<script>
import { Body } from "@budibase/bbui"
import { Label, Body } from "@budibase/bbui"
import DrawerBindableInput from "components/common/bindings/DrawerBindableInput.svelte"
export let parameters
export let bindings = []
</script>
<Body size="S">Navigate To screen, or leave blank.</Body>
<br />
<div class="root">
<Body size="S">This action doesn't require any additional settings.</Body>
<Body size="S">
This action won't do anything if there isn't a screen modal open.
</Body>
<Label small>Screen</Label>
<DrawerBindableInput
title="Destination URL"
placeholder="/screen"
value={parameters.url}
on:change={value => (parameters.url = value.detail)}
{bindings}
/>
</div>
<style>
.root {
display: grid;
align-items: center;
gap: var(--spacing-m);
grid-template-columns: auto 1fr;
max-width: 400px;
margin: 0 auto;
}
</style>

View file

@ -10,11 +10,13 @@
export let parameters
export let bindings = []
export let nested
$: formComponents = getContextProviderComponents(
$currentAsset,
$store.selectedComponentId,
"form"
"form",
{ includeSelf: nested }
)
$: schemaComponents = getContextProviderComponents(
$currentAsset,

View file

@ -0,0 +1,13 @@
<script>
import { ActionButton } from "@budibase/bbui"
const eject = () => {
document.dispatchEvent(
new KeyboardEvent("keydown", { key: "e", ctrlKey: true })
)
}
</script>
<div>
<ActionButton secondary on:click={eject}>Eject block</ActionButton>
</div>

View file

@ -24,18 +24,17 @@
const getOptions = (schema, type) => {
let entries = Object.entries(schema ?? {})
let types = []
if (type === "field/options") {
if (type === "field/options" || type === "field/barcode/qr") {
// allow options to be used on both options and string fields
types = [type, "field/string"]
} else {
types = [type]
}
types = types.map(type => type.split("/")[1])
entries = entries.filter(entry => types.includes(entry[1].type))
types = types.map(type => type.slice(type.indexOf("/") + 1))
entries = entries.filter(entry => types.includes(entry[1].type))
return entries.map(entry => entry[0])
}
</script>

View file

@ -20,6 +20,7 @@
export let componentBindings = []
export let nested = false
export let highlighted = false
export let info = null
$: nullishValue = value == null || value === ""
$: allBindings = getAllBindings(bindings, componentBindings, nested)
@ -94,11 +95,15 @@
bindings={allBindings}
name={key}
text={label}
{nested}
{key}
{type}
{...props}
/>
</div>
{#if info}
<div class="text">{@html info}</div>
{/if}
</div>
<style>
@ -123,4 +128,9 @@
.control {
position: relative;
}
.text {
margin-top: var(--spectrum-global-dimension-size-65);
font-size: var(--spectrum-global-dimension-font-size-75);
color: var(--grey-6);
}
</style>

View file

@ -4,6 +4,7 @@
export let value
export let bindings
export let placeholder
$: urlOptions = $store.screens
.map(screen => screen.routing?.route)
@ -13,6 +14,7 @@
<DrawerBindableCombobox
{value}
{bindings}
{placeholder}
on:change
options={urlOptions}
appendBindingsAsOptions={false}

View file

@ -1,7 +1,16 @@
<script>
import Editor from "./QueryEditor.svelte"
import FieldsBuilder from "./QueryFieldsBuilder.svelte"
import { Label, Input } from "@budibase/bbui"
import {
Label,
Input,
Select,
Divider,
Layout,
Icon,
Button,
ActionButton,
} from "@budibase/bbui"
const QueryTypes = {
SQL: "sql",
@ -15,6 +24,8 @@
export let editable = true
export let height = 500
let stepEditors = []
$: urlDisplay =
schema.urlDisplay &&
`${datasource.config.url}${
@ -24,6 +35,39 @@
function updateQuery({ detail }) {
query.fields[schema.type] = detail.value
}
function updateEditorsOnDelete(deleteIndex) {
for (let i = deleteIndex; i < query.fields.steps?.length - 1; i++) {
stepEditors[i].update(query.fields.steps[i + 1].value?.value)
}
}
function updateEditorsOnSwap(actionIndex, targetIndex) {
const target = query.fields.steps[targetIndex].value?.value
stepEditors[targetIndex].update(
query.fields.steps[actionIndex].value?.value
)
stepEditors[actionIndex].update(target)
}
function setEditorTemplate(fromKey, toKey, index) {
const currentValue = query.fields.steps[index].value?.value
if (
!currentValue ||
currentValue.toString().replace("\\s", "").length < 3 ||
schema.steps.filter(step => step.key === fromKey)[0]?.template ===
currentValue
) {
query.fields.steps[index].value.value = schema.steps.filter(
step => step.key === toKey
)[0]?.template
stepEditors[index].update(query.fields.steps[index].value.value)
}
query.fields.steps[index].key = toKey
}
$: shouldDisplayJsonBox =
schema.type === QueryTypes.JSON &&
query.fields.extra?.actionType !== "pipeline"
</script>
{#if schema}
@ -38,7 +82,7 @@
value={query.fields.sql}
parameters={query.parameters}
/>
{:else if schema.type === QueryTypes.JSON}
{:else if shouldDisplayJsonBox}
<Editor
editorHeight={height}
label="Query"
@ -56,6 +100,118 @@
<Input thin outline disabled value={urlDisplay} />
</div>
{/if}
{:else if query.fields.extra?.actionType === "pipeline"}
<br />
{#if !query.fields.steps?.length}
<div class="controls">
<Button
secondary
slot="buttons"
on:click={() => {
query.fields.steps = [
{
key: "$match",
value: "{\n\t\n}",
},
]
}}>Add stage</Button
>
</div>
<br />
{:else}
{#each query.fields.steps ?? [] as step, index}
<div class="block">
<div class="subblock">
<Divider noMargin />
<div class="blockSection">
<div class="block-options">
Stage {index + 1}
<div class="block-actions">
<div style="margin-right: 24px;">
{#if index > 0}
<ActionButton
quiet
on:click={() => {
updateEditorsOnSwap(index, index - 1)
const target = query.fields.steps[index - 1].key
query.fields.steps[index - 1].key =
query.fields.steps[index].key
query.fields.steps[index].key = target
}}
icon="ChevronUp"
/>
{/if}
{#if index < query.fields.steps.length - 1}
<ActionButton
quiet
on:click={() => {
updateEditorsOnSwap(index, index + 1)
const target = query.fields.steps[index + 1].key
query.fields.steps[index + 1].key =
query.fields.steps[index].key
query.fields.steps[index].key = target
}}
icon="ChevronDown"
/>
{/if}
</div>
<ActionButton
on:click={() => {
updateEditorsOnDelete(index)
query.fields.steps.splice(index, 1)
query.fields.steps = [...query.fields.steps]
}}
icon="DeleteOutline"
/>
</div>
</div>
<Layout noPadding gap="S">
<div class="fields">
<div class="block-field">
<Select
value={step.key}
options={schema.steps.map(s => s.key)}
on:change={({ detail }) => {
setEditorTemplate(step.key, detail, index)
}}
/>
<Editor
bind:this={stepEditors[index]}
editorHeight={height / 2}
mode="json"
value={typeof step.value === "string"
? step.value
: step.value.value}
on:change={({ detail }) => {
query.fields.steps[index].value = detail
}}
/>
</div>
</div>
</Layout>
</div>
</div>
<div class="separator" />
{#if index === query.fields.steps.length - 1}
<Icon
hoverable
name="AddCircle"
size="S"
on:click={() => {
query.fields.steps = [
...query.fields.steps,
{
key: "$match",
value: "{\n\t\n}",
},
]
}}
/>
<br />
{/if}
</div>
{/each}
{/if}
{/if}
{/key}
{/if}
@ -67,4 +223,57 @@
grid-gap: var(--spacing-l);
align-items: center;
}
.blockSection {
padding: var(--spacing-xl);
}
.block {
display: flex;
flex-direction: column;
justify-content: flex-start;
align-items: center;
margin-top: -6px;
}
.subblock {
width: 480px;
font-size: 16px;
background-color: var(--background);
border: 1px solid var(--spectrum-global-color-gray-300);
border-radius: 4px 4px 4px 4px;
}
.block-options {
justify-content: space-between;
display: flex;
align-items: center;
padding-bottom: 24px;
}
.block-actions {
justify-content: space-between;
display: flex;
align-items: right;
}
.fields {
display: flex;
flex-direction: column;
justify-content: flex-start;
align-items: stretch;
gap: var(--spacing-s);
}
.block-field {
display: grid;
grid-gap: 5px;
}
.separator {
width: 1px;
height: 25px;
border-left: 1px dashed var(--grey-4);
color: var(--grey-4);
/* center horizontally */
align-self: center;
}
.controls {
display: flex;
align-items: center;
justify-content: right;
}
</style>

View file

@ -1,5 +1,5 @@
<script>
import { Layout, Table, Select, Pagination } from "@budibase/bbui"
import { Layout, Table, Select, Pagination, Button } from "@budibase/bbui"
import DateTimeRenderer from "components/common/renderers/DateTimeRenderer.svelte"
import StatusRenderer from "./StatusRenderer.svelte"
import HistoryDetailsPanel from "./HistoryDetailsPanel.svelte"
@ -7,12 +7,16 @@
import { createPaginationStore } from "helpers/pagination"
import { onMount } from "svelte"
import dayjs from "dayjs"
import { auth, licensing, admin } from "stores/portal"
import { Constants } from "@budibase/frontend-core"
const ERROR = "error",
SUCCESS = "success",
STOPPED = "stopped"
export let app
$: licensePlan = $auth.user?.license?.plan
let pageInfo = createPaginationStore()
let runHistory = null
let showPanel = false
@ -26,6 +30,8 @@
$: fetchLogs(automationId, status, page, timeRange)
const timeOptions = [
{ value: "90-d", label: "Past 90 days" },
{ value: "30-d", label: "Past 30 days" },
{ value: "1-w", label: "Past week" },
{ value: "1-d", label: "Past day" },
{ value: "1-h", label: "Past 1 hour" },
@ -131,10 +137,20 @@
</div>
<div class="select">
<Select
placeholder="Past 30 days"
placeholder="All"
label="Date range"
bind:value={timeRange}
options={timeOptions}
isOptionEnabled={x => {
if (licensePlan?.type === Constants.PlanType.FREE) {
return ["1-w", "30-d", "90-d"].indexOf(x.value) < 0
} else if (licensePlan?.type === Constants.PlanType.TEAM) {
return ["90-d"].indexOf(x.value) < 0
} else if (licensePlan?.type === Constants.PlanType.PRO) {
return ["30-d", "90-d"].indexOf(x.value) < 0
}
return true
}}
/>
</div>
<div class="select">
@ -145,6 +161,14 @@
options={statusOptions}
/>
</div>
{#if (licensePlan?.type !== Constants.PlanType.ENTERPRISE && $auth.user.accountPortalAccess) || !$admin.cloud}
<div class="pro-upgrade">
<div class="pro-copy">Expand your automation log history</div>
<Button primary newStyles on:click={$licensing.goToUpgradePage()}>
Upgrade
</Button>
</div>
{/if}
</div>
{#if runHistory}
<div>
@ -221,4 +245,15 @@
.panelOpen {
grid-template-columns: auto 420px;
}
.pro-upgrade {
display: flex;
align-items: center;
justify-content: flex-end;
flex: 1;
}
.pro-copy {
margin-right: var(--spacing-l);
}
</style>

View file

@ -0,0 +1,114 @@
<script>
import {
ActionMenu,
MenuItem,
Icon,
Input,
Heading,
Body,
Modal,
} from "@budibase/bbui"
import ConfirmDialog from "components/common/ConfirmDialog.svelte"
import CreateRestoreModal from "./CreateRestoreModal.svelte"
import { createEventDispatcher } from "svelte"
export let row
let deleteDialog
let restoreDialog
let updateDialog
let name
let restoreBackupModal
const dispatch = createEventDispatcher()
const onClickRestore = name => {
dispatch("buttonclick", {
type: "backupRestore",
name,
backupId: row._id,
restoreBackupName: name,
})
}
const onClickDelete = () => {
dispatch("buttonclick", {
type: "backupDelete",
backupId: row._id,
})
}
const onClickUpdate = () => {
dispatch("buttonclick", {
type: "backupUpdate",
backupId: row._id,
name,
})
}
async function downloadExport() {
window.open(`/api/apps/${row.appId}/backups/${row._id}/file`, "_blank")
}
</script>
<div class="cell">
<ActionMenu align="right">
<div slot="control">
<Icon size="M" hoverable name="MoreSmallList" />
</div>
{#if row.type !== "restore"}
<MenuItem on:click={restoreDialog.show} icon="Revert">Restore</MenuItem>
<MenuItem on:click={deleteDialog.show} icon="Delete">Delete</MenuItem>
<MenuItem on:click={downloadExport} icon="Download">Download</MenuItem>
{/if}
<MenuItem on:click={updateDialog.show} icon="Edit">Update</MenuItem>
</ActionMenu>
</div>
<Modal bind:this={restoreBackupModal}>
<CreateRestoreModal confirm={name => onClickRestore(name)} />
</Modal>
<ConfirmDialog
bind:this={deleteDialog}
okText="Delete Backup"
onOk={onClickDelete}
title="Confirm Deletion"
>
Are you sure you wish to delete the backup
<i>{row.name}?</i>
This action cannot be undone.
</ConfirmDialog>
<ConfirmDialog
bind:this={restoreDialog}
okText="Continue"
onOk={restoreBackupModal?.show}
title="Confirm restore"
warning={false}
>
<Heading size="S">{row.name || "Backup"}</Heading>
<Body size="S">{new Date(row.timestamp).toLocaleString()}</Body>
</ConfirmDialog>
<ConfirmDialog
bind:this={updateDialog}
disabled={!name}
okText="Confirm"
onOk={onClickUpdate}
title="Update Backup"
warning={false}
>
<Input onlabel="Backup name" placeholder={row.name} bind:value={name} />
</ConfirmDialog>
<style>
.cell {
display: flex;
flex-direction: row;
gap: var(--spacing-m);
align-items: center;
margin-left: auto;
}
</style>

View file

@ -0,0 +1,41 @@
<script>
import { Icon } from "@budibase/bbui"
export let row
$: automations = row?.automations
$: datasources = row?.datasources
$: screens = row?.screens
</script>
<div class="cell">
{#if automations != null && screens != null && datasources != null}
<div class="item">
<Icon name="Data" />
<div>{datasources || 0}</div>
</div>
<div class="item">
<Icon name="WebPage" />
<div>{screens || 0}</div>
</div>
<div class="item">
<Icon name="JourneyVoyager" />
<div>{automations || 0}</div>
</div>
{/if}
</div>
<style>
.cell {
display: flex;
flex-direction: row;
gap: calc(var(--spacing-xl) * 2);
align-items: center;
}
.item {
display: flex;
gap: var(--spacing-s);
flex-direction: row;
}
</style>

View file

@ -0,0 +1,345 @@
<script>
import {
ActionButton,
Button,
DatePicker,
Divider,
Layout,
Modal,
notifications,
Pagination,
Select,
Heading,
Body,
Tags,
Tag,
Table,
Page,
} from "@budibase/bbui"
import { backups, licensing, auth, admin } from "stores/portal"
import { createPaginationStore } from "helpers/pagination"
import AppSizeRenderer from "./AppSizeRenderer.svelte"
import CreateBackupModal from "./CreateBackupModal.svelte"
import ActionsRenderer from "./ActionsRenderer.svelte"
import DateRenderer from "./DateRenderer.svelte"
import UserRenderer from "./UserRenderer.svelte"
import StatusRenderer from "./StatusRenderer.svelte"
import TypeRenderer from "./TypeRenderer.svelte"
import BackupsDefault from "assets/backups-default.png"
import { onMount } from "svelte"
export let app
let backupData = null
let modal
let pageInfo = createPaginationStore()
let filterOpt = null
let startDate = null
let endDate = null
let filters = getFilters()
let loaded = false
$: page = $pageInfo.page
$: fetchBackups(filterOpt, page, startDate, endDate)
function getFilters() {
const options = []
let types = ["backup"]
let triggers = ["manual", "publish", "scheduled", "restoring"]
for (let type of types) {
for (let trigger of triggers) {
let label = `${trigger} ${type}`
label = label.charAt(0).toUpperCase() + label?.slice(1)
options.push({ label, value: { type, trigger } })
}
}
options.push({
label: `Manual restore`,
value: { type: "restore", trigger: "manual" },
})
return options
}
const schema = {
type: {
displayName: "Type",
width: "auto",
},
createdAt: {
displayName: "Date",
width: "auto",
},
name: {
displayName: "Name",
width: "auto",
},
appSize: {
displayName: "App size",
width: "auto",
},
createdBy: {
displayName: "User",
width: "auto",
},
status: {
displayName: "Status",
width: "auto",
},
actions: {
displayName: null,
width: "5%",
},
}
const customRenderers = [
{ column: "appSize", component: AppSizeRenderer },
{ column: "actions", component: ActionsRenderer },
{ column: "createdAt", component: DateRenderer },
{ column: "createdBy", component: UserRenderer },
{ column: "status", component: StatusRenderer },
{ column: "type", component: TypeRenderer },
]
function flattenBackups(backups) {
return backups.map(backup => {
return {
...backup,
...backup?.contents,
}
})
}
async function fetchBackups(filters, page, startDate, endDate) {
const response = await backups.searchBackups({
appId: app.instance._id,
...filters,
page,
startDate,
endDate,
})
pageInfo.fetched(response.hasNextPage, response.nextPage)
// flatten so we have an easier structure to use for the table schema
backupData = flattenBackups(response.data)
}
async function createManualBackup(name) {
try {
let response = await backups.createManualBackup({
appId: app.instance._id,
name,
})
await fetchBackups(filterOpt, page)
notifications.success(response.message)
} catch {
notifications.error("Unable to create backup")
}
}
async function handleButtonClick({ detail }) {
if (detail.type === "backupDelete") {
await backups.deleteBackup({
appId: app.instance._id,
backupId: detail.backupId,
})
await fetchBackups(filterOpt, page)
} else if (detail.type === "backupRestore") {
await backups.restoreBackup({
appId: app.instance._id,
backupId: detail.backupId,
name: detail.restoreBackupName,
})
await fetchBackups(filterOpt, page)
} else if (detail.type === "backupUpdate") {
await backups.updateBackup({
appId: app.instance._id,
backupId: detail.backupId,
name: detail.name,
})
await fetchBackups(filterOpt, page)
}
}
onMount(() => {
fetchBackups(filterOpt, page, startDate, endDate)
loaded = true
})
</script>
<div class="root">
{#if !$licensing.backupsEnabled}
<Page wide={false}>
<Layout gap="XS" noPadding>
<div class="title">
<Heading size="M">Backups</Heading>
<Tags>
<Tag icon="LockClosed">Pro plan</Tag>
</Tags>
</div>
<div>
<Body>
Back up your apps and restore them to their previous state.
{#if !$auth.accountPortalAccess && !$licensing.groupsEnabled && $admin.cloud}
Contact your account holder to upgrade your plan.
{/if}
</Body>
</div>
<Divider />
<div class="pro-buttons">
{#if $auth.accountPortalAccess}
<Button
newStyles
primary
disabled={!$auth.accountPortalAccess && $admin.cloud}
on:click={$licensing.goToUpgradePage()}
>
Upgrade
</Button>
{/if}
<!--Show the view plans button-->
<Button
newStyles
secondary
on:click={() => {
window.open("https://budibase.com/pricing/", "_blank")
}}
>
View plans
</Button>
</div>
</Layout>
</Page>
{:else if backupData?.length === 0 && !loaded && !filterOpt && !startDate}
<Page wide={false}>
<div class="align">
<img
width="220px"
height="130px"
src={BackupsDefault}
alt="BackupsDefault"
/>
<Layout gap="S">
<Heading>You have no backups yet</Heading>
<div class="opacity">
<Body size="S">You can manually backup your app any time</Body>
</div>
<div class="padding">
<Button on:click={modal.show} cta>Create Backup</Button>
</div>
</Layout>
</div>
</Page>
{:else if loaded}
<Layout noPadding gap="M" alignContent="start">
<div class="search">
<div class="select">
<Select
placeholder="All"
label="Type"
options={filters}
getOptionValue={filter => filter.value}
getOptionLabel={filter => filter.label}
bind:value={filterOpt}
/>
</div>
<div>
<DatePicker
range={true}
label={"Filter Range"}
on:change={e => {
if (e.detail[0].length > 1) {
startDate = e.detail[0][0].toISOString()
endDate = e.detail[0][1].toISOString()
}
}}
/>
</div>
<div class="split-buttons">
<ActionButton on:click={modal.show} icon="SaveAsFloppy"
>Create new backup</ActionButton
>
</div>
</div>
<div>
<Table
{schema}
disableSorting
allowSelectRows={false}
allowEditColumns={false}
allowEditRows={false}
data={backupData}
{customRenderers}
placeholderText="No backups found"
border={false}
on:buttonclick={handleButtonClick}
/>
<div class="pagination">
<Pagination
page={$pageInfo.pageNumber}
hasPrevPage={$pageInfo.loading ? false : $pageInfo.hasPrevPage}
hasNextPage={$pageInfo.loading ? false : $pageInfo.hasNextPage}
goToPrevPage={pageInfo.prevPage}
goToNextPage={pageInfo.nextPage}
/>
</div>
</div>
</Layout>
{/if}
</div>
<Modal bind:this={modal}>
<CreateBackupModal {createManualBackup} />
</Modal>
<style>
.root {
display: grid;
grid-template-columns: 1fr;
height: 100%;
padding: var(--spectrum-alias-grid-gutter-medium)
var(--spectrum-alias-grid-gutter-large);
}
.search {
display: flex;
gap: var(--spacing-xl);
width: 100%;
align-items: flex-end;
}
.select {
flex-basis: 150px;
}
.pagination {
display: flex;
flex-direction: row;
justify-content: flex-end;
margin-top: var(--spacing-xl);
}
.split-buttons {
display: flex;
align-items: center;
justify-content: flex-end;
flex: 1;
gap: var(--spacing-xl);
}
.title {
display: flex;
flex-direction: row;
align-items: center;
gap: var(--spacing-m);
}
.align {
margin-top: 5%;
text-align: center;
}
.pro-buttons {
display: flex;
gap: var(--spacing-m);
}
</style>

View file

@ -0,0 +1,22 @@
<script>
import { ModalContent, Input } from "@budibase/bbui"
import { auth } from "stores/portal"
export let createManualBackup
let templateName = $auth.user.firstName
? `${$auth.user.firstName}'s Backup`
: "New Backup"
let name = templateName
</script>
<ModalContent
onConfirm={() => createManualBackup(name)}
title="Create new backup"
diabled={!name}
confirmText="Create"
><Input label="Backup name" bind:value={name} /></ModalContent
>
<style>
</style>

View file

@ -0,0 +1,27 @@
<script>
import { ModalContent, Input, Body } from "@budibase/bbui"
import { auth } from "stores/portal"
export let confirm
let templateName = $auth.user.firstName
? `${$auth.user.firstName}'s Backup`
: "Restore Backup"
let name = templateName
</script>
<ModalContent
onConfirm={() => confirm(name)}
title="Back up your current version"
confirmText="Confirm Restore"
disabled={!name}
>
<Body size="S"
>Create a backup of your current app to allow you to roll back after
restoring this backup</Body
>
<Input label="Backup name" bind:value={name} />
</ModalContent>
<style>
</style>

View file

@ -0,0 +1,21 @@
<script>
import { Icon } from "@budibase/bbui"
export let value
</script>
<div class="cell">
{#if value != null}
<Icon name="Data" />
<div>{value || 0}</div>
{/if}
</div>
<style>
.cell {
display: flex;
flex-direction: row;
gap: var(--spacing-m);
align-items: center;
}
</style>

View file

@ -0,0 +1,22 @@
<script>
import DateTimeRenderer from "components/common/renderers/DateTimeRenderer.svelte"
import dayjs from "dayjs"
import relativeTime from "dayjs/plugin/relativeTime"
dayjs.extend(relativeTime)
export let value
$: timeSince = dayjs(value).fromNow()
</script>
<div class="cell">
{timeSince} - <DateTimeRenderer {value} />
</div>
<style>
.cell {
display: flex;
flex-direction: row;
gap: var(--spacing-m);
align-items: center;
}
</style>

View file

@ -0,0 +1,15 @@
<script>
import { Badge } from "@budibase/bbui"
export let value = "started"
$: status = value[0].toUpperCase() + value?.slice(1)
</script>
<Badge
grey={value === "started" || value === "pending"}
green={value === "complete"}
red={value === "failed"}
size="S"
>
{status}
</Badge>

View file

@ -0,0 +1,20 @@
<script>
export let row
$: baseTrig = row?.trigger || "manual"
$: type = row?.type || "backup"
$: trigger = baseTrig.charAt(0).toUpperCase() + baseTrig.slice(1)
</script>
<div class="cell">
{trigger}
{type}
</div>
<style>
.cell {
display: flex;
flex-direction: row;
align-items: center;
}
</style>

View file

@ -0,0 +1,24 @@
<script>
export let value
let firstName = value?.firstName
let lastName = value?.lastName || ""
$: username =
firstName && lastName ? `${firstName} ${lastName}` : value?.email
</script>
<div class="cell">
{#if value != null}
<div>{username}</div>
{/if}
</div>
<style>
.cell {
display: flex;
flex-direction: row;
gap: var(--spacing-m);
align-items: center;
}
</style>

Some files were not shown because too many files have changed in this diff Show more