1
0
Fork 0
mirror of synced 2024-07-04 05:50:57 +12:00

Merge branch 'develop' into fix/return-url

This commit is contained in:
Rory Powell 2022-01-13 14:11:20 +00:00
commit f055d392b5
87 changed files with 2053 additions and 1074 deletions

View file

@ -79,6 +79,8 @@ Component libraries are collections of components as well as the definition of t
### Getting Started For Contributors
#### 1. Prerequisites
NodeJS Version `14.x.x`
*yarn -* `npm install -g yarn`
*jest* - `npm install -g jest`
@ -177,36 +179,7 @@ To enable this mode, use:
yarn mode:account
```
### CI
#### PR Job
After your pr is submitted a github action (can be found at `.github/workflows/budibase_ci.yml`) will run to perform some checks against the changes such as linting, build and test.
The job will run when changes are pushed to or targetted at `master` and `develop`
#### Release Develop
To test changes before a release, a prerelease action (can be found at `.github/workflows/release-develop.yml`) will run to build and release develop versions of npm packages and docker images. On each subsequent commit to develop a new alpha version of npm packages will be created and released.
For example:
- `feature1` -> `develop` = `v0.9.160-alpha.1`
- `feature2` -> `develop` = `v0.9.160-alpha.0`
The job will run when changes are pushed to `develop`
#### Release Job
To release changes a release job (can be found at `.github/workflows/release.yml`) will run to create final versions of npm packages and docker images.
Following the example above:
- `develop` -> `master` = `v0.9.160`
The job will run when changes are pushed to `master`
#### Release Self Host Job
To release the self hosted version of docker images, an additional job (can be found at `.github/workflows/release-selfhost.yml`) must be ran manually. This will releaae docker images to docker hub under the tag `latest` to be picked up by self hosted installations.
An overview of the CI pipelines can be found [here](./workflows/README.md)
### Troubleshooting
Sometimes, things go wrong. This can be due to incompatible updates on the budibase platform. To clear down your development environment and start again follow **Step 6. Cleanup**, then proceed from **Step 3. Install and Build** in the setup guide above. You should have a fresh Budibase installation.

93
.github/workflows/README.md vendored Normal file
View file

@ -0,0 +1,93 @@
# Budibase CI Pipelines
Welcome to the budibase CI pipelines directory. This document details what each of the CI pipelines are for, and come common combinations.
## All CI Pipelines
### Note
- When running workflow dispatch jobs, ensure you always run them off the `master` branch. It defaults to `develop`, so double check before running any jobs.
### Standard CI Build Job (budibase_ci.yml)
Triggers:
- PR or push to develop
- PR or push to master
The standard CI Build job is what runs when you raise a PR to develop or master.
- Installs all dependencies,
- builds the project
- run the unit tests
- Generate test coverage metrics with codecov
- Run the cypress tests
### Release Develop Job (release-develop.yml)
Triggers:
- Push to develop
The job responsible for building, tagging and pushing docker images out to the test and staging environments.
- Installs all dependencies
- builds the project
- run the unit tests
- publish the budibase JS packages under a prerelease tag to NPM
- build, tag and push docker images under the `develop` tag to docker hub
These images will then be pulled by the test and staging environments, updating the latest automatically. Discord notifications are sent to the #infra channel when this occurs.
### Release Job (release.yml)
Triggers:
- Push to master
This job is responsible for building and pushing the latest code to NPM and docker hub, so that it can be deployed.
- Installs all dependencies
- builds the project
- run the unit tests
- publish the budibase JS packages under a release tag to NPM (always incremented by patch versions)
- build, tag and push docker images under the `v.x.x.x` (the tag of the NPM release) tag to docker hub
### Release Selfhost Job (release-selfhost.yml)
Triggers:
- Manual Workflow Dispatch Trigger
This job is responsible for delivering the latest version of budibase to those that are self-hosting.
This job relies on the release job to have run first, so the latest image is pushed to dockerhub. This job then will pull the latest version from `lerna.json` and try to find an image in dockerhub corresponding to that version. For example, if the version in `lerna.json` is `1.0.0`:
- Pull the images for all budibase services tagged `v1.0.0` from dockerhub
- Tag these images as `latest`
- Push them back to dockerhub. This now means anyone who pulls `latest` (self hosters using docker-compose) will get the latest version.
- Build and release the budibase helm chart for kubernetes users
- Perform a github release with the latest version. You can see previous releases here (https://github.com/Budibase/budibase/releases)
### Cloud Deploy (deploy-cloud.yml)
Triggers:
- Manual Workflow Dispatch Trigger
This job is responsible for deploying to our production, cloud kubernetes environment. You must run the release job first, to ensure that the latest images have been built and pushed to docker hub. You can also manually enter a version number for this job, so you can perform rollbacks or upgrade to a specific version. After kicking off this job, the following will occur:
- Checks out the master branch
- Pulls the latest `values.yaml` from budibase infra, a private repo containing budibases infrastructure configuration
- Gets the latest budibase version from `lerna.json`, if it hasn't been specified in the workflow when you kicked it off
- Configures AWS Credentials
- Deploys the helm chart in the budibase repo to our production EKS cluster, injecting the `values.yaml` we pulled from budibase-infra
- Fires off a discord webhook in the #infra channel to show that the deployment completely successfully.
## Common Workflows
### Deploy Changes to Production (Release)
- Merge `develop` into `master`
- Wait for budibase CI job and release job to run
- Run cloud deploy job
- Run release selfhost job
### Deploy Changes to Production (Hotfix)
- Branch off `master`
- Perform your hotfix
- Merge back into `master`
- Wait for budibase CI job and release job to run
- Run cloud deploy job
- Run release selfhost job
### Rollback A Bad Cloud Deployment
- Kick off cloud deploy job
- Ensure you are running off master
- Enter the version number of the last known good version of budibase. For example `1.0.0`

View file

@ -41,4 +41,6 @@ jobs:
files: ./packages/server/coverage/clover.xml
name: codecov-umbrella
verbose: true
# TODO: parallelise this
- run: yarn test:e2e:ci

View file

@ -12,6 +12,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Fail if branch is not master
if: github.ref != 'refs/heads/master'
run: |
echo "Ref is not master, you must run this job from master."
exit 1
- uses: actions/checkout@v2
- name: Pull values.yaml from budibase-infra

View file

@ -23,16 +23,19 @@ jobs:
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: eu-west-1
- name: 'Get Previous tag'
id: previoustag
uses: "WyriHaximus/github-action-get-previous-tag@v1"
- name: Get the latest budibase release version
id: version
run: |
release_version=$(cat lerna.json | jq -r '.version')
echo "RELEASE_VERSION=$release_version" >> $GITHUB_ENV
# - name: Pull values.yaml from budibase-infra
# run: |
# curl -H "Authorization: token ${{ secrets.GH_PERSONAL_TOKEN }}" \ [c3a7a9d12]
# -H 'Accept: application/vnd.github.v3.raw' \
# -o values.preprod.yaml \
# -L https://api.github.com/repos/budibase/budibase-infra/contents/kubernetes/values.preprod.yaml
- name: Pull values.yaml from budibase-infra
run: |
curl -H "Authorization: token ${{ secrets.GH_PERSONAL_TOKEN }}" \
-H 'Accept: application/vnd.github.v3.raw' \
-o values.preprod.yaml \
-L https://api.github.com/repos/budibase/budibase-infra/contents/kubernetes/budibase-preprod/values.yaml
wc -l values.preprod.yaml
- name: Deploy to Preprod Environment
uses: deliverybot/helm@v1
@ -41,13 +44,17 @@ jobs:
namespace: budibase
chart: charts/budibase
token: ${{ github.token }}
helm: helm3
values: |
globals:
appVersion: ${{ steps.previoustag.outputs.tag }}
# value-files: >-
# [
# "charts/budibase/values.yaml"
# ]
appVersion: v${{ env.RELEASE_VERSION }}
ingress:
enabled: true
nginx: true
value-files: >-
[
"values.preprod.yaml"
]
env:
KUBECONFIG_FILE: '${{ secrets.PREPROD_KUBECONFIG }}'

View file

@ -4,6 +4,16 @@ on:
push:
branches:
- develop
paths:
- '.aws/**'
- '.github/**'
- 'charts/**'
- 'packages/**'
- 'scripts/**'
- 'package.json'
- 'yarn.lock'
- 'package.json'
- 'yarn.lock'
env:
POSTHOG_TOKEN: ${{ secrets.POSTHOG_TOKEN }}

View file

@ -9,6 +9,7 @@ jobs:
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: 14.x
@ -42,6 +43,7 @@ jobs:
uses: azure/setup-helm@v1
id: helm-install
<<<<<<< HEAD
- name: Build CLI executables
run: |
pushd packages/cli
@ -68,6 +70,22 @@ jobs:
# github_token: ${{ secrets.GITHUB_TOKEN }}
# publish_dir: ./public
# full_commit_message: "Helm Release: ${{ env.RELEASE_VERSION }}"
=======
- name: Build and release helm chart
run: |
git config user.name "Budibase Helm Bot"
git config user.email "<>"
git pull
helm package charts/budibase
git checkout gh-pages
mv *.tgz docs
helm repo index docs
git add -A
git commit -m "Helm Release: ${{ env.RELEASE_VERSION }}"
git push
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
>>>>>>> 157877a60444104d17efedf937bc7d48fc3006c5
- name: Perform Github Release
uses: softprops/action-gh-release@v1

View file

@ -3,7 +3,17 @@ name: Budibase Release
on:
push:
branches:
- test
- master
paths:
- '.aws/**'
- '.github/**'
- 'charts/**'
- 'packages/**'
- 'scripts/**'
- 'package.json'
- 'yarn.lock'
- 'package.json'
- 'yarn.lock'
env:
POSTHOG_TOKEN: ${{ secrets.POSTHOG_TOKEN }}
@ -56,34 +66,3 @@ jobs:
DOCKER_USER: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_API_KEY }}
BUDIBASE_RELEASE_VERSION: ${{ steps.previoustag.outputs.tag }}
# - name: Pull values.yaml from budibase-infra
# run: |
# curl -H "Authorization: token ${{ secrets.GH_PERSONAL_TOKEN }}" \ [c3a7a9d12]
# -H 'Accept: application/vnd.github.v3.raw' \
# -o values.preprod.yaml \
# -L https://api.github.com/repos/budibase/budibase-infra/contents/kubernetes/values.preprod.yaml
# - name: Deploy to Preprod Environment
# uses: deliverybot/helm@v1
# with:
# release: budibase-preprod
# namespace: budibase
# chart: charts/budibase
# token: ${{ github.token }}
# values: |
# globals:
# appVersion: ${{ steps.previoustag.outputs.tag }}
# # value-files: >-
# # [
# # "charts/budibase/values.yaml"
# # ]
# env:
# KUBECONFIG_FILE: '${{ secrets.PREPROD_KUBECONFIG }}'
# - name: Discord Webhook Action
# uses: tsickert/discord-webhook@v4.0.0
# with:
# webhook-url: ${{ secrets.PROD_DEPLOY_WEBHOOK_URL }}
# content: "Preprod Deployment Complete: ${{ env.RELEASE_VERSION }} deployed to Budibase Pre-prod."
# embed-title: ${{ env.RELEASE_VERSION }}

View file

@ -0,0 +1,9 @@
dependencies:
- name: couchdb
repository: https://apache.github.io/couchdb-helm
version: 3.3.4
- name: ingress-nginx
repository: https://kubernetes.github.io/ingress-nginx
version: 4.0.13
digest: sha256:20892705c2d8e64c98257d181063a514ac55013e2b43399a6e54868a97f97845
generated: "2021-12-30T18:55:30.878411Z"

View file

@ -11,14 +11,14 @@ sources:
- https://github.com/Budibase/budibase
- https://budibase.com
type: application
version: 1.0.0
appVersion: 1.0.20
version: 0.2.5
appVersion: 1.0.25
dependencies:
- name: couchdb
version: 3.3.4
repository: https://apache.github.io/couchdb-helm
condition: services.couchdb.enabled
- name: ingress-nginx
version: 3.35.0
repository: https://github.com/kubernetes/ingress-nginx
version: 4.0.13
repository: https://kubernetes.github.io/ingress-nginx
condition: ingress.nginx

Binary file not shown.

View file

@ -9,12 +9,11 @@ metadata:
app.kubernetes.io/name: budibase-proxy
name: proxy-service
spec:
type: NodePort
ports:
- port: {{ .Values.services.proxy.port }}
- name: {{ .Values.services.proxy.port | quote }}
port: {{ .Values.services.proxy.port }}
targetPort: {{ .Values.services.proxy.port }}
protocol: TCP
selector:
app.kubernetes.io/name: budibase-proxy
status:
loadBalancer: {}
loadBalancer: {}

View file

@ -40,7 +40,7 @@ service:
port: 10000
ingress:
enabled: false
enabled: true
aws: false
nginx: true
certificateArn: ""
@ -302,4 +302,4 @@ couchdb:
initialDelaySeconds: 0
periodSeconds: 10
successThreshold: 1
timeoutSeconds: 1
timeoutSeconds: 1

View file

@ -1,5 +1,5 @@
{
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"npmClient": "yarn",
"packages": [
"packages/*"

View file

@ -0,0 +1 @@
module.exports = require("./src/auth")

View file

@ -0,0 +1 @@
module.exports = require("./src/middleware")

View file

@ -0,0 +1,4 @@
module.exports = {
...require("./src/objectStore"),
...require("./src/objectStore/utils"),
}

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/backend-core",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"description": "Budibase backend core libraries used in server and worker",
"main": "src/index.js",
"author": "Budibase",

View file

@ -0,0 +1,45 @@
const passport = require("koa-passport")
const LocalStrategy = require("passport-local").Strategy
const JwtStrategy = require("passport-jwt").Strategy
const { getGlobalDB } = require("./tenancy")
const {
jwt,
local,
authenticated,
google,
oidc,
auditLog,
tenancy,
appTenancy,
authError,
} = require("./middleware")
// Strategies
passport.use(new LocalStrategy(local.options, local.authenticate))
passport.use(new JwtStrategy(jwt.options, jwt.authenticate))
passport.serializeUser((user, done) => done(null, user))
passport.deserializeUser(async (user, done) => {
const db = getGlobalDB()
try {
const user = await db.get(user._id)
return done(null, user)
} catch (err) {
console.error("User not found", err)
return done(null, false, { message: "User not found" })
}
})
module.exports = {
buildAuthMiddleware: authenticated,
passport,
google,
oidc,
jwt: require("jsonwebtoken"),
buildTenancyMiddleware: tenancy,
buildAppTenancyMiddleware: appTenancy,
auditLog,
authError,
}

View file

@ -1,71 +1,17 @@
const passport = require("koa-passport")
const LocalStrategy = require("passport-local").Strategy
const JwtStrategy = require("passport-jwt").Strategy
const { StaticDatabases } = require("./db/utils")
const { getGlobalDB } = require("./tenancy")
const {
jwt,
local,
authenticated,
google,
oidc,
auditLog,
tenancy,
appTenancy,
authError,
} = require("./middleware")
const { setDB } = require("./db")
const userCache = require("./cache/user")
// Strategies
passport.use(new LocalStrategy(local.options, local.authenticate))
passport.use(new JwtStrategy(jwt.options, jwt.authenticate))
passport.serializeUser((user, done) => done(null, user))
passport.deserializeUser(async (user, done) => {
const db = getGlobalDB()
try {
const user = await db.get(user._id)
return done(null, user)
} catch (err) {
console.error("User not found", err)
return done(null, false, { message: "User not found" })
}
})
module.exports = {
init(pouch) {
setDB(pouch)
},
db: require("./db/utils"),
redis: {
Client: require("./redis"),
utils: require("./redis/utils"),
},
objectStore: {
...require("./objectStore"),
...require("./objectStore/utils"),
},
utils: {
...require("./utils"),
...require("./hashing"),
},
auth: {
buildAuthMiddleware: authenticated,
passport,
google,
oidc,
jwt: require("jsonwebtoken"),
buildTenancyMiddleware: tenancy,
buildAppTenancyMiddleware: appTenancy,
auditLog,
authError,
},
cache: {
user: userCache,
},
StaticDatabases,
constants: require("./constants"),
// some default exports from the library, however these ideally shouldn't
// be used, instead the syntax require("@budibase/backend-core/db") should be used
StaticDatabases: require("./db/utils").StaticDatabases,
db: require("../db"),
redis: require("../redis"),
objectStore: require("../objectStore"),
utils: require("../utils"),
cache: require("../cache"),
auth: require("../auth"),
constants: require("../constants"),
}

View file

@ -0,0 +1,4 @@
module.exports = {
...require("./src/utils"),
...require("./src/hashing"),
}

View file

@ -1,7 +1,7 @@
{
"name": "@budibase/bbui",
"description": "A UI solution used in the different Budibase projects.",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"license": "MPL-2.0",
"svelte": "src/index.js",
"module": "dist/bbui.es.js",

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/builder",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"license": "GPL-3.0",
"private": true,
"scripts": {
@ -65,10 +65,10 @@
}
},
"dependencies": {
"@budibase/bbui": "^1.0.27-alpha.12",
"@budibase/client": "^1.0.27-alpha.12",
"@budibase/bbui": "^1.0.27-alpha.13",
"@budibase/client": "^1.0.27-alpha.13",
"@budibase/colorpicker": "1.1.2",
"@budibase/string-templates": "^1.0.27-alpha.12",
"@budibase/string-templates": "^1.0.27-alpha.13",
"@sentry/browser": "5.19.1",
"@spectrum-css/page": "^3.0.1",
"@spectrum-css/vars": "^3.0.1",

View file

@ -133,5 +133,6 @@
.iconText {
margin-top: 1px;
font-size: var(--spectrum-global-dimension-font-size-50);
flex: 0 0 34px;
}
</style>

View file

@ -31,7 +31,10 @@
export let menuItems
export let showMenu = false
let fields = Object.entries(object).map(([name, value]) => ({ name, value }))
let fields = Object.entries(object || {}).map(([name, value]) => ({
name,
value,
}))
let fieldActivity = buildFieldActivity(activity)
$: object = fields.reduce(

View file

@ -219,3 +219,13 @@ export const RestBodyTypes = [
{ name: "raw (XML)", value: "xml" },
{ name: "raw (Text)", value: "text" },
]
export const PaginationTypes = [
{ label: "Page number based", value: "page" },
{ label: "Cursor based", value: "cursor" },
]
export const PaginationLocations = [
{ label: "Query parameters", value: "query" },
{ label: "Request body", value: "body" },
]

View file

@ -84,7 +84,7 @@ export function customQueryIconText(datasource, query) {
case "read":
return "GET"
case "delete":
return "DELETE"
return "DEL"
case "patch":
return "PATCH"
}

View file

@ -1,5 +1,7 @@
// Do not use any aliased imports in common files, as these will be bundled
// by multiple bundlers which may not be able to resolve them
// by multiple bundlers which may not be able to resolve them.
// This will eventually be replaced by the new client implementation when we
// add a core package.
import { writable, derived, get } from "svelte/store"
import * as API from "../builderStore/api"
import { buildLuceneQuery } from "./lucene"

View file

@ -122,12 +122,16 @@ export const luceneQuery = (docs, query) => {
// Process a string match (fails if the value does not start with the string)
const stringMatch = match("string", (docValue, testValue) => {
return !docValue || !docValue.startsWith(testValue)
return (
!docValue || !docValue?.toLowerCase().startsWith(testValue?.toLowerCase())
)
})
// Process a fuzzy match (treat the same as starts with when running locally)
const fuzzyMatch = match("fuzzy", (docValue, testValue) => {
return !docValue || !docValue.startsWith(testValue)
return (
!docValue || !docValue?.toLowerCase().startsWith(testValue?.toLowerCase())
)
})
// Process a range match

View file

@ -30,6 +30,8 @@
import {
RestBodyTypes as bodyTypes,
SchemaTypeOptions,
PaginationLocations,
PaginationTypes,
} from "constants/backend"
import JSONPreview from "components/integration/JSONPreview.svelte"
import AccessLevelSelect from "components/integration/AccessLevelSelect.svelte"
@ -269,6 +271,9 @@
query.fields.bodyType = RawRestBodyTypes.NONE
}
}
if (query && !query.fields.pagination) {
query.fields.pagination = {}
}
dynamicVariables = getDynamicVariables(datasource, query._id)
})
</script>
@ -343,6 +348,42 @@
/>
<RestBodyInput bind:bodyType={query.fields.bodyType} bind:query />
</Tab>
<Tab title="Pagination">
<div class="pagination">
<Select
label="Pagination type"
bind:value={query.fields.pagination.type}
options={PaginationTypes}
placeholder="None"
/>
{#if query.fields.pagination.type}
<Select
label="Pagination parameters location"
bind:value={query.fields.pagination.location}
options={PaginationLocations}
placeholer="Choose where to send pagination parameters"
/>
<Input
label={query.fields.pagination.type === "page"
? "Page number parameter name "
: "Request cursor parameter name"}
bind:value={query.fields.pagination.pageParam}
/>
<Input
label={query.fields.pagination.type === "page"
? "Page size parameter name"
: "Request limit parameter name"}
bind:value={query.fields.pagination.sizeParam}
/>
{#if query.fields.pagination.type === "cursor"}
<Input
label="Response body parameter name for cursor"
bind:value={query.fields.pagination.responseParam}
/>
{/if}
{/if}
</div>
</Tab>
<Tab title="Transformer">
<Layout noPadding>
{#if !$flags.queryTransformerBanner}
@ -564,4 +605,9 @@
.auth-select {
width: 200px;
}
.pagination {
display: grid;
grid-template-columns: 1fr 1fr;
gap: var(--spacing-m);
}
</style>

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/cli",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"description": "Budibase CLI, for developers, self hosting and migrations.",
"main": "src/index.js",
"bin": {

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/client",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"license": "MPL-2.0",
"module": "dist/budibase-client.js",
"main": "dist/budibase-client.js",
@ -19,9 +19,9 @@
"dev:builder": "rollup -cw"
},
"dependencies": {
"@budibase/bbui": "^1.0.27-alpha.12",
"@budibase/bbui": "^1.0.27-alpha.13",
"@budibase/standard-components": "^0.9.139",
"@budibase/string-templates": "^1.0.27-alpha.12",
"@budibase/string-templates": "^1.0.27-alpha.13",
"regexparam": "^1.3.0",
"shortid": "^2.2.15",
"svelte-spa-router": "^3.0.5"

View file

@ -1,133 +0,0 @@
import { cloneDeep } from "lodash/fp"
import { fetchTableData, fetchTableDefinition } from "./tables"
import { fetchViewData } from "./views"
import { fetchRelationshipData } from "./relationships"
import { FieldTypes } from "../constants"
import { executeQuery, fetchQueryDefinition } from "./queries"
import {
convertJSONSchemaToTableSchema,
getJSONArrayDatasourceSchema,
} from "builder/src/builderStore/jsonUtils"
/**
* Fetches all rows for a particular Budibase data source.
*/
export const fetchDatasource = async dataSource => {
if (!dataSource || !dataSource.type) {
return []
}
// Fetch all rows in data source
const { type, tableId, fieldName } = dataSource
let rows = [],
info = {}
if (type === "table") {
rows = await fetchTableData(tableId)
} else if (type === "view") {
rows = await fetchViewData(dataSource)
} else if (type === "query") {
// Set the default query params
let parameters = cloneDeep(dataSource.queryParams || {})
for (let param of dataSource.parameters) {
if (!parameters[param.name]) {
parameters[param.name] = param.default
}
}
const { data, ...rest } = await executeQuery({
queryId: dataSource._id,
parameters,
})
info = rest
rows = data
} else if (type === FieldTypes.LINK) {
rows = await fetchRelationshipData({
rowId: dataSource.rowId,
tableId: dataSource.rowTableId,
fieldName,
})
}
// Enrich the result is always an array
return { rows: Array.isArray(rows) ? rows : [], info }
}
/**
* Fetches the schema of any kind of datasource.
*/
export const fetchDatasourceSchema = async dataSource => {
if (!dataSource) {
return null
}
const { type } = dataSource
let schema
// Nested providers should already have exposed their own schema
if (type === "provider") {
schema = dataSource.value?.schema
}
// Field sources have their schema statically defined
if (type === "field") {
if (dataSource.fieldType === "attachment") {
schema = {
url: {
type: "string",
},
name: {
type: "string",
},
}
} else if (dataSource.fieldType === "array") {
schema = {
value: {
type: "string",
},
}
}
}
// JSON arrays need their table definitions fetched.
// We can then extract their schema as a subset of the table schema.
if (type === "jsonarray") {
const table = await fetchTableDefinition(dataSource.tableId)
schema = getJSONArrayDatasourceSchema(table?.schema, dataSource)
}
// Tables, views and links can be fetched by table ID
if (
(type === "table" || type === "view" || type === "link") &&
dataSource.tableId
) {
const table = await fetchTableDefinition(dataSource.tableId)
schema = table?.schema
}
// Queries can be fetched by query ID
if (type === "query" && dataSource._id) {
const definition = await fetchQueryDefinition(dataSource._id)
schema = definition?.schema
}
// Sanity check
if (!schema) {
return null
}
// Check for any JSON fields so we can add any top level properties
let jsonAdditions = {}
Object.keys(schema).forEach(fieldKey => {
const fieldSchema = schema[fieldKey]
if (fieldSchema?.type === "json") {
const jsonSchema = convertJSONSchemaToTableSchema(fieldSchema, {
squashObjects: true,
})
Object.keys(jsonSchema).forEach(jsonKey => {
jsonAdditions[`${fieldKey}.${jsonKey}`] = {
type: jsonSchema[jsonKey].type,
nestedJSON: true,
}
})
}
})
return { ...schema, ...jsonAdditions }
}

View file

@ -1,6 +1,5 @@
export * from "./rows"
export * from "./auth"
export * from "./datasources"
export * from "./tables"
export * from "./attachments"
export * from "./views"

View file

@ -4,7 +4,7 @@ import API from "./api"
/**
* Executes a query against an external data connector.
*/
export const executeQuery = async ({ queryId, parameters }) => {
export const executeQuery = async ({ queryId, pagination, parameters }) => {
const query = await fetchQueryDefinition(queryId)
if (query?.datasourceId == null) {
notificationStore.actions.error("That query couldn't be found")
@ -14,6 +14,7 @@ export const executeQuery = async ({ queryId, parameters }) => {
url: `/api/v2/queries/${queryId}`,
body: {
parameters,
pagination,
},
})
if (res.error) {

View file

@ -19,6 +19,16 @@
export let isScreen = false
export let isBlock = false
// Ref to the svelte component
let ref
// Initial settings are passed in on first render of the component.
// When the first instance of cachedSettings are set, this object is set to
// reference cachedSettings, so that mutations to cachedSettings also affect
// initialSettings, but it does not get caught by svelte invalidation - which
// would happen if we spread cachedSettings directly to the component.
let initialSettings
// Component settings are the un-enriched settings for this component that
// need to be enriched at this level.
// Nested settings are the un-enriched block settings that are to be passed on
@ -267,16 +277,26 @@
const cacheSettings = (enriched, nested, conditional) => {
const allSettings = { ...enriched, ...nested, ...conditional }
if (!cachedSettings) {
cachedSettings = allSettings
cachedSettings = { ...allSettings }
initialSettings = cachedSettings
} else {
Object.keys(allSettings).forEach(key => {
if (!propsAreSame(allSettings[key], cachedSettings[key])) {
const same = propsAreSame(allSettings[key], cachedSettings[key])
if (!same) {
cachedSettings[key] = allSettings[key]
assignSetting(key, allSettings[key])
}
})
}
}
// Assigns a certain setting to this component.
// We manually use the svelte $set function to avoid triggering additional
// reactive statements.
const assignSetting = (key, value) => {
ref?.$$set?.({ [key]: value })
}
// Generates a key used to determine when components need to fully remount.
// Currently only toggling editing requires remounting.
const getRenderKey = (id, editing) => {
@ -299,7 +319,7 @@
data-id={id}
data-name={name}
>
<svelte:component this={constructor} {...cachedSettings}>
<svelte:component this={constructor} bind:this={ref} {...initialSettings}>
{#if children.length}
{#each children as child (child._id)}
<svelte:self instance={child} />

View file

@ -1,13 +1,9 @@
<script>
import { getContext } from "svelte"
import { ProgressCircle, Pagination } from "@budibase/bbui"
import {
buildLuceneQuery,
luceneQuery,
luceneSort,
luceneLimit,
} from "builder/src/helpers/lucene"
import Placeholder from "./Placeholder.svelte"
import { fetchData } from "utils/fetch/fetchData.js"
import { buildLuceneQuery } from "builder/src/helpers/lucene"
export let dataSource
export let filter
@ -16,85 +12,30 @@
export let limit
export let paginate
const { API, styleable, Provider, ActionTypes } = getContext("sdk")
const { styleable, Provider, ActionTypes } = getContext("sdk")
const component = getContext("component")
// Loading flag every time data is being fetched
let loading = false
// Loading flag for the initial load
// Mark as loaded if we have no datasource so we don't stall forever
let loaded = !dataSource
let schemaLoaded = false
// Provider state
let rows = []
let allRows = []
let info = {}
let schema = {}
let bookmarks = [null]
let pageNumber = 0
let query = null
// We need to manage our lucene query manually as we want to allow components
// to extend it
let queryExtensions = {}
// Sorting can be overridden at run time, so we can't use the prop directly
let currentSortColumn = sortColumn
let currentSortOrder = sortOrder
// Reset the current sort state to props if props change
$: currentSortColumn = sortColumn
$: currentSortOrder = sortOrder
$: defaultQuery = buildLuceneQuery(filter)
$: extendQuery(defaultQuery, queryExtensions)
$: internalTable = dataSource?.type === "table"
$: nestedProvider = dataSource?.type === "provider"
$: hasNextPage = bookmarks[pageNumber + 1] != null
$: hasPrevPage = pageNumber > 0
$: getSchema(dataSource)
$: sortType = getSortType(schema, currentSortColumn)
$: query = extendQuery(defaultQuery, queryExtensions)
// Wait until schema loads before loading data, so that we can determine
// the correct sort type first time
$: {
if (schemaLoaded) {
fetchData(
dataSource,
schema,
query,
limit,
currentSortColumn,
currentSortOrder,
sortType,
paginate
)
}
}
// Reactively filter and sort rows if required
$: {
if (internalTable) {
// Internal tables are already processed server-side
rows = allRows
} else {
// For anything else we use client-side implementations to filter, sort
// and limit
const filtered = luceneQuery(allRows, query)
const sorted = luceneSort(
filtered,
currentSortColumn,
currentSortOrder,
sortType
)
rows = luceneLimit(sorted, limit)
}
}
// Keep our data fetch instance up to date
$: fetch = createFetch(dataSource)
$: fetch.update({
query,
sortColumn,
sortOrder,
limit,
paginate,
})
// Build our action context
$: actions = [
{
type: ActionTypes.RefreshDatasource,
callback: () => refresh(),
callback: () => fetch.refresh(),
metadata: { dataSource },
},
{
@ -108,11 +49,15 @@
{
type: ActionTypes.SetDataProviderSorting,
callback: ({ column, order }) => {
let newOptions = {}
if (column) {
currentSortColumn = column
newOptions.sortColumn = column
}
if (order) {
currentSortOrder = order
newOptions.sortOrder = order
}
if (Object.keys(newOptions)?.length) {
fetch.update(newOptions)
}
},
},
@ -120,166 +65,30 @@
// Build our data context
$: dataContext = {
rows,
info,
schema,
rowsLength: rows?.length,
rows: $fetch.rows,
info: $fetch.info,
schema: $fetch.schema,
rowsLength: $fetch.rows.length,
// Undocumented properties. These aren't supposed to be used in builder
// bindings, but are used internally by other components
id: $component?.id,
state: {
query,
sortColumn: currentSortColumn,
sortOrder: currentSortOrder,
query: $fetch.query,
sortColumn: $fetch.sortColumn,
sortOrder: $fetch.sortOrder,
},
loaded,
loaded: $fetch.loaded,
}
const getSortType = (schema, sortColumn) => {
if (!schema || !sortColumn || !schema[sortColumn]) {
return "string"
}
const type = schema?.[sortColumn]?.type
return type === "number" ? "number" : "string"
}
const refresh = async () => {
if (schemaLoaded && !nestedProvider) {
fetchData(
dataSource,
schema,
query,
limit,
currentSortColumn,
currentSortOrder,
sortType,
paginate
)
}
}
const fetchData = async (
dataSource,
schema,
query,
limit,
sortColumn,
sortOrder,
sortType,
paginate
) => {
loading = true
if (dataSource?.type === "table") {
// Sanity check sort column, as using a non-existant column will prevent
// results coming back at all
const sort = schema?.[sortColumn] ? sortColumn : undefined
// For internal tables we use server-side processing
const res = await API.searchTable({
tableId: dataSource.tableId,
query,
limit,
sort,
sortOrder: sortOrder?.toLowerCase() ?? "ascending",
sortType,
paginate,
})
pageNumber = 0
allRows = res.rows
if (res.hasNextPage) {
bookmarks = [null, res.bookmark]
} else {
bookmarks = [null]
}
} else if (dataSource?.type === "provider") {
// For providers referencing another provider, just use the rows it
// provides
allRows = dataSource?.value?.rows || []
} else if (
dataSource?.type === "field" ||
dataSource?.type === "jsonarray"
) {
// These sources will be available directly from context.
// Enrich non object elements into objects to ensure a valid schema.
const data = dataSource?.value || []
if (Array.isArray(data) && data[0] && typeof data[0] !== "object") {
allRows = data.map(value => ({ value }))
} else {
allRows = data
}
} else {
// For other data sources like queries or views, fetch all rows from the
// server
const data = await API.fetchDatasource(dataSource)
allRows = data.rows
info = data.info
}
loading = false
loaded = true
}
const getSchema = async dataSource => {
let newSchema = (await API.fetchDatasourceSchema(dataSource)) || {}
// Ensure there are "name" properties for all fields and that field schema
// are objects
Object.entries(newSchema).forEach(([fieldName, fieldSchema]) => {
if (typeof fieldSchema === "string") {
newSchema[fieldName] = {
type: fieldSchema,
name: fieldName,
}
} else {
newSchema[fieldName] = {
...fieldSchema,
name: fieldName,
}
}
})
schema = newSchema
schemaLoaded = true
}
const nextPage = async () => {
if (!hasNextPage || !internalTable) {
return
}
const sort = schema?.[currentSortColumn] ? currentSortColumn : undefined
const res = await API.searchTable({
tableId: dataSource?.tableId,
const createFetch = datasource => {
return fetchData(datasource, {
query,
bookmark: bookmarks[pageNumber + 1],
sortColumn,
sortOrder,
limit,
sort,
sortOrder: currentSortOrder?.toLowerCase() ?? "ascending",
sortType,
paginate: true,
paginate,
})
pageNumber++
allRows = res.rows
if (res.hasNextPage) {
bookmarks[pageNumber + 1] = res.bookmark
}
}
const prevPage = async () => {
if (!hasPrevPage || !internalTable) {
return
}
const sort = schema?.[currentSortColumn] ? currentSortColumn : undefined
const res = await API.searchTable({
tableId: dataSource?.tableId,
query,
bookmark: bookmarks[pageNumber - 1],
limit,
sort,
sortOrder: currentSortOrder?.toLowerCase() ?? "ascending",
sortType,
paginate: true,
})
pageNumber--
allRows = res.rows
}
const addQueryExtension = (key, extension) => {
@ -309,16 +118,13 @@
}
})
})
if (JSON.stringify(query) !== JSON.stringify(extendedQuery)) {
query = extendedQuery
}
return extendedQuery
}
</script>
<div use:styleable={$component.styles} class="container">
<Provider {actions} data={dataContext}>
{#if !loaded}
{#if !$fetch.loaded}
<div class="loading">
<ProgressCircle />
</div>
@ -328,14 +134,14 @@
{:else}
<slot />
{/if}
{#if paginate && internalTable}
{#if paginate && $fetch.supportsPagination}
<div class="pagination">
<Pagination
page={pageNumber + 1}
{hasPrevPage}
{hasNextPage}
goToPrevPage={prevPage}
goToNextPage={nextPage}
page={$fetch.pageNumber + 1}
hasPrevPage={$fetch.hasPrevPage}
hasNextPage={$fetch.hasNextPage}
goToPrevPage={fetch.prevPage}
goToNextPage={fetch.nextPage}
/>
</div>
{/if}

View file

@ -30,7 +30,7 @@
export let cardButtonOnClick
export let linkColumn
const { API, styleable } = getContext("sdk")
const { fetchDatasourceSchema, styleable } = getContext("sdk")
const context = getContext("context")
const component = getContext("component")
const schemaComponentMap = {
@ -45,6 +45,7 @@
let dataProviderId
let repeaterId
let schema
let schemaLoaded = false
$: fetchSchema(dataSource)
$: enrichedSearchColumns = enrichSearchColumns(searchColumns, schema)
@ -111,103 +112,106 @@
// Load the datasource schema so we can determine column types
const fetchSchema = async dataSource => {
if (dataSource) {
schema = await API.fetchDatasourceSchema(dataSource)
schema = await fetchDatasourceSchema(dataSource)
}
schemaLoaded = true
}
</script>
<Block>
<div class="card-list" use:styleable={$component.styles}>
<BlockComponent type="form" bind:id={formId} props={{ dataSource }}>
{#if title || enrichedSearchColumns?.length || showTitleButton}
<div class="header" class:mobile={$context.device.mobile}>
<div class="title">
<Heading>{title || ""}</Heading>
{#if schemaLoaded}
<Block>
<div class="card-list" use:styleable={$component.styles}>
<BlockComponent type="form" bind:id={formId} props={{ dataSource }}>
{#if title || enrichedSearchColumns?.length || showTitleButton}
<div class="header" class:mobile={$context.device.mobile}>
<div class="title">
<Heading>{title || ""}</Heading>
</div>
<div class="controls">
{#if enrichedSearchColumns?.length}
<div
class="search"
style="--cols:{enrichedSearchColumns?.length}"
>
{#each enrichedSearchColumns as column}
<BlockComponent
type={column.componentType}
props={{
field: column.name,
placeholder: column.name,
text: column.name,
autoWidth: true,
}}
/>
{/each}
</div>
{/if}
{#if showTitleButton}
<BlockComponent
type="button"
props={{
onClick: titleButtonAction,
text: titleButtonText,
type: "cta",
}}
/>
{/if}
</div>
</div>
<div class="controls">
{#if enrichedSearchColumns?.length}
<div
class="search"
style="--cols:{enrichedSearchColumns?.length}"
>
{#each enrichedSearchColumns as column}
<BlockComponent
type={column.componentType}
props={{
field: column.name,
placeholder: column.name,
text: column.name,
autoWidth: true,
}}
/>
{/each}
</div>
{/if}
{#if showTitleButton}
<BlockComponent
type="button"
props={{
onClick: titleButtonAction,
text: titleButtonText,
type: "cta",
}}
/>
{/if}
</div>
</div>
{/if}
<BlockComponent
type="dataprovider"
bind:id={dataProviderId}
props={{
dataSource,
filter: enrichedFilter,
sortColumn,
sortOrder,
paginate,
limit,
}}
>
{/if}
<BlockComponent
type="repeater"
bind:id={repeaterId}
context="repeater"
type="dataprovider"
bind:id={dataProviderId}
props={{
dataProvider: `{{ literal ${safe(dataProviderId)} }}`,
direction: "row",
hAlign: "stretch",
vAlign: "top",
gap: "M",
noRowsMessage: "No rows found",
}}
styles={{
display: "grid",
"grid-template-columns": `repeat(auto-fill, minmax(min(${cardWidth}px, 100%), 1fr))`,
dataSource,
filter: enrichedFilter,
sortColumn,
sortOrder,
paginate,
limit,
}}
>
<BlockComponent
type="spectrumcard"
type="repeater"
bind:id={repeaterId}
context="repeater"
props={{
title: cardTitle,
subtitle: cardSubtitle,
description: cardDescription,
imageURL: cardImageURL,
horizontal: cardHorizontal,
showButton: showCardButton,
buttonText: cardButtonText,
buttonOnClick: cardButtonOnClick,
linkURL: fullCardURL,
linkPeek: cardPeek,
dataProvider: `{{ literal ${safe(dataProviderId)} }}`,
direction: "row",
hAlign: "stretch",
vAlign: "top",
gap: "M",
noRowsMessage: "No rows found",
}}
styles={{
width: "auto",
display: "grid",
"grid-template-columns": `repeat(auto-fill, minmax(min(${cardWidth}px, 100%), 1fr))`,
}}
/>
>
<BlockComponent
type="spectrumcard"
props={{
title: cardTitle,
subtitle: cardSubtitle,
description: cardDescription,
imageURL: cardImageURL,
horizontal: cardHorizontal,
showButton: showCardButton,
buttonText: cardButtonText,
buttonOnClick: cardButtonOnClick,
linkURL: fullCardURL,
linkPeek: cardPeek,
}}
styles={{
width: "auto",
}}
/>
</BlockComponent>
</BlockComponent>
</BlockComponent>
</BlockComponent>
</div>
</Block>
</div>
</Block>
{/if}
<style>
.header {

View file

@ -26,7 +26,7 @@
export let titleButtonURL
export let titleButtonPeek
const { API, styleable } = getContext("sdk")
const { fetchDatasourceSchema, styleable } = getContext("sdk")
const context = getContext("context")
const component = getContext("component")
const schemaComponentMap = {
@ -40,6 +40,7 @@
let formId
let dataProviderId
let schema
let schemaLoaded = false
$: fetchSchema(dataSource)
$: enrichedSearchColumns = enrichSearchColumns(searchColumns, schema)
@ -89,82 +90,85 @@
// Load the datasource schema so we can determine column types
const fetchSchema = async dataSource => {
if (dataSource) {
schema = await API.fetchDatasourceSchema(dataSource)
schema = await fetchDatasourceSchema(dataSource)
}
schemaLoaded = true
}
</script>
<Block>
<div class={size} use:styleable={$component.styles}>
<BlockComponent type="form" bind:id={formId} props={{ dataSource }}>
{#if title || enrichedSearchColumns?.length || showTitleButton}
<div class="header" class:mobile={$context.device.mobile}>
<div class="title">
<Heading>{title || ""}</Heading>
{#if schemaLoaded}
<Block>
<div class={size} use:styleable={$component.styles}>
<BlockComponent type="form" bind:id={formId} props={{ dataSource }}>
{#if title || enrichedSearchColumns?.length || showTitleButton}
<div class="header" class:mobile={$context.device.mobile}>
<div class="title">
<Heading>{title || ""}</Heading>
</div>
<div class="controls">
{#if enrichedSearchColumns?.length}
<div
class="search"
style="--cols:{enrichedSearchColumns?.length}"
>
{#each enrichedSearchColumns as column}
<BlockComponent
type={column.componentType}
props={{
field: column.name,
placeholder: column.name,
text: column.name,
autoWidth: true,
}}
/>
{/each}
</div>
{/if}
{#if showTitleButton}
<BlockComponent
type="button"
props={{
onClick: titleButtonAction,
text: titleButtonText,
type: "cta",
}}
/>
{/if}
</div>
</div>
<div class="controls">
{#if enrichedSearchColumns?.length}
<div
class="search"
style="--cols:{enrichedSearchColumns?.length}"
>
{#each enrichedSearchColumns as column}
<BlockComponent
type={column.componentType}
props={{
field: column.name,
placeholder: column.name,
text: column.name,
autoWidth: true,
}}
/>
{/each}
</div>
{/if}
{#if showTitleButton}
<BlockComponent
type="button"
props={{
onClick: titleButtonAction,
text: titleButtonText,
type: "cta",
}}
/>
{/if}
</div>
</div>
{/if}
<BlockComponent
type="dataprovider"
bind:id={dataProviderId}
props={{
dataSource,
filter: enrichedFilter,
sortColumn,
sortOrder,
paginate,
limit: rowCount,
}}
>
{/if}
<BlockComponent
type="table"
type="dataprovider"
bind:id={dataProviderId}
props={{
dataProvider: `{{ literal ${safe(dataProviderId)} }}`,
columns: tableColumns,
showAutoColumns,
rowCount,
quiet,
size,
linkRows,
linkURL,
linkColumn,
linkPeek,
dataSource,
filter: enrichedFilter,
sortColumn,
sortOrder,
paginate,
limit: rowCount,
}}
/>
>
<BlockComponent
type="table"
props={{
dataProvider: `{{ literal ${safe(dataProviderId)} }}`,
columns: tableColumns,
showAutoColumns,
rowCount,
quiet,
size,
linkRows,
linkURL,
linkColumn,
linkPeek,
}}
/>
</BlockComponent>
</BlockComponent>
</BlockComponent>
</div>
</Block>
</div>
</Block>
{/if}
<style>
.header {

View file

@ -10,7 +10,7 @@
export let actionType = "Create"
const context = getContext("context")
const { API } = getContext("sdk")
const { API, fetchDatasourceSchema } = getContext("sdk")
let loaded = false
let schema
@ -61,7 +61,7 @@
// For all other cases, just grab the normal schema
else {
const dataSourceSchema = await API.fetchDatasourceSchema(dataSource)
const dataSourceSchema = await fetchDatasourceSchema(dataSource)
schema = dataSourceSchema || {}
}

View file

@ -9,6 +9,7 @@ import {
import { styleable } from "utils/styleable"
import { linkable } from "utils/linkable"
import { getAction } from "utils/getAction"
import { fetchDatasourceSchema } from "utils/schema.js"
import Provider from "components/context/Provider.svelte"
import { ActionTypes } from "constants"
@ -22,6 +23,7 @@ export default {
styleable,
linkable,
getAction,
fetchDatasourceSchema,
Provider,
ActionTypes,
}

View file

@ -0,0 +1,407 @@
import { writable, derived, get } from "svelte/store"
import {
buildLuceneQuery,
luceneLimit,
luceneQuery,
luceneSort,
} from "builder/src/helpers/lucene"
import { fetchTableDefinition } from "api"
/**
* Parent class which handles the implementation of fetching data from an
* internal table or datasource plus.
* For other types of datasource, this class is overridden and extended.
*/
export default class DataFetch {
// Feature flags
featureStore = writable({
supportsSearch: false,
supportsSort: false,
supportsPagination: false,
})
// Config
options = {
datasource: null,
limit: 10,
// Search config
filter: null,
query: null,
// Sorting config
sortColumn: null,
sortOrder: "ascending",
sortType: null,
// Pagination config
paginate: true,
}
// State of the fetch
store = writable({
rows: [],
info: null,
schema: null,
loading: false,
loaded: false,
query: null,
pageNumber: 0,
cursor: null,
cursors: [],
})
/**
* Constructs a new DataFetch instance.
* @param opts the fetch options
*/
constructor(opts) {
// Merge options with their default values
this.options = {
...this.options,
...opts,
}
// Bind all functions to properly scope "this"
this.getData = this.getData.bind(this)
this.getPage = this.getPage.bind(this)
this.getInitialData = this.getInitialData.bind(this)
this.determineFeatureFlags = this.determineFeatureFlags.bind(this)
this.enrichSchema = this.enrichSchema.bind(this)
this.refresh = this.refresh.bind(this)
this.update = this.update.bind(this)
this.hasNextPage = this.hasNextPage.bind(this)
this.hasPrevPage = this.hasPrevPage.bind(this)
this.nextPage = this.nextPage.bind(this)
this.prevPage = this.prevPage.bind(this)
// Derive certain properties to return
this.derivedStore = derived(
[this.store, this.featureStore],
([$store, $featureStore]) => {
return {
...$store,
...$featureStore,
hasNextPage: this.hasNextPage($store),
hasPrevPage: this.hasPrevPage($store),
}
}
)
// Mark as loaded if we have no datasource
if (!this.options.datasource) {
this.store.update($store => ({ ...$store, loaded: true }))
return
}
// Initially fetch data but don't bother waiting for the result
this.getInitialData()
}
/**
* Extend the svelte store subscribe method to that instances of this class
* can be treated like stores
*/
get subscribe() {
return this.derivedStore.subscribe
}
/**
* Fetches a fresh set of data from the server, resetting pagination
*/
async getInitialData() {
const { datasource, filter, sortColumn, paginate } = this.options
const tableId = datasource?.tableId
// Ensure table ID exists
if (!tableId) {
return
}
// Fetch datasource definition and determine feature flags
const definition = await this.constructor.getDefinition(datasource)
const features = this.determineFeatureFlags(definition)
this.featureStore.set({
supportsSearch: !!features?.supportsSearch,
supportsSort: !!features?.supportsSort,
supportsPagination: paginate && !!features?.supportsPagination,
})
// Fetch and enrich schema
let schema = this.constructor.getSchema(datasource, definition)
schema = this.enrichSchema(schema)
if (!schema) {
return
}
// Determine what sort type to use
if (!this.options.sortType) {
let sortType = "string"
if (sortColumn) {
const type = schema?.[sortColumn]?.type
sortType = type === "number" ? "number" : "string"
}
this.options.sortType = sortType
}
// Build the lucene query
let query = this.options.query
if (!query) {
query = buildLuceneQuery(filter)
}
// Update store
this.store.update($store => ({
...$store,
definition,
schema,
query,
loading: true,
}))
// Actually fetch data
const page = await this.getPage()
this.store.update($store => ({
...$store,
loading: false,
loaded: true,
pageNumber: 0,
rows: page.rows,
info: page.info,
cursors: paginate && page.hasNextPage ? [null, page.cursor] : [null],
}))
}
/**
* Fetches some filtered, sorted and paginated data
*/
async getPage() {
const { sortColumn, sortOrder, sortType, limit } = this.options
const { query } = get(this.store)
const features = get(this.featureStore)
// Get the actual data
let { rows, info, hasNextPage, cursor } = await this.getData()
// If we don't support searching, do a client search
if (!features.supportsSearch) {
rows = luceneQuery(rows, query)
}
// If we don't support sorting, do a client-side sort
if (!features.supportsSort) {
rows = luceneSort(rows, sortColumn, sortOrder, sortType)
}
// If we don't support pagination, do a client-side limit
if (!features.supportsPagination) {
rows = luceneLimit(rows, limit)
}
return {
rows,
info,
hasNextPage,
cursor,
}
}
/**
* Fetches a single page of data from the remote resource.
* Must be overridden by a datasource specific child class.
*/
async getData() {
return {
rows: [],
info: null,
hasNextPage: false,
cursor: null,
}
}
/**
* Gets the definition for this datasource.
* Defaults to fetching a table definition.
* @param datasource
* @return {object} the definition
*/
static async getDefinition(datasource) {
if (!datasource?.tableId) {
return null
}
return await fetchTableDefinition(datasource.tableId)
}
/**
* Gets the schema definition for a datasource.
* Defaults to getting the "schema" property of the definition.
* @param datasource the datasource
* @param definition the datasource definition
* @return {object} the schema
*/
static getSchema(datasource, definition) {
return definition?.schema
}
/**
* Enriches the schema and ensures that entries are objects with names
* @param schema the datasource schema
* @return {object} the enriched datasource schema
*/
enrichSchema(schema) {
if (schema == null) {
return null
}
let enrichedSchema = {}
Object.entries(schema).forEach(([fieldName, fieldSchema]) => {
if (typeof fieldSchema === "string") {
enrichedSchema[fieldName] = {
type: fieldSchema,
name: fieldName,
}
} else {
enrichedSchema[fieldName] = {
...fieldSchema,
name: fieldName,
}
}
})
return enrichedSchema
}
/**
* Determine the feature flag for this datasource definition
* @param definition
*/
// eslint-disable-next-line no-unused-vars
determineFeatureFlags(definition) {
return {
supportsSearch: false,
supportsSort: false,
supportsPagination: false,
}
}
/**
* Resets the data set and updates options
* @param newOptions any new options
*/
async update(newOptions) {
// Check if any settings have actually changed
let refresh = false
const entries = Object.entries(newOptions || {})
for (let [key, value] of entries) {
if (JSON.stringify(value) !== JSON.stringify(this.options[key])) {
refresh = true
break
}
}
if (!refresh) {
return
}
// Assign new options and reload data
this.options = {
...this.options,
...newOptions,
}
await this.getInitialData()
}
/**
* Loads the same page again
*/
async refresh() {
if (get(this.store).loading) {
return
}
this.store.update($store => ({ ...$store, loading: true }))
const { rows, info } = await this.getPage()
this.store.update($store => ({ ...$store, rows, info, loading: false }))
}
/**
* Determines whether there is a next page of data based on the state of the
* store
* @param state the current store state
* @return {boolean} whether there is a next page of data or not
*/
hasNextPage(state) {
return state.cursors[state.pageNumber + 1] != null
}
/**
* Determines whether there is a previous page of data based on the state of
* the store
* @param state the current store state
* @return {boolean} whether there is a previous page of data or not
*/
hasPrevPage(state) {
return state.pageNumber > 0
}
/**
* Fetches the next page of data
*/
async nextPage() {
const state = get(this.derivedStore)
if (state.loading || !this.options.paginate || !state.hasNextPage) {
return
}
// Fetch next page
const nextCursor = state.cursors[state.pageNumber + 1]
this.store.update($store => ({
...$store,
loading: true,
cursor: nextCursor,
pageNumber: $store.pageNumber + 1,
}))
const { rows, info, hasNextPage, cursor } = await this.getPage()
// Update state
this.store.update($store => {
let { cursors, pageNumber } = $store
if (hasNextPage) {
cursors[pageNumber + 1] = cursor
}
return {
...$store,
rows,
info,
cursors,
loading: false,
}
})
}
/**
* Fetches the previous page of data
*/
async prevPage() {
const state = get(this.derivedStore)
if (state.loading || !this.options.paginate || !state.hasPrevPage) {
return
}
// Fetch previous page
const prevCursor = state.cursors[state.pageNumber - 1]
this.store.update($store => ({
...$store,
loading: true,
cursor: prevCursor,
pageNumber: $store.pageNumber - 1,
}))
const { rows, info } = await this.getPage()
// Update state
this.store.update($store => {
return {
...$store,
rows,
info,
loading: false,
}
})
}
}

View file

@ -0,0 +1,44 @@
import DataFetch from "./DataFetch.js"
export default class FieldFetch extends DataFetch {
static async getDefinition(datasource) {
// Field sources have their schema statically defined
let schema
if (datasource.fieldType === "attachment") {
schema = {
url: {
type: "string",
},
name: {
type: "string",
},
}
} else if (datasource.fieldType === "array") {
schema = {
value: {
type: "string",
},
}
}
return { schema }
}
async getData() {
const { datasource } = this.options
// These sources will be available directly from context
const data = datasource?.value || []
let rows = []
if (Array.isArray(data) && data[0] && typeof data[0] !== "object") {
rows = data.map(value => ({ value }))
} else {
rows = data
}
return {
rows: rows || [],
hasNextPage: false,
cursor: null,
}
}
}

View file

@ -0,0 +1,13 @@
import FieldFetch from "./FieldFetch.js"
import { fetchTableDefinition } from "api"
import { getJSONArrayDatasourceSchema } from "builder/src/builderStore/jsonUtils"
export default class JSONArrayFetch extends FieldFetch {
static async getDefinition(datasource) {
// JSON arrays need their table definitions fetched.
// We can then extract their schema as a subset of the table schema.
const table = await fetchTableDefinition(datasource.tableId)
const schema = getJSONArrayDatasourceSchema(table?.schema, datasource)
return { schema }
}
}

View file

@ -0,0 +1,20 @@
import DataFetch from "./DataFetch.js"
export default class NestedProviderFetch extends DataFetch {
static async getDefinition(datasource) {
// Nested providers should already have exposed their own schema
return {
schema: datasource?.value?.schema,
}
}
async getData() {
const { datasource } = this.options
// Pull the rows from the existing data provider
return {
rows: datasource?.value?.rows || [],
hasNextPage: false,
cursor: null,
}
}
}

View file

@ -0,0 +1,68 @@
import DataFetch from "./DataFetch.js"
import { executeQuery, fetchQueryDefinition } from "api"
import { cloneDeep } from "lodash/fp"
import { get } from "svelte/store"
export default class QueryFetch extends DataFetch {
determineFeatureFlags(definition) {
const supportsPagination =
!!definition?.fields?.pagination?.type &&
!!definition?.fields?.pagination?.location &&
!!definition?.fields?.pagination?.pageParam
return { supportsPagination }
}
static async getDefinition(datasource) {
if (!datasource?._id) {
return null
}
return await fetchQueryDefinition(datasource._id)
}
async getData() {
const { datasource, limit, paginate } = this.options
const { supportsPagination } = get(this.featureStore)
const { cursor, definition } = get(this.store)
const type = definition?.fields?.pagination?.type
// Set the default query params
let parameters = cloneDeep(datasource?.queryParams || {})
for (let param of datasource?.parameters || {}) {
if (!parameters[param.name]) {
parameters[param.name] = param.default
}
}
// Add pagination to query if supported
let queryPayload = { queryId: datasource?._id, parameters }
if (paginate && supportsPagination) {
const requestCursor = type === "page" ? parseInt(cursor || 1) : cursor
queryPayload.pagination = { page: requestCursor, limit }
}
// Execute query
const { data, pagination, ...rest } = await executeQuery(queryPayload)
// Derive pagination info from response
let nextCursor = null
let hasNextPage = false
if (paginate && supportsPagination) {
if (type === "page") {
// For "page number" pagination, increment the existing page number
nextCursor = queryPayload.pagination.page + 1
hasNextPage = data?.length === limit && limit > 0
} else {
// For "cursor" pagination, the cursor should be in the response
nextCursor = pagination?.cursor
hasNextPage = nextCursor != null
}
}
return {
rows: data || [],
info: rest,
cursor: nextCursor,
hasNextPage,
}
}
}

View file

@ -0,0 +1,16 @@
import DataFetch from "./DataFetch.js"
import { fetchRelationshipData } from "api"
export default class RelationshipFetch extends DataFetch {
async getData() {
const { datasource } = this.options
const res = await fetchRelationshipData({
rowId: datasource?.rowId,
tableId: datasource?.rowTableId,
fieldName: datasource?.fieldName,
})
return {
rows: res || [],
}
}
}

View file

@ -0,0 +1,37 @@
import { get } from "svelte/store"
import DataFetch from "./DataFetch.js"
import { searchTable } from "api"
export default class TableFetch extends DataFetch {
determineFeatureFlags() {
return {
supportsSearch: true,
supportsSort: true,
supportsPagination: true,
}
}
async getData() {
const { datasource, limit, sortColumn, sortOrder, sortType, paginate } =
this.options
const { tableId } = datasource
const { cursor, query } = get(this.store)
// Search table
const res = await searchTable({
tableId,
query,
limit,
sort: sortColumn,
sortOrder: sortOrder?.toLowerCase() ?? "ascending",
sortType,
paginate,
bookmark: cursor,
})
return {
rows: res?.rows || [],
hasNextPage: res?.hasNextPage || false,
cursor: res?.bookmark || null,
}
}
}

View file

@ -0,0 +1,16 @@
import DataFetch from "./DataFetch.js"
import { fetchViewData } from "api"
export default class ViewFetch extends DataFetch {
static getSchema(datasource, definition) {
return definition?.views?.[datasource.name]?.schema
}
async getData() {
const { datasource } = this.options
const res = await fetchViewData(datasource)
return {
rows: res || [],
}
}
}

View file

@ -0,0 +1,22 @@
import TableFetch from "./TableFetch.js"
import ViewFetch from "./ViewFetch.js"
import QueryFetch from "./QueryFetch.js"
import RelationshipFetch from "./RelationshipFetch.js"
import NestedProviderFetch from "./NestedProviderFetch.js"
import FieldFetch from "./FieldFetch.js"
import JSONArrayFetch from "./JSONArrayFetch.js"
const DataFetchMap = {
table: TableFetch,
view: ViewFetch,
query: QueryFetch,
link: RelationshipFetch,
provider: NestedProviderFetch,
field: FieldFetch,
jsonarray: JSONArrayFetch,
}
export const fetchData = (datasource, options) => {
const Fetch = DataFetchMap[datasource?.type] || TableFetch
return new Fetch({ datasource, ...options })
}

View file

@ -0,0 +1,53 @@
import { convertJSONSchemaToTableSchema } from "builder/src/builderStore/jsonUtils"
import TableFetch from "./fetch/TableFetch.js"
import ViewFetch from "./fetch/ViewFetch.js"
import QueryFetch from "./fetch/QueryFetch.js"
import RelationshipFetch from "./fetch/RelationshipFetch.js"
import NestedProviderFetch from "./fetch/NestedProviderFetch.js"
import FieldFetch from "./fetch/FieldFetch.js"
import JSONArrayFetch from "./fetch/JSONArrayFetch.js"
/**
* Fetches the schema of any kind of datasource.
* All datasource fetch classes implement their own functionality to get the
* schema of a datasource of their respective types.
*/
export const fetchDatasourceSchema = async datasource => {
const handler = {
table: TableFetch,
view: ViewFetch,
query: QueryFetch,
link: RelationshipFetch,
provider: NestedProviderFetch,
field: FieldFetch,
jsonarray: JSONArrayFetch,
}[datasource?.type]
if (!handler) {
return null
}
// Get the datasource definition and then schema
const definition = await handler.getDefinition(datasource)
const schema = handler.getSchema(datasource, definition)
if (!schema) {
return null
}
// Check for any JSON fields so we can add any top level properties
let jsonAdditions = {}
Object.keys(schema).forEach(fieldKey => {
const fieldSchema = schema[fieldKey]
if (fieldSchema?.type === "json") {
const jsonSchema = convertJSONSchemaToTableSchema(fieldSchema, {
squashObjects: true,
})
Object.keys(jsonSchema).forEach(jsonKey => {
jsonAdditions[`${fieldKey}.${jsonKey}`] = {
type: jsonSchema[jsonKey].type,
nestedJSON: true,
}
})
}
})
return { ...schema, ...jsonAdditions }
}

View file

@ -1,7 +1,7 @@
{
"name": "@budibase/server",
"email": "hi@budibase.com",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"description": "Budibase Web Server",
"main": "src/index.ts",
"repository": {
@ -70,9 +70,9 @@
"license": "GPL-3.0",
"dependencies": {
"@apidevtools/swagger-parser": "^10.0.3",
"@budibase/backend-core": "^1.0.27-alpha.12",
"@budibase/client": "^1.0.27-alpha.12",
"@budibase/string-templates": "^1.0.27-alpha.12",
"@budibase/backend-core": "^1.0.27-alpha.13",
"@budibase/client": "^1.0.27-alpha.13",
"@budibase/string-templates": "^1.0.27-alpha.13",
"@bull-board/api": "^3.7.0",
"@bull-board/koa": "^3.7.0",
"@elastic/elasticsearch": "7.10.0",

View file

@ -4,7 +4,7 @@ const env = require("../../environment")
const { checkSlashesInUrl } = require("../../utilities")
const { request } = require("../../utilities/workerRequests")
const { clearLock } = require("../../utilities/redis")
const { Replication } = require("@budibase/backend-core").db
const { Replication } = require("@budibase/backend-core/db")
const { DocumentTypes } = require("../../db/utils")
const { app: appCache } = require("@budibase/backend-core/cache")

View file

@ -1,7 +1,7 @@
const CouchDB = require("../../db")
const { getDeployedApps } = require("../../utilities/workerRequests")
const { getScopedConfig } = require("@budibase/backend-core/db")
const { Configs } = require("@budibase/backend-core").constants
const { Configs } = require("@budibase/backend-core/constants")
const { checkSlashesInUrl } = require("../../utilities")
exports.fetchUrls = async ctx => {

View file

@ -0,0 +1,161 @@
const { processString } = require("@budibase/string-templates")
const CouchDB = require("../../db")
const {
generateQueryID,
getQueryParams,
isProdAppID,
} = require("../../db/utils")
const { BaseQueryVerbs } = require("../../constants")
const { Thread, ThreadType } = require("../../threads")
const env = require("../../environment")
const Runner = new Thread(ThreadType.QUERY, {
timeoutMs: env.QUERY_THREAD_TIMEOUT || 10000,
})
// simple function to append "readable" to all read queries
function enrichQueries(input) {
const wasArray = Array.isArray(input)
const queries = wasArray ? input : [input]
for (let query of queries) {
if (query.queryVerb === BaseQueryVerbs.READ) {
query.readable = true
}
}
return wasArray ? queries : queries[0]
}
exports.fetch = async function (ctx) {
const db = new CouchDB(ctx.appId)
const body = await db.allDocs(
getQueryParams(null, {
include_docs: true,
})
)
ctx.body = enrichQueries(body.rows.map(row => row.doc))
}
exports.save = async function (ctx) {
const db = new CouchDB(ctx.appId)
const query = ctx.request.body
if (!query._id) {
query._id = generateQueryID(query.datasourceId)
}
const response = await db.put(query)
query._rev = response.rev
ctx.body = query
ctx.message = `Query ${query.name} saved successfully.`
}
async function enrichQueryFields(fields, parameters = {}) {
const enrichedQuery = {}
// enrich the fields with dynamic parameters
for (let key of Object.keys(fields)) {
if (fields[key] == null) {
continue
}
if (typeof fields[key] === "object") {
// enrich nested fields object
enrichedQuery[key] = await enrichQueryFields(fields[key], parameters)
} else if (typeof fields[key] === "string") {
// enrich string value as normal
enrichedQuery[key] = await processString(fields[key], parameters, {
noHelpers: true,
})
} else {
enrichedQuery[key] = fields[key]
}
}
if (
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
) {
try {
enrichedQuery.json = JSON.parse(
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
)
} catch (err) {
throw { message: `JSON Invalid - error: ${err}` }
}
delete enrichedQuery.customData
}
return enrichedQuery
}
exports.find = async function (ctx) {
const db = new CouchDB(ctx.appId)
const query = enrichQueries(await db.get(ctx.params.queryId))
// remove properties that could be dangerous in real app
if (isProdAppID(ctx.appId)) {
delete query.fields
delete query.parameters
}
ctx.body = query
}
exports.preview = async function (ctx) {
const db = new CouchDB(ctx.appId)
const datasource = await db.get(ctx.request.body.datasourceId)
const { fields, parameters, queryVerb, transformer } = ctx.request.body
const enrichedQuery = await enrichQueryFields(fields, parameters)
try {
const { rows, keys } = await Runner.run({
datasource,
queryVerb,
query: enrichedQuery,
transformer,
})
ctx.body = {
rows,
schemaFields: [...new Set(keys)],
}
} catch (err) {
ctx.throw(400, err)
}
}
exports.execute = async function (ctx) {
const db = new CouchDB(ctx.appId)
const query = await db.get(ctx.params.queryId)
const datasource = await db.get(query.datasourceId)
const enrichedQuery = await enrichQueryFields(
query.fields,
ctx.request.body.parameters
)
// call the relevant CRUD method on the integration class
try {
const { rows } = await Runner.run({
datasource,
queryVerb: query.queryVerb,
query: enrichedQuery,
transformer: query.transformer,
})
ctx.body = rows
} catch (err) {
ctx.throw(400, err)
}
}
exports.destroy = async function (ctx) {
const db = new CouchDB(ctx.appId)
await db.remove(ctx.params.queryId, ctx.params.revId)
ctx.message = `Query deleted.`
ctx.status = 200
}

View file

@ -140,11 +140,12 @@ async function execute(ctx, opts = { rowsOnly: false }) {
// call the relevant CRUD method on the integration class
try {
const { rows, extra } = await Runner.run({
const { rows, pagination, extra } = await Runner.run({
appId: ctx.appId,
datasource,
queryVerb: query.queryVerb,
fields: query.fields,
pagination: ctx.request.body.pagination,
parameters: ctx.request.body.parameters,
transformer: query.transformer,
queryId: ctx.params.queryId,
@ -152,7 +153,7 @@ async function execute(ctx, opts = { rowsOnly: false }) {
if (opts && opts.rowsOnly) {
ctx.body = rows
} else {
ctx.body = { data: rows, ...extra }
ctx.body = { data: rows, pagination, ...extra }
}
} catch (err) {
ctx.throw(400, err)

View file

@ -4,7 +4,7 @@ const {
auditLog,
buildTenancyMiddleware,
buildAppTenancyMiddleware,
} = require("@budibase/backend-core").auth
} = require("@budibase/backend-core/auth")
const currentApp = require("../middleware/currentapp")
const compress = require("koa-compress")
const zlib = require("zlib")

View file

@ -1,6 +1,6 @@
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
const { UserStatus } = require("@budibase/backend-core").constants
const { ObjectStoreBuckets } = require("@budibase/backend-core").objectStore
const { UserStatus } = require("@budibase/backend-core/constants")
const { ObjectStoreBuckets } = require("@budibase/backend-core/objectStore")
exports.JobQueues = {
AUTOMATIONS: "automationQueue",

View file

@ -232,6 +232,8 @@ export interface RestQueryFields {
json: object
method: string
authConfigId: string
pagination: PaginationConfig | null
paginationValues: PaginationValues | null
}
export interface RestConfig {
@ -252,6 +254,19 @@ export interface RestConfig {
]
}
export interface PaginationConfig {
type: string
location: string
pageParam: string
sizeParam: string | null
responseParam: string | null
}
export interface PaginationValues {
page: string | number | null
limit: number | null
}
export interface Query {
_id?: string
datasourceId: string

View file

@ -65,6 +65,7 @@ module.exports = {
DEPLOYMENT_CREDENTIALS_URL: process.env.DEPLOYMENT_CREDENTIALS_URL,
ALLOW_DEV_AUTOMATIONS: process.env.ALLOW_DEV_AUTOMATIONS,
DISABLE_THREADING: process.env.DISABLE_THREADING,
QUERY_THREAD_TIMEOUT: process.env.QUERY_THREAD_TIMEOUT,
_set(key, value) {
process.env[key] = value
module.exports[key] = value

View file

@ -80,6 +80,17 @@ module DynamoModule {
},
},
},
describe: {
type: QueryTypes.FIELDS,
customisable: true,
readable: true,
fields: {
table: {
type: DatasourceFieldTypes.STRING,
required: true,
},
},
},
get: {
type: QueryTypes.FIELDS,
customisable: true,
@ -180,6 +191,13 @@ module DynamoModule {
return response
}
async describe(query: { table: string }) {
const params = {
TableName: query.table,
}
return new AWS.DynamoDB().describeTable(params).promise()
}
async get(query: { table: string; json: object }) {
const params = {
TableName: query.table,

View file

@ -4,9 +4,11 @@ import {
QueryTypes,
RestConfig,
RestQueryFields as RestQuery,
PaginationConfig,
AuthType,
BasicAuthConfig,
BearerAuthConfig,
PaginationValues,
} from "../definitions/datasource"
import { IntegrationBase } from "./base/IntegrationBase"
@ -40,6 +42,9 @@ const coreFields = {
type: DatasourceFieldTypes.STRING,
enum: Object.values(BodyTypes),
},
pagination: {
type: DatasourceFieldTypes.OBJECT
}
}
module RestModule {
@ -115,7 +120,7 @@ module RestModule {
this.config = config
}
async parseResponse(response: any) {
async parseResponse(response: any, pagination: PaginationConfig | null) {
let data, raw, headers
const contentType = response.headers.get("content-type") || ""
try {
@ -154,6 +159,13 @@ module RestModule {
for (let [key, value] of Object.entries(headers)) {
headers[key] = Array.isArray(value) ? value[0] : value
}
// Check if a pagination cursor exists in the response
let nextCursor = null
if (pagination?.responseParam) {
nextCursor = data?.[pagination.responseParam]
}
return {
data,
info: {
@ -165,10 +177,35 @@ module RestModule {
raw,
headers,
},
pagination: {
cursor: nextCursor
}
}
}
getUrl(path: string, queryString: string): string {
getUrl(path: string, queryString: string, pagination: PaginationConfig | null, paginationValues: PaginationValues | null): string {
// Add pagination params to query string if required
if (pagination?.location === "query" && paginationValues) {
const { pageParam, sizeParam } = pagination
const params = new URLSearchParams()
// Append page number or cursor param if configured
if (pageParam && paginationValues.page != null) {
params.append(pageParam, paginationValues.page)
}
// Append page size param if configured
if (sizeParam && paginationValues.limit != null) {
params.append(sizeParam, paginationValues.limit)
}
// Prepend query string with pagination params
let paginationString = params.toString()
if (paginationString) {
queryString = `${paginationString}&${queryString}`
}
}
const main = `${path}?${queryString}`
let complete = main
if (this.config.url && !main.startsWith("http")) {
@ -180,20 +217,36 @@ module RestModule {
return complete
}
addBody(bodyType: string, body: string | any, input: any) {
let error, object, string
try {
string = typeof body !== "string" ? JSON.stringify(body) : body
object = typeof body === "object" ? body : JSON.parse(body)
} catch (err) {
error = err
}
addBody(bodyType: string, body: string | any, input: any, pagination: PaginationConfig | null, paginationValues: PaginationValues | null) {
if (!input.headers) {
input.headers = {}
}
if (bodyType === BodyTypes.NONE) {
return input
}
let error, object: any = {}, string = ""
try {
if (body) {
string = typeof body !== "string" ? JSON.stringify(body) : body
object = typeof body === "object" ? body : JSON.parse(body)
}
} catch (err) {
error = err
}
// Util to add pagination values to a certain body type
const addPaginationToBody = (insertFn: Function) => {
if (pagination?.location === "body") {
if (pagination?.pageParam && paginationValues?.page != null) {
insertFn(pagination.pageParam, paginationValues.page)
}
if (pagination?.sizeParam && paginationValues?.limit != null) {
insertFn(pagination.sizeParam, paginationValues.limit)
}
}
}
switch (bodyType) {
case BodyTypes.NONE:
break
case BodyTypes.TEXT:
// content type defaults to plaintext
input.body = string
@ -203,6 +256,9 @@ module RestModule {
for (let [key, value] of Object.entries(object)) {
params.append(key, value)
}
addPaginationToBody((key: string, value: any) => {
params.append(key, value)
})
input.body = params
break
case BodyTypes.FORM_DATA:
@ -210,6 +266,9 @@ module RestModule {
for (let [key, value] of Object.entries(object)) {
form.append(key, value)
}
addPaginationToBody((key: string, value: any) => {
form.append(key, value)
})
input.body = form
break
case BodyTypes.XML:
@ -219,13 +278,15 @@ module RestModule {
input.body = string
input.headers["Content-Type"] = "application/xml"
break
default:
case BodyTypes.JSON:
// if JSON error, throw it
if (error) {
throw "Invalid JSON for request body"
}
input.body = string
addPaginationToBody((key: string, value: any) => {
object[key] = value
})
input.body = JSON.stringify(object)
input.headers["Content-Type"] = "application/json"
break
}
@ -271,6 +332,8 @@ module RestModule {
bodyType,
requestBody,
authConfigId,
pagination,
paginationValues
} = query
const authHeaders = this.getAuthHeaders(authConfigId)
@ -289,14 +352,12 @@ module RestModule {
}
let input: any = { method, headers: this.headers }
if (requestBody) {
input = this.addBody(bodyType, requestBody, input)
}
input = this.addBody(bodyType, requestBody, input, pagination, paginationValues)
this.startTimeMs = performance.now()
const url = this.getUrl(path, queryString)
const url = this.getUrl(path, queryString, pagination, paginationValues)
const response = await fetch(url, input)
return await this.parseResponse(response)
return await this.parseResponse(response, pagination)
}
async create(opts: RestQuery) {

View file

@ -8,6 +8,8 @@ module S3Module {
region: string
accessKeyId: string
secretAccessKey: string
s3ForcePathStyle: boolean
endpoint?: string
}
const SCHEMA: Integration = {
@ -18,7 +20,7 @@ module S3Module {
datasource: {
region: {
type: "string",
required: true,
required: false,
default: "us-east-1",
},
accessKeyId: {
@ -29,6 +31,15 @@ module S3Module {
type: "password",
required: true,
},
endpoint: {
type: "string",
required: false,
},
signatureVersion: {
type: "string",
required: false,
default: "v4"
},
},
query: {
read: {
@ -46,16 +57,16 @@ module S3Module {
class S3Integration implements IntegrationBase {
private readonly config: S3Config
private client: any
private connectionPromise: Promise<any>
constructor(config: S3Config) {
this.config = config
this.connectionPromise = this.connect()
this.client = new AWS.S3()
}
if (this.config.endpoint) {
this.config.s3ForcePathStyle = true
} else {
delete this.config.endpoint
}
async connect() {
AWS.config.update(this.config)
this.client = new AWS.S3(this.config)
}
async read(query: { bucket: string }) {

View file

@ -4,19 +4,23 @@ jest.mock("node-fetch", () =>
raw: () => {
return { "content-type": ["application/json"] }
},
get: () => ["application/json"]
get: () => ["application/json"],
},
json: jest.fn(),
text: jest.fn()
json: jest.fn(() => ({
my_next_cursor: 123,
})),
text: jest.fn(),
}))
)
const fetch = require("node-fetch")
const RestIntegration = require("../rest")
const { AuthType } = require("../rest")
const FormData = require("form-data")
const { URLSearchParams } = require("url")
const HEADERS = {
"Accept": "application/json",
"Content-Type": "application/json"
Accept: "application/json",
"Content-Type": "application/json",
}
class TestConfiguration {
@ -165,17 +169,20 @@ describe("REST Integration", () => {
status: 200,
json: json ? async () => json : undefined,
text: text ? async () => text : undefined,
headers: { get: key => key === "content-length" ? 100 : header, raw: () => ({ "content-type": header }) }
headers: {
get: key => (key === "content-length" ? 100 : header),
raw: () => ({ "content-type": header }),
},
}
}
it("should be able to parse JSON response", async () => {
const input = buildInput({a: 1}, null, "application/json")
const input = buildInput({ a: 1 }, null, "application/json")
const output = await config.integration.parseResponse(input)
expect(output.data).toEqual({a: 1})
expect(output.data).toEqual({ a: 1 })
expect(output.info.code).toEqual(200)
expect(output.info.size).toEqual("100B")
expect(output.extra.raw).toEqual(JSON.stringify({a: 1}))
expect(output.extra.raw).toEqual(JSON.stringify({ a: 1 }))
expect(output.extra.headers["content-type"]).toEqual("application/json")
})
@ -192,7 +199,7 @@ describe("REST Integration", () => {
const text = "<root><a>1</a><b>2</b></root>"
const input = buildInput(null, text, "application/xml")
const output = await config.integration.parseResponse(input)
expect(output.data).toEqual({a: "1", b: "2"})
expect(output.data).toEqual({ a: "1", b: "2" })
expect(output.extra.raw).toEqual(text)
expect(output.extra.headers["content-type"]).toEqual("application/xml")
})
@ -202,53 +209,309 @@ describe("REST Integration", () => {
const basicAuth = {
_id: "c59c14bd1898a43baa08da68959b24686",
name: "basic-1",
type : AuthType.BASIC,
config : {
type: AuthType.BASIC,
config: {
username: "user",
password: "password"
}
password: "password",
},
}
const bearerAuth = {
_id: "0d91d732f34e4befabeff50b392a8ff3",
name: "bearer-1",
type : AuthType.BEARER,
config : {
"token": "mytoken"
}
type: AuthType.BEARER,
config: {
token: "mytoken",
},
}
beforeEach(() => {
config = new TestConfiguration({
url: BASE_URL,
authConfigs : [basicAuth, bearerAuth]
authConfigs: [basicAuth, bearerAuth],
})
})
it("adds basic auth", async () => {
const query = {
authConfigId: "c59c14bd1898a43baa08da68959b24686"
authConfigId: "c59c14bd1898a43baa08da68959b24686",
}
await config.integration.read(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/?`, {
method: "GET",
headers: {
Authorization: "Basic dXNlcjpwYXNzd29yZA=="
Authorization: "Basic dXNlcjpwYXNzd29yZA==",
},
})
})
it("adds bearer auth", async () => {
const query = {
authConfigId: "0d91d732f34e4befabeff50b392a8ff3"
authConfigId: "0d91d732f34e4befabeff50b392a8ff3",
}
await config.integration.read(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/?`, {
method: "GET",
headers: {
Authorization: "Bearer mytoken"
Authorization: "Bearer mytoken",
},
})
})
})
describe("page based pagination", () => {
it("can paginate using query params", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
path: "api",
pagination: {
type: "page",
location: "query",
pageParam,
sizeParam,
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
await config.integration.read(query)
expect(fetch).toHaveBeenCalledWith(
`${BASE_URL}/api?${pageParam}=${pageValue}&${sizeParam}=${sizeValue}&`,
{
headers: {},
method: "GET",
}
)
})
it("can paginate using JSON request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "json",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: JSON.stringify({
[pageParam]: pageValue,
[sizeParam]: sizeValue,
}),
headers: {
"Content-Type": "application/json",
},
method: "POST",
})
})
it("can paginate using form-data request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "form",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: expect.any(FormData),
headers: {},
method: "POST",
})
const sentData = JSON.stringify(fetch.mock.calls[0][1].body)
expect(sentData).toContain(pageParam)
expect(sentData).toContain(sizeParam)
})
it("can paginate using form-encoded request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "encoded",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: expect.any(URLSearchParams),
headers: {},
method: "POST",
})
const sentData = fetch.mock.calls[0][1].body
expect(sentData.has(pageParam))
expect(sentData.get(pageParam)).toEqual(pageValue.toString())
expect(sentData.has(sizeParam))
expect(sentData.get(sizeParam)).toEqual(sizeValue.toString())
})
})
describe("cursor based pagination", () => {
it("can paginate using query params", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
path: "api",
pagination: {
type: "cursor",
location: "query",
pageParam,
sizeParam,
responseParam: "my_next_cursor",
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
const res = await config.integration.read(query)
expect(fetch).toHaveBeenCalledWith(
`${BASE_URL}/api?${pageParam}=${pageValue}&${sizeParam}=${sizeValue}&`,
{
headers: {},
method: "GET",
}
)
expect(res.pagination.cursor).toEqual(123)
})
it("can paginate using JSON request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "json",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
responseParam: "my_next_cursor",
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
const res = await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: JSON.stringify({
[pageParam]: pageValue,
[sizeParam]: sizeValue,
}),
headers: {
"Content-Type": "application/json",
},
method: "POST",
})
expect(res.pagination.cursor).toEqual(123)
})
it("can paginate using form-data request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "form",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
responseParam: "my_next_cursor",
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
const res = await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: expect.any(FormData),
headers: {},
method: "POST",
})
const sentData = JSON.stringify(fetch.mock.calls[0][1].body)
expect(sentData).toContain(pageParam)
expect(sentData).toContain(sizeParam)
expect(res.pagination.cursor).toEqual(123)
})
it("can paginate using form-encoded request body", async () => {
const pageParam = "my_page_param"
const sizeParam = "my_size_param"
const pageValue = 3
const sizeValue = 10
const query = {
bodyType: "encoded",
path: "api",
pagination: {
type: "page",
location: "body",
pageParam,
sizeParam,
responseParam: "my_next_cursor",
},
paginationValues: {
page: pageValue,
limit: sizeValue,
},
}
const res = await config.integration.create(query)
expect(fetch).toHaveBeenCalledWith(`${BASE_URL}/api?`, {
body: expect.any(URLSearchParams),
headers: {},
method: "POST",
})
const sentData = fetch.mock.calls[0][1].body
expect(sentData.has(pageParam))
expect(sentData.get(pageParam)).toEqual(pageValue.toString())
expect(sentData.has(sizeParam))
expect(sentData.get(sizeParam)).toEqual(sizeValue.toString())
expect(res.pagination.cursor).toEqual(123)
})
})
})

View file

@ -1,6 +1,10 @@
const { getAppId, setCookie, getCookie, clearCookie } =
require("@budibase/backend-core").utils
const { Cookies } = require("@budibase/backend-core").constants
const {
getAppId,
setCookie,
getCookie,
clearCookie,
} = require("@budibase/backend-core/utils")
const { Cookies } = require("@budibase/backend-core/constants")
const { getRole } = require("@budibase/backend-core/roles")
const { BUILTIN_ROLE_IDS } = require("@budibase/backend-core/roles")
const { generateUserMetadataID, isDevAppID } = require("../db/utils")

View file

@ -32,34 +32,30 @@ function mockAuthWithNoCookie() {
},
},
}))
jest.mock("@budibase/backend-core", () => ({
utils: {
getAppId: jest.fn(),
setCookie: jest.fn(),
getCookie: jest.fn(),
},
constants: {
Cookies: {},
},
jest.mock("@budibase/backend-core/utils", () => ({
getAppId: jest.fn(),
setCookie: jest.fn(),
getCookie: jest.fn(),
}))
jest.mock("@budibase/backend-core/constants", () => ({
Cookies: {},
}))
}
function mockAuthWithCookie() {
jest.resetModules()
mockWorker()
jest.mock("@budibase/backend-core", () => ({
utils: {
getAppId: () => {
return "app_test"
},
setCookie: jest.fn(),
getCookie: () => ({appId: "app_different", roleId: "PUBLIC"}),
jest.mock("@budibase/backend-core/utils", () => ({
getAppId: () => {
return "app_test"
},
constants: {
Cookies: {
Auth: "auth",
CurrentApp: "currentapp",
},
setCookie: jest.fn(),
getCookie: () => ({appId: "app_different", roleId: "PUBLIC"}),
}))
jest.mock("@budibase/backend-core/constants", () => ({
Cookies: {
Auth: "auth",
CurrentApp: "currentapp",
},
}))
}
@ -121,7 +117,7 @@ describe("Current app middleware", () => {
async function checkExpected(setCookie) {
config.setUser()
await config.executeMiddleware()
const cookieFn = require("@budibase/backend-core").utils.setCookie
let { setCookie: cookieFn } = require("@budibase/backend-core/utils")
if (setCookie) {
expect(cookieFn).toHaveBeenCalled()
} else {
@ -140,32 +136,30 @@ describe("Current app middleware", () => {
it("should perform correct when no cookie exists", async () => {
mockReset()
jest.mock("@budibase/backend-core", () => ({
utils: {
getAppId: () => {
return "app_test"
},
setCookie: jest.fn(),
getCookie: jest.fn(),
},
constants: {
Cookies: {},
jest.mock("@budibase/backend-core/utils", () => ({
getAppId: () => {
return "app_test"
},
setCookie: jest.fn(),
getCookie: jest.fn(),
}))
jest.mock("@budibase/backend-core/constants", () => ({
Cookies: {},
}))
await checkExpected(true)
})
it("lastly check what occurs when cookie doesn't need updated", async () => {
mockReset()
jest.mock("@budibase/backend-core", () => ({
utils: {
getAppId: () => {
return "app_test"
},
setCookie: jest.fn(),
getCookie: () => ({appId: "app_test", roleId: "PUBLIC"}),
jest.mock("@budibase/backend-core/utils", () => ({
getAppId: () => {
return "app_test"
},
constants: { Cookies: {} },
setCookie: jest.fn(),
getCookie: () => ({appId: "app_test", roleId: "PUBLIC"}),
}))
jest.mock("@budibase/backend-core/constants", () => ({
Cookies: {},
}))
await checkExpected(false)
})

View file

@ -15,8 +15,8 @@ const {
const controllers = require("./controllers")
const supertest = require("supertest")
const { cleanup } = require("../../utilities/fileSystem")
const { Cookies, Headers } = require("@budibase/backend-core").constants
const { jwt } = require("@budibase/backend-core").auth
const { Cookies, Headers } = require("@budibase/backend-core/constants")
const { jwt } = require("@budibase/backend-core/auth")
const core = require("@budibase/backend-core")
const { getGlobalDB } = require("@budibase/backend-core/tenancy")
const { createASession } = require("@budibase/backend-core/sessions")

View file

@ -4,7 +4,7 @@ const actions = require("../automations/actions")
const automationUtils = require("../automations/automationUtils")
const AutomationEmitter = require("../events/AutomationEmitter")
const { processObject } = require("@budibase/string-templates")
const { DEFAULT_TENANT_ID } = require("@budibase/backend-core").constants
const { DEFAULT_TENANT_ID } = require("@budibase/backend-core/constants")
const CouchDB = require("../db")
const { DocumentTypes, isDevAppID } = require("../db/utils")
const { doInTenant } = require("@budibase/backend-core/tenancy")

View file

@ -5,6 +5,9 @@ const { integrations } = require("../integrations")
const { processStringSync } = require("@budibase/string-templates")
const CouchDB = require("../db")
const IS_TRIPLE_BRACE = new RegExp(/^{{3}.*}{3}$/)
const IS_HANDLEBARS = new RegExp(/^{{2}.*}{2}$/)
class QueryRunner {
constructor(input, flags = { noRecursiveQuery: false }) {
this.appId = input.appId
@ -12,6 +15,7 @@ class QueryRunner {
this.queryVerb = input.queryVerb
this.fields = input.fields
this.parameters = input.parameters
this.pagination = input.pagination
this.transformer = input.transformer
this.queryId = input.queryId
this.noRecursiveQuery = flags.noRecursiveQuery
@ -27,7 +31,13 @@ class QueryRunner {
let { datasource, fields, queryVerb, transformer } = this
// pre-query, make sure datasource variables are added to parameters
const parameters = await this.addDatasourceVariables()
const query = threadUtils.enrichQueryFields(fields, parameters)
let query = this.enrichQueryFields(fields, parameters)
// Add pagination values for REST queries
if (this.pagination) {
query.paginationValues = this.pagination
}
const Integration = integrations[datasource.source]
if (!Integration) {
throw "Integration type does not exist."
@ -37,11 +47,13 @@ class QueryRunner {
let output = threadUtils.formatResponse(await integration[queryVerb](query))
let rows = output,
info = undefined,
extra = undefined
extra = undefined,
pagination = undefined
if (threadUtils.hasExtraData(output)) {
rows = output.data
info = output.info
extra = output.extra
pagination = output.pagination
}
// transform as required
@ -83,7 +95,7 @@ class QueryRunner {
integration.end()
}
return { rows, keys, info, extra }
return { rows, keys, info, extra, pagination }
}
async runAnotherQuery(queryId, parameters) {
@ -159,6 +171,50 @@ class QueryRunner {
}
return parameters
}
enrichQueryFields(fields, parameters = {}) {
const enrichedQuery = {}
// enrich the fields with dynamic parameters
for (let key of Object.keys(fields)) {
if (fields[key] == null) {
continue
}
if (typeof fields[key] === "object") {
// enrich nested fields object
enrichedQuery[key] = this.enrichQueryFields(fields[key], parameters)
} else if (typeof fields[key] === "string") {
// enrich string value as normal
let value = fields[key]
// add triple brace to avoid escaping e.g. '=' in cookie header
if (IS_HANDLEBARS.test(value) && !IS_TRIPLE_BRACE.test(value)) {
value = `{${value}}`
}
enrichedQuery[key] = processStringSync(value, parameters, {
noHelpers: true,
})
} else {
enrichedQuery[key] = fields[key]
}
}
if (
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
) {
try {
enrichedQuery.json = JSON.parse(
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
)
} catch (err) {
// no json found, ignore
}
delete enrichedQuery.customData
}
return enrichedQuery
}
}
module.exports = (input, callback) => {

View file

@ -3,14 +3,10 @@ const CouchDB = require("../db")
const { init } = require("@budibase/backend-core")
const redis = require("@budibase/backend-core/redis")
const { SEPARATOR } = require("@budibase/backend-core/db")
const { processStringSync } = require("@budibase/string-templates")
const VARIABLE_TTL_SECONDS = 3600
let client
const IS_TRIPLE_BRACE = new RegExp(/^{{3}.*}{3}$/)
const IS_HANDLEBARS = new RegExp(/^{{2}.*}{2}$/)
async function getClient() {
if (!client) {
client = await new redis.Client(redis.utils.Databases.QUERY_VARS).init()
@ -80,49 +76,3 @@ exports.hasExtraData = response => {
response.info != null
)
}
exports.enrichQueryFields = (fields, parameters = {}) => {
const enrichedQuery = {}
// enrich the fields with dynamic parameters
for (let key of Object.keys(fields)) {
if (fields[key] == null) {
continue
}
if (typeof fields[key] === "object") {
// enrich nested fields object
enrichedQuery[key] = this.enrichQueryFields(fields[key], parameters)
} else if (typeof fields[key] === "string") {
// enrich string value as normal
let value = fields[key]
// add triple brace to avoid escaping e.g. '=' in cookie header
if (IS_HANDLEBARS.test(value) && !IS_TRIPLE_BRACE.test(value)) {
value = `{${value}}`
}
enrichedQuery[key] = processStringSync(value, parameters, {
noHelpers: true,
})
} else {
enrichedQuery[key] = fields[key]
}
}
if (
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
) {
try {
enrichedQuery.json = JSON.parse(
enrichedQuery.json ||
enrichedQuery.customData ||
enrichedQuery.requestBody
)
} catch (err) {
// no json found, ignore
}
delete enrichedQuery.customData
}
return enrichedQuery
}

View file

@ -1,7 +1,7 @@
const { join } = require("./centralPath")
const { homedir } = require("os")
const env = require("../environment")
const { budibaseTempDir } = require("@budibase/backend-core").objectStore
const { budibaseTempDir } = require("@budibase/backend-core/objectStore")
module.exports.budibaseAppsDir = function () {
return env.BUDIBASE_DIR || join(homedir(), ".budibase")

View file

@ -9,7 +9,7 @@ const {
deleteFolder,
uploadDirectory,
downloadTarball,
} = require("@budibase/backend-core").objectStore
} = require("@budibase/backend-core/objectStore")
/***********************************
* NOTE *

View file

@ -1,6 +1,6 @@
const env = require("../environment")
const { OBJ_STORE_DIRECTORY } = require("../constants")
const { sanitizeKey } = require("@budibase/backend-core/src/objectStore")
const { sanitizeKey } = require("@budibase/backend-core/objectStore")
const CouchDB = require("../db")
const { generateMetadataID } = require("../db/utils")
const Readable = require("stream").Readable
@ -34,7 +34,7 @@ exports.checkSlashesInUrl = url => {
* @return {string} The base URL of the object store (MinIO or S3).
*/
exports.objectStoreUrl = () => {
if (env.SELF_HOSTED) {
if (env.SELF_HOSTED || env.MINIO_URL) {
// can use a relative url for this as all goes through the proxy (this is hosted in minio)
return OBJ_STORE_DIRECTORY
} else {

View file

@ -983,10 +983,10 @@
resolved "https://registry.yarnpkg.com/@bcoe/v8-coverage/-/v8-coverage-0.2.3.tgz#75a2e8b51cb758a7553d6804a5932d7aace75c39"
integrity sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw==
"@budibase/backend-core@^1.0.27-alpha.0":
version "1.0.27-alpha.0"
resolved "https://registry.yarnpkg.com/@budibase/backend-core/-/auth-1.0.27-alpha.0.tgz#8020c205d20d722983906426cb5a1aaf5cc6aba4"
integrity sha512-sfXJjQJsFWfgElsHGHn7beERcsrUA5cotN2p9XEp15SrMeEmy4s9a6K58b779QB/d28GXKXtSJwmM/DrptJetQ==
"@budibase/backend-core@^1.0.27-alpha.13":
version "1.0.27-alpha.13"
resolved "https://registry.yarnpkg.com/@budibase/backend-core/-/backend-core-1.0.27-alpha.13.tgz#89f46e081eb7b342f483fd0eccd72c42b2b2fa6c"
integrity sha512-NiasBvZ5wTpvANG9AjuO34DHMTqWQWSpabLcgwBY0tNG4ekh+wvSCPjCcUvN/bBpOzrVMQ8C4hmS4pvv342BhQ==
dependencies:
"@techpass/passport-openidconnect" "^0.3.0"
aws-sdk "^2.901.0"
@ -1056,26 +1056,64 @@
svelte-flatpickr "^3.2.3"
svelte-portal "^1.0.0"
"@budibase/bbui@^1.0.27-alpha.0":
version "1.58.13"
resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-1.58.13.tgz#59df9c73def2d81c75dcbd2266c52c19db88dbd7"
integrity sha512-Zk6CKXdBfKsTVzA1Xs5++shdSSZLfphVpZuKVbjfzkgtuhyH7ruucexuSHEpFsxjW5rEKgKIBoRFzCK5vPvN0w==
"@budibase/bbui@^1.0.35":
version "1.0.35"
resolved "https://registry.yarnpkg.com/@budibase/bbui/-/bbui-1.0.35.tgz#a51886886772257d31e2c6346dbec46fe0c9fd85"
integrity sha512-8qeAzTujtO7uvhj+dMiyW4BTkQ7dC4xF1CNIwyuTnDwIeFDlXYgNb09VVRs3+nWcX2e2eC53EUs1RnLUoSlTsw==
dependencies:
markdown-it "^12.0.2"
quill "^1.3.7"
sirv-cli "^0.4.6"
svelte-flatpickr "^2.4.0"
"@adobe/spectrum-css-workflow-icons" "^1.2.1"
"@spectrum-css/actionbutton" "^1.0.1"
"@spectrum-css/actiongroup" "^1.0.1"
"@spectrum-css/avatar" "^3.0.2"
"@spectrum-css/button" "^3.0.1"
"@spectrum-css/buttongroup" "^3.0.2"
"@spectrum-css/checkbox" "^3.0.2"
"@spectrum-css/dialog" "^3.0.1"
"@spectrum-css/divider" "^1.0.3"
"@spectrum-css/dropzone" "^3.0.2"
"@spectrum-css/fieldgroup" "^3.0.2"
"@spectrum-css/fieldlabel" "^3.0.1"
"@spectrum-css/icon" "^3.0.1"
"@spectrum-css/illustratedmessage" "^3.0.2"
"@spectrum-css/inlinealert" "^2.0.1"
"@spectrum-css/inputgroup" "^3.0.2"
"@spectrum-css/label" "^2.0.10"
"@spectrum-css/link" "^3.1.1"
"@spectrum-css/menu" "^3.0.1"
"@spectrum-css/modal" "^3.0.1"
"@spectrum-css/pagination" "^3.0.3"
"@spectrum-css/picker" "^1.0.1"
"@spectrum-css/popover" "^3.0.1"
"@spectrum-css/progressbar" "^1.0.2"
"@spectrum-css/progresscircle" "^1.0.2"
"@spectrum-css/radio" "^3.0.2"
"@spectrum-css/search" "^3.0.2"
"@spectrum-css/sidenav" "^3.0.2"
"@spectrum-css/statuslight" "^3.0.2"
"@spectrum-css/stepper" "^3.0.3"
"@spectrum-css/switch" "^1.0.2"
"@spectrum-css/table" "^3.0.1"
"@spectrum-css/tabs" "^3.0.1"
"@spectrum-css/tags" "^3.0.2"
"@spectrum-css/textfield" "^3.0.1"
"@spectrum-css/toast" "^3.0.1"
"@spectrum-css/tooltip" "^3.0.3"
"@spectrum-css/treeview" "^3.0.2"
"@spectrum-css/typography" "^3.0.1"
"@spectrum-css/underlay" "^2.0.9"
"@spectrum-css/vars" "^3.0.1"
dayjs "^1.10.4"
svelte-flatpickr "^3.2.3"
svelte-portal "^1.0.0"
turndown "^7.0.0"
"@budibase/client@^1.0.27-alpha.0":
version "1.0.27-alpha.0"
resolved "https://registry.yarnpkg.com/@budibase/client/-/client-1.0.27-alpha.0.tgz#5393d51f4fd08307aad01dd62fcd717acaa38d68"
integrity sha512-wAGiPjZ4n8j69Y0em1nkkUlabcTx7aw7F9MgUusX1oMPihQ0lnBn1Z3rnHON2tRk3rTcdlnitPfGFqsVFFWsCg==
"@budibase/client@^1.0.27-alpha.13":
version "1.0.35"
resolved "https://registry.yarnpkg.com/@budibase/client/-/client-1.0.35.tgz#b832e7e7e35032fb35fe5492fbb721db1da15394"
integrity sha512-maL3V29PQb9VjgnPZq44GSDZCuamAGp01bheUeJxEeskjQqZUdf8QC7Frf1mT+ZjgKJf3gU6qtFOxmWRbVzVbw==
dependencies:
"@budibase/bbui" "^1.0.27-alpha.0"
"@budibase/bbui" "^1.0.35"
"@budibase/standard-components" "^0.9.139"
"@budibase/string-templates" "^1.0.27-alpha.0"
"@budibase/string-templates" "^1.0.35"
regexparam "^1.3.0"
shortid "^2.2.15"
svelte-spa-router "^3.0.5"
@ -1125,10 +1163,10 @@
svelte-apexcharts "^1.0.2"
svelte-flatpickr "^3.1.0"
"@budibase/string-templates@^1.0.27-alpha.0":
version "1.0.27-alpha.0"
resolved "https://registry.yarnpkg.com/@budibase/string-templates/-/string-templates-1.0.27-alpha.0.tgz#89f72e0599e94f95540c9e4fb7948bec5d645526"
integrity sha512-MQXyw+/oIJg2Ezs3GK/HJ2p01ANpl1IjUP/HxDZhTiGUXPDwHXGDKE+t32tiwsYY2l+cn8wHy2DOQbLsRoZhVg==
"@budibase/string-templates@^1.0.27-alpha.13", "@budibase/string-templates@^1.0.35":
version "1.0.35"
resolved "https://registry.yarnpkg.com/@budibase/string-templates/-/string-templates-1.0.35.tgz#a888f1e9327bb36416336a91a95a43cb34e6a42d"
integrity sha512-8HxSv0ru+cgSmphqtOm1pmBM8rc0TRC/6RQGzQefmFFQFfm/SBLAVLLWRmZxAOYTxt4mittGWeL4y05FqEuocg==
dependencies:
"@budibase/handlebars-helpers" "^0.11.7"
dayjs "^1.10.4"
@ -1873,11 +1911,6 @@
"@nodelib/fs.scandir" "2.1.5"
fastq "^1.6.0"
"@polka/url@^0.5.0":
version "0.5.0"
resolved "https://registry.yarnpkg.com/@polka/url/-/url-0.5.0.tgz#b21510597fd601e5d7c95008b76bf0d254ebfd31"
integrity sha512-oZLYFEAzUKyi3SKnXvj32ZCEGH6RDnao7COuCVhDydMS9NrCSVXhM79VaKyP5+Zc33m0QXEd2DN3UkU7OsHcfw==
"@sendgrid/client@^7.1.1":
version "7.6.0"
resolved "https://registry.yarnpkg.com/@sendgrid/client/-/client-7.6.0.tgz#f90cb8759c96e1d90224f29ad98f8fdc2be287f3"
@ -2065,6 +2098,11 @@
resolved "https://registry.yarnpkg.com/@spectrum-css/illustratedmessage/-/illustratedmessage-3.0.8.tgz#69ef0c935bcc5027f233a78de5aeb0064bf033cb"
integrity sha512-HvC4dywDi11GdrXQDCvKQ0vFlrXLTyJuc9UKf7meQLCGoJbGYDBwe+tHXNK1c6gPMD9BoL6pPMP1K/vRzR4EBQ==
"@spectrum-css/inlinealert@^2.0.1":
version "2.0.6"
resolved "https://registry.yarnpkg.com/@spectrum-css/inlinealert/-/inlinealert-2.0.6.tgz#4c5e923a1f56a96cc1adb30ef1f06ae04f2c6376"
integrity sha512-OpvvoWP02wWyCnF4IgG8SOPkXymovkC9cGtgMS1FdDubnG3tJZB/JeKTsRR9C9Vt3WBaOmISRdSKlZ4lC9CFzA==
"@spectrum-css/inputgroup@^3.0.2":
version "3.0.8"
resolved "https://registry.yarnpkg.com/@spectrum-css/inputgroup/-/inputgroup-3.0.8.tgz#fc23afc8a73c24d17249c9d2337e8b42085b298b"
@ -3979,11 +4017,6 @@ clone-response@1.0.2, clone-response@^1.0.2:
dependencies:
mimic-response "^1.0.0"
clone@^2.1.1:
version "2.1.2"
resolved "https://registry.yarnpkg.com/clone/-/clone-2.1.2.tgz#1b7f4b9f591f1e8f83670401600345a02887435f"
integrity sha1-G39Ln1kfHo+DZwQBYANFoCiHQ18=
cls-hooked@^4.2.2:
version "4.2.2"
resolved "https://registry.yarnpkg.com/cls-hooked/-/cls-hooked-4.2.2.tgz#ad2e9a4092680cdaffeb2d3551da0e225eae1908"
@ -4184,11 +4217,6 @@ configstore@^5.0.1:
write-file-atomic "^3.0.0"
xdg-basedir "^4.0.0"
console-clear@^1.1.0:
version "1.1.1"
resolved "https://registry.yarnpkg.com/console-clear/-/console-clear-1.1.1.tgz#995e20cbfbf14dd792b672cde387bd128d674bf7"
integrity sha512-pMD+MVR538ipqkG5JXeOEbKWS5um1H4LUUccUQG68qpeqBYbzYy79Gh55jkd2TtPdRfUaLWdv6LPP//5Zt0aPQ==
consolidate@^0.16.0:
version "0.16.0"
resolved "https://registry.yarnpkg.com/consolidate/-/consolidate-0.16.0.tgz#a11864768930f2f19431660a65906668f5fbdc16"
@ -4536,18 +4564,6 @@ dedent@^0.7.0:
resolved "https://registry.yarnpkg.com/dedent/-/dedent-0.7.0.tgz#2495ddbaf6eb874abb0e1be9df22d2e5a544326c"
integrity sha1-JJXduvbrh0q7Dhvp3yLS5aVEMmw=
deep-equal@^1.0.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-1.1.1.tgz#b5c98c942ceffaf7cb051e24e1434a25a2e6076a"
integrity sha512-yd9c5AdiqVcR+JjcwUQb9DkhJc8ngNr0MahEBGvDiJw8puWab2yZlh+nkasOnZP+EGTAP6rRp2JzJhJZzvNF8g==
dependencies:
is-arguments "^1.0.4"
is-date-object "^1.0.1"
is-regex "^1.0.4"
object-is "^1.0.1"
object-keys "^1.1.1"
regexp.prototype.flags "^1.2.0"
deep-equal@~1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/deep-equal/-/deep-equal-1.0.1.tgz#f5d260292b660e084eff4cdbc9f08ad3247448b5"
@ -4758,11 +4774,6 @@ domexception@^2.0.1:
dependencies:
webidl-conversions "^5.0.0"
domino@^2.1.6:
version "2.1.6"
resolved "https://registry.yarnpkg.com/domino/-/domino-2.1.6.tgz#fe4ace4310526e5e7b9d12c7de01b7f485a57ffe"
integrity sha512-3VdM/SXBZX2omc9JF9nOPCtDaYQ67BGp5CoLpIQlO2KCAPETs8TcDHacF26jXadGbvUteZzRTeos2fhID5+ucQ==
dot-prop@^5.2.0:
version "5.3.0"
resolved "https://registry.yarnpkg.com/dot-prop/-/dot-prop-5.3.0.tgz#90ccce708cd9cd82cc4dc8c3ddd9abdd55b20e88"
@ -5341,11 +5352,6 @@ event-target-shim@^5.0.0:
resolved "https://registry.yarnpkg.com/event-target-shim/-/event-target-shim-5.0.1.tgz#5d4d3ebdf9583d63a5333ce2deb7480ab2b05789"
integrity sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==
eventemitter3@^2.0.3:
version "2.0.3"
resolved "https://registry.yarnpkg.com/eventemitter3/-/eventemitter3-2.0.3.tgz#b5e1079b59fb5e1ba2771c0a993be060a58c99ba"
integrity sha1-teEHm1n7XhuidxwKmTvgYKWMmbo=
events@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/events/-/events-1.1.1.tgz#9ebdb7635ad099c70dcc4c2a1f5004288e8bd924"
@ -5484,7 +5490,7 @@ extend-shallow@^3.0.0, extend-shallow@^3.0.2:
assign-symbols "^1.0.0"
is-extendable "^1.0.1"
extend@^3.0.0, extend@^3.0.2, extend@~3.0.2:
extend@^3.0.0, extend@~3.0.2:
version "3.0.2"
resolved "https://registry.yarnpkg.com/extend/-/extend-3.0.2.tgz#f8b1136b4071fbd8eb140aff858b1019ec2915fa"
integrity sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g==
@ -5532,11 +5538,6 @@ fast-deep-equal@^3.1.1:
resolved "https://registry.yarnpkg.com/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz#3a7d56b559d6cbc3eb512325244e619a65c6c525"
integrity sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==
fast-diff@1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/fast-diff/-/fast-diff-1.1.2.tgz#4b62c42b8e03de3f848460b639079920695d0154"
integrity sha512-KaJUt+M9t1qaIteSvjc6P3RbMdXsNhK61GRftR6SNxqmhthcd9MGIi4T+o0jD8LUSpSnSKXE20nLtJ3fOHxQig==
fast-glob@^3.1.1:
version "3.2.7"
resolved "https://registry.yarnpkg.com/fast-glob/-/fast-glob-3.2.7.tgz#fd6cb7a2d7e9aa7a7846111e85a196d6b2f766a1"
@ -5964,11 +5965,6 @@ get-paths@0.0.7:
dependencies:
pify "^4.0.1"
get-port@^3.2.0:
version "3.2.0"
resolved "https://registry.yarnpkg.com/get-port/-/get-port-3.2.0.tgz#dd7ce7de187c06c8bf353796ac71e099f0980ebc"
integrity sha1-3Xzn3hh8Bsi/NTeWrHHgmfCYDrw=
get-port@^5.1.1:
version "5.1.1"
resolved "https://registry.yarnpkg.com/get-port/-/get-port-5.1.1.tgz#0469ed07563479de6efb986baf053dcd7d4e3193"
@ -6709,14 +6705,6 @@ is-accessor-descriptor@^1.0.0:
dependencies:
kind-of "^6.0.0"
is-arguments@^1.0.4:
version "1.1.1"
resolved "https://registry.yarnpkg.com/is-arguments/-/is-arguments-1.1.1.tgz#15b3f88fda01f2a97fec84ca761a560f123efa9b"
integrity sha512-8Q7EARjzEnKpt/PCD7e1cgUS0a6X8u5tdSiMqXhojOdoV9TsMsiO+9VLC5vAmO8N7/GmXn7yjR8qnA6bVAEzfA==
dependencies:
call-bind "^1.0.2"
has-tostringtag "^1.0.0"
is-arrayish@^0.2.1:
version "0.2.1"
resolved "https://registry.yarnpkg.com/is-arrayish/-/is-arrayish-0.2.1.tgz#77c99840527aa8ecb1a8ba697b80645a7a926a9d"
@ -6981,7 +6969,7 @@ is-property@^1.0.2:
resolved "https://registry.yarnpkg.com/is-property/-/is-property-1.0.2.tgz#57fe1c4e48474edd65b09911f26b1cd4095dda84"
integrity sha1-V/4cTkhHTt1lsJkR8msc1Ald2oQ=
is-regex@^1.0.4, is-regex@^1.1.4:
is-regex@^1.1.4:
version "1.1.4"
resolved "https://registry.yarnpkg.com/is-regex/-/is-regex-1.1.4.tgz#eef5663cd59fa4c0ae339505323df6854bb15958"
integrity sha512-kvRdxDsxZjhzUX07ZnLydzS1TU/TJlTUHHY4YLL87e37oUA49DfkLqgy+VjFocowy29cKvcSiu+kIv728jTTVg==
@ -8267,7 +8255,7 @@ klaw-sync@^6.0.0:
dependencies:
graceful-fs "^4.1.11"
kleur@^3.0.0, kleur@^3.0.3:
kleur@^3.0.3:
version "3.0.3"
resolved "https://registry.yarnpkg.com/kleur/-/kleur-3.0.3.tgz#a79c9ecc86ee1ce3fa6206d1216c501f147fc07e"
integrity sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==
@ -8691,11 +8679,6 @@ loader-utils@^2.0.0:
emojis-list "^3.0.0"
json5 "^2.1.2"
local-access@^1.0.1:
version "1.1.0"
resolved "https://registry.yarnpkg.com/local-access/-/local-access-1.1.0.tgz#e007c76ba2ca83d5877ba1a125fc8dfe23ba4798"
integrity sha512-XfegD5pyTAfb+GY6chk283Ox5z8WexG56OvM06RWLpAc/UHozO8X6xAxEkIitZOtsSMM1Yr3DkHgW5W+onLhCw==
locate-path@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/locate-path/-/locate-path-3.0.0.tgz#dbec3b3ab759758071b58fe59fc41871af21400e"
@ -8962,17 +8945,6 @@ map-visit@^1.0.0:
dependencies:
object-visit "^1.0.0"
markdown-it@^12.0.2:
version "12.3.0"
resolved "https://registry.yarnpkg.com/markdown-it/-/markdown-it-12.3.0.tgz#11490c61b412b8f41530319c005ecdcd4367171f"
integrity sha512-T345UZZ6ejQWTjG6PSEHplzNy5m4kF6zvUpHVDv8Snl/pEU0OxIK0jGg8YLVNwJvT8E0YJC7/2UvssJDk/wQCQ==
dependencies:
argparse "^2.0.1"
entities "~2.1.0"
linkify-it "^3.0.1"
mdurl "^1.0.1"
uc.micro "^1.0.5"
markdown-it@^12.2.0:
version "12.2.0"
resolved "https://registry.yarnpkg.com/markdown-it/-/markdown-it-12.2.0.tgz#091f720fd5db206f80de7a8d1f1a7035fd0d38db"
@ -9113,11 +9085,6 @@ mime@^1.3.4, mime@^1.4.1:
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==
mime@^2.3.1:
version "2.6.0"
resolved "https://registry.yarnpkg.com/mime/-/mime-2.6.0.tgz#a2a682a95cd4d0cb1d6257e28f83da7e35800367"
integrity sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg==
mimic-fn@^2.0.0, mimic-fn@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/mimic-fn/-/mimic-fn-2.1.0.tgz#7ed2c2ccccaf84d3ffcb7a69b57711fc2083401b"
@ -9202,11 +9169,6 @@ mri@1.1.4:
resolved "https://registry.yarnpkg.com/mri/-/mri-1.1.4.tgz#7cb1dd1b9b40905f1fac053abe25b6720f44744a"
integrity sha512-6y7IjGPm8AzlvoUrwAaw1tLnUBudaS3752vcd8JtrpGGQn+rXIe63LFVHm/YMwtqAuh+LJPCFdlLYPWM1nYn6w==
mri@^1.1.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/mri/-/mri-1.2.0.tgz#6721480fec2a11a4889861115a48b6cbe7cc8f0b"
integrity sha512-tzzskb3bG8LvYGFF/mDTpq3jpI6Q9wc3LEmBaghu+DdCssd1FakN7Bc0hVNmEyGq1bq3RgfkCb3cmQLpNPOroA==
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
@ -9524,14 +9486,6 @@ object-inspect@^1.11.0, object-inspect@^1.9.0:
resolved "https://registry.yarnpkg.com/object-inspect/-/object-inspect-1.11.0.tgz#9dceb146cedd4148a0d9e51ab88d34cf509922b1"
integrity sha512-jp7ikS6Sd3GxQfZJPyH3cjcbJF6GZPClgdV+EFygjFLQ5FmW/dRUnTd9PQ9k0JhoNDabWFbpF1yCdSWCC6gexg==
object-is@^1.0.1:
version "1.1.5"
resolved "https://registry.yarnpkg.com/object-is/-/object-is-1.1.5.tgz#b9deeaa5fc7f1846a0faecdceec138e5778f53ac"
integrity sha512-3cyDsyHgtmi7I7DfSSI2LDp6SK2lwvtbg0p0R1e0RvTqF5ceGx+K2dfSjm1bKDMVCFEDAQvy+o8c6a7VujOddw==
dependencies:
call-bind "^1.0.2"
define-properties "^1.1.3"
object-keys@^1.0.12, object-keys@^1.0.6, object-keys@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/object-keys/-/object-keys-1.1.1.tgz#1c47f272df277f3b1daf061677d9c82e2322c60e"
@ -9760,11 +9714,6 @@ pako@^1.0.5:
resolved "https://registry.yarnpkg.com/pako/-/pako-1.0.11.tgz#6c9599d340d54dfd3946380252a35705a6b992bf"
integrity sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==
parchment@^1.1.4:
version "1.1.4"
resolved "https://registry.yarnpkg.com/parchment/-/parchment-1.1.4.tgz#aeded7ab938fe921d4c34bc339ce1168bc2ffde5"
integrity sha512-J5FBQt/pM2inLzg4hEWmzQx/8h8D0CiDxaG3vyp9rKrQRSDgBlhjdP5jQGgosEajXPSQouXGHOmVdgo7QmJuOg==
parent-module@^1.0.0:
version "1.0.1"
resolved "https://registry.yarnpkg.com/parent-module/-/parent-module-1.0.1.tgz#691d2709e78c79fae3a156622452d00762caaaa2"
@ -10609,27 +10558,6 @@ quick-format-unescaped@^4.0.3:
resolved "https://registry.yarnpkg.com/quick-format-unescaped/-/quick-format-unescaped-4.0.4.tgz#93ef6dd8d3453cbc7970dd614fad4c5954d6b5a7"
integrity sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg==
quill-delta@^3.6.2:
version "3.6.3"
resolved "https://registry.yarnpkg.com/quill-delta/-/quill-delta-3.6.3.tgz#b19fd2b89412301c60e1ff213d8d860eac0f1032"
integrity sha512-wdIGBlcX13tCHOXGMVnnTVFtGRLoP0imqxM696fIPwIf5ODIYUHIvHbZcyvGlZFiFhK5XzDC2lpjbxRhnM05Tg==
dependencies:
deep-equal "^1.0.1"
extend "^3.0.2"
fast-diff "1.1.2"
quill@^1.3.7:
version "1.3.7"
resolved "https://registry.yarnpkg.com/quill/-/quill-1.3.7.tgz#da5b2f3a2c470e932340cdbf3668c9f21f9286e8"
integrity sha512-hG/DVzh/TiknWtE6QmWAF/pxoZKYxfe3J/d/+ShUWkDvvkZQVTPeVmUJVu1uE6DDooC4fWTiCLh84ul89oNz5g==
dependencies:
clone "^2.1.1"
deep-equal "^1.0.1"
eventemitter3 "^2.0.3"
extend "^3.0.2"
parchment "^1.1.4"
quill-delta "^3.6.2"
randombytes@^2.1.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/randombytes/-/randombytes-2.1.0.tgz#df6f84372f0270dc65cdf6291349ab7a473d4f2a"
@ -10845,14 +10773,6 @@ regex-not@^1.0.0, regex-not@^1.0.2:
extend-shallow "^3.0.2"
safe-regex "^1.1.0"
regexp.prototype.flags@^1.2.0:
version "1.3.1"
resolved "https://registry.yarnpkg.com/regexp.prototype.flags/-/regexp.prototype.flags-1.3.1.tgz#7ef352ae8d159e758c0eadca6f8fcb4eef07be26"
integrity sha512-JiBdRBq91WlY7uRJ0ds7R+dU02i6LKi8r3BuQhNXn+kmeLN+EfHhfjqMRis1zJxnlu88hq/4dx0P2OP3APRTOA==
dependencies:
call-bind "^1.0.2"
define-properties "^1.1.3"
regexparam@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/regexparam/-/regexparam-2.0.0.tgz#059476767d5f5f87f735fc7922d133fd1a118c8c"
@ -11122,13 +11042,6 @@ rxjs@^6.6.0:
dependencies:
tslib "^1.9.0"
sade@^1.4.0:
version "1.7.4"
resolved "https://registry.yarnpkg.com/sade/-/sade-1.7.4.tgz#ea681e0c65d248d2095c90578c03ca0bb1b54691"
integrity sha512-y5yauMD93rX840MwUJr7C1ysLFBgMspsdTo4UVrDg3fXDvtwOyIqykhVAAm6fk/3au77773itJStObgK+LKaiA==
dependencies:
mri "^1.1.0"
safe-buffer@*, safe-buffer@^5.1.0, safe-buffer@^5.1.1, safe-buffer@^5.1.2, safe-buffer@~5.2.0:
version "5.2.1"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.2.1.tgz#1eaf9fa9bdb1fdd4ec75f58f9cdb4e6b7827eec6"
@ -11387,27 +11300,6 @@ simple-swizzle@^0.2.2:
dependencies:
is-arrayish "^0.3.1"
sirv-cli@^0.4.6:
version "0.4.6"
resolved "https://registry.yarnpkg.com/sirv-cli/-/sirv-cli-0.4.6.tgz#c28ab20deb3b34637f5a60863dc350f055abca04"
integrity sha512-/Vj85/kBvPL+n9ibgX6FicLE8VjidC1BhlX67PYPBfbBAphzR6i0k0HtU5c2arejfU3uzq8l3SYPCwl1x7z6Ww==
dependencies:
console-clear "^1.1.0"
get-port "^3.2.0"
kleur "^3.0.0"
local-access "^1.0.1"
sade "^1.4.0"
sirv "^0.4.6"
tinydate "^1.0.0"
sirv@^0.4.6:
version "0.4.6"
resolved "https://registry.yarnpkg.com/sirv/-/sirv-0.4.6.tgz#185e44eb93d24009dd183b7494285c5180b81f22"
integrity sha512-rYpOXlNbpHiY4nVXxuDf4mXPvKz1reZGap/LkWp9TvcZ84qD/nPBjjH/6GZsgIjVMbOslnY8YYULAyP8jMn1GQ==
dependencies:
"@polka/url" "^0.5.0"
mime "^2.3.1"
sisteransi@^1.0.5:
version "1.0.5"
resolved "https://registry.yarnpkg.com/sisteransi/-/sisteransi-1.0.5.tgz#134d681297756437cc05ca01370d3a7a571075ed"
@ -11956,13 +11848,6 @@ svelte-apexcharts@^1.0.2:
dependencies:
apexcharts "^3.19.2"
svelte-flatpickr@^2.4.0:
version "2.4.0"
resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-2.4.0.tgz#190871fc3305956c8c8fd3601cd036b8ac71ef49"
integrity sha512-UUC5Te+b0qi4POg7VDwfGh0m5W3Hf64OwkfOTj6FEe/dYZN4cBzpQ82EuuQl0CTbbBAsMkcjJcixV1d2V6EHCQ==
dependencies:
flatpickr "^4.5.2"
svelte-flatpickr@^3.1.0, svelte-flatpickr@^3.2.3:
version "3.2.4"
resolved "https://registry.yarnpkg.com/svelte-flatpickr/-/svelte-flatpickr-3.2.4.tgz#1824e26a5dc151d14906cfc7dfd100aefd1b072d"
@ -12272,11 +12157,6 @@ tinycolor2@^1.4.1:
resolved "https://registry.yarnpkg.com/tinycolor2/-/tinycolor2-1.4.2.tgz#3f6a4d1071ad07676d7fa472e1fac40a719d8803"
integrity sha512-vJhccZPs965sV/L2sU4oRQVAos0pQXwsvTLkWYdqJ+a8Q5kPFzJTuOFwy7UniPli44NKQGAglksjvOcpo95aZA==
tinydate@^1.0.0:
version "1.3.0"
resolved "https://registry.yarnpkg.com/tinydate/-/tinydate-1.3.0.tgz#e6ca8e5a22b51bb4ea1c3a2a4fd1352dbd4c57fb"
integrity sha512-7cR8rLy2QhYHpsBDBVYnnWXm8uRTr38RoZakFSW7Bs7PzfMPNZthuMLkwqZv7MTu8lhQ91cOFYS5a7iFj2oR3w==
tmp@^0.0.33:
version "0.0.33"
resolved "https://registry.yarnpkg.com/tmp/-/tmp-0.0.33.tgz#6d34335889768d21b2bcda0aa277ced3b1bfadf9"
@ -12488,13 +12368,6 @@ tunnel@0.0.6:
resolved "https://registry.yarnpkg.com/tunnel/-/tunnel-0.0.6.tgz#72f1314b34a5b192db012324df2cc587ca47f92c"
integrity sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==
turndown@^7.0.0:
version "7.1.1"
resolved "https://registry.yarnpkg.com/turndown/-/turndown-7.1.1.tgz#96992f2d9b40a1a03d3ea61ad31b5a5c751ef77f"
integrity sha512-BEkXaWH7Wh7e9bd2QumhfAXk5g34+6QUmmWx+0q6ThaVOLuLUqsnkq35HQ5SBHSaxjSfSM7US5o4lhJNH7B9MA==
dependencies:
domino "^2.1.6"
tweetnacl@^0.14.3, tweetnacl@~0.14.0:
version "0.14.5"
resolved "https://registry.yarnpkg.com/tweetnacl/-/tweetnacl-0.14.5.tgz#5ae68177f192d4456269d108afa93ff8743f4f64"

View file

@ -1,6 +1,6 @@
{
"name": "@budibase/string-templates",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"description": "Handlebars wrapper for Budibase templating.",
"main": "src/index.cjs",
"module": "dist/bundle.mjs",

View file

@ -1,7 +1,7 @@
{
"name": "@budibase/worker",
"email": "hi@budibase.com",
"version": "1.0.27-alpha.12",
"version": "1.0.27-alpha.13",
"description": "Budibase background service",
"main": "src/index.js",
"repository": {
@ -29,8 +29,8 @@
"author": "Budibase",
"license": "GPL-3.0",
"dependencies": {
"@budibase/backend-core": "^1.0.27-alpha.12",
"@budibase/string-templates": "^1.0.27-alpha.12",
"@budibase/backend-core": "^1.0.27-alpha.13",
"@budibase/string-templates": "^1.0.27-alpha.13",
"@koa/router": "^8.0.0",
"@sentry/node": "^6.0.0",
"@techpass/passport-openidconnect": "^0.3.0",

View file

@ -1,7 +1,7 @@
const core = require("@budibase/backend-core")
const { getScopedConfig } = require("@budibase/backend-core/db")
const { google } = require("@budibase/backend-core/src/middleware")
const { oidc } = require("@budibase/backend-core/src/middleware")
const { google } = require("@budibase/backend-core/middleware")
const { oidc } = require("@budibase/backend-core/middleware")
const { Configs, EmailTemplatePurpose } = require("../../../constants")
const { sendEmail, isEmailConfigured } = require("../../../utilities/email")
const {

View file

@ -7,8 +7,10 @@ const {
} = require("@budibase/backend-core/db")
const { Configs } = require("../../../constants")
const email = require("../../../utilities/email")
const { upload, ObjectStoreBuckets } =
require("@budibase/backend-core").objectStore
const {
upload,
ObjectStoreBuckets,
} = require("@budibase/backend-core/objectStore")
const CouchDB = require("../../../db")
const { getGlobalDB, getTenantId } = require("@budibase/backend-core/tenancy")
const env = require("../../../environment")

View file

@ -3,8 +3,12 @@ const {
StaticDatabases,
generateNewUsageQuotaDoc,
} = require("@budibase/backend-core/db")
const { hash, getGlobalUserByEmail, saveUser, platformLogout } =
require("@budibase/backend-core").utils
const {
hash,
getGlobalUserByEmail,
saveUser,
platformLogout,
} = require("@budibase/backend-core/utils")
const { EmailTemplatePurpose } = require("../../../constants")
const { checkInviteCode } = require("../../../utilities/redis")
const { sendEmail } = require("../../../utilities/email")

View file

@ -2,8 +2,11 @@ const Router = require("@koa/router")
const compress = require("koa-compress")
const zlib = require("zlib")
const { routes } = require("./routes")
const { buildAuthMiddleware, auditLog, buildTenancyMiddleware } =
require("@budibase/backend-core").auth
const {
buildAuthMiddleware,
auditLog,
buildTenancyMiddleware,
} = require("@budibase/backend-core/auth")
const PUBLIC_ENDPOINTS = [
// old deprecated endpoints kept for backwards compat

View file

@ -54,7 +54,7 @@ describe("/api/global/auth", () => {
})
describe("oidc", () => {
const auth = require("@budibase/backend-core").auth
const auth = require("@budibase/backend-core/auth")
// mock the oidc strategy implementation and return value
strategyFactory = jest.fn()
@ -104,4 +104,4 @@ describe("/api/global/auth", () => {
})
})
})
})

View file

@ -1,10 +1,10 @@
const env = require("../../../../environment")
const controllers = require("./controllers")
const supertest = require("supertest")
const { jwt } = require("@budibase/backend-core").auth
const { Cookies } = require("@budibase/backend-core").constants
const { jwt } = require("@budibase/backend-core/auth")
const { Cookies } = require("@budibase/backend-core/constants")
const { Configs, LOGO_URL } = require("../../../../constants")
const { getGlobalUserByEmail } = require("@budibase/backend-core").utils
const { getGlobalUserByEmail } = require("@budibase/backend-core/utils")
const { createASession } = require("@budibase/backend-core/sessions")
const { newid } = require("@budibase/backend-core/src/hashing")
const { TENANT_ID } = require("./structures")

View file

@ -1,4 +1,4 @@
const { Configs } = require("@budibase/backend-core").constants
const { Configs } = require("@budibase/backend-core/constants")
exports.LOGO_URL =
"https://d33wubrfki0l68.cloudfront.net/aac32159d7207b5085e74a7ef67afbb7027786c5/2b1fd/img/logo/bb-emblem.svg"

View file

@ -6,7 +6,7 @@ const Koa = require("koa")
const destroyable = require("server-destroy")
const koaBody = require("koa-body")
const koaSession = require("koa-session")
const { passport } = require("@budibase/backend-core").auth
const { passport } = require("@budibase/backend-core/auth")
const logger = require("koa-pino-logger")
const http = require("http")
const api = require("./api")

View file

@ -1,5 +1,5 @@
const env = require("../environment")
const { Headers } = require("@budibase/backend-core").constants
const { Headers } = require("@budibase/backend-core/constants")
/**
* This is a restricted endpoint in the cloud.

View file

@ -1,5 +1,5 @@
const { Client, utils } = require("@budibase/backend-core/redis")
const { newid } = require("@budibase/backend-core").utils
const { newid } = require("@budibase/backend-core/utils")
function getExpirySecondsForDB(db) {
switch (db) {