1
0
Fork 0
mirror of synced 2024-07-03 21:40:55 +12:00

Merge remote-tracking branch 'origin/master' into feature/app-list-actions

This commit is contained in:
Dean 2024-03-04 10:00:03 +00:00
commit 42893c1218
110 changed files with 3532 additions and 891 deletions

221
i18n/README.kr.md Normal file
View file

@ -0,0 +1,221 @@
<p align="center">
<a href="https://www.budibase.com">
<img alt="Budibase" src="https://res.cloudinary.com/daog6scxm/image/upload/v1696515725/Branding/Assets/Symbol/RGB/Full%20Colour/Budibase_Symbol_RGB_FullColour_cbqvha_1_z5cwq2.svg" width="60" />
</a>
</p>
<h1 align="center">
Budibase
</h1>
<h3 align="center">
자체 인프라에서 몇 분 만에 맞춤형 비즈니스 도구를 구축하세요.
</h3>
<p align="center">
Budibase는 개발자와 IT 전문가가 몇 분 만에 맞춤형 애플리케이션을 구축하고 자동화할 수 있는 오픈 소스 로우코드 플랫폼입니다.
</p>
<h3 align="center">
🤖 🎨 🚀
</h3>
<p align="center">
<img alt="Budibase design ui" src="https://res.cloudinary.com/daog6scxm/image/upload/v1633524049/ui/design-ui-wide-mobile_gdaveq.jpg">
</p>
<p align="center">
<a href="https://github.com/Budibase/budibase/releases">
<img alt="GitHub all releases" src="https://img.shields.io/github/downloads/Budibase/budibase/total">
</a>
<a href="https://github.com/Budibase/budibase/releases">
<img alt="GitHub release (latest by date)" src="https://img.shields.io/github/v/release/Budibase/budibase">
</a>
<a href="https://twitter.com/intent/follow?screen_name=budibase">
<img src="https://img.shields.io/twitter/follow/budibase?style=social" alt="Follow @budibase" />
</a>
<img src="https://img.shields.io/badge/Contributor%20Covenant-v2.0%20adopted-ff69b4.svg" alt="Code of conduct" />
<a href="https://codecov.io/gh/Budibase/budibase">
<img src="https://codecov.io/gh/Budibase/budibase/graph/badge.svg?token=E8W2ZFXQOH"/>
</a>
</p>
<h3 align="center">
<a href="https://docs.budibase.com/getting-started">소개</a>
<span> · </span>
<a href="https://docs.budibase.com">문서</a>
<span> · </span>
<a href="https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas">기능 요청</a>
<span> · </span>
<a href="https://github.com/Budibase/budibase/issues">버그 보고</a>
<span> · </span>
지원: <a href="https://github.com/Budibase/budibase/discussions">토론</a>
</h3>
<br /><br />
## ✨ 특징
### "실제" 소프트웨어를 구축할 수 있습니다.
Budibase를 사용하면 고성능 단일 페이지 애플리케이션을 구축할 수 있습니다. 또한 반응형 디자인으로 제작하여 사용자에게 멋진 경험을 제공할 수 있습니다.
<br /><br />
### 오픈 소스 및 확장성
Budibase는 오픈소스이며, GPL v3 라이선스에 따라 공개되어 있습니다. 이는 Budibase가 항상 당신 곁에 있다는 안도감을 줄 것입니다. 그리고 우리는 개발자 친화적인 환경을 제공하고 있기 때문에, 당신은 원하는 만큼 소스 코드를 포크하여 수정하거나 Budibase에 직접 기여할 수 있습니다.
<br /><br />
### 기존 데이터 또는 처음부터 시작
Budibase를 사용하면 다음과 같은 여러 소스에서 데이터를 가져올 수 있습니다: MondoDB, CouchDB, PostgreSQL, MySQL, Airtable, S3, DynamoDB 또는 REST API.
또는 원하는 경우 외부 도구 없이도 Budibase를 사용하여 처음부터 시작하여 자체 애플리케이션을 구축할 수 있습니다.[데이터 소스 제안](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas).
<p align="center">
<img alt="Budibase data" src="https://res.cloudinary.com/daog6scxm/image/upload/v1636970242/Out%20of%20beta%20launch/data_n1tlhf.png">
</p>
<br /><br />
### 강력한 내장 구성 요소로 애플리케이션을 설계하고 구축할 수 있습니다.
Budibase에는 아름답게 디자인된 강력한 컴포넌트들이 제공되며, 이를 사용하여 UI를 쉽게 구축할 수 있습니다. 또한, CSS를 통한 스타일링 옵션도 풍부하게 제공되어 보다 창의적인 표현도 가능하다.
[Request new component](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas).
<p align="center">
<img alt="Budibase design" src="https://res.cloudinary.com/daog6scxm/image/upload/v1636970243/Out%20of%20beta%20launch/design-like-a-pro_qhlfeu.gif">
</p>
<br /><br />
### 프로세스를 자동화하고, 다른 도구와 연동하고, 웹훅으로 연결하세요!
워크플로우와 수동 프로세스를 자동화하여 시간을 절약하세요. 웹훅 이벤트 연결부터 이메일 자동화까지, Budibase에 수행할 작업을 지시하기만 하면 자동으로 처리됩니다. [새로운 자동화 만들기](https://github.com/Budibase/automations)또는[새로운 자동화를 요청할 수 있습니다](https://github.com/Budibase/budibase/discussions?discussions_q=category%3AIdeas).
<p align="center">
<img alt="Budibase automations" src="https://res.cloudinary.com/daog6scxm/image/upload/v1636970486/Out%20of%20beta%20launch/automation_riro7u.png">
</p>
<br /><br />
### 선호하는 도구
Budibase는 사용자의 선호도에 따라 애플리케이션을 구축할 수 있는 다양한 도구를 통합하고 있습니다.
<p align="center">
<img alt="Budibase integrations" src="https://res.cloudinary.com/daog6scxm/image/upload/v1636970242/Out%20of%20beta%20launch/integrations_kc7dqt.png">
</p>
<br /><br />
### 관리자의 천국
Budibase는 어떤 규모의 프로젝트에도 유연하게 대응할 수 있으며, Budibase를 사용하면 개인 또는 조직의 서버에서 자체 호스팅하고 사용자, 온보딩, SMTP, 앱, 그룹, 테마 등을 한꺼번에 관리할 수 있습니다. 또한, 사용자나 그룹에 앱 포털을 제공하고 그룹 관리자에게 사용자 관리를 맡길 수도 있다.
- 프로모션 비디오: https://youtu.be/xoljVpty_Kw
<br /><br /><br />
## 🏁 시작
Docker, Kubernetes 또는 Digital Ocean을 사용하여 자체 인프라에서 Budibase를 호스팅하거나, 걱정 없이 빠르게 애플리케이션을 구축하려는 경우 클라우드에서 Budibase를 사용할 수 있습니다.
### [Budibase 셀프 호스팅으로 시작하기](https://docs.budibase.com/docs/hosting-methods)
- [Docker - single ARM compatible image](https://docs.budibase.com/docs/docker)
- [Docker Compose](https://docs.budibase.com/docs/docker-compose)
- [Kubernetes](https://docs.budibase.com/docs/kubernetes-k8s)
- [Digital Ocean](https://docs.budibase.com/docs/digitalocean)
- [Portainer](https://docs.budibase.com/docs/portainer)
### [클라우드에서 Budibase 시작하기](https://budibase.com)
<br /><br />
## 🎓 Budibase 알아보기
문서 [documentacion de Budibase](https://docs.budibase.com/docs).
<br />
<br /><br />
## 💬 커뮤니티
질문하고, 다른 사람을 돕고, 다른 Budibase 사용자와 즐거운 대화를 나눌 수 있는 Budibase 커뮤니티에 여러분을 초대합니다.
[깃허브 토론](https://github.com/Budibase/budibase/discussions)
<br /><br /><br />
## ❗ 행동강령
Budibase 는 모든 계층의 사람들을 환영하고 상호 존중하는 환경을 제공하는 데 특별한 주의를 기울이고 있습니다. 저희는 커뮤니티에도 같은 기대를 가지고 있습니다.
[**행동 강령**](https://github.com/Budibase/budibase/blob/HEAD/.github/CODE_OF_CONDUCT.md).
<br />
<br /><br />
## 🙌 Contribuir en Budibase
버그 신고부터 코드의 버그 수정에 이르기까지 모든 기여를 감사하고 환영합니다. 새로운 기능을 구현하거나 API를 변경할 계획이 있다면 [여기에 새 메시지](https://github.com/Budibase/budibase/issues),
이렇게 하면 여러분의 노력이 헛되지 않도록 보장할 수 있습니다.
여기에는 다음을 위해 Budibase 환경을 설정하는 방법에 대한 지침이 나와 있습니다. [여기를 클릭하세요](https://github.com/Budibase/budibase/tree/HEAD/docs/CONTRIBUTING.md).
### 어디서부터 시작해야 할지 혼란스러우신가요?
이곳은 기여를 시작하기에 최적의 장소입니다! [First time issues project](https://github.com/Budibase/budibase/projects/22).
### 리포지토리 구성
Budibase는 Lerna에서 관리하는 단일 리포지토리입니다. Lerna는 변경 사항이 있을 때마다 이를 동기화하여 Budibase 패키지를 빌드하고 게시합니다. 크게 보면 이러한 패키지가 Budibase를 구성하는 패키지입니다:
- [packages/builder](https://github.com/Budibase/budibase/tree/HEAD/packages/builder) - budibase builder 클라이언트 측의 svelte 애플리케이션 코드가 포함되어 있습니다.
- [packages/client](https://github.com/Budibase/budibase/tree/HEAD/packages/client) - budibase builder 클라이언트 측의 svelte 애플리케이션 코드가 포함되어 있습니다.
- [packages/server](https://github.com/Budibase/budibase/tree/HEAD/packages/server) - Budibase의 서버 부분입니다. 이 Koa 애플리케이션은 빌더에게 Budibase 애플리케이션을 생성하는 데 필요한 것을 제공하는 역할을 합니다. 또한 데이터베이스 및 파일 저장소와 상호 작용할 수 있는 API를 제공합니다.
자세한 내용은 다음 문서를 참조하세요. [CONTRIBUTING.md](https://github.com/Budibase/budibase/blob/HEAD/docs/CONTRIBUTING.md)
<br /><br />
## 📝 라이선스
Budibase는 오픈 소스이며, 라이선스는 다음과 같습니다 [GPL v3](https://www.gnu.org/licenses/gpl-3.0.en.html). 클라이언트 및 컴포넌트 라이브러리는 다음과 같이 라이선스가 부여됩니다. [MPL](https://directory.fsf.org/wiki/License:MPL-2.0) - 이렇게 하면 빌드한 애플리케이션에 원하는 대로 라이선스를 부여할 수 있습니다.
<br /><br />
## ⭐ 스타 수의 역사
[![Stargazers over time](https://starchart.cc/Budibase/budibase.svg)](https://starchart.cc/Budibase/budibase)
빌더 업데이트 중 문제가 발생하는 경우 [여기](https://github.com/Budibase/budibase/blob/HEAD/docs/CONTRIBUTING.md#troubleshooting) 를 참고하여 환경을 정리해 주세요.
<br /><br />
## Contributors ✨
훌륭한 여러분께 감사할 따름입니다. ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="http://martinmck.com"><img src="https://avatars1.githubusercontent.com/u/11256663?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Martin McKeaveney</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=shogunpurple" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=shogunpurple" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=shogunpurple" title="Tests">⚠️</a> <a href="#infra-shogunpurple" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="http://www.michaeldrury.co.uk/"><img src="https://avatars2.githubusercontent.com/u/4407001?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Michael Drury</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=mike12345567" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=mike12345567" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=mike12345567" title="Tests">⚠️</a> <a href="#infra-mike12345567" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://github.com/aptkingston"><img src="https://avatars3.githubusercontent.com/u/9075550?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Andrew Kingston</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=aptkingston" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=aptkingston" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=aptkingston" title="Tests">⚠️</a> <a href="#design-aptkingston" title="Design">🎨</a></td>
<td align="center"><a href="https://budibase.com/"><img src="https://avatars3.githubusercontent.com/u/3524181?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Michael Shanks</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=mjashanks" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=mjashanks" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=mjashanks" title="Tests">⚠️</a></td>
<td align="center"><a href="https://github.com/kevmodrome"><img src="https://avatars3.githubusercontent.com/u/534488?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Kevin Åberg Kultalahti</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=kevmodrome" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=kevmodrome" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=kevmodrome" title="Tests">⚠️</a></td>
<td align="center"><a href="https://www.budibase.com/"><img src="https://avatars2.githubusercontent.com/u/49767913?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Joe</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=joebudi" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=joebudi" title="Code">💻</a> <a href="#content-joebudi" title="Content">🖋</a> <a href="#design-joebudi" title="Design">🎨</a></td>
<td align="center"><a href="https://github.com/Rory-Powell"><img src="https://avatars.githubusercontent.com/u/8755148?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Rory Powell</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=Rory-Powell" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=Rory-Powell" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=Rory-Powell" title="Tests">⚠️</a></td>
</tr>
<tr>
<td align="center"><a href="https://github.com/PClmnt"><img src="https://avatars.githubusercontent.com/u/5665926?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Peter Clement</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=PClmnt" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=PClmnt" title="Documentation">📖</a> <a href="https://github.com/Budibase/budibase/commits?author=PClmnt" title="Tests">⚠️</a></td>
<td align="center"><a href="https://github.com/Conor-Mack"><img src="https://avatars1.githubusercontent.com/u/36074859?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Conor_Mack</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=Conor-Mack" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=Conor-Mack" title="Tests">⚠️</a></td>
<td align="center"><a href="https://github.com/pngwn"><img src="https://avatars1.githubusercontent.com/u/12937446?v=4?s=100" width="100px;" alt=""/><br /><sub><b>pngwn</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=pngwn" title="Code">💻</a> <a href="https://github.com/Budibase/budibase/commits?author=pngwn" title="Tests">⚠️</a></td>
<td align="center"><a href="https://github.com/HugoLd"><img src="https://avatars0.githubusercontent.com/u/26521848?v=4?s=100" width="100px;" alt=""/><br /><sub><b>HugoLd</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=HugoLd" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/victoriasloan"><img src="https://avatars.githubusercontent.com/u/9913651?v=4?s=100" width="100px;" alt=""/><br /><sub><b>victoriasloan</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=victoriasloan" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/yashank09"><img src="https://avatars.githubusercontent.com/u/37672190?v=4?s=100" width="100px;" alt=""/><br /><sub><b>yashank09</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=yashank09" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/SOVLOOKUP"><img src="https://avatars.githubusercontent.com/u/53158137?v=4?s=100" width="100px;" alt=""/><br /><sub><b>SOVLOOKUP</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=SOVLOOKUP" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/seoulaja"><img src="https://avatars.githubusercontent.com/u/15101654?v=4?s=100" width="100px;" alt=""/><br /><sub><b>seoulaja</b></sub></a><br /><a href="#translation-seoulaja" title="Translation">🌍</a></td>
<td align="center"><a href="https://github.com/mslourens"><img src="https://avatars.githubusercontent.com/u/1907152?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Maurits Lourens</b></sub></a><br /><a href="https://github.com/Budibase/budibase/commits?author=mslourens" title="Tests">⚠️</a> <a href="https://github.com/Budibase/budibase/commits?author=mslourens" title="Code">💻</a></td>
</tr>
</table>
<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->
이 프로젝트는 다음 사양을 따릅니다. [all-contributors](https://github.com/all-contributors/all-contributors).
모든 종류의 기여를 환영합니다!

View file

@ -1,5 +1,5 @@
{
"version": "2.20.10",
"version": "2.21.0",
"npmClient": "yarn",
"packages": [
"packages/*",

@ -1 +1 @@
Subproject commit de6d44c372a7f48ca0ce8c6c0c19311d4bc21646
Subproject commit 19f7a5829f4d23cbc694136e45d94482a59a475a

View file

@ -11,6 +11,7 @@ import {
Document,
isDocument,
RowResponse,
RowValue,
} from "@budibase/types"
import { getCouchInfo } from "./connections"
import { directCouchUrlCall } from "./utils"
@ -221,7 +222,7 @@ export class DatabaseImpl implements Database {
})
}
async allDocs<T extends Document>(
async allDocs<T extends Document | RowValue>(
params: DatabaseQueryOpts
): Promise<AllDocsResponse<T>> {
return this.performCall(db => {

View file

@ -1,5 +1,4 @@
import {
DocumentScope,
DocumentDestroyResponse,
DocumentInsertResponse,
DocumentBulkResponse,
@ -13,6 +12,7 @@ import {
DatabasePutOpts,
DatabaseQueryOpts,
Document,
RowValue,
} from "@budibase/types"
import tracer from "dd-trace"
import { Writable } from "stream"
@ -79,7 +79,7 @@ export class DDInstrumentedDatabase implements Database {
})
}
allDocs<T extends Document>(
allDocs<T extends Document | RowValue>(
params: DatabaseQueryOpts
): Promise<AllDocsResponse<T>> {
return tracer.trace("db.allDocs", span => {

View file

@ -74,7 +74,7 @@ export function getGlobalIDFromUserMetadataID(id: string) {
* Generates a template ID.
* @param ownerId The owner/user of the template, this could be global or a workspace level.
*/
export function generateTemplateID(ownerId: any) {
export function generateTemplateID(ownerId: string) {
return `${DocumentType.TEMPLATE}${SEPARATOR}${ownerId}${SEPARATOR}${newid()}`
}
@ -105,7 +105,7 @@ export function prefixRoleID(name: string) {
* Generates a new dev info document ID - this is scoped to a user.
* @returns The new dev info ID which info for dev (like api key) can be stored under.
*/
export const generateDevInfoID = (userId: any) => {
export const generateDevInfoID = (userId: string) => {
return `${DocumentType.DEV_INFO}${SEPARATOR}${userId}`
}

View file

@ -6,7 +6,7 @@ import { Plugin } from "@budibase/types"
// URLS
export function enrichPluginURLs(plugins: Plugin[]) {
export function enrichPluginURLs(plugins?: Plugin[]): Plugin[] {
if (!plugins || !plugins.length) {
return []
}

View file

@ -1,11 +1,11 @@
<script>
import FontAwesomeIcon from "./FontAwesomeIcon.svelte"
import { Popover, Heading, Body } from "@budibase/bbui"
import { licensing } from "stores/portal"
import { isEnabled, TENANT_FEATURE_FLAGS } from "helpers/featureFlags"
import { licensing } from "stores/portal"
import { isPremiumOrAbove } from "helpers/planTitle"
$: isBusinessAndAbove =
$licensing.isBusinessPlan || $licensing.isEnterprisePlan
$: premiumOrAboveLicense = isPremiumOrAbove($licensing?.license.plan.type)
let show
let hide
@ -56,22 +56,25 @@
<div class="divider" />
{#if isEnabled(TENANT_FEATURE_FLAGS.LICENSING)}
<a
href={isBusinessAndAbove
href={premiumOrAboveLicense
? "mailto:support@budibase.com"
: "/builder/portal/account/usage"}
>
<div class="premiumLinkContent" class:disabled={!isBusinessAndAbove}>
<div
class="premiumLinkContent"
class:disabled={!premiumOrAboveLicense}
>
<div class="icon">
<FontAwesomeIcon name="fa-solid fa-envelope" />
</div>
<Body size="S">Email support</Body>
</div>
{#if !isBusinessAndAbove}
{#if !premiumOrAboveLicense}
<div class="premiumBadge">
<div class="icon">
<FontAwesomeIcon name="fa-solid fa-lock" />
</div>
<Body size="XS">Business</Body>
<Body size="XS">Premium</Body>
</div>
{/if}
</a>

View file

@ -1,9 +1,9 @@
<script>
import { Label, Select, Body, Multiselect } from "@budibase/bbui"
import { findAllMatchingComponents, findComponent } from "helpers/components"
import { selectedScreen } from "stores/builder"
import { Label, Select, Body } from "@budibase/bbui"
import { onMount } from "svelte"
import { getDatasourceForProvider, getSchemaForDatasource } from "dataBinding"
import ColumnEditor from "../../ColumnEditor/ColumnEditor.svelte"
import { findAllMatchingComponents } from "helpers/components"
import { selectedScreen } from "stores/builder"
export let parameters
@ -18,37 +18,65 @@
},
]
const DELIMITERS = [
{
label: ",",
value: ",",
},
{
label: ";",
value: ";",
},
{
label: ":",
value: ":",
},
{
label: "|",
value: "|",
},
{
label: "~",
value: "~",
},
{
label: "[tab]",
value: "\t",
},
{
label: "[space]",
value: " ",
},
]
$: tables = findAllMatchingComponents($selectedScreen?.props, component =>
component._component.endsWith("table")
).map(table => ({
label: table._instanceName,
value: table._id,
}))
)
$: tableBlocks = findAllMatchingComponents(
$selectedScreen?.props,
component => component._component.endsWith("tableblock")
).map(block => ({
label: block._instanceName,
value: `${block._id}-table`,
)
$: components = tables.concat(tableBlocks)
$: componentOptions = components.map(table => ({
label: table._instanceName,
value: table._component.includes("tableblock")
? `${table._id}-table`
: table._id,
}))
$: componentOptions = tables.concat(tableBlocks)
$: columnOptions = getColumnOptions(parameters.tableComponentId)
const getColumnOptions = tableId => {
// Strip block suffix if block component
if (tableId?.includes("-")) {
tableId = tableId.split("-")[0]
}
const selectedTable = findComponent($selectedScreen?.props, tableId)
const datasource = getDatasourceForProvider($selectedScreen, selectedTable)
const { schema } = getSchemaForDatasource($selectedScreen, datasource)
return Object.keys(schema || {})
}
$: selectedTableId = parameters.tableComponentId?.includes("-")
? parameters.tableComponentId.split("-")[0]
: parameters.tableComponentId
$: selectedTable = components.find(
component => component._id === selectedTableId
)
onMount(() => {
if (!parameters.type) {
parameters.type = "csv"
}
if (!parameters.delimiter) {
parameters.delimiter = ","
}
})
</script>
@ -67,13 +95,30 @@
options={componentOptions}
on:change={() => (parameters.columns = [])}
/>
<span />
<Label small>Export as</Label>
<Select bind:value={parameters.type} options={FORMATS} />
<Select
bind:value={parameters.delimiter}
placeholder={null}
options={DELIMITERS}
disabled={parameters.type !== "csv"}
/>
<Label small>Export columns</Label>
<Multiselect
placeholder="All columns"
bind:value={parameters.columns}
options={columnOptions}
<ColumnEditor
value={parameters.columns}
allowCellEditing={false}
componentInstance={selectedTable}
on:change={e => {
const columns = e.detail
parameters.columns = columns
parameters.customHeaders = columns.reduce((headerMap, column) => {
return {
[column.name]: column.displayName,
...headerMap,
}
}, {})
}}
/>
</div>
</div>
@ -97,8 +142,8 @@
.params {
display: grid;
column-gap: var(--spacing-xs);
row-gap: var(--spacing-s);
grid-template-columns: 90px 1fr;
row-gap: var(--spacing-m);
grid-template-columns: 90px 1fr 90px;
align-items: center;
}
</style>

View file

@ -29,6 +29,12 @@
allowLinks: true,
})
$: {
value = (value || []).filter(
column => (schema || {})[column.name || column] !== undefined
)
}
const getText = value => {
if (!value?.length) {
return "All columns"

View file

@ -116,7 +116,6 @@
$: pagerText = `Page ${currentPage} of ${totalPages}`
</script>
a11y-click-events-have-key-events
<div bind:this={buttonAnchor}>
<ActionButton on:click={dropdown.show}>
{displayValue}

View file

@ -17,6 +17,10 @@ export function breakQueryString(qs) {
return paramObj
}
function isEncoded(str) {
return typeof str == "string" && decodeURIComponent(str) !== str
}
export function buildQueryString(obj) {
let str = ""
if (obj) {
@ -35,7 +39,7 @@ export function buildQueryString(obj) {
value = value.replace(binding, marker)
bindingMarkers[marker] = binding
})
let encoded = encodeURIComponent(value || "")
let encoded = isEncoded(value) ? value : encodeURIComponent(value || "")
Object.entries(bindingMarkers).forEach(([marker, binding]) => {
encoded = encoded.replace(marker, binding)
})

View file

@ -25,3 +25,7 @@ export function getFormattedPlanName(userPlanType) {
}
return `${planName} Plan`
}
export function isPremiumOrAbove(userPlanType) {
return ![PlanType.PRO, PlanType.TEAM, PlanType.FREE].includes(userPlanType)
}

View file

@ -39,4 +39,11 @@ describe("check query string utils", () => {
expect(broken.key1).toBe(obj2.key1)
expect(broken.key2).toBe(obj2.key2)
})
it("should not encode a URL more than once when building the query string", () => {
const queryString = buildQueryString({
values: "a%2Cb%2Cc",
})
expect(queryString).toBe("values=a%2Cb%2Cc")
})
})

View file

@ -69,11 +69,12 @@
// brought back to the same screen.
const topItemNavigate = path => () => {
const activeTopNav = $layout.children.find(c => $isActive(c.path))
if (!activeTopNav) return
builderStore.setPreviousTopNavPath(
activeTopNav.path,
window.location.pathname
)
if (activeTopNav) {
builderStore.setPreviousTopNavPath(
activeTopNav.path,
window.location.pathname
)
}
$goto($builderStore.previousTopNavPath[path] || path)
}

View file

@ -341,7 +341,11 @@ const exportDataHandler = async action => {
tableId: selection.tableId,
rows: selection.selectedRows,
format: action.parameters.type,
columns: action.parameters.columns,
columns: action.parameters.columns?.map(
column => column.name || column
),
delimiter: action.parameters.delimiter,
customHeaders: action.parameters.customHeaders,
})
download(
new Blob([data], { type: "text/plain" }),

View file

@ -89,13 +89,24 @@ export const buildRowEndpoints = API => ({
* @param rows the array of rows to export
* @param format the format to export (csv or json)
* @param columns which columns to export (all if undefined)
* @param delimiter how values should be separated in a CSV (default is comma)
*/
exportRows: async ({ tableId, rows, format, columns, search }) => {
exportRows: async ({
tableId,
rows,
format,
columns,
search,
delimiter,
customHeaders,
}) => {
return await API.post({
url: `/api/${tableId}/rows/exportRows?format=${format}`,
body: {
rows,
columns,
delimiter,
customHeaders,
...search,
},
parseResponse: async response => {

View file

@ -1,3 +1,3 @@
#!/bin/bash
docker-compose down
docker-compose down -v
docker volume prune -f

View file

@ -3,12 +3,12 @@ set -e
if [[ -n $CI ]]
then
export NODE_OPTIONS="--max-old-space-size=4096 --no-node-snapshot"
export NODE_OPTIONS="--max-old-space-size=4096 --no-node-snapshot $NODE_OPTIONS"
echo "jest --coverage --maxWorkers=2 --forceExit --workerIdleMemoryLimit=2000MB --bail $@"
jest --coverage --maxWorkers=2 --forceExit --workerIdleMemoryLimit=2000MB --bail $@
else
# --maxWorkers performs better in development
export NODE_OPTIONS="--no-node-snapshot"
export NODE_OPTIONS="--no-node-snapshot $NODE_OPTIONS"
echo "jest --coverage --maxWorkers=2 --forceExit $@"
jest --coverage --maxWorkers=2 --forceExit $@
fi

View file

@ -48,6 +48,9 @@ import {
PlanType,
Screen,
UserCtx,
CreateAppRequest,
FetchAppDefinitionResponse,
FetchAppPackageResponse,
} from "@budibase/types"
import { BASE_LAYOUT_PROP_IDS } from "../../constants/layouts"
import sdk from "../../sdk"
@ -59,23 +62,23 @@ import * as appMigrations from "../../appMigrations"
async function getLayouts() {
const db = context.getAppDB()
return (
await db.allDocs(
await db.allDocs<Layout>(
getLayoutParams(null, {
include_docs: true,
})
)
).rows.map((row: any) => row.doc)
).rows.map(row => row.doc!)
}
async function getScreens() {
const db = context.getAppDB()
return (
await db.allDocs(
await db.allDocs<Screen>(
getScreenParams(null, {
include_docs: true,
})
)
).rows.map((row: any) => row.doc)
).rows.map(row => row.doc!)
}
function getUserRoleId(ctx: UserCtx) {
@ -117,8 +120,8 @@ function checkAppName(
}
interface AppTemplate {
templateString: string
useTemplate: string
templateString?: string
useTemplate?: string
file?: {
type?: string
path: string
@ -175,14 +178,16 @@ export const addSampleData = async (ctx: UserCtx) => {
ctx.status = 200
}
export async function fetch(ctx: UserCtx) {
export async function fetch(ctx: UserCtx<void, App[]>) {
ctx.body = await sdk.applications.fetch(
ctx.query.status as AppStatus,
ctx.user
)
}
export async function fetchAppDefinition(ctx: UserCtx) {
export async function fetchAppDefinition(
ctx: UserCtx<void, FetchAppDefinitionResponse>
) {
const layouts = await getLayouts()
const userRoleId = getUserRoleId(ctx)
const accessController = new roles.AccessController()
@ -197,10 +202,12 @@ export async function fetchAppDefinition(ctx: UserCtx) {
}
}
export async function fetchAppPackage(ctx: UserCtx) {
export async function fetchAppPackage(
ctx: UserCtx<void, FetchAppPackageResponse>
) {
const db = context.getAppDB()
const appId = context.getAppId()
let application = await db.get<any>(DocumentType.APP_METADATA)
let application = await db.get<App>(DocumentType.APP_METADATA)
const layouts = await getLayouts()
let screens = await getScreens()
const license = await licensing.cache.getCachedLicense()
@ -232,17 +239,21 @@ export async function fetchAppPackage(ctx: UserCtx) {
}
}
async function performAppCreate(ctx: UserCtx) {
async function performAppCreate(ctx: UserCtx<CreateAppRequest, App>) {
const apps = (await dbCore.getAllApps({ dev: true })) as App[]
const name = ctx.request.body.name,
possibleUrl = ctx.request.body.url,
encryptionPassword = ctx.request.body.encryptionPassword
const {
name,
url,
encryptionPassword,
useTemplate,
templateKey,
templateString,
} = ctx.request.body
checkAppName(ctx, apps, name)
const url = sdk.applications.getAppUrl({ name, url: possibleUrl })
checkAppUrl(ctx, apps, url)
const appUrl = sdk.applications.getAppUrl({ name, url })
checkAppUrl(ctx, apps, appUrl)
const { useTemplate, templateKey, templateString } = ctx.request.body
const instanceConfig: AppTemplate = {
useTemplate,
key: templateKey,
@ -273,7 +284,7 @@ async function performAppCreate(ctx: UserCtx) {
version: envCore.VERSION,
componentLibraries: ["@budibase/standard-components"],
name: name,
url: url,
url: appUrl,
template: templateKey,
instance,
tenantId: tenancy.getTenantId(),
@ -435,7 +446,9 @@ export async function create(ctx: UserCtx) {
// This endpoint currently operates as a PATCH rather than a PUT
// Thus name and url fields are handled only if present
export async function update(ctx: UserCtx) {
export async function update(
ctx: UserCtx<{ name?: string; url?: string }, App>
) {
const apps = (await dbCore.getAllApps({ dev: true })) as App[]
// validation
const name = ctx.request.body.name,
@ -508,7 +521,7 @@ export async function revertClient(ctx: UserCtx) {
const revertedToVersion = application.revertableVersion
const appPackageUpdates = {
version: revertedToVersion,
revertableVersion: null,
revertableVersion: undefined,
}
const app = await updateAppPackage(appPackageUpdates, ctx.params.appId)
await events.app.versionReverted(app, currentVersion, revertedToVersion)
@ -683,12 +696,15 @@ export async function duplicateApp(ctx: UserCtx) {
ctx.status = 200
}
export async function updateAppPackage(appPackage: any, appId: any) {
export async function updateAppPackage(
appPackage: Partial<App>,
appId: string
) {
return context.doInAppContext(appId, async () => {
const db = context.getAppDB()
const application = await db.get<App>(DocumentType.APP_METADATA)
const newAppPackage = { ...application, ...appPackage }
const newAppPackage: App = { ...application, ...appPackage }
if (appPackage._rev !== application._rev) {
newAppPackage._rev = application._rev
}

View file

@ -20,6 +20,7 @@ import {
AutomationActionStepId,
AutomationResults,
UserCtx,
DeleteAutomationResponse,
} from "@budibase/types"
import { getActionDefinitions as actionDefs } from "../../automations/actions"
import sdk from "../../sdk"
@ -72,7 +73,9 @@ function cleanAutomationInputs(automation: Automation) {
return automation
}
export async function create(ctx: UserCtx) {
export async function create(
ctx: UserCtx<Automation, { message: string; automation: Automation }>
) {
const db = context.getAppDB()
let automation = ctx.request.body
automation.appId = ctx.appId
@ -207,7 +210,7 @@ export async function find(ctx: UserCtx) {
ctx.body = await db.get(ctx.params.id)
}
export async function destroy(ctx: UserCtx) {
export async function destroy(ctx: UserCtx<void, DeleteAutomationResponse>) {
const db = context.getAppDB()
const automationId = ctx.params.id
const oldAutomation = await db.get<Automation>(automationId)

View file

@ -15,10 +15,14 @@ import {
FieldType,
RelationshipFieldMetadata,
SourceName,
UpdateDatasourceRequest,
UpdateDatasourceResponse,
UserCtx,
VerifyDatasourceRequest,
VerifyDatasourceResponse,
Table,
RowValue,
DynamicVariable,
} from "@budibase/types"
import sdk from "../../sdk"
import { builderSocket } from "../../websockets"
@ -90,8 +94,10 @@ async function invalidateVariables(
existingDatasource: Datasource,
updatedDatasource: Datasource
) {
const existingVariables: any = existingDatasource.config?.dynamicVariables
const updatedVariables: any = updatedDatasource.config?.dynamicVariables
const existingVariables: DynamicVariable[] =
existingDatasource.config?.dynamicVariables || []
const updatedVariables: DynamicVariable[] =
updatedDatasource.config?.dynamicVariables || []
const toInvalidate = []
if (!existingVariables) {
@ -103,9 +109,9 @@ async function invalidateVariables(
toInvalidate.push(...existingVariables)
} else {
// invaldate changed / removed
existingVariables.forEach((existing: any) => {
existingVariables.forEach(existing => {
const unchanged = updatedVariables.find(
(updated: any) =>
updated =>
existing.name === updated.name &&
existing.queryId === updated.queryId &&
existing.value === updated.value
@ -118,24 +124,32 @@ async function invalidateVariables(
await invalidateDynamicVariables(toInvalidate)
}
export async function update(ctx: UserCtx<any, UpdateDatasourceResponse>) {
export async function update(
ctx: UserCtx<UpdateDatasourceRequest, UpdateDatasourceResponse>
) {
const db = context.getAppDB()
const datasourceId = ctx.params.datasourceId
const baseDatasource = await sdk.datasources.get(datasourceId)
const auth = baseDatasource.config?.auth
await invalidateVariables(baseDatasource, ctx.request.body)
const isBudibaseSource =
baseDatasource.type === dbCore.BUDIBASE_DATASOURCE_TYPE
const dataSourceBody = isBudibaseSource
? { name: ctx.request.body?.name }
const dataSourceBody: Datasource = isBudibaseSource
? {
name: ctx.request.body?.name,
type: dbCore.BUDIBASE_DATASOURCE_TYPE,
source: SourceName.BUDIBASE,
}
: ctx.request.body
let datasource: Datasource = {
...baseDatasource,
...sdk.datasources.mergeConfigs(dataSourceBody, baseDatasource),
}
// this block is specific to GSheets, if no auth set, set it back
const auth = baseDatasource.config?.auth
if (auth && !ctx.request.body.auth) {
// don't strip auth config from DB
datasource.config!.auth = auth
@ -204,7 +218,7 @@ async function destroyInternalTablesBySourceId(datasourceId: string) {
const db = context.getAppDB()
// Get all internal tables
const internalTables = await db.allDocs(
const internalTables = await db.allDocs<Table>(
getTableParams(null, {
include_docs: true,
})
@ -212,8 +226,8 @@ async function destroyInternalTablesBySourceId(datasourceId: string) {
// Filter by datasource and return the docs.
const datasourceTableDocs = internalTables.rows.reduce(
(acc: any, table: any) => {
if (table.doc.sourceId == datasourceId) {
(acc: Table[], table) => {
if (table.doc?.sourceId == datasourceId) {
acc.push(table.doc)
}
return acc
@ -254,9 +268,9 @@ export async function destroy(ctx: UserCtx) {
if (datasource.type === dbCore.BUDIBASE_DATASOURCE_TYPE) {
await destroyInternalTablesBySourceId(datasourceId)
} else {
const queries = await db.allDocs(getQueryParams(datasourceId))
const queries = await db.allDocs<RowValue>(getQueryParams(datasourceId))
await db.bulkDocs(
queries.rows.map((row: any) => ({
queries.rows.map(row => ({
_id: row.id,
_rev: row.value.rev,
_deleted: true,

View file

@ -1,7 +1,10 @@
import { getDefinition, getDefinitions } from "../../integrations"
import { SourceName, UserCtx } from "@budibase/types"
const DISABLED_EXTERNAL_INTEGRATIONS = [SourceName.AIRTABLE]
const DISABLED_EXTERNAL_INTEGRATIONS = [
SourceName.AIRTABLE,
SourceName.BUDIBASE,
]
export async function fetch(ctx: UserCtx) {
const definitions = await getDefinitions()

View file

@ -1,9 +1,17 @@
import { EMPTY_LAYOUT } from "../../constants/layouts"
import { generateLayoutID, getScreenParams } from "../../db/utils"
import { events, context } from "@budibase/backend-core"
import { BBContext, Layout } from "@budibase/types"
import {
BBContext,
Layout,
SaveLayoutRequest,
SaveLayoutResponse,
UserCtx,
} from "@budibase/types"
export async function save(ctx: BBContext) {
export async function save(
ctx: UserCtx<SaveLayoutRequest, SaveLayoutResponse>
) {
const db = context.getAppDB()
let layout = ctx.request.body

View file

@ -73,7 +73,7 @@ const _import = async (ctx: UserCtx) => {
}
export { _import as import }
export async function save(ctx: UserCtx) {
export async function save(ctx: UserCtx<Query, Query>) {
const db = context.getAppDB()
const query: Query = ctx.request.body

View file

@ -7,6 +7,7 @@ import {
FilterType,
IncludeRelationship,
ManyToManyRelationshipFieldMetadata,
ManyToOneRelationshipFieldMetadata,
OneToManyRelationshipFieldMetadata,
Operation,
PaginationJson,
@ -18,6 +19,7 @@ import {
SortJson,
SortType,
Table,
isManyToOne,
} from "@budibase/types"
import {
breakExternalTableId,
@ -32,7 +34,9 @@ import { processObjectSync } from "@budibase/string-templates"
import { cloneDeep } from "lodash/fp"
import { processDates, processFormulas } from "../../../utilities/rowProcessor"
import { db as dbCore } from "@budibase/backend-core"
import AliasTables from "./alias"
import sdk from "../../../sdk"
import env from "../../../environment"
export interface ManyRelationship {
tableId?: string
@ -101,6 +105,39 @@ function buildFilters(
}
}
async function removeManyToManyRelationships(
rowId: string,
table: Table,
colName: string
) {
const tableId = table._id!
const filters = buildFilters(rowId, {}, table)
// safety check, if there are no filters on deletion bad things happen
if (Object.keys(filters).length !== 0) {
return getDatasourceAndQuery({
endpoint: getEndpoint(tableId, Operation.DELETE),
body: { [colName]: null },
filters,
})
} else {
return []
}
}
async function removeOneToManyRelationships(rowId: string, table: Table) {
const tableId = table._id!
const filters = buildFilters(rowId, {}, table)
// safety check, if there are no filters on deletion bad things happen
if (Object.keys(filters).length !== 0) {
return getDatasourceAndQuery({
endpoint: getEndpoint(tableId, Operation.UPDATE),
filters,
})
} else {
return []
}
}
/**
* This function checks the incoming parameters to make sure all the inputs are
* valid based on on the table schema. The main thing this is looking for is when a
@ -178,13 +215,13 @@ function generateIdForRow(
function getEndpoint(tableId: string | undefined, operation: string) {
if (!tableId) {
return {}
throw new Error("Cannot get endpoint information - no table ID specified")
}
const { datasourceId, tableName } = breakExternalTableId(tableId)
return {
datasourceId,
entityId: tableName,
operation,
datasourceId: datasourceId!,
entityId: tableName!,
operation: operation as Operation,
}
}
@ -304,6 +341,18 @@ export class ExternalRequest<T extends Operation> {
}
}
async getRow(table: Table, rowId: string): Promise<Row> {
const response = await getDatasourceAndQuery({
endpoint: getEndpoint(table._id!, Operation.READ),
filters: buildFilters(rowId, {}, table),
})
if (Array.isArray(response) && response.length > 0) {
return response[0]
} else {
throw new Error(`Cannot fetch row by ID "${rowId}"`)
}
}
inputProcessing(row: Row | undefined, table: Table) {
if (!row) {
return { row, manyRelationships: [] }
@ -571,7 +620,9 @@ export class ExternalRequest<T extends Operation> {
* information.
*/
async lookupRelations(tableId: string, row: Row) {
const related: { [key: string]: any } = {}
const related: {
[key: string]: { rows: Row[]; isMany: boolean; tableId: string }
} = {}
const { tableName } = breakExternalTableId(tableId)
if (!tableName) {
return related
@ -589,14 +640,26 @@ export class ExternalRequest<T extends Operation> {
) {
continue
}
const isMany = field.relationshipType === RelationshipType.MANY_TO_MANY
const tableId = isMany ? field.through : field.tableId
let tableId: string | undefined,
lookupField: string | undefined,
fieldName: string | undefined
if (isManyToMany(field)) {
tableId = field.through
lookupField = primaryKey
fieldName = field.throughTo || primaryKey
} else if (isManyToOne(field)) {
tableId = field.tableId
lookupField = field.foreignKey
fieldName = field.fieldName
}
if (!tableId || !lookupField || !fieldName) {
throw new Error(
"Unable to lookup relationships - undefined column properties."
)
}
const { tableName: relatedTableName } = breakExternalTableId(tableId)
// @ts-ignore
const linkPrimaryKey = this.tables[relatedTableName].primary[0]
const lookupField = isMany ? primaryKey : field.foreignKey
const fieldName = isMany ? field.throughTo || primaryKey : field.fieldName
if (!lookupField || !row[lookupField]) {
continue
}
@ -609,9 +672,12 @@ export class ExternalRequest<T extends Operation> {
},
})
// this is the response from knex if no rows found
const rows = !response[0].read ? response : []
const storeTo = isMany ? field.throughFrom || linkPrimaryKey : fieldName
related[storeTo] = { rows, isMany, tableId }
const rows: Row[] =
!Array.isArray(response) || response?.[0].read ? [] : response
const storeTo = isManyToMany(field)
? field.throughFrom || linkPrimaryKey
: fieldName
related[storeTo] = { rows, isMany: isManyToMany(field), tableId }
}
return related
}
@ -697,24 +763,43 @@ export class ExternalRequest<T extends Operation> {
continue
}
for (let row of rows) {
const filters = buildFilters(generateIdForRow(row, table), {}, table)
// safety check, if there are no filters on deletion bad things happen
if (Object.keys(filters).length !== 0) {
const op = isMany ? Operation.DELETE : Operation.UPDATE
const body = isMany ? null : { [colName]: null }
promises.push(
getDatasourceAndQuery({
endpoint: getEndpoint(tableId, op),
body,
filters,
})
)
const rowId = generateIdForRow(row, table)
const promise: Promise<any> = isMany
? removeManyToManyRelationships(rowId, table, colName)
: removeOneToManyRelationships(rowId, table)
if (promise) {
promises.push(promise)
}
}
}
await Promise.all(promises)
}
async removeRelationshipsToRow(table: Table, rowId: string) {
const row = await this.getRow(table, rowId)
const related = await this.lookupRelations(table._id!, row)
for (let column of Object.values(table.schema)) {
const relationshipColumn = column as RelationshipFieldMetadata
if (!isManyToOne(relationshipColumn)) {
continue
}
const { rows, isMany, tableId } = related[relationshipColumn.fieldName]
const table = this.getTable(tableId)!
await Promise.all(
rows.map(row => {
const rowId = generateIdForRow(row, table)
return isMany
? removeManyToManyRelationships(
rowId,
table,
relationshipColumn.fieldName
)
: removeOneToManyRelationships(rowId, table)
})
)
}
}
/**
* This function is a bit crazy, but the exact purpose of it is to protect against the scenario in which
* you have column overlap in relationships, e.g. we join a few different tables and they all have the
@ -804,7 +889,7 @@ export class ExternalRequest<T extends Operation> {
}
let json = {
endpoint: {
datasourceId,
datasourceId: datasourceId!,
entityId: tableName,
operation,
},
@ -826,17 +911,30 @@ export class ExternalRequest<T extends Operation> {
},
}
// can't really use response right now
const response = await getDatasourceAndQuery(json)
// handle many to many relationships now if we know the ID (could be auto increment)
// remove any relationships that could block deletion
if (operation === Operation.DELETE && id) {
await this.removeRelationshipsToRow(table, generateRowIdField(id))
}
// aliasing can be disabled fully if desired
let response
if (env.SQL_ALIASING_DISABLE) {
response = await getDatasourceAndQuery(json)
} else {
const aliasing = new AliasTables(Object.keys(this.tables))
response = await aliasing.queryWithAliasing(json)
}
const responseRows = Array.isArray(response) ? response : []
// handle many-to-many relationships now if we know the ID (could be auto increment)
if (operation !== Operation.READ) {
await this.handleManyRelationships(
table._id || "",
response[0],
responseRows[0],
processed.manyRelationships
)
}
const output = this.outputProcessing(response, table, relationships)
const output = this.outputProcessing(responseRows, table, relationships)
// if reading it'll just be an array of rows, return whole thing
if (operation === Operation.READ) {
return (

View file

@ -0,0 +1,166 @@
import {
QueryJson,
SearchFilters,
Table,
Row,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import { getDatasourceAndQuery } from "../../../sdk/app/rows/utils"
import { cloneDeep } from "lodash"
class CharSequence {
static alphabet = "abcdefghijklmnopqrstuvwxyz"
counters: number[]
constructor() {
this.counters = [0]
}
getCharacter(): string {
const char = this.counters.map(i => CharSequence.alphabet[i]).join("")
for (let i = this.counters.length - 1; i >= 0; i--) {
if (this.counters[i] < CharSequence.alphabet.length - 1) {
this.counters[i]++
return char
}
this.counters[i] = 0
}
this.counters.unshift(0)
return char
}
}
export default class AliasTables {
aliases: Record<string, string>
tableAliases: Record<string, string>
tableNames: string[]
charSeq: CharSequence
constructor(tableNames: string[]) {
this.tableNames = tableNames
this.aliases = {}
this.tableAliases = {}
this.charSeq = new CharSequence()
}
getAlias(tableName: string) {
if (this.aliases[tableName]) {
return this.aliases[tableName]
}
const char = this.charSeq.getCharacter()
this.aliases[tableName] = char
this.tableAliases[char] = tableName
return char
}
aliasField(field: string) {
const tableNames = this.tableNames
if (field.includes(".")) {
const [tableName, column] = field.split(".")
const foundTableName = tableNames.find(name => {
const idx = tableName.indexOf(name)
if (idx === -1 || idx > 1) {
return
}
return Math.abs(tableName.length - name.length) <= 2
})
if (foundTableName) {
const aliasedTableName = tableName.replace(
foundTableName,
this.getAlias(foundTableName)
)
field = `${aliasedTableName}.${column}`
}
}
return field
}
reverse<T extends Row | Row[]>(rows: T): T {
const process = (row: Row) => {
const final: Row = {}
for (let [key, value] of Object.entries(row)) {
if (!key.includes(".")) {
final[key] = value
} else {
const [alias, column] = key.split(".")
const tableName = this.tableAliases[alias] || alias
final[`${tableName}.${column}`] = value
}
}
return final
}
if (Array.isArray(rows)) {
return rows.map(row => process(row)) as T
} else {
return process(rows) as T
}
}
aliasMap(tableNames: (string | undefined)[]) {
const map: Record<string, string> = {}
for (let tableName of tableNames) {
if (tableName) {
map[tableName] = this.getAlias(tableName)
}
}
return map
}
async queryWithAliasing(json: QueryJson): DatasourcePlusQueryResponse {
json = cloneDeep(json)
const aliasTable = (table: Table) => ({
...table,
name: this.getAlias(table.name),
})
// run through the query json to update anywhere a table may be used
if (json.resource?.fields) {
json.resource.fields = json.resource.fields.map(field =>
this.aliasField(field)
)
}
if (json.filters) {
for (let [filterKey, filter] of Object.entries(json.filters)) {
if (typeof filter !== "object") {
continue
}
const aliasedFilters: typeof filter = {}
for (let key of Object.keys(filter)) {
aliasedFilters[this.aliasField(key)] = filter[key]
}
json.filters[filterKey as keyof SearchFilters] = aliasedFilters
}
}
if (json.relationships) {
json.relationships = json.relationships.map(relationship => ({
...relationship,
aliases: this.aliasMap([
relationship.through,
relationship.tableName,
json.endpoint.entityId,
]),
}))
}
if (json.meta?.table) {
json.meta.table = aliasTable(json.meta.table)
}
if (json.meta?.tables) {
const aliasedTables: Record<string, Table> = {}
for (let [tableName, table] of Object.entries(json.meta.tables)) {
aliasedTables[this.getAlias(tableName)] = aliasTable(table)
}
json.meta.tables = aliasedTables
}
// invert and return
const invertedTableAliases: Record<string, string> = {}
for (let [key, value] of Object.entries(this.tableAliases)) {
invertedTableAliases[value] = key
}
json.tableAliases = invertedTableAliases
const response = await getDatasourceAndQuery(json)
if (Array.isArray(response)) {
return this.reverse(response)
} else {
return response
}
}
}

View file

@ -223,7 +223,8 @@ export const exportRows = async (
const format = ctx.query.format
const { rows, columns, query, sort, sortOrder } = ctx.request.body
const { rows, columns, query, sort, sortOrder, delimiter, customHeaders } =
ctx.request.body
if (typeof format !== "string" || !exporters.isFormat(format)) {
ctx.throw(
400,
@ -241,6 +242,8 @@ export const exportRows = async (
query,
sort,
sortOrder,
delimiter,
customHeaders,
})
ctx.attachment(fileName)
ctx.body = apiFileReturn(content)

View file

@ -189,11 +189,12 @@ export async function fetchEnrichedRow(ctx: UserCtx) {
const tableId = utils.getTableId(ctx)
const rowId = ctx.params.rowId as string
// need table to work out where links go in row, as well as the link docs
const [table, row, links] = await Promise.all([
const [table, links] = await Promise.all([
sdk.tables.getTable(tableId),
utils.findRow(ctx, tableId, rowId),
linkRows.getLinkDocuments({ tableId, rowId, fieldName }),
])
let row = await utils.findRow(ctx, tableId, rowId)
row = await outputProcessing(table, row)
const linkVals = links as LinkDocumentValue[]
// look up the actual rows based on the ids

View file

@ -7,7 +7,13 @@ import {
roles,
} from "@budibase/backend-core"
import { updateAppPackage } from "./application"
import { Plugin, ScreenProps, BBContext, Screen } from "@budibase/types"
import {
Plugin,
ScreenProps,
BBContext,
Screen,
UserCtx,
} from "@budibase/types"
import { builderSocket } from "../../websockets"
export async function fetch(ctx: BBContext) {
@ -31,7 +37,7 @@ export async function fetch(ctx: BBContext) {
)
}
export async function save(ctx: BBContext) {
export async function save(ctx: UserCtx<Screen, Screen>) {
const db = context.getAppDB()
let screen = ctx.request.body

View file

@ -170,6 +170,7 @@ export const serveApp = async function (ctx: Ctx) {
if (!env.isJest()) {
const plugins = objectStore.enrichPluginURLs(appInfo.usedPlugins)
const { head, html, css } = AppComponent.render({
title: branding?.platformTitle || `${appInfo.name}`,
metaImage:
branding?.metaImageUrl ||
"https://res.cloudinary.com/daog6scxm/image/upload/v1698759482/meta-images/plain-branded-meta-image-coral_ocxmgu.png",

View file

@ -1,7 +1,19 @@
import { Row, TableSchema } from "@budibase/types"
export function csv(headers: string[], rows: Row[]) {
let csv = headers.map(key => `"${key}"`).join(",")
function getHeaders(
headers: string[],
customHeaders: { [key: string]: string }
) {
return headers.map(header => `"${customHeaders[header] || header}"`)
}
export function csv(
headers: string[],
rows: Row[],
delimiter: string = ",",
customHeaders: { [key: string]: string } = {}
) {
let csv = getHeaders(headers, customHeaders).join(delimiter)
for (let row of rows) {
csv = `${csv}\n${headers
@ -15,7 +27,7 @@ export function csv(headers: string[], rows: Row[]) {
: ""
return val.trim()
})
.join(",")}`
.join(delimiter)}`
}
return csv
}

View file

@ -4,7 +4,6 @@ import * as deploymentController from "../controllers/deploy"
import authorized from "../../middleware/authorized"
import { permissions } from "@budibase/backend-core"
import { applicationValidator } from "./utils/validators"
import { importToApp } from "../controllers/application"
const router: Router = new Router()

View file

@ -11,65 +11,54 @@ jest.mock("../../../utilities/redis", () => ({
checkDebounce: jest.fn(),
shutdown: jest.fn(),
}))
import { clearAllApps, checkBuilderEndpoint } from "./utilities/TestFunctions"
import { checkBuilderEndpoint } from "./utilities/TestFunctions"
import * as setup from "./utilities"
import { AppStatus } from "../../../db/utils"
import { events, utils, context } from "@budibase/backend-core"
import env from "../../../environment"
jest.setTimeout(15000)
import type { App } from "@budibase/types"
import tk from "timekeeper"
describe("/applications", () => {
let request = setup.getRequest()
let config = setup.getConfig()
let app: App
afterAll(setup.afterAll)
beforeAll(async () => {
await config.init()
})
beforeAll(async () => await config.init())
beforeEach(async () => {
app = await config.api.application.create({ name: utils.newid() })
const deployment = await config.api.application.publish(app.appId)
expect(deployment.status).toBe("SUCCESS")
jest.clearAllMocks()
})
describe("create", () => {
it("creates empty app", async () => {
const res = await request
.post("/api/applications")
.field("name", utils.newid())
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._id).toBeDefined()
const app = await config.api.application.create({ name: utils.newid() })
expect(app._id).toBeDefined()
expect(events.app.created).toBeCalledTimes(1)
})
it("creates app from template", async () => {
const res = await request
.post("/api/applications")
.field("name", utils.newid())
.field("useTemplate", "true")
.field("templateKey", "test")
.field("templateString", "{}") // override the file download
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._id).toBeDefined()
const app = await config.api.application.create({
name: utils.newid(),
useTemplate: "true",
templateKey: "test",
templateString: "{}",
})
expect(app._id).toBeDefined()
expect(events.app.created).toBeCalledTimes(1)
expect(events.app.templateImported).toBeCalledTimes(1)
})
it("creates app from file", async () => {
const res = await request
.post("/api/applications")
.field("name", utils.newid())
.field("useTemplate", "true")
.set(config.defaultHeaders())
.attach("templateFile", "src/api/routes/tests/data/export.txt")
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._id).toBeDefined()
const app = await config.api.application.create({
name: utils.newid(),
useTemplate: "true",
templateFile: "src/api/routes/tests/data/export.txt",
})
expect(app._id).toBeDefined()
expect(events.app.created).toBeCalledTimes(1)
expect(events.app.fileImported).toBeCalledTimes(1)
})
@ -84,24 +73,21 @@ describe("/applications", () => {
})
it("migrates navigation settings from old apps", async () => {
const res = await request
.post("/api/applications")
.field("name", "Old App")
.field("useTemplate", "true")
.set(config.defaultHeaders())
.attach("templateFile", "src/api/routes/tests/data/old-app.txt")
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._id).toBeDefined()
expect(res.body.navigation).toBeDefined()
expect(res.body.navigation.hideLogo).toBe(true)
expect(res.body.navigation.title).toBe("Custom Title")
expect(res.body.navigation.hideLogo).toBe(true)
expect(res.body.navigation.navigation).toBe("Left")
expect(res.body.navigation.navBackground).toBe(
const app = await config.api.application.create({
name: utils.newid(),
useTemplate: "true",
templateFile: "src/api/routes/tests/data/old-app.txt",
})
expect(app._id).toBeDefined()
expect(app.navigation).toBeDefined()
expect(app.navigation!.hideLogo).toBe(true)
expect(app.navigation!.title).toBe("Custom Title")
expect(app.navigation!.hideLogo).toBe(true)
expect(app.navigation!.navigation).toBe("Left")
expect(app.navigation!.navBackground).toBe(
"var(--spectrum-global-color-blue-600)"
)
expect(res.body.navigation.navTextColor).toBe(
expect(app.navigation!.navTextColor).toBe(
"var(--spectrum-global-color-gray-50)"
)
expect(events.app.created).toBeCalledTimes(1)
@ -148,164 +134,106 @@ describe("/applications", () => {
})
describe("fetch", () => {
beforeEach(async () => {
// Clean all apps but the onde from config
await clearAllApps(config.getTenantId(), [config.getAppId()!])
})
it("lists all applications", async () => {
await config.createApp("app1")
await config.createApp("app2")
const res = await request
.get(`/api/applications?status=${AppStatus.DEV}`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
// two created apps + the inited app
expect(res.body.length).toBe(3)
const apps = await config.api.application.fetch({ status: AppStatus.DEV })
expect(apps.length).toBeGreaterThan(0)
})
})
describe("fetchAppDefinition", () => {
it("should be able to get an apps definition", async () => {
const res = await request
.get(`/api/applications/${config.getAppId()}/definition`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body.libraries.length).toEqual(1)
const res = await config.api.application.getDefinition(app.appId)
expect(res.libraries.length).toEqual(1)
})
})
describe("fetchAppPackage", () => {
it("should be able to fetch the app package", async () => {
const res = await request
.get(`/api/applications/${config.getAppId()}/appPackage`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body.application).toBeDefined()
expect(res.body.application.appId).toEqual(config.getAppId())
const res = await config.api.application.getAppPackage(app.appId)
expect(res.application).toBeDefined()
expect(res.application.appId).toEqual(config.getAppId())
})
})
describe("update", () => {
it("should be able to update the app package", async () => {
const res = await request
.put(`/api/applications/${config.getAppId()}`)
.send({
name: "TEST_APP",
})
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._rev).toBeDefined()
const updatedApp = await config.api.application.update(app.appId, {
name: "TEST_APP",
})
expect(updatedApp._rev).toBeDefined()
expect(events.app.updated).toBeCalledTimes(1)
})
})
describe("publish", () => {
it("should publish app with dev app ID", async () => {
const appId = config.getAppId()
await request
.post(`/api/applications/${appId}/publish`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.publish(app.appId)
expect(events.app.published).toBeCalledTimes(1)
})
it("should publish app with prod app ID", async () => {
const appId = config.getProdAppId()
await request
.post(`/api/applications/${appId}/publish`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.publish(app.appId.replace("_dev", ""))
expect(events.app.published).toBeCalledTimes(1)
})
})
describe("manage client library version", () => {
it("should be able to update the app client library version", async () => {
await request
.post(`/api/applications/${config.getAppId()}/client/update`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.updateClient(app.appId)
expect(events.app.versionUpdated).toBeCalledTimes(1)
})
it("should be able to revert the app client library version", async () => {
// We need to first update the version so that we can then revert
await request
.post(`/api/applications/${config.getAppId()}/client/update`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await request
.post(`/api/applications/${config.getAppId()}/client/revert`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.updateClient(app.appId)
await config.api.application.revertClient(app.appId)
expect(events.app.versionReverted).toBeCalledTimes(1)
})
})
describe("edited at", () => {
it("middleware should set edited at", async () => {
const headers = config.defaultHeaders()
headers["referer"] = `/${config.getAppId()}/test`
const res = await request
.put(`/api/applications/${config.getAppId()}`)
.send({
name: "UPDATED_NAME",
})
.set(headers)
.expect("Content-Type", /json/)
.expect(200)
expect(res.body._rev).toBeDefined()
// retrieve the app to check it
const getRes = await request
.get(`/api/applications/${config.getAppId()}/appPackage`)
.set(headers)
.expect("Content-Type", /json/)
.expect(200)
expect(getRes.body.application.updatedAt).toBeDefined()
it("middleware should set updatedAt", async () => {
const app = await tk.withFreeze(
"2021-01-01",
async () => await config.api.application.create({ name: utils.newid() })
)
expect(app.updatedAt).toEqual("2021-01-01T00:00:00.000Z")
const updatedApp = await tk.withFreeze(
"2021-02-01",
async () =>
await config.api.application.update(app.appId, {
name: "UPDATED_NAME",
})
)
expect(updatedApp._rev).toBeDefined()
expect(updatedApp.updatedAt).toEqual("2021-02-01T00:00:00.000Z")
const fetchedApp = await config.api.application.get(app.appId)
expect(fetchedApp.updatedAt).toEqual("2021-02-01T00:00:00.000Z")
})
})
describe("sync", () => {
it("app should sync correctly", async () => {
const res = await request
.post(`/api/applications/${config.getAppId()}/sync`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body.message).toEqual("App sync completed successfully.")
const { message } = await config.api.application.sync(app.appId)
expect(message).toEqual("App sync completed successfully.")
})
it("app should not sync if production", async () => {
const res = await request
.post(`/api/applications/app_123456/sync`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(400)
expect(res.body.message).toEqual(
const { message } = await config.api.application.sync(
app.appId.replace("_dev", ""),
{ statusCode: 400 }
)
expect(message).toEqual(
"This action cannot be performed for production apps"
)
})
it("app should not sync if sync is disabled", async () => {
env._set("DISABLE_AUTO_PROD_APP_SYNC", true)
const res = await request
.post(`/api/applications/${config.getAppId()}/sync`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
expect(res.body.message).toEqual(
const { message } = await config.api.application.sync(app.appId)
expect(message).toEqual(
"App sync disabled. You can reenable with the DISABLE_AUTO_PROD_APP_SYNC environment variable."
)
env._set("DISABLE_AUTO_PROD_APP_SYNC", false)
@ -313,51 +241,26 @@ describe("/applications", () => {
})
describe("unpublish", () => {
beforeEach(async () => {
// We want to republish as the unpublish will delete the prod app
await config.publish()
})
it("should unpublish app with dev app ID", async () => {
const appId = config.getAppId()
await request
.post(`/api/applications/${appId}/unpublish`)
.set(config.defaultHeaders())
.expect(204)
await config.api.application.unpublish(app.appId)
expect(events.app.unpublished).toBeCalledTimes(1)
})
it("should unpublish app with prod app ID", async () => {
const appId = config.getProdAppId()
await request
.post(`/api/applications/${appId}/unpublish`)
.set(config.defaultHeaders())
.expect(204)
await config.api.application.unpublish(app.appId.replace("_dev", ""))
expect(events.app.unpublished).toBeCalledTimes(1)
})
})
describe("delete", () => {
it("should delete published app and dev apps with dev app ID", async () => {
await config.createApp("to-delete")
const appId = config.getAppId()
await request
.delete(`/api/applications/${appId}`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.delete(app.appId)
expect(events.app.deleted).toBeCalledTimes(1)
expect(events.app.unpublished).toBeCalledTimes(1)
})
it("should delete published app and dev app with prod app ID", async () => {
await config.createApp("to-delete")
const appId = config.getProdAppId()
await request
.delete(`/api/applications/${appId}`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.delete(app.appId.replace("_dev", ""))
expect(events.app.deleted).toBeCalledTimes(1)
expect(events.app.unpublished).toBeCalledTimes(1)
})
@ -422,28 +325,18 @@ describe("/applications", () => {
describe("POST /api/applications/:appId/sync", () => {
it("should not sync automation logs", async () => {
// setup the apps
await config.createApp("testing-auto-logs")
const automation = await config.createAutomation()
await config.publish()
await context.doInAppContext(config.getProdAppId(), () => {
return config.createAutomationLog(automation)
})
await context.doInAppContext(app.appId, () =>
config.createAutomationLog(automation)
)
// do the sync
const appId = config.getAppId()
await request
.post(`/api/applications/${appId}/sync`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
await config.api.application.sync(app.appId)
// does exist in prod
const prodLogs = await config.getAutomationLogs()
expect(prodLogs.data.length).toBe(1)
// delete prod app so we revert to dev log search
await config.unpublish()
await config.api.application.unpublish(app.appId)
// doesn't exist in dev
const devLogs = await config.getAutomationLogs()

View file

@ -394,7 +394,7 @@ describe("/automations", () => {
it("deletes a automation by its ID", async () => {
const automation = await config.createAutomation()
const res = await request
.delete(`/api/automations/${automation.id}/${automation.rev}`)
.delete(`/api/automations/${automation._id}/${automation._rev}`)
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
@ -408,7 +408,7 @@ describe("/automations", () => {
await checkBuilderEndpoint({
config,
method: "DELETE",
url: `/api/automations/${automation.id}/${automation._rev}`,
url: `/api/automations/${automation._id}/${automation._rev}`,
})
})
})

View file

@ -44,7 +44,7 @@ describe("/backups", () => {
expect(headers["content-disposition"]).toEqual(
`attachment; filename="${
config.getApp()!.name
config.getApp().name
}-export-${mocks.date.MOCK_DATE.getTime()}.tar.gz"`
)
})

View file

@ -86,7 +86,7 @@ describe("/datasources", () => {
})
// check variables in cache
let contents = await checkCacheForDynamicVariable(
query._id,
query._id!,
"variable3"
)
expect(contents.rows.length).toEqual(1)
@ -102,7 +102,7 @@ describe("/datasources", () => {
expect(res.body.errors).toBeUndefined()
// check variables no longer in cache
contents = await checkCacheForDynamicVariable(query._id, "variable3")
contents = await checkCacheForDynamicVariable(query._id!, "variable3")
expect(contents).toBe(null)
})
})

View file

@ -467,7 +467,10 @@ describe("/queries", () => {
queryString: "test={{ variable3 }}",
})
// check its in cache
const contents = await checkCacheForDynamicVariable(base._id, "variable3")
const contents = await checkCacheForDynamicVariable(
base._id!,
"variable3"
)
expect(contents.rows.length).toEqual(1)
const responseBody = await preview(datasource, {
path: "www.failonce.com",
@ -490,7 +493,7 @@ describe("/queries", () => {
queryString: "test={{ variable3 }}",
})
// check its in cache
let contents = await checkCacheForDynamicVariable(base._id, "variable3")
let contents = await checkCacheForDynamicVariable(base._id!, "variable3")
expect(contents.rows.length).toEqual(1)
// delete the query
@ -500,7 +503,7 @@ describe("/queries", () => {
.expect(200)
// check variables no longer in cache
contents = await checkCacheForDynamicVariable(base._id, "variable3")
contents = await checkCacheForDynamicVariable(base._id!, "variable3")
expect(contents).toBe(null)
})
})

View file

@ -110,7 +110,7 @@ describe.each([
config.api.row.get(tbl_Id, id, { expectStatus: status })
const getRowUsage = async () => {
const { total } = await config.doInContext(null, () =>
const { total } = await config.doInContext(undefined, () =>
quotas.getCurrentUsageValues(QuotaUsageType.STATIC, StaticQuotaName.ROWS)
)
return total

View file

@ -27,15 +27,17 @@ describe("/users", () => {
describe("fetch", () => {
it("returns a list of users from an instance db", async () => {
await config.createUser({ id: "uuidx" })
await config.createUser({ id: "uuidy" })
const id1 = `us_${utils.newid()}`
const id2 = `us_${utils.newid()}`
await config.createUser({ _id: id1 })
await config.createUser({ _id: id2 })
const res = await config.api.user.fetch()
expect(res.length).toBe(3)
const ids = res.map(u => u._id)
expect(ids).toContain(`ro_ta_users_us_uuidx`)
expect(ids).toContain(`ro_ta_users_us_uuidy`)
expect(ids).toContain(`ro_ta_users_${id1}`)
expect(ids).toContain(`ro_ta_users_${id2}`)
})
it("should apply authorization to endpoint", async () => {
@ -54,7 +56,7 @@ describe("/users", () => {
describe("update", () => {
it("should be able to update the user", async () => {
const user: UserMetadata = await config.createUser({
id: `us_update${utils.newid()}`,
_id: `us_update${utils.newid()}`,
})
user.roleId = roles.BUILTIN_ROLE_IDS.BASIC
delete user._rev

View file

@ -4,6 +4,7 @@ import { AppStatus } from "../../../../db/utils"
import { roles, tenancy, context, db } from "@budibase/backend-core"
import env from "../../../../environment"
import Nano from "@budibase/nano"
import TestConfiguration from "src/tests/utilities/TestConfiguration"
class Request {
appId: any
@ -52,10 +53,10 @@ export const clearAllApps = async (
})
}
export const clearAllAutomations = async (config: any) => {
export const clearAllAutomations = async (config: TestConfiguration) => {
const automations = await config.getAllAutomations()
for (let auto of automations) {
await context.doInAppContext(config.appId, async () => {
await context.doInAppContext(config.getAppId(), async () => {
await config.deleteAutomation(auto)
})
}
@ -101,7 +102,12 @@ export const checkBuilderEndpoint = async ({
method,
url,
body,
}: any) => {
}: {
config: TestConfiguration
method: string
url: string
body?: any
}) => {
const headers = await config.login({
userId: "us_fail",
builder: false,

View file

@ -36,7 +36,7 @@ describe("/webhooks", () => {
const automation = await config.createAutomation()
const res = await request
.put(`/api/webhooks`)
.send(basicWebhook(automation._id))
.send(basicWebhook(automation._id!))
.set(config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
@ -145,7 +145,7 @@ describe("/webhooks", () => {
let automation = collectAutomation()
let newAutomation = await config.createAutomation(automation)
let syncWebhook = await config.createWebhook(
basicWebhook(newAutomation._id)
basicWebhook(newAutomation._id!)
)
// replicate changes before checking webhook

View file

@ -29,6 +29,6 @@ start().catch(err => {
throw err
})
export function getServer() {
export function getServer(): Server {
return server
}

View file

@ -1,9 +1,11 @@
import { Layout } from "@budibase/types"
export const BASE_LAYOUT_PROP_IDS = {
PRIVATE: "layout_private_master",
PUBLIC: "layout_public_master",
}
export const EMPTY_LAYOUT = {
export const EMPTY_LAYOUT: Layout = {
componentLibraries: ["@budibase/standard-components"],
title: "{{ name }}",
favicon: "./_shared/favicon.png",

View file

@ -1,5 +1,6 @@
import { roles } from "@budibase/backend-core"
import { BASE_LAYOUT_PROP_IDS } from "./layouts"
import { Screen } from "@budibase/types"
export function createHomeScreen(
config: {
@ -9,10 +10,8 @@ export function createHomeScreen(
roleId: roles.BUILTIN_ROLE_IDS.BASIC,
route: "/",
}
) {
): Screen {
return {
description: "",
url: "",
layoutId: BASE_LAYOUT_PROP_IDS.PRIVATE,
props: {
_id: "d834fea2-1b3e-4320-ab34-f9009f5ecc59",

View file

@ -1,8 +1,8 @@
import {
DEFAULT_BB_DATASOURCE_ID,
DEFAULT_INVENTORY_TABLE_ID,
DEFAULT_EMPLOYEE_TABLE_ID,
DEFAULT_EXPENSES_TABLE_ID,
DEFAULT_INVENTORY_TABLE_ID,
DEFAULT_JOBS_TABLE_ID,
} from "../../constants"
import { importToRows } from "../../api/controllers/table/utils"
@ -15,19 +15,21 @@ import { expensesImport } from "./expensesImport"
import { db as dbCore } from "@budibase/backend-core"
import {
AutoFieldSubType,
Datasource,
FieldType,
RelationshipType,
Row,
SourceName,
Table,
TableSchema,
TableSourceType,
} from "@budibase/types"
const defaultDatasource = {
const defaultDatasource: Datasource = {
_id: DEFAULT_BB_DATASOURCE_ID,
type: dbCore.BUDIBASE_DATASOURCE_TYPE,
name: "Sample Data",
source: "BUDIBASE",
source: SourceName.BUDIBASE,
config: {},
}

View file

@ -1,13 +1,15 @@
import newid from "./newid"
import { db as dbCore } from "@budibase/backend-core"
import {
FieldType,
DatabaseQueryOpts,
Datasource,
DocumentType,
FieldSchema,
RelationshipFieldMetadata,
VirtualDocumentType,
FieldType,
INTERNAL_TABLE_SOURCE_ID,
DatabaseQueryOpts,
RelationshipFieldMetadata,
SourceName,
VirtualDocumentType,
} from "@budibase/types"
export { DocumentType, VirtualDocumentType } from "@budibase/types"
@ -20,11 +22,11 @@ export const enum AppStatus {
DEPLOYED = "published",
}
export const BudibaseInternalDB = {
export const BudibaseInternalDB: Datasource = {
_id: INTERNAL_TABLE_SOURCE_ID,
type: dbCore.BUDIBASE_DATASOURCE_TYPE,
name: "Budibase DB",
source: "BUDIBASE",
source: SourceName.BUDIBASE,
config: {},
}

View file

@ -76,13 +76,16 @@ const environment = {
DEFAULTS.AUTOMATION_THREAD_TIMEOUT > QUERY_THREAD_TIMEOUT
? DEFAULTS.AUTOMATION_THREAD_TIMEOUT
: QUERY_THREAD_TIMEOUT,
SQL_MAX_ROWS: process.env.SQL_MAX_ROWS,
BB_ADMIN_USER_EMAIL: process.env.BB_ADMIN_USER_EMAIL,
BB_ADMIN_USER_PASSWORD: process.env.BB_ADMIN_USER_PASSWORD,
PLUGINS_DIR: process.env.PLUGINS_DIR || DEFAULTS.PLUGINS_DIR,
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
MAX_IMPORT_SIZE_MB: process.env.MAX_IMPORT_SIZE_MB,
SESSION_EXPIRY_SECONDS: process.env.SESSION_EXPIRY_SECONDS,
// SQL
SQL_MAX_ROWS: process.env.SQL_MAX_ROWS,
SQL_LOGGING_ENABLE: process.env.SQL_LOGGING_ENABLE,
SQL_ALIASING_DISABLE: process.env.SQL_ALIASING_DISABLE,
// flags
ALLOW_DEV_AUTOMATIONS: process.env.ALLOW_DEV_AUTOMATIONS,
DISABLE_THREADING: process.env.DISABLE_THREADING,

View file

@ -1,11 +1,15 @@
import { QueryJson, Datasource } from "@budibase/types"
import {
QueryJson,
Datasource,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import { getIntegration } from "../index"
import sdk from "../../sdk"
export async function makeExternalQuery(
datasource: Datasource,
json: QueryJson
) {
): DatasourcePlusQueryResponse {
datasource = await sdk.datasources.enrich(datasource)
const Integration = await getIntegration(datasource.source)
// query is the opinionated function

View file

@ -17,7 +17,6 @@ const envLimit = environment.SQL_MAX_ROWS
: null
const BASE_LIMIT = envLimit || 5000
type KnexQuery = Knex.QueryBuilder | Knex
// these are invalid dates sent by the client, need to convert them to a real max date
const MIN_ISO_DATE = "0000-00-00T00:00:00.000Z"
const MAX_ISO_DATE = "9999-00-00T00:00:00.000Z"
@ -127,10 +126,15 @@ class InternalBuilder {
// right now we only do filters on the specific table being queried
addFilters(
query: KnexQuery,
query: Knex.QueryBuilder,
filters: SearchFilters | undefined,
opts: { relationship?: boolean; tableName?: string }
): KnexQuery {
tableName: string,
opts: { aliases?: Record<string, string>; relationship?: boolean }
): Knex.QueryBuilder {
function getTableName(name: string) {
const alias = opts.aliases?.[name]
return alias || name
}
function iterate(
structure: { [key: string]: any },
fn: (key: string, value: any) => void
@ -139,10 +143,11 @@ class InternalBuilder {
const updatedKey = dbCore.removeKeyNumbering(key)
const isRelationshipField = updatedKey.includes(".")
if (!opts.relationship && !isRelationshipField) {
fn(`${opts.tableName}.${updatedKey}`, value)
fn(`${getTableName(tableName)}.${updatedKey}`, value)
}
if (opts.relationship && isRelationshipField) {
fn(updatedKey, value)
const [filterTableName, property] = updatedKey.split(".")
fn(`${getTableName(filterTableName)}.${property}`, value)
}
}
}
@ -314,7 +319,7 @@ class InternalBuilder {
return query
}
addSorting(query: KnexQuery, json: QueryJson): KnexQuery {
addSorting(query: Knex.QueryBuilder, json: QueryJson): Knex.QueryBuilder {
let { sort, paginate } = json
const table = json.meta?.table
if (sort && Object.keys(sort || {}).length > 0) {
@ -330,16 +335,28 @@ class InternalBuilder {
return query
}
tableNameWithSchema(
tableName: string,
opts?: { alias?: string; schema?: string }
) {
let withSchema = opts?.schema ? `${opts.schema}.${tableName}` : tableName
if (opts?.alias) {
withSchema += ` as ${opts.alias}`
}
return withSchema
}
addRelationships(
query: KnexQuery,
query: Knex.QueryBuilder,
fromTable: string,
relationships: RelationshipsJson[] | undefined,
schema: string | undefined
): KnexQuery {
schema: string | undefined,
aliases?: Record<string, string>
): Knex.QueryBuilder {
if (!relationships) {
return query
}
const tableSets: Record<string, [any]> = {}
const tableSets: Record<string, [RelationshipsJson]> = {}
// aggregate into table sets (all the same to tables)
for (let relationship of relationships) {
const keyObj: { toTable: string; throughTable: string | undefined } = {
@ -358,10 +375,17 @@ class InternalBuilder {
}
for (let [key, relationships] of Object.entries(tableSets)) {
const { toTable, throughTable } = JSON.parse(key)
const toTableWithSchema = schema ? `${schema}.${toTable}` : toTable
const throughTableWithSchema = schema
? `${schema}.${throughTable}`
: throughTable
const toAlias = aliases?.[toTable] || toTable,
throughAlias = aliases?.[throughTable] || throughTable,
fromAlias = aliases?.[fromTable] || fromTable
let toTableWithSchema = this.tableNameWithSchema(toTable, {
alias: toAlias,
schema,
})
let throughTableWithSchema = this.tableNameWithSchema(throughTable, {
alias: throughAlias,
schema,
})
if (!throughTable) {
// @ts-ignore
query = query.leftJoin(toTableWithSchema, function () {
@ -369,7 +393,7 @@ class InternalBuilder {
const from = relationship.from,
to = relationship.to
// @ts-ignore
this.orOn(`${fromTable}.${from}`, "=", `${toTable}.${to}`)
this.orOn(`${fromAlias}.${from}`, "=", `${toAlias}.${to}`)
}
})
} else {
@ -381,9 +405,9 @@ class InternalBuilder {
const from = relationship.from
// @ts-ignore
this.orOn(
`${fromTable}.${fromPrimary}`,
`${fromAlias}.${fromPrimary}`,
"=",
`${throughTable}.${from}`
`${throughAlias}.${from}`
)
}
})
@ -392,7 +416,7 @@ class InternalBuilder {
const toPrimary = relationship.toPrimary
const to = relationship.to
// @ts-ignore
this.orOn(`${toTable}.${toPrimary}`, `${throughTable}.${to}`)
this.orOn(`${toAlias}.${toPrimary}`, `${throughAlias}.${to}`)
}
})
}
@ -400,12 +424,25 @@ class InternalBuilder {
return query.limit(BASE_LIMIT)
}
create(knex: Knex, json: QueryJson, opts: QueryOptions): KnexQuery {
const { endpoint, body } = json
let query: KnexQuery = knex(endpoint.entityId)
knexWithAlias(
knex: Knex,
endpoint: QueryJson["endpoint"],
aliases?: QueryJson["tableAliases"]
): Knex.QueryBuilder {
const tableName = endpoint.entityId
const tableAliased = aliases?.[tableName]
? `${tableName} as ${aliases?.[tableName]}`
: tableName
let query = knex(tableAliased)
if (endpoint.schema) {
query = query.withSchema(endpoint.schema)
}
return query
}
create(knex: Knex, json: QueryJson, opts: QueryOptions): Knex.QueryBuilder {
const { endpoint, body } = json
let query = this.knexWithAlias(knex, endpoint)
const parsedBody = parseBody(body)
// make sure no null values in body for creation
for (let [key, value] of Object.entries(parsedBody)) {
@ -422,12 +459,9 @@ class InternalBuilder {
}
}
bulkCreate(knex: Knex, json: QueryJson): KnexQuery {
bulkCreate(knex: Knex, json: QueryJson): Knex.QueryBuilder {
const { endpoint, body } = json
let query: KnexQuery = knex(endpoint.entityId)
if (endpoint.schema) {
query = query.withSchema(endpoint.schema)
}
let query = this.knexWithAlias(knex, endpoint)
if (!Array.isArray(body)) {
return query
}
@ -435,8 +469,10 @@ class InternalBuilder {
return query.insert(parsedBody)
}
read(knex: Knex, json: QueryJson, limit: number): KnexQuery {
let { endpoint, resource, filters, paginate, relationships } = json
read(knex: Knex, json: QueryJson, limit: number): Knex.QueryBuilder {
let { endpoint, resource, filters, paginate, relationships, tableAliases } =
json
const tableName = endpoint.entityId
// select all if not specified
if (!resource) {
@ -462,21 +498,20 @@ class InternalBuilder {
foundLimit = paginate.limit
}
// start building the query
let query: KnexQuery = knex(tableName).limit(foundLimit)
if (endpoint.schema) {
query = query.withSchema(endpoint.schema)
}
let query = this.knexWithAlias(knex, endpoint, tableAliases)
query = query.limit(foundLimit)
if (foundOffset) {
query = query.offset(foundOffset)
}
query = this.addFilters(query, filters, { tableName })
query = this.addFilters(query, filters, tableName, {
aliases: tableAliases,
})
// add sorting to pre-query
query = this.addSorting(query, json)
// @ts-ignore
let preQuery: KnexQuery = knex({
// @ts-ignore
[tableName]: query,
}).select(selectStatement)
const alias = tableAliases?.[tableName] || tableName
let preQuery = knex({
[alias]: query,
} as any).select(selectStatement) as any
// have to add after as well (this breaks MS-SQL)
if (this.client !== SqlClient.MS_SQL) {
preQuery = this.addSorting(preQuery, json)
@ -486,19 +521,22 @@ class InternalBuilder {
preQuery,
tableName,
relationships,
endpoint.schema
endpoint.schema,
tableAliases
)
return this.addFilters(query, filters, { relationship: true })
return this.addFilters(query, filters, tableName, {
relationship: true,
aliases: tableAliases,
})
}
update(knex: Knex, json: QueryJson, opts: QueryOptions): KnexQuery {
const { endpoint, body, filters } = json
let query: KnexQuery = knex(endpoint.entityId)
if (endpoint.schema) {
query = query.withSchema(endpoint.schema)
}
update(knex: Knex, json: QueryJson, opts: QueryOptions): Knex.QueryBuilder {
const { endpoint, body, filters, tableAliases } = json
let query = this.knexWithAlias(knex, endpoint, tableAliases)
const parsedBody = parseBody(body)
query = this.addFilters(query, filters, { tableName: endpoint.entityId })
query = this.addFilters(query, filters, endpoint.entityId, {
aliases: tableAliases,
})
// mysql can't use returning
if (opts.disableReturning) {
return query.update(parsedBody)
@ -507,13 +545,12 @@ class InternalBuilder {
}
}
delete(knex: Knex, json: QueryJson, opts: QueryOptions): KnexQuery {
const { endpoint, filters } = json
let query: KnexQuery = knex(endpoint.entityId)
if (endpoint.schema) {
query = query.withSchema(endpoint.schema)
}
query = this.addFilters(query, filters, { tableName: endpoint.entityId })
delete(knex: Knex, json: QueryJson, opts: QueryOptions): Knex.QueryBuilder {
const { endpoint, filters, tableAliases } = json
let query = this.knexWithAlias(knex, endpoint, tableAliases)
query = this.addFilters(query, filters, endpoint.entityId, {
aliases: tableAliases,
})
// mysql can't use returning
if (opts.disableReturning) {
return query.delete()
@ -537,10 +574,10 @@ class SqlQueryBuilder extends SqlTableQueryBuilder {
* which for the sake of mySQL stops adding the returning statement to inserts, updates and deletes.
* @return the query ready to be passed to the driver.
*/
_query(json: QueryJson, opts: QueryOptions = {}) {
_query(json: QueryJson, opts: QueryOptions = {}): Knex.SqlNative | Knex.Sql {
const sqlClient = this.getSqlClient()
const client = knex({ client: sqlClient })
let query
let query: Knex.QueryBuilder
const builder = new InternalBuilder(sqlClient)
switch (this._operation(json)) {
case Operation.CREATE:
@ -565,8 +602,6 @@ class SqlQueryBuilder extends SqlTableQueryBuilder {
default:
throw `Operation type is not supported by SQL query builder`
}
// @ts-ignore
return query.toSQL().toNative()
}
@ -648,6 +683,18 @@ class SqlQueryBuilder extends SqlTableQueryBuilder {
}
return results.length ? results : [{ [operation.toLowerCase()]: true }]
}
log(query: string, values?: any[]) {
if (!environment.SQL_LOGGING_ENABLE) {
return
}
const sqlClient = this.getSqlClient()
let string = `[SQL] [${sqlClient.toUpperCase()}] query="${query}"`
if (values) {
string += ` values="${values.join(", ")}"`
}
console.log(string)
}
}
export default SqlQueryBuilder

View file

@ -9,7 +9,7 @@ import {
Table,
FieldType,
} from "@budibase/types"
import { breakExternalTableId } from "../utils"
import { breakExternalTableId, SqlClient } from "../utils"
import SchemaBuilder = Knex.SchemaBuilder
import CreateTableBuilder = Knex.CreateTableBuilder
import { utils } from "@budibase/shared-core"
@ -135,7 +135,8 @@ function generateSchema(
// need to check if any columns have been deleted
if (oldTable) {
const deletedColumns = Object.entries(oldTable.schema).filter(
([key, column]) => isIgnoredType(column.type) && table.schema[key] == null
([key, column]) =>
!isIgnoredType(column.type) && table.schema[key] == null
)
deletedColumns.forEach(([key, column]) => {
if (renamed?.old === key || isIgnoredType(column.type)) {
@ -197,13 +198,14 @@ class SqlTableQueryBuilder {
return json.endpoint.operation
}
_tableQuery(json: QueryJson): any {
_tableQuery(json: QueryJson): Knex.Sql | Knex.SqlNative {
let client = knex({ client: this.sqlClient }).schema
if (json?.endpoint?.schema) {
client = client.withSchema(json.endpoint.schema)
let schemaName = json?.endpoint?.schema
if (schemaName) {
client = client.withSchema(schemaName)
}
let query
let query: Knex.SchemaBuilder
if (!json.table || !json.meta || !json.meta.tables) {
throw "Cannot execute without table being specified"
}
@ -215,6 +217,18 @@ class SqlTableQueryBuilder {
if (!json.meta || !json.meta.table) {
throw "Must specify old table for update"
}
// renameColumn does not work for MySQL, so return a raw query
if (this.sqlClient === SqlClient.MY_SQL && json.meta.renamed) {
const updatedColumn = json.meta.renamed.updated
const tableName = schemaName
? `\`${schemaName}\`.\`${json.table.name}\``
: `\`${json.table.name}\``
const externalType = json.table.schema[updatedColumn].externalType!
return {
sql: `alter table ${tableName} change column \`${json.meta.renamed.old}\` \`${updatedColumn}\` ${externalType};`,
bindings: [],
}
}
query = buildUpdateTable(
client,
json.table,

View file

@ -16,6 +16,7 @@ import {
Table,
TableRequest,
TableSourceType,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import { OAuth2Client } from "google-auth-library"
import {
@ -334,7 +335,7 @@ class GoogleSheetsIntegration implements DatasourcePlus {
return { tables: externalTables, errors }
}
async query(json: QueryJson) {
async query(json: QueryJson): DatasourcePlusQueryResponse {
const sheet = json.endpoint.entityId
switch (json.endpoint.operation) {
case Operation.CREATE:
@ -384,7 +385,7 @@ class GoogleSheetsIntegration implements DatasourcePlus {
}
try {
await this.connect()
return await this.client.addSheet({ title: name, headerValues: [name] })
await this.client.addSheet({ title: name, headerValues: [name] })
} catch (err) {
console.error("Error creating new table in google sheets", err)
throw err
@ -450,7 +451,7 @@ class GoogleSheetsIntegration implements DatasourcePlus {
try {
await this.connect()
const sheetToDelete = this.client.sheetsByTitle[sheet]
return await sheetToDelete.delete()
await sheetToDelete.delete()
} catch (err) {
console.error("Error deleting table in google sheets", err)
throw err

View file

@ -37,6 +37,7 @@ const DEFINITIONS: Record<SourceName, Integration | undefined> = {
[SourceName.REDIS]: redis.schema,
[SourceName.SNOWFLAKE]: snowflake.schema,
[SourceName.ORACLE]: undefined,
[SourceName.BUDIBASE]: undefined,
}
const INTEGRATIONS: Record<SourceName, any> = {
@ -56,6 +57,7 @@ const INTEGRATIONS: Record<SourceName, any> = {
[SourceName.REDIS]: redis.integration,
[SourceName.SNOWFLAKE]: snowflake.integration,
[SourceName.ORACLE]: undefined,
[SourceName.BUDIBASE]: undefined,
}
// optionally add oracle integration if the oracle binary can be installed

View file

@ -13,6 +13,7 @@ import {
SourceName,
Schema,
TableSourceType,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import {
getSqlQuery,
@ -329,6 +330,7 @@ class SqlServerIntegration extends Sql implements DatasourcePlus {
operation === Operation.CREATE
? `${query.sql}; SELECT SCOPE_IDENTITY() AS id;`
: query.sql
this.log(sql, query.bindings)
return await request.query(sql)
} catch (err: any) {
let readableMessage = getReadableErrorMessage(
@ -492,7 +494,7 @@ class SqlServerIntegration extends Sql implements DatasourcePlus {
return response.recordset || [{ deleted: true }]
}
async query(json: QueryJson) {
async query(json: QueryJson): DatasourcePlusQueryResponse {
const schema = this.config.schema
await this.connect()
if (schema && schema !== DEFAULT_SCHEMA && json?.endpoint) {

View file

@ -12,7 +12,7 @@ import {
SourceName,
Schema,
TableSourceType,
FieldType,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import {
getSqlQuery,
@ -261,6 +261,7 @@ class MySQLIntegration extends Sql implements DatasourcePlus {
const bindings = opts?.disableCoercion
? baseBindings
: bindingTypeCoerce(baseBindings)
this.log(query.sql, bindings)
// Node MySQL is callback based, so we must wrap our call in a promise
const response = await this.client!.query(query.sql, bindings)
return response[0]
@ -380,7 +381,7 @@ class MySQLIntegration extends Sql implements DatasourcePlus {
return results.length ? results : [{ deleted: true }]
}
async query(json: QueryJson) {
async query(json: QueryJson): DatasourcePlusQueryResponse {
await this.connect()
try {
const queryFn = (query: any) =>

View file

@ -12,6 +12,8 @@ import {
ConnectionInfo,
Schema,
TableSourceType,
Row,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import {
buildExternalTableId,
@ -368,6 +370,7 @@ class OracleIntegration extends Sql implements DatasourcePlus {
const options: ExecuteOptions = { autoCommit: true }
const bindings: BindParameters = query.bindings || []
this.log(query.sql, bindings)
return await connection.execute<T>(query.sql, bindings, options)
} finally {
if (connection) {
@ -419,9 +422,9 @@ class OracleIntegration extends Sql implements DatasourcePlus {
: [{ deleted: true }]
}
async query(json: QueryJson) {
async query(json: QueryJson): DatasourcePlusQueryResponse {
const operation = this._operation(json)
const input = this._query(json, { disableReturning: true })
const input = this._query(json, { disableReturning: true }) as SqlQuery
if (Array.isArray(input)) {
const responses = []
for (let query of input) {
@ -443,7 +446,7 @@ class OracleIntegration extends Sql implements DatasourcePlus {
if (deletedRows?.rows?.length) {
return deletedRows.rows
} else if (response.rows?.length) {
return response.rows
return response.rows as Row[]
} else {
// get the last row that was updated
if (
@ -454,7 +457,7 @@ class OracleIntegration extends Sql implements DatasourcePlus {
const lastRow = await this.internalQuery({
sql: `SELECT * FROM \"${json.endpoint.entityId}\" WHERE ROWID = '${response.lastRowid}'`,
})
return lastRow.rows
return lastRow.rows as Row[]
} else {
return [{ [operation.toLowerCase()]: true }]
}

View file

@ -12,6 +12,7 @@ import {
SourceName,
Schema,
TableSourceType,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import {
getSqlQuery,
@ -268,7 +269,9 @@ class PostgresIntegration extends Sql implements DatasourcePlus {
}
}
try {
return await client.query(query.sql, query.bindings || [])
const bindings = query.bindings || []
this.log(query.sql, bindings)
return await client.query(query.sql, bindings)
} catch (err: any) {
await this.closeConnection()
let readableMessage = getReadableErrorMessage(
@ -417,9 +420,9 @@ class PostgresIntegration extends Sql implements DatasourcePlus {
return response.rows.length ? response.rows : [{ deleted: true }]
}
async query(json: QueryJson) {
async query(json: QueryJson): DatasourcePlusQueryResponse {
const operation = this._operation(json).toLowerCase()
const input = this._query(json)
const input = this._query(json) as SqlQuery
if (Array.isArray(input)) {
const responses = []
for (let query of input) {

View file

@ -1,5 +1,12 @@
const Sql = require("../base/sql").default
const { SqlClient } = require("../utils")
import { SqlClient } from "../utils"
import Sql from "../base/sql"
import {
Operation,
QueryJson,
TableSourceType,
Table,
FieldType,
} from "@budibase/types"
const TABLE_NAME = "test"
@ -17,7 +24,7 @@ function generateReadJson({
filters,
sort,
paginate,
}: any = {}) {
}: any = {}): QueryJson {
return {
endpoint: endpoint(table || TABLE_NAME, "READ"),
resource: {
@ -28,41 +35,51 @@ function generateReadJson({
paginate: paginate || {},
meta: {
table: {
type: "table",
sourceType: TableSourceType.EXTERNAL,
sourceId: "SOURCE_ID",
schema: {},
name: table || TABLE_NAME,
primary: ["id"],
},
} as any,
},
}
}
function generateCreateJson(table = TABLE_NAME, body = {}) {
function generateCreateJson(table = TABLE_NAME, body = {}): QueryJson {
return {
endpoint: endpoint(table, "CREATE"),
body,
}
}
function generateUpdateJson(table = TABLE_NAME, body = {}, filters = {}) {
function generateUpdateJson({
table = TABLE_NAME,
body = {},
filters = {},
meta = {},
}): QueryJson {
return {
endpoint: endpoint(table, "UPDATE"),
filters,
body,
meta,
}
}
function generateDeleteJson(table = TABLE_NAME, filters = {}) {
function generateDeleteJson(table = TABLE_NAME, filters = {}): QueryJson {
return {
endpoint: endpoint(table, "DELETE"),
filters,
}
}
function generateRelationshipJson(config: { schema?: string } = {}) {
function generateRelationshipJson(config: { schema?: string } = {}): QueryJson {
return {
endpoint: {
datasourceId: "Postgres",
entityId: "brands",
operation: "READ",
operation: Operation.READ,
schema: config.schema,
},
resource: {
@ -76,7 +93,6 @@ function generateRelationshipJson(config: { schema?: string } = {}) {
},
filters: {},
sort: {},
paginate: {},
relationships: [
{
from: "brand_id",
@ -240,17 +256,17 @@ describe("SQL query builder", () => {
it("should test an update statement", () => {
const query = sql._query(
generateUpdateJson(
TABLE_NAME,
{
generateUpdateJson({
table: TABLE_NAME,
body: {
name: "John",
},
{
filters: {
equal: {
id: 1001,
},
}
)
},
})
)
expect(query).toEqual({
bindings: ["John", 1001],
@ -502,7 +518,7 @@ describe("SQL query builder", () => {
const query = sql._query(generateRelationshipJson({ schema: "production" }))
expect(query).toEqual({
bindings: [500, 5000],
sql: `select "brands"."brand_id" as "brands.brand_id", "brands"."brand_name" as "brands.brand_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name", "products"."brand_id" as "products.brand_id" from (select * from "production"."brands" limit $1) as "brands" left join "production"."products" on "brands"."brand_id" = "products"."brand_id" limit $2`,
sql: `select "brands"."brand_id" as "brands.brand_id", "brands"."brand_name" as "brands.brand_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name", "products"."brand_id" as "products.brand_id" from (select * from "production"."brands" limit $1) as "brands" left join "production"."products" as "products" on "brands"."brand_id" = "products"."brand_id" limit $2`,
})
})
@ -510,7 +526,7 @@ describe("SQL query builder", () => {
const query = sql._query(generateRelationshipJson())
expect(query).toEqual({
bindings: [500, 5000],
sql: `select "brands"."brand_id" as "brands.brand_id", "brands"."brand_name" as "brands.brand_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name", "products"."brand_id" as "products.brand_id" from (select * from "brands" limit $1) as "brands" left join "products" on "brands"."brand_id" = "products"."brand_id" limit $2`,
sql: `select "brands"."brand_id" as "brands.brand_id", "brands"."brand_name" as "brands.brand_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name", "products"."brand_id" as "products.brand_id" from (select * from "brands" limit $1) as "brands" left join "products" as "products" on "brands"."brand_id" = "products"."brand_id" limit $2`,
})
})
@ -520,7 +536,7 @@ describe("SQL query builder", () => {
)
expect(query).toEqual({
bindings: [500, 5000],
sql: `select "stores"."store_id" as "stores.store_id", "stores"."store_name" as "stores.store_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name" from (select * from "production"."stores" limit $1) as "stores" left join "production"."stocks" on "stores"."store_id" = "stocks"."store_id" left join "production"."products" on "products"."product_id" = "stocks"."product_id" limit $2`,
sql: `select "stores"."store_id" as "stores.store_id", "stores"."store_name" as "stores.store_name", "products"."product_id" as "products.product_id", "products"."product_name" as "products.product_name" from (select * from "production"."stores" limit $1) as "stores" left join "production"."stocks" as "stocks" on "stores"."store_id" = "stocks"."store_id" left join "production"."products" as "products" on "products"."product_id" = "stocks"."product_id" limit $2`,
})
})
@ -682,4 +698,99 @@ describe("SQL query builder", () => {
sql: `insert into \"test\" (\"name\") values ($1) returning *`,
})
})
it("should be able to rename column for MySQL", () => {
const table: Table = {
type: "table",
sourceType: TableSourceType.EXTERNAL,
name: TABLE_NAME,
schema: {
first_name: {
type: FieldType.STRING,
name: "first_name",
externalType: "varchar(45)",
},
},
sourceId: "SOURCE_ID",
}
const oldTable: Table = {
...table,
schema: {
name: {
type: FieldType.STRING,
name: "name",
externalType: "varchar(45)",
},
},
}
const query = new Sql(SqlClient.MY_SQL, limit)._query({
table,
endpoint: {
datasourceId: "MySQL",
operation: Operation.UPDATE_TABLE,
entityId: TABLE_NAME,
},
meta: {
table: oldTable,
tables: { [oldTable.name]: oldTable },
renamed: {
old: "name",
updated: "first_name",
},
},
})
expect(query).toEqual({
bindings: [],
sql: `alter table \`${TABLE_NAME}\` change column \`name\` \`first_name\` varchar(45);`,
})
})
it("should be able to delete a column", () => {
const table: Table = {
type: "table",
sourceType: TableSourceType.EXTERNAL,
name: TABLE_NAME,
schema: {
first_name: {
type: FieldType.STRING,
name: "first_name",
externalType: "varchar(45)",
},
},
sourceId: "SOURCE_ID",
}
const oldTable: Table = {
...table,
schema: {
first_name: {
type: FieldType.STRING,
name: "first_name",
externalType: "varchar(45)",
},
last_name: {
type: FieldType.STRING,
name: "last_name",
externalType: "varchar(45)",
},
},
}
const query = sql._query({
table,
endpoint: {
datasourceId: "Postgres",
operation: Operation.UPDATE_TABLE,
entityId: TABLE_NAME,
},
meta: {
table: oldTable,
tables: [oldTable],
},
})
expect(query).toEqual([
{
bindings: [],
sql: `alter table "${TABLE_NAME}" drop column "last_name"`,
},
])
})
})

View file

@ -0,0 +1,204 @@
import { QueryJson } from "@budibase/types"
import { join } from "path"
import Sql from "../base/sql"
import { SqlClient } from "../utils"
import AliasTables from "../../api/controllers/row/alias"
import { generator } from "@budibase/backend-core/tests"
function multiline(sql: string) {
return sql.replace(/\n/g, "").replace(/ +/g, " ")
}
describe("Captures of real examples", () => {
const limit = 5000
const relationshipLimit = 100
function getJson(name: string): QueryJson {
return require(join(__dirname, "sqlQueryJson", name)) as QueryJson
}
describe("create", () => {
it("should create a row with relationships", () => {
const queryJson = getJson("createWithRelationships.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: ["A Street", 34, "London", "A", "B", "designer", 1990],
sql: multiline(`insert into "persons" ("address", "age", "city", "firstname", "lastname", "type", "year")
values ($1, $2, $3, $4, $5, $6, $7) returning *`),
})
})
})
describe("read", () => {
it("should handle basic retrieval with relationships", () => {
const queryJson = getJson("basicFetchWithRelationships.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [relationshipLimit, limit],
sql: multiline(`select "a"."year" as "a.year", "a"."firstname" as "a.firstname", "a"."personid" as "a.personid",
"a"."address" as "a.address", "a"."age" as "a.age", "a"."type" as "a.type", "a"."city" as "a.city",
"a"."lastname" as "a.lastname", "b"."executorid" as "b.executorid", "b"."taskname" as "b.taskname",
"b"."taskid" as "b.taskid", "b"."completed" as "b.completed", "b"."qaid" as "b.qaid",
"b"."executorid" as "b.executorid", "b"."taskname" as "b.taskname", "b"."taskid" as "b.taskid",
"b"."completed" as "b.completed", "b"."qaid" as "b.qaid"
from (select * from "persons" as "a" order by "a"."firstname" asc limit $1) as "a"
left join "tasks" as "b" on "a"."personid" = "b"."qaid" or "a"."personid" = "b"."executorid"
order by "a"."firstname" asc limit $2`),
})
})
it("should handle filtering by relationship", () => {
const queryJson = getJson("filterByRelationship.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [relationshipLimit, "assembling", limit],
sql: multiline(`select "a"."productname" as "a.productname", "a"."productid" as "a.productid",
"b"."executorid" as "b.executorid", "b"."taskname" as "b.taskname", "b"."taskid" as "b.taskid",
"b"."completed" as "b.completed", "b"."qaid" as "b.qaid"
from (select * from "products" as "a" order by "a"."productname" asc limit $1) as "a"
left join "products_tasks" as "c" on "a"."productid" = "c"."productid"
left join "tasks" as "b" on "b"."taskid" = "c"."taskid" where "b"."taskname" = $2
order by "a"."productname" asc limit $3`),
})
})
it("should handle fetching many to many relationships", () => {
const queryJson = getJson("fetchManyToMany.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [relationshipLimit, limit],
sql: multiline(`select "a"."productname" as "a.productname", "a"."productid" as "a.productid",
"b"."executorid" as "b.executorid", "b"."taskname" as "b.taskname", "b"."taskid" as "b.taskid",
"b"."completed" as "b.completed", "b"."qaid" as "b.qaid"
from (select * from "products" as "a" order by "a"."productname" asc limit $1) as "a"
left join "products_tasks" as "c" on "a"."productid" = "c"."productid"
left join "tasks" as "b" on "b"."taskid" = "c"."taskid"
order by "a"."productname" asc limit $2`),
})
})
it("should handle enrichment of rows", () => {
const queryJson = getJson("enrichRelationship.json")
const filters = queryJson.filters?.oneOf?.taskid as number[]
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [...filters, limit, limit],
sql: multiline(`select "a"."executorid" as "a.executorid", "a"."taskname" as "a.taskname",
"a"."taskid" as "a.taskid", "a"."completed" as "a.completed", "a"."qaid" as "a.qaid",
"b"."productname" as "b.productname", "b"."productid" as "b.productid"
from (select * from "tasks" as "a" where "a"."taskid" in ($1, $2) limit $3) as "a"
left join "products_tasks" as "c" on "a"."taskid" = "c"."taskid"
left join "products" as "b" on "b"."productid" = "c"."productid" limit $4`),
})
})
it("should manage query with many relationship filters", () => {
const queryJson = getJson("manyRelationshipFilters.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
const filters = queryJson.filters
const notEqualsValue = Object.values(filters?.notEqual!)[0]
const rangeValue = Object.values(filters?.range!)[0]
const equalValue = Object.values(filters?.equal!)[0]
expect(query).toEqual({
bindings: [
notEqualsValue,
relationshipLimit,
rangeValue.low,
rangeValue.high,
equalValue,
limit,
],
sql: multiline(`select "a"."executorid" as "a.executorid", "a"."taskname" as "a.taskname", "a"."taskid" as "a.taskid",
"a"."completed" as "a.completed", "a"."qaid" as "a.qaid", "b"."productname" as "b.productname",
"b"."productid" as "b.productid", "c"."year" as "c.year", "c"."firstname" as "c.firstname",
"c"."personid" as "c.personid", "c"."address" as "c.address", "c"."age" as "c.age", "c"."type" as "c.type",
"c"."city" as "c.city", "c"."lastname" as "c.lastname", "c"."year" as "c.year", "c"."firstname" as "c.firstname",
"c"."personid" as "c.personid", "c"."address" as "c.address", "c"."age" as "c.age", "c"."type" as "c.type",
"c"."city" as "c.city", "c"."lastname" as "c.lastname"
from (select * from "tasks" as "a" where not "a"."completed" = $1
order by "a"."taskname" asc limit $2) as "a"
left join "products_tasks" as "d" on "a"."taskid" = "d"."taskid"
left join "products" as "b" on "b"."productid" = "d"."productid"
left join "persons" as "c" on "a"."executorid" = "c"."personid" or "a"."qaid" = "c"."personid"
where "c"."year" between $3 and $4 and "b"."productname" = $5 order by "a"."taskname" asc limit $6`),
})
})
})
describe("update", () => {
it("should handle performing a simple update", () => {
const queryJson = getJson("updateSimple.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [1990, "C", "A Street", 34, "designer", "London", "B", 5],
sql: multiline(`update "persons" as "a" set "year" = $1, "firstname" = $2, "address" = $3, "age" = $4,
"type" = $5, "city" = $6, "lastname" = $7 where "a"."personid" = $8 returning *`),
})
})
it("should handle performing an update of relationships", () => {
const queryJson = getJson("updateRelationship.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: [1990, "C", "A Street", 34, "designer", "London", "B", 5],
sql: multiline(`update "persons" as "a" set "year" = $1, "firstname" = $2, "address" = $3, "age" = $4,
"type" = $5, "city" = $6, "lastname" = $7 where "a"."personid" = $8 returning *`),
})
})
})
describe("delete", () => {
it("should handle deleting with relationships", () => {
const queryJson = getJson("deleteSimple.json")
let query = new Sql(SqlClient.POSTGRES, limit)._query(queryJson)
expect(query).toEqual({
bindings: ["ddd", ""],
sql: multiline(`delete from "compositetable" as "a" where "a"."keypartone" = $1 and "a"."keyparttwo" = $2
returning "a"."keyparttwo" as "a.keyparttwo", "a"."keypartone" as "a.keypartone", "a"."name" as "a.name"`),
})
})
})
describe("check max character aliasing", () => {
it("should handle over 'z' max character alias", () => {
const tableNames = []
for (let i = 0; i < 100; i++) {
tableNames.push(generator.guid())
}
const aliasing = new AliasTables(tableNames)
let alias: string = ""
for (let table of tableNames) {
alias = aliasing.getAlias(table)
}
expect(alias).toEqual("cv")
})
})
describe("check some edge cases", () => {
const tableNames = ["hello", "world"]
it("should handle quoted table names", () => {
const aliasing = new AliasTables(tableNames)
const aliased = aliasing.aliasField(`"hello"."field"`)
expect(aliased).toEqual(`"a"."field"`)
})
it("should handle quoted table names with graves", () => {
const aliasing = new AliasTables(tableNames)
const aliased = aliasing.aliasField("`hello`.`world`")
expect(aliased).toEqual("`a`.`world`")
})
it("should handle table names in table names correctly", () => {
const tableNames = ["he", "hell", "hello"]
const aliasing = new AliasTables(tableNames)
const aliased1 = aliasing.aliasField("`he`.`world`")
const aliased2 = aliasing.aliasField("`hell`.`world`")
const aliased3 = aliasing.aliasField("`hello`.`world`")
expect(aliased1).toEqual("`a`.`world`")
expect(aliased2).toEqual("`b`.`world`")
expect(aliased3).toEqual("`c`.`world`")
})
})
})

View file

@ -0,0 +1,183 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "persons",
"operation": "READ"
},
"resource": {
"fields": [
"a.year",
"a.firstname",
"a.personid",
"a.address",
"a.age",
"a.type",
"a.city",
"a.lastname",
"b.executorid",
"b.taskname",
"b.taskid",
"b.completed",
"b.qaid",
"b.executorid",
"b.taskname",
"b.taskid",
"b.completed",
"b.qaid"
]
},
"filters": {},
"sort": {
"firstname": {
"direction": "ASCENDING"
}
},
"paginate": {
"limit": 100,
"page": 1
},
"relationships": [
{
"tableName": "tasks",
"column": "QA",
"from": "personid",
"to": "qaid",
"aliases": {
"tasks": "b",
"persons": "a"
}
},
{
"tableName": "tasks",
"column": "executor",
"from": "personid",
"to": "executorid",
"aliases": {
"tasks": "b",
"persons": "a"
}
}
],
"extra": {
"idFilter": {}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__persons",
"primary": [
"personid"
],
"name": "a",
"schema": {
"year": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "year",
"constraints": {
"presence": false
}
},
"firstname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "firstname",
"constraints": {
"presence": false
}
},
"personid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "personid",
"constraints": {
"presence": false
}
},
"address": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "address",
"constraints": {
"presence": false
}
},
"age": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "age",
"constraints": {
"presence": false
}
},
"type": {
"type": "options",
"externalType": "USER-DEFINED",
"autocolumn": false,
"name": "type",
"constraints": {
"presence": false,
"inclusion": [
"support",
"designer",
"programmer",
"qa"
]
}
},
"city": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "city",
"constraints": {
"presence": false
}
},
"lastname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "lastname",
"constraints": {
"presence": false
}
},
"QA": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "QA",
"relationshipType": "many-to-one",
"fieldName": "qaid",
"type": "link",
"main": true,
"_id": "ccb68481c80c34217a4540a2c6c27fe46",
"foreignKey": "personid"
},
"executor": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "executor",
"relationshipType": "many-to-one",
"fieldName": "executorid",
"type": "link",
"main": true,
"_id": "c89530b9770d94bec851e062b5cff3001",
"foreignKey": "personid",
"tableName": "persons"
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "firstname",
"views": {}
}
},
"tableAliases": {
"persons": "a",
"tasks": "b"
}
}

View file

@ -0,0 +1,173 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "persons",
"operation": "CREATE"
},
"resource": {
"fields": [
"a.year",
"a.firstname",
"a.personid",
"a.address",
"a.age",
"a.type",
"a.city",
"a.lastname"
]
},
"filters": {},
"relationships": [
{
"tableName": "tasks",
"column": "QA",
"from": "personid",
"to": "qaid",
"aliases": {
"tasks": "b",
"persons": "a"
}
},
{
"tableName": "tasks",
"column": "executor",
"from": "personid",
"to": "executorid",
"aliases": {
"tasks": "b",
"persons": "a"
}
}
],
"body": {
"year": 1990,
"firstname": "A",
"address": "A Street",
"age": 34,
"type": "designer",
"city": "London",
"lastname": "B"
},
"extra": {
"idFilter": {}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__persons",
"primary": [
"personid"
],
"name": "a",
"schema": {
"year": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "year",
"constraints": {
"presence": false
}
},
"firstname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "firstname",
"constraints": {
"presence": false
}
},
"personid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "personid",
"constraints": {
"presence": false
}
},
"address": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "address",
"constraints": {
"presence": false
}
},
"age": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "age",
"constraints": {
"presence": false
}
},
"type": {
"type": "options",
"externalType": "USER-DEFINED",
"autocolumn": false,
"name": "type",
"constraints": {
"presence": false,
"inclusion": [
"support",
"designer",
"programmer",
"qa"
]
}
},
"city": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "city",
"constraints": {
"presence": false
}
},
"lastname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "lastname",
"constraints": {
"presence": false
}
},
"QA": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "QA",
"relationshipType": "many-to-one",
"fieldName": "qaid",
"type": "link",
"main": true,
"_id": "ccb68481c80c34217a4540a2c6c27fe46",
"foreignKey": "personid"
},
"executor": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "executor",
"relationshipType": "many-to-one",
"fieldName": "executorid",
"type": "link",
"main": true,
"_id": "c89530b9770d94bec851e062b5cff3001",
"foreignKey": "personid",
"tableName": "persons"
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "firstname",
"views": {}
}
},
"tableAliases": {
"persons": "a",
"tasks": "b"
}
}

View file

@ -0,0 +1,75 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "compositetable",
"operation": "DELETE"
},
"resource": {
"fields": [
"a.keyparttwo",
"a.keypartone",
"a.name"
]
},
"filters": {
"equal": {
"keypartone": "ddd",
"keyparttwo": ""
}
},
"relationships": [],
"extra": {
"idFilter": {
"equal": {
"keypartone": "ddd",
"keyparttwo": ""
}
}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__compositetable",
"primary": [
"keypartone",
"keyparttwo"
],
"name": "a",
"schema": {
"keyparttwo": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "keyparttwo",
"constraints": {
"presence": true
}
},
"keypartone": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "keypartone",
"constraints": {
"presence": true
}
},
"name": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "name",
"constraints": {
"presence": false
}
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "keypartone"
}
},
"tableAliases": {
"compositetable": "a"
}
}

View file

@ -0,0 +1,123 @@
{
"endpoint": {
"datasourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"entityId": "tasks",
"operation": "READ"
},
"resource": {
"fields": [
"a.executorid",
"a.taskname",
"a.taskid",
"a.completed",
"a.qaid",
"b.productname",
"b.productid"
]
},
"filters": {
"oneOf": {
"taskid": [
1,
2
]
}
},
"relationships": [
{
"tableName": "products",
"column": "products",
"through": "products_tasks",
"from": "taskid",
"to": "productid",
"fromPrimary": "taskid",
"toPrimary": "productid",
"aliases": {
"products_tasks": "c",
"products": "b",
"tasks": "a"
}
}
],
"extra": {
"idFilter": {}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__tasks",
"primary": [
"taskid"
],
"name": "a",
"schema": {
"executorid": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "executorid",
"constraints": {
"presence": false
}
},
"taskname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "taskname",
"constraints": {
"presence": false
}
},
"taskid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "taskid",
"constraints": {
"presence": false
}
},
"completed": {
"type": "boolean",
"externalType": "boolean",
"autocolumn": false,
"name": "completed",
"constraints": {
"presence": false
}
},
"qaid": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "qaid",
"constraints": {
"presence": false
}
},
"products": {
"tableId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products",
"name": "products",
"relationshipType": "many-to-many",
"through": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products_tasks",
"type": "link",
"_id": "c3b91d00cd36c4cc1a347794725b9adbd",
"fieldName": "productid",
"throughFrom": "productid",
"throughTo": "taskid"
}
},
"sourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"sourceType": "external",
"primaryDisplay": "taskname",
"sql": true,
"views": {}
}
},
"tableAliases": {
"tasks": "a",
"products": "b",
"products_tasks": "c"
}
}

View file

@ -0,0 +1,109 @@
{
"endpoint": {
"datasourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"entityId": "products",
"operation": "READ"
},
"resource": {
"fields": [
"a.productname",
"a.productid",
"b.executorid",
"b.taskname",
"b.taskid",
"b.completed",
"b.qaid"
]
},
"filters": {
"string": {},
"fuzzy": {},
"range": {},
"equal": {},
"notEqual": {},
"empty": {},
"notEmpty": {},
"contains": {},
"notContains": {},
"oneOf": {},
"containsAny": {}
},
"sort": {
"productname": {
"direction": "ASCENDING"
}
},
"paginate": {
"limit": 100,
"page": 1
},
"relationships": [
{
"tableName": "tasks",
"column": "tasks",
"through": "products_tasks",
"from": "productid",
"to": "taskid",
"fromPrimary": "productid",
"toPrimary": "taskid",
"aliases": {
"products_tasks": "c",
"tasks": "b",
"products": "a"
}
}
],
"extra": {
"idFilter": {}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products",
"primary": [
"productid"
],
"name": "a",
"schema": {
"productname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "productname",
"constraints": {
"presence": false
}
},
"productid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "productid",
"constraints": {
"presence": false
}
},
"tasks": {
"tableId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__tasks",
"name": "tasks",
"relationshipType": "many-to-many",
"fieldName": "taskid",
"through": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products_tasks",
"throughFrom": "taskid",
"throughTo": "productid",
"type": "link",
"main": true,
"_id": "c3b91d00cd36c4cc1a347794725b9adbd"
}
},
"sourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"sourceType": "external",
"primaryDisplay": "productname"
}
},
"tableAliases": {
"products": "a",
"tasks": "b",
"products_tasks": "c"
}
}

View file

@ -0,0 +1,94 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "products",
"operation": "READ"
},
"resource": {
"fields": [
"a.productname",
"a.productid",
"b.executorid",
"b.taskname",
"b.taskid",
"b.completed",
"b.qaid"
]
},
"filters": {
"equal": {
"1:tasks.taskname": "assembling"
},
"onEmptyFilter": "all"
},
"sort": {
"productname": {
"direction": "ASCENDING"
}
},
"paginate": {
"limit": 100,
"page": 1
},
"relationships": [
{
"tableName": "tasks",
"column": "tasks",
"through": "products_tasks",
"from": "productid",
"to": "taskid",
"fromPrimary": "productid",
"toPrimary": "taskid"
}
],
"tableAliases": {
"products_tasks": "c",
"tasks": "b",
"products": "a"
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__products",
"primary": [
"productid"
],
"name": "a",
"schema": {
"productname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "productname",
"constraints": {
"presence": false
}
},
"productid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "productid",
"constraints": {
"presence": false
}
},
"tasks": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "tasks",
"relationshipType": "many-to-many",
"fieldName": "taskid",
"through": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__products_tasks",
"throughFrom": "taskid",
"throughTo": "productid",
"type": "link",
"main": true,
"_id": "ca6862d9ba09146dd8a68e3b5b7055a09"
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "productname"
}
}
}

View file

@ -0,0 +1,202 @@
{
"endpoint": {
"datasourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"entityId": "tasks",
"operation": "READ"
},
"resource": {
"fields": [
"a.executorid",
"a.taskname",
"a.taskid",
"a.completed",
"a.qaid",
"b.productname",
"b.productid",
"c.year",
"c.firstname",
"c.personid",
"c.address",
"c.age",
"c.type",
"c.city",
"c.lastname",
"c.year",
"c.firstname",
"c.personid",
"c.address",
"c.age",
"c.type",
"c.city",
"c.lastname"
]
},
"filters": {
"string": {},
"fuzzy": {},
"range": {
"1:persons.year": {
"low": 1990,
"high": 2147483647
}
},
"equal": {
"2:products.productname": "Computers"
},
"notEqual": {
"3:completed": true
},
"empty": {},
"notEmpty": {},
"contains": {},
"notContains": {},
"oneOf": {},
"containsAny": {},
"onEmptyFilter": "all"
},
"sort": {
"taskname": {
"direction": "ASCENDING"
}
},
"paginate": {
"limit": 100,
"page": 1
},
"relationships": [
{
"tableName": "products",
"column": "products",
"through": "products_tasks",
"from": "taskid",
"to": "productid",
"fromPrimary": "taskid",
"toPrimary": "productid",
"aliases": {
"products_tasks": "d",
"products": "b",
"tasks": "a"
}
},
{
"tableName": "persons",
"column": "tasksToExecute",
"from": "executorid",
"to": "personid",
"aliases": {
"persons": "c",
"tasks": "a"
}
},
{
"tableName": "persons",
"column": "tasksToQA",
"from": "qaid",
"to": "personid",
"aliases": {
"persons": "c",
"tasks": "a"
}
}
],
"extra": {
"idFilter": {}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__tasks",
"primary": [
"taskid"
],
"name": "a",
"schema": {
"executorid": {
"type": "number",
"externalType": "integer",
"name": "executorid",
"constraints": {
"presence": false
},
"autocolumn": true,
"autoReason": "foreign_key"
},
"taskname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "taskname",
"constraints": {
"presence": false
}
},
"taskid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "taskid",
"constraints": {
"presence": false
}
},
"completed": {
"type": "boolean",
"externalType": "boolean",
"autocolumn": false,
"name": "completed",
"constraints": {
"presence": false
}
},
"qaid": {
"type": "number",
"externalType": "integer",
"name": "qaid",
"constraints": {
"presence": false
}
},
"products": {
"tableId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products",
"name": "products",
"relationshipType": "many-to-many",
"through": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__products_tasks",
"type": "link",
"_id": "c3b91d00cd36c4cc1a347794725b9adbd",
"fieldName": "productid",
"throughFrom": "productid",
"throughTo": "taskid"
},
"tasksToExecute": {
"tableId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__persons",
"name": "tasksToExecute",
"relationshipType": "one-to-many",
"type": "link",
"_id": "c0f440590bda04f28846242156c1dd60b",
"foreignKey": "executorid",
"fieldName": "personid"
},
"tasksToQA": {
"tableId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81__persons",
"name": "tasksToQA",
"relationshipType": "one-to-many",
"type": "link",
"_id": "c5fdf453a0ba743d58e29491d174c974b",
"foreignKey": "qaid",
"fieldName": "personid"
}
},
"sourceId": "datasource_plus_44a967caf37a435f84fe01cd6dfe8f81",
"sourceType": "external",
"primaryDisplay": "taskname",
"sql": true,
"views": {}
}
},
"tableAliases": {
"tasks": "a",
"products": "b",
"persons": "c",
"products_tasks": "d"
}
}

View file

@ -0,0 +1,181 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "persons",
"operation": "UPDATE"
},
"resource": {
"fields": [
"a.year",
"a.firstname",
"a.personid",
"a.address",
"a.age",
"a.type",
"a.city",
"a.lastname"
]
},
"filters": {
"equal": {
"personid": 5
}
},
"relationships": [
{
"tableName": "tasks",
"column": "QA",
"from": "personid",
"to": "qaid",
"aliases": {
"tasks": "b",
"persons": "a"
}
},
{
"tableName": "tasks",
"column": "executor",
"from": "personid",
"to": "executorid",
"aliases": {
"tasks": "b",
"persons": "a"
}
}
],
"body": {
"year": 1990,
"firstname": "C",
"address": "A Street",
"age": 34,
"type": "designer",
"city": "London",
"lastname": "B"
},
"extra": {
"idFilter": {
"equal": {
"personid": 5
}
}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__persons",
"primary": [
"personid"
],
"name": "a",
"schema": {
"year": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "year",
"constraints": {
"presence": false
}
},
"firstname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "firstname",
"constraints": {
"presence": false
}
},
"personid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "personid",
"constraints": {
"presence": false
}
},
"address": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "address",
"constraints": {
"presence": false
}
},
"age": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "age",
"constraints": {
"presence": false
}
},
"type": {
"type": "options",
"externalType": "USER-DEFINED",
"autocolumn": false,
"name": "type",
"constraints": {
"presence": false,
"inclusion": [
"support",
"designer",
"programmer",
"qa"
]
}
},
"city": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "city",
"constraints": {
"presence": false
}
},
"lastname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "lastname",
"constraints": {
"presence": false
}
},
"QA": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "QA",
"relationshipType": "many-to-one",
"fieldName": "qaid",
"type": "link",
"main": true,
"_id": "ccb68481c80c34217a4540a2c6c27fe46",
"foreignKey": "personid"
},
"executor": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "executor",
"relationshipType": "many-to-one",
"fieldName": "executorid",
"type": "link",
"main": true,
"_id": "c89530b9770d94bec851e062b5cff3001",
"foreignKey": "personid",
"tableName": "persons"
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "firstname",
"views": {}
}
},
"tableAliases": {
"persons": "a",
"tasks": "b"
}
}

View file

@ -0,0 +1,181 @@
{
"endpoint": {
"datasourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"entityId": "persons",
"operation": "UPDATE"
},
"resource": {
"fields": [
"a.year",
"a.firstname",
"a.personid",
"a.address",
"a.age",
"a.type",
"a.city",
"a.lastname"
]
},
"filters": {
"equal": {
"personid": 5
}
},
"relationships": [
{
"tableName": "tasks",
"column": "QA",
"from": "personid",
"to": "qaid",
"aliases": {
"tasks": "b",
"persons": "a"
}
},
{
"tableName": "tasks",
"column": "executor",
"from": "personid",
"to": "executorid",
"aliases": {
"tasks": "b",
"persons": "a"
}
}
],
"body": {
"year": 1990,
"firstname": "C",
"address": "A Street",
"age": 34,
"type": "designer",
"city": "London",
"lastname": "B"
},
"extra": {
"idFilter": {
"equal": {
"personid": 5
}
}
},
"meta": {
"table": {
"type": "table",
"_id": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__persons",
"primary": [
"personid"
],
"name": "a",
"schema": {
"year": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "year",
"constraints": {
"presence": false
}
},
"firstname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "firstname",
"constraints": {
"presence": false
}
},
"personid": {
"type": "number",
"externalType": "integer",
"autocolumn": true,
"name": "personid",
"constraints": {
"presence": false
}
},
"address": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "address",
"constraints": {
"presence": false
}
},
"age": {
"type": "number",
"externalType": "integer",
"autocolumn": false,
"name": "age",
"constraints": {
"presence": false
}
},
"type": {
"type": "options",
"externalType": "USER-DEFINED",
"autocolumn": false,
"name": "type",
"constraints": {
"presence": false,
"inclusion": [
"support",
"designer",
"programmer",
"qa"
]
}
},
"city": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "city",
"constraints": {
"presence": false
}
},
"lastname": {
"type": "string",
"externalType": "character varying",
"autocolumn": false,
"name": "lastname",
"constraints": {
"presence": false
}
},
"QA": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "QA",
"relationshipType": "many-to-one",
"fieldName": "qaid",
"type": "link",
"main": true,
"_id": "ccb68481c80c34217a4540a2c6c27fe46",
"foreignKey": "personid"
},
"executor": {
"tableId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7__tasks",
"name": "executor",
"relationshipType": "many-to-one",
"fieldName": "executorid",
"type": "link",
"main": true,
"_id": "c89530b9770d94bec851e062b5cff3001",
"foreignKey": "personid",
"tableName": "persons"
}
},
"sourceId": "datasource_plus_8066e56456784eb2a00129d31be5c3e7",
"sourceType": "external",
"primaryDisplay": "firstname",
"views": {}
}
},
"tableAliases": {
"persons": "a",
"tasks": "b"
}
}

View file

@ -13,7 +13,7 @@ describe("syncApps", () => {
afterAll(config.end)
it("runs successfully", async () => {
return config.doInContext(null, async () => {
return config.doInContext(undefined, async () => {
// create the usage quota doc and mock usages
await quotas.getQuotaUsage()
await quotas.setUsage(3, StaticQuotaName.APPS, QuotaUsageType.STATIC)

View file

@ -12,8 +12,8 @@ describe("syncCreators", () => {
afterAll(config.end)
it("syncs creators", async () => {
return config.doInContext(null, async () => {
await config.createUser({ admin: true })
return config.doInContext(undefined, async () => {
await config.createUser({ admin: { global: true } })
await syncCreators.run()

View file

@ -14,7 +14,7 @@ describe("syncRows", () => {
afterAll(config.end)
it("runs successfully", async () => {
return config.doInContext(null, async () => {
return config.doInContext(undefined, async () => {
// create the usage quota doc and mock usages
await quotas.getQuotaUsage()
await quotas.setUsage(300, StaticQuotaName.ROWS, QuotaUsageType.STATIC)

View file

@ -12,7 +12,7 @@ describe("syncUsers", () => {
afterAll(config.end)
it("syncs users", async () => {
return config.doInContext(null, async () => {
return config.doInContext(undefined, async () => {
await config.createUser()
await syncUsers.run()

View file

@ -40,7 +40,7 @@ describe("migrations", () => {
describe("backfill", () => {
it("runs app db migration", async () => {
await config.doInContext(null, async () => {
await config.doInContext(undefined, async () => {
await clearMigrations()
await config.createAutomation()
await config.createAutomation(structures.newAutomation())
@ -93,18 +93,18 @@ describe("migrations", () => {
})
it("runs global db migration", async () => {
await config.doInContext(null, async () => {
await config.doInContext(undefined, async () => {
await clearMigrations()
const appId = config.prodAppId
const appId = config.getProdAppId()
const roles = { [appId]: "role_12345" }
await config.createUser({
builder: false,
admin: true,
builder: { global: false },
admin: { global: true },
roles,
}) // admin only
await config.createUser({
builder: false,
admin: false,
builder: { global: false },
admin: { global: false },
roles,
}) // non admin non builder
await config.createTable()

View file

@ -85,7 +85,9 @@ async function getImportableDocuments(db: Database) {
const docPromises = []
for (let docType of DocumentTypesToImport) {
docPromises.push(
db.allDocs(dbCore.getDocParams(docType, null, { include_docs: true }))
db.allDocs<Document>(
dbCore.getDocParams(docType, null, { include_docs: true })
)
)
}
// map the responses to the document itself

View file

@ -43,8 +43,8 @@ async function createUser(email: string, roles: UserRoles, builder?: boolean) {
const user = await config.createUser({
email,
roles,
builder: builder || false,
admin: false,
builder: { global: builder || false },
admin: { global: false },
})
await context.doInContext(config.appId!, async () => {
await events.user.created(user)
@ -55,10 +55,10 @@ async function createUser(email: string, roles: UserRoles, builder?: boolean) {
async function removeUserRole(user: User) {
const final = await config.globalUser({
...user,
id: user._id,
_id: user._id,
roles: {},
builder: false,
admin: false,
builder: { global: false },
admin: { global: false },
})
await context.doInContext(config.appId!, async () => {
await events.user.updated(final)
@ -69,8 +69,8 @@ async function createGroupAndUser(email: string) {
groupUser = await config.createUser({
email,
roles: {},
builder: false,
admin: false,
builder: { global: false },
admin: { global: false },
})
group = await config.createGroup()
await config.addUserToGroup(group._id!, groupUser._id!)

View file

@ -229,7 +229,7 @@ export async function removeSecretSingle(datasource: Datasource) {
}
export function mergeConfigs(update: Datasource, old: Datasource) {
if (!update.config) {
if (!update.config || !old.config) {
return update
}
// specific to REST datasources, fix the auth configs again if required

View file

@ -3,12 +3,33 @@ import {
DatasourcePlus,
IntegrationBase,
Schema,
Table,
} from "@budibase/types"
import * as datasources from "./datasources"
import tableSdk from "../tables"
import { getIntegration } from "../../../integrations"
import { context } from "@budibase/backend-core"
function checkForSchemaErrors(schema: Record<string, Table>) {
const errors: Record<string, string> = {}
for (let [tableName, table] of Object.entries(schema)) {
if (tableName.includes(".")) {
errors[tableName] = "Table names containing dots are not supported."
} else {
const columnNames = Object.keys(table.schema)
const invalidColumnName = columnNames.find(columnName =>
columnName.includes(".")
)
if (invalidColumnName) {
errors[
tableName
] = `Column '${invalidColumnName}' is not supported as it contains a dot.`
}
}
}
return errors
}
export async function buildFilteredSchema(
datasource: Datasource,
filter?: string[]
@ -30,16 +51,19 @@ export async function buildFilteredSchema(
filteredSchema.errors[key] = schema.errors[key]
}
}
return filteredSchema
return {
...filteredSchema,
errors: {
...filteredSchema.errors,
...checkForSchemaErrors(filteredSchema.tables),
},
}
}
async function buildSchemaHelper(datasource: Datasource): Promise<Schema> {
const connector = (await getConnector(datasource)) as DatasourcePlus
const externalSchema = await connector.buildSchema(
datasource._id!,
datasource.entities!
)
return externalSchema
return await connector.buildSchema(datasource._id!, datasource.entities!)
}
export async function getConnector(

View file

@ -36,11 +36,13 @@ export async function search(options: SearchParams): Promise<{
export interface ExportRowsParams {
tableId: string
format: Format
delimiter?: string
rowIds?: string[]
columns?: string[]
query?: SearchFilters
sort?: string
sortOrder?: SortOrder
customHeaders?: { [key: string]: string }
}
export interface ExportRowsResult {

View file

@ -101,7 +101,17 @@ export async function search(options: SearchParams) {
export async function exportRows(
options: ExportRowsParams
): Promise<ExportRowsResult> {
const { tableId, format, columns, rowIds, query, sort, sortOrder } = options
const {
tableId,
format,
columns,
rowIds,
query,
sort,
sortOrder,
delimiter,
customHeaders,
} = options
const { datasourceId, tableName } = breakExternalTableId(tableId)
let requestQuery: SearchFilters = {}
@ -153,12 +163,17 @@ export async function exportRows(
rows = result.rows
}
let exportRows = cleanExportRows(rows, schema, format, columns)
let exportRows = cleanExportRows(rows, schema, format, columns, customHeaders)
let content: string
switch (format) {
case exporters.Format.CSV:
content = exporters.csv(headers ?? Object.keys(schema), exportRows)
content = exporters.csv(
headers ?? Object.keys(schema),
exportRows,
delimiter,
customHeaders
)
break
case exporters.Format.JSON:
content = exporters.json(exportRows)

View file

@ -84,7 +84,17 @@ export async function search(options: SearchParams) {
export async function exportRows(
options: ExportRowsParams
): Promise<ExportRowsResult> {
const { tableId, format, rowIds, columns, query, sort, sortOrder } = options
const {
tableId,
format,
rowIds,
columns,
query,
sort,
sortOrder,
delimiter,
customHeaders,
} = options
const db = context.getAppDB()
const table = await sdk.tables.getTable(tableId)
@ -124,11 +134,16 @@ export async function exportRows(
rows = result
}
let exportRows = cleanExportRows(rows, schema, format, columns)
let exportRows = cleanExportRows(rows, schema, format, columns, customHeaders)
if (format === Format.CSV) {
return {
fileName: "export.csv",
content: csv(headers ?? Object.keys(rows[0]), exportRows),
content: csv(
headers ?? Object.keys(rows[0]),
exportRows,
delimiter,
customHeaders
),
}
} else if (format === Format.JSON) {
return {

View file

@ -81,7 +81,7 @@ describe("sdk >> rows >> internal", () => {
const response = await internalSdk.save(
table._id!,
row,
config.user._id
config.getUser()._id
)
expect(response).toEqual({
@ -129,7 +129,7 @@ describe("sdk >> rows >> internal", () => {
const response = await internalSdk.save(
table._id!,
row,
config.user._id
config.getUser()._id
)
expect(response).toEqual({
@ -190,15 +190,15 @@ describe("sdk >> rows >> internal", () => {
await config.doInContext(config.appId, async () => {
for (const row of makeRows(5)) {
await internalSdk.save(table._id!, row, config.user._id)
await internalSdk.save(table._id!, row, config.getUser()._id)
}
await Promise.all(
makeRows(10).map(row =>
internalSdk.save(table._id!, row, config.user._id)
internalSdk.save(table._id!, row, config.getUser()._id)
)
)
for (const row of makeRows(5)) {
await internalSdk.save(table._id!, row, config.user._id)
await internalSdk.save(table._id!, row, config.getUser()._id)
}
})

View file

@ -1,12 +1,21 @@
import cloneDeep from "lodash/cloneDeep"
import validateJs from "validate.js"
import { FieldType, Row, Table, TableSchema } from "@budibase/types"
import {
FieldType,
QueryJson,
Row,
Table,
TableSchema,
DatasourcePlusQueryResponse,
} from "@budibase/types"
import { makeExternalQuery } from "../../../integrations/base/query"
import { Format } from "../../../api/controllers/view/exporters"
import sdk from "../.."
import { isRelationshipColumn } from "../../../db/utils"
export async function getDatasourceAndQuery(json: any) {
export async function getDatasourceAndQuery(
json: QueryJson
): DatasourcePlusQueryResponse {
const datasourceId = json.endpoint.datasourceId
const datasource = await sdk.datasources.get(datasourceId)
return makeExternalQuery(datasource, json)
@ -16,7 +25,8 @@ export function cleanExportRows(
rows: any[],
schema: TableSchema,
format: string,
columns?: string[]
columns?: string[],
customHeaders: { [key: string]: string } = {}
) {
let cleanRows = [...rows]
@ -44,11 +54,27 @@ export function cleanExportRows(
}
}
}
} else if (format === Format.JSON) {
// Replace row keys with custom headers
for (let row of cleanRows) {
renameKeys(customHeaders, row)
}
}
return cleanRows
}
function renameKeys(keysMap: { [key: string]: any }, row: any) {
for (const key in keysMap) {
Object.defineProperty(
row,
keysMap[key],
Object.getOwnPropertyDescriptor(row, key) || {}
)
delete row[key]
}
}
function isForeignKey(key: string, table: Table) {
const relationships = Object.values(table.schema).filter(isRelationshipColumn)
return relationships.some(

View file

@ -22,15 +22,18 @@ describe("syncGlobalUsers", () => {
expect(metadata).toHaveLength(1)
expect(metadata).toEqual([
expect.objectContaining({
_id: db.generateUserMetadataID(config.user._id),
_id: db.generateUserMetadataID(config.getUser()._id!),
}),
])
})
})
it("admin and builders users are synced", async () => {
const user1 = await config.createUser({ admin: true })
const user2 = await config.createUser({ admin: false, builder: true })
const user1 = await config.createUser({ admin: { global: true } })
const user2 = await config.createUser({
admin: { global: false },
builder: { global: true },
})
await config.doInContext(config.appId, async () => {
expect(await rawUserMetadata()).toHaveLength(1)
await syncGlobalUsers()
@ -51,7 +54,10 @@ describe("syncGlobalUsers", () => {
})
it("app users are not synced if not specified", async () => {
const user = await config.createUser({ admin: false, builder: false })
const user = await config.createUser({
admin: { global: false },
builder: { global: false },
})
await config.doInContext(config.appId, async () => {
await syncGlobalUsers()
@ -68,8 +74,14 @@ describe("syncGlobalUsers", () => {
it("app users are added when group is assigned to app", async () => {
await config.doInTenant(async () => {
const group = await proSdk.groups.save(structures.userGroups.userGroup())
const user1 = await config.createUser({ admin: false, builder: false })
const user2 = await config.createUser({ admin: false, builder: false })
const user1 = await config.createUser({
admin: { global: false },
builder: { global: false },
})
const user2 = await config.createUser({
admin: { global: false },
builder: { global: false },
})
await proSdk.groups.addUsers(group.id, [user1._id!, user2._id!])
await config.doInContext(config.appId, async () => {
@ -103,8 +115,14 @@ describe("syncGlobalUsers", () => {
it("app users are removed when app is removed from user group", async () => {
await config.doInTenant(async () => {
const group = await proSdk.groups.save(structures.userGroups.userGroup())
const user1 = await config.createUser({ admin: false, builder: false })
const user2 = await config.createUser({ admin: false, builder: false })
const user1 = await config.createUser({
admin: { global: false },
builder: { global: false },
})
const user2 = await config.createUser({
admin: { global: false },
builder: { global: false },
})
await proSdk.groups.updateGroupApps(group.id, {
appsToAdd: [
{ appId: config.prodAppId!, roleId: roles.BUILTIN_ROLE_IDS.BASIC },

View file

@ -38,6 +38,7 @@ async function initRoutes(app: Koa) {
// api routes
app.use(api.router.routes())
app.use(api.router.allowedMethods())
}
async function initPro() {

View file

@ -49,25 +49,31 @@ import {
AuthToken,
Automation,
CreateViewRequest,
Ctx,
Datasource,
FieldType,
INTERNAL_TABLE_SOURCE_ID,
Layout,
Query,
RelationshipFieldMetadata,
RelationshipType,
Row,
Screen,
SearchParams,
SourceName,
Table,
TableSourceType,
User,
UserRoles,
UserCtx,
View,
Webhook,
WithRequired,
} from "@budibase/types"
import API from "./api"
import { cloneDeep } from "lodash"
import jwt, { Secret } from "jsonwebtoken"
import { Server } from "http"
mocks.licenses.init(pro)
@ -82,27 +88,23 @@ export interface TableToBuild extends Omit<Table, "sourceId" | "sourceType"> {
}
export default class TestConfiguration {
server: any
request: supertest.SuperTest<supertest.Test> | undefined
server?: Server
request?: supertest.SuperTest<supertest.Test>
started: boolean
appId: string | null
allApps: any[]
appId?: string
allApps: App[]
app?: App
prodApp: any
prodAppId: any
user: any
userMetadataId: any
prodApp?: App
prodAppId?: string
user?: User
userMetadataId?: string
table?: Table
automation: any
automation?: Automation
datasource?: Datasource
tenantId?: string
api: API
csrfToken?: string
private get globalUserId() {
return this.user._id
}
constructor(openServer = true) {
if (openServer) {
// use a random port because it doesn't matter
@ -114,7 +116,7 @@ export default class TestConfiguration {
} else {
this.started = false
}
this.appId = null
this.appId = undefined
this.allApps = []
this.api = new API(this)
@ -125,46 +127,86 @@ export default class TestConfiguration {
}
getApp() {
if (!this.app) {
throw new Error("app has not been initialised, call config.init() first")
}
return this.app
}
getProdApp() {
if (!this.prodApp) {
throw new Error(
"prodApp has not been initialised, call config.init() first"
)
}
return this.prodApp
}
getAppId() {
if (!this.appId) {
throw "appId has not been initialised properly"
throw new Error(
"appId has not been initialised, call config.init() first"
)
}
return this.appId
}
getProdAppId() {
if (!this.prodAppId) {
throw new Error(
"prodAppId has not been initialised, call config.init() first"
)
}
return this.prodAppId
}
getUser(): User {
if (!this.user) {
throw new Error("User has not been initialised, call config.init() first")
}
return this.user
}
getUserDetails() {
const user = this.getUser()
return {
globalId: this.globalUserId,
email: this.user.email,
firstName: this.user.firstName,
lastName: this.user.lastName,
globalId: user._id!,
email: user.email,
firstName: user.firstName,
lastName: user.lastName,
}
}
getAutomation() {
if (!this.automation) {
throw new Error(
"automation has not been initialised, call config.init() first"
)
}
return this.automation
}
getDatasource() {
if (!this.datasource) {
throw new Error(
"datasource has not been initialised, call config.init() first"
)
}
return this.datasource
}
async doInContext<T>(
appId: string | null,
appId: string | undefined,
task: () => Promise<T>
): Promise<T> {
if (!appId) {
appId = this.appId
}
const tenant = this.getTenantId()
return tenancy.doInTenant(tenant, () => {
if (!appId) {
appId = this.appId
}
// check if already in a context
if (context.getAppId() == null && appId !== null) {
if (context.getAppId() == null && appId) {
return context.doInAppContext(appId, async () => {
return task()
})
@ -259,7 +301,11 @@ export default class TestConfiguration {
// UTILS
_req(body: any, params: any, controlFunc: any) {
_req<Req extends Record<string, any> | void, Res>(
handler: (ctx: UserCtx<Req, Res>) => Promise<void>,
body?: Req,
params?: Record<string, string | undefined>
): Promise<Res> {
// create a fake request ctx
const request: any = {}
const appId = this.appId
@ -278,63 +324,48 @@ export default class TestConfiguration {
throw new Error(`Error ${status} - ${message}`)
}
return this.doInContext(appId, async () => {
await controlFunc(request)
await handler(request)
return request.body
})
}
// USER / AUTH
async globalUser(
config: {
id?: string
firstName?: string
lastName?: string
builder?: boolean
admin?: boolean
email?: string
roles?: any
} = {}
): Promise<User> {
async globalUser(config: Partial<User> = {}): Promise<User> {
const {
id = `us_${newid()}`,
_id = `us_${newid()}`,
firstName = generator.first(),
lastName = generator.last(),
builder = true,
admin = false,
builder = { global: true },
admin = { global: false },
email = generator.email(),
roles,
tenantId = this.getTenantId(),
roles = {},
} = config
const db = tenancy.getTenantDB(this.getTenantId())
let existing
let existing: Partial<User> = {}
try {
existing = await db.get<any>(id)
existing = await db.get<User>(_id)
} catch (err) {
existing = { email }
// ignore
}
const user: User = {
_id: id,
_id,
...existing,
roles: roles || {},
tenantId: this.getTenantId(),
...config,
email,
roles,
tenantId,
firstName,
lastName,
builder,
admin,
}
await sessions.createASession(id, {
await sessions.createASession(_id, {
sessionId: "sessionid",
tenantId: this.getTenantId(),
csrfToken: this.csrfToken,
})
if (builder) {
user.builder = { global: true }
} else {
user.builder = { global: false }
}
if (admin) {
user.admin = { global: true }
} else {
user.admin = { global: false }
}
const resp = await db.put(user)
return {
_rev: resp.rev,
@ -342,38 +373,9 @@ export default class TestConfiguration {
}
}
async createUser(
user: {
id?: string
firstName?: string
lastName?: string
email?: string
builder?: boolean
admin?: boolean
roles?: UserRoles
} = {}
): Promise<User> {
const {
id,
firstName = generator.first(),
lastName = generator.last(),
email = generator.email(),
builder = true,
admin,
roles,
} = user
const globalId = !id ? `us_${Math.random()}` : `us_${id}`
const resp = await this.globalUser({
id: globalId,
firstName,
lastName,
email,
builder,
admin,
roles: roles || {},
})
await cache.user.invalidateUser(globalId)
async createUser(user: Partial<User> = {}): Promise<User> {
const resp = await this.globalUser(user)
await cache.user.invalidateUser(resp._id!)
return resp
}
@ -381,7 +383,7 @@ export default class TestConfiguration {
return context.doInTenant(this.tenantId!, async () => {
const baseGroup = structures.userGroups.userGroup()
baseGroup.roles = {
[this.prodAppId]: roleId,
[this.getProdAppId()]: roleId,
}
const { id, rev } = await pro.sdk.groups.save(baseGroup)
return {
@ -404,8 +406,18 @@ export default class TestConfiguration {
})
}
async login({ roleId, userId, builder, prodApp = false }: any = {}) {
const appId = prodApp ? this.prodAppId : this.appId
async login({
roleId,
userId,
builder,
prodApp,
}: {
roleId?: string
userId: string
builder: boolean
prodApp: boolean
}) {
const appId = prodApp ? this.getProdAppId() : this.getAppId()
return context.doInAppContext(appId, async () => {
userId = !userId ? `us_uuid1` : userId
if (!this.request) {
@ -414,9 +426,9 @@ export default class TestConfiguration {
// make sure the user exists in the global DB
if (roleId !== roles.BUILTIN_ROLE_IDS.PUBLIC) {
await this.globalUser({
id: userId,
builder,
roles: { [this.prodAppId]: roleId },
_id: userId,
builder: { global: builder },
roles: { [appId]: roleId || roles.BUILTIN_ROLE_IDS.BASIC },
})
}
await sessions.createASession(userId, {
@ -445,8 +457,9 @@ export default class TestConfiguration {
defaultHeaders(extras = {}, prodApp = false) {
const tenantId = this.getTenantId()
const user = this.getUser()
const authObj: AuthToken = {
userId: this.globalUserId,
userId: user._id!,
sessionId: "sessionid",
tenantId,
}
@ -498,7 +511,7 @@ export default class TestConfiguration {
builder = false,
prodApp = true,
} = {}) {
return this.login({ email, roleId, builder, prodApp })
return this.login({ userId: email, roleId, builder, prodApp })
}
// TENANCY
@ -521,18 +534,22 @@ export default class TestConfiguration {
this.tenantId = structures.tenant.id()
this.user = await this.globalUser()
this.userMetadataId = generateUserMetadataID(this.user._id)
this.userMetadataId = generateUserMetadataID(this.user._id!)
return this.createApp(appName)
}
doInTenant(task: any) {
doInTenant<T>(task: () => T) {
return context.doInTenant(this.getTenantId(), task)
}
// API
async generateApiKey(userId = this.user._id) {
async generateApiKey(userId?: string) {
const user = this.getUser()
if (!userId) {
userId = user._id!
}
const db = tenancy.getTenantDB(this.getTenantId())
const id = dbCore.generateDevInfoID(userId)
let devInfo: any
@ -552,29 +569,28 @@ export default class TestConfiguration {
async createApp(appName: string, url?: string): Promise<App> {
// create dev app
// clear any old app
this.appId = null
this.app = await context.doInTenant(this.tenantId!, async () => {
const app = await this._req(
{ name: appName, url },
null,
appController.create
)
this.appId = app.appId!
return app
})
return await context.doInAppContext(this.getAppId(), async () => {
this.appId = undefined
this.app = await context.doInTenant(
this.tenantId!,
async () =>
(await this._req(appController.create, {
name: appName,
})) as App
)
this.appId = this.app.appId
return await context.doInAppContext(this.app.appId!, async () => {
// create production app
this.prodApp = await this.publish()
this.allApps.push(this.prodApp)
this.allApps.push(this.app)
this.allApps.push(this.app!)
return this.app!
})
}
async publish() {
await this._req(null, null, deployController.publishApp)
await this._req(deployController.publishApp)
// @ts-ignore
const prodAppId = this.getAppId().replace("_dev", "")
this.prodAppId = prodAppId
@ -586,13 +602,11 @@ export default class TestConfiguration {
}
async unpublish() {
const response = await this._req(
null,
{ appId: this.appId },
appController.unpublish
)
this.prodAppId = null
this.prodApp = null
const response = await this._req(appController.unpublish, {
appId: this.appId,
})
this.prodAppId = undefined
this.prodApp = undefined
return response
}
@ -720,8 +734,7 @@ export default class TestConfiguration {
// ROLE
async createRole(config?: any) {
config = config || basicRole()
return this._req(config, null, roleController.save)
return this._req(roleController.save, config || basicRole())
}
// VIEW
@ -734,7 +747,7 @@ export default class TestConfiguration {
tableId: this.table!._id,
name: generator.guid(),
}
return this._req(view, null, viewController.v1.save)
return this._req(viewController.v1.save, view)
}
async createView(
@ -758,40 +771,38 @@ export default class TestConfiguration {
// AUTOMATION
async createAutomation(config?: any) {
async createAutomation(config?: Automation) {
config = config || basicAutomation()
if (config._rev) {
delete config._rev
}
this.automation = (
await this._req(config, null, automationController.create)
).automation
const res = await this._req(automationController.create, config)
this.automation = res.automation
return this.automation
}
async getAllAutomations() {
return this._req(null, null, automationController.fetch)
return this._req(automationController.fetch)
}
async deleteAutomation(automation?: any) {
async deleteAutomation(automation?: Automation) {
automation = automation || this.automation
if (!automation) {
return
}
return this._req(
null,
{ id: automation._id, rev: automation._rev },
automationController.destroy
)
return this._req(automationController.destroy, undefined, {
id: automation._id,
rev: automation._rev,
})
}
async createWebhook(config?: any) {
async createWebhook(config?: Webhook) {
if (!this.automation) {
throw "Must create an automation before creating webhook."
}
config = config || basicWebhook(this.automation._id)
config = config || basicWebhook(this.automation._id!)
return (await this._req(config, null, webhookController.save)).webhook
return (await this._req(webhookController.save, config)).webhook
}
// DATASOURCE
@ -813,7 +824,7 @@ export default class TestConfiguration {
return { ...this.datasource, _id: this.datasource!._id! }
}
async restDatasource(cfg?: any) {
async restDatasource(cfg?: Record<string, any>) {
return this.createDatasource({
datasource: {
...basicDatasource().datasource,
@ -870,26 +881,25 @@ export default class TestConfiguration {
// QUERY
async createQuery(config?: any) {
if (!this.datasource && !config) {
throw "No datasource created for query."
}
config = config || basicQuery(this.datasource!._id!)
return this._req(config, null, queryController.save)
async createQuery(config?: Query) {
return this._req(
queryController.save,
config || basicQuery(this.getDatasource()._id!)
)
}
// SCREEN
async createScreen(config?: any) {
async createScreen(config?: Screen) {
config = config || basicScreen()
return this._req(config, null, screenController.save)
return this._req(screenController.save, config)
}
// LAYOUT
async createLayout(config?: any) {
async createLayout(config?: Layout) {
config = config || basicLayout()
return await this._req(config, null, layoutController.save)
return await this._req(layoutController.save, config)
}
}

View file

@ -1,17 +1,96 @@
import { Response } from "supertest"
import { App } from "@budibase/types"
import {
App,
type CreateAppRequest,
type FetchAppDefinitionResponse,
type FetchAppPackageResponse,
} from "@budibase/types"
import TestConfiguration from "../TestConfiguration"
import { TestAPI } from "./base"
import { AppStatus } from "../../../db/utils"
import { constants } from "@budibase/backend-core"
export class ApplicationAPI extends TestAPI {
constructor(config: TestConfiguration) {
super(config)
}
create = async (app: CreateAppRequest): Promise<App> => {
const request = this.request
.post("/api/applications")
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
for (const key of Object.keys(app)) {
request.field(key, (app as any)[key])
}
if (app.templateFile) {
request.attach("templateFile", app.templateFile)
}
const result = await request
if (result.statusCode !== 200) {
throw new Error(JSON.stringify(result.body))
}
return result.body as App
}
delete = async (appId: string): Promise<void> => {
await this.request
.delete(`/api/applications/${appId}`)
.set(this.config.defaultHeaders())
.expect(200)
}
publish = async (
appId: string
): Promise<{ _id: string; status: string; appUrl: string }> => {
// While the publish endpoint does take an :appId parameter, it doesn't
// use it. It uses the appId from the context.
let headers = {
...this.config.defaultHeaders(),
[constants.Header.APP_ID]: appId,
}
const result = await this.request
.post(`/api/applications/${appId}/publish`)
.set(headers)
.expect("Content-Type", /json/)
.expect(200)
return result.body as { _id: string; status: string; appUrl: string }
}
unpublish = async (appId: string): Promise<void> => {
await this.request
.post(`/api/applications/${appId}/unpublish`)
.set(this.config.defaultHeaders())
.expect(204)
}
sync = async (
appId: string,
{ statusCode }: { statusCode: number } = { statusCode: 200 }
): Promise<{ message: string }> => {
const result = await this.request
.post(`/api/applications/${appId}/sync`)
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(statusCode)
return result.body
}
getRaw = async (appId: string): Promise<Response> => {
// While the appPackage endpoint does take an :appId parameter, it doesn't
// use it. It uses the appId from the context.
let headers = {
...this.config.defaultHeaders(),
[constants.Header.APP_ID]: appId,
}
const result = await this.request
.get(`/api/applications/${appId}/appPackage`)
.set(this.config.defaultHeaders())
.set(headers)
.expect("Content-Type", /json/)
.expect(200)
return result
@ -21,4 +100,94 @@ export class ApplicationAPI extends TestAPI {
const result = await this.getRaw(appId)
return result.body.application as App
}
getDefinition = async (
appId: string
): Promise<FetchAppDefinitionResponse> => {
const result = await this.request
.get(`/api/applications/${appId}/definition`)
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
return result.body as FetchAppDefinitionResponse
}
getAppPackage = async (appId: string): Promise<FetchAppPackageResponse> => {
const result = await this.request
.get(`/api/applications/${appId}/appPackage`)
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
return result.body
}
update = async (
appId: string,
app: { name?: string; url?: string }
): Promise<App> => {
const request = this.request
.put(`/api/applications/${appId}`)
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
for (const key of Object.keys(app)) {
request.field(key, (app as any)[key])
}
const result = await request
if (result.statusCode !== 200) {
throw new Error(JSON.stringify(result.body))
}
return result.body as App
}
updateClient = async (appId: string): Promise<void> => {
// While the updateClient endpoint does take an :appId parameter, it doesn't
// use it. It uses the appId from the context.
let headers = {
...this.config.defaultHeaders(),
[constants.Header.APP_ID]: appId,
}
const response = await this.request
.post(`/api/applications/${appId}/client/update`)
.set(headers)
.expect("Content-Type", /json/)
if (response.statusCode !== 200) {
throw new Error(JSON.stringify(response.body))
}
}
revertClient = async (appId: string): Promise<void> => {
// While the revertClient endpoint does take an :appId parameter, it doesn't
// use it. It uses the appId from the context.
let headers = {
...this.config.defaultHeaders(),
[constants.Header.APP_ID]: appId,
}
const response = await this.request
.post(`/api/applications/${appId}/client/revert`)
.set(headers)
.expect("Content-Type", /json/)
if (response.statusCode !== 200) {
throw new Error(JSON.stringify(response.body))
}
}
fetch = async ({ status }: { status?: AppStatus } = {}): Promise<App[]> => {
let query = []
if (status) {
query.push(`status=${status}`)
}
const result = await this.request
.get(`/api/applications${query.length ? `?${query.join("&")}` : ""}`)
.set(this.config.defaultHeaders())
.expect("Content-Type", /json/)
.expect(200)
return result.body as App[]
}
}

View file

@ -22,6 +22,8 @@ import {
INTERNAL_TABLE_SOURCE_ID,
TableSourceType,
Query,
Webhook,
WebhookActionType,
} from "@budibase/types"
import { LoopInput, LoopStepType } from "../../definitions/automations"
@ -407,12 +409,12 @@ export function basicLayout() {
return cloneDeep(EMPTY_LAYOUT)
}
export function basicWebhook(automationId: string) {
export function basicWebhook(automationId: string): Webhook {
return {
live: true,
name: "webhook",
action: {
type: "automation",
type: WebhookActionType.AUTOMATION,
target: automationId,
},
}

View file

@ -32,9 +32,7 @@ export interface FetchDatasourceInfoResponse {
tableNames: string[]
}
export interface UpdateDatasourceRequest extends Datasource {
datasource: Datasource
}
export interface UpdateDatasourceRequest extends Datasource {}
export interface BuildSchemaFromSourceRequest {
tablesFilter?: string[]

View file

@ -37,6 +37,8 @@ export interface ExportRowsRequest {
query?: SearchFilters
sort?: string
sortOrder?: SortOrder
delimiter?: string
customHeaders?: { [key: string]: string }
}
export type ExportRowsResponse = ReadStream

View file

@ -0,0 +1,29 @@
import type { PlanType } from "../../sdk"
import type { Layout, App, Screen } from "../../documents"
export interface CreateAppRequest {
name: string
url?: string
useTemplate?: string
templateName?: string
templateKey?: string
templateFile?: string
includeSampleData?: boolean
encryptionPassword?: string
templateString?: string
}
export interface FetchAppDefinitionResponse {
layouts: Layout[]
screens: Screen[]
libraries: string[]
}
export interface FetchAppPackageResponse {
application: App
licenseType: PlanType
screens: Screen[]
layouts: Layout[]
clientLibPath: string
hasLock: boolean
}

View file

@ -0,0 +1,3 @@
import { DocumentDestroyResponse } from "@budibase/nano"
export interface DeleteAutomationResponse extends DocumentDestroyResponse {}

View file

@ -1,3 +1,4 @@
export * from "./application"
export * from "./analytics"
export * from "./auth"
export * from "./user"
@ -10,3 +11,5 @@ export * from "./global"
export * from "./pagination"
export * from "./searchFilter"
export * from "./cookies"
export * from "./automation"
export * from "./layout"

View file

@ -0,0 +1,5 @@
import { Layout } from "../../documents"
export interface SaveLayoutRequest extends Layout {}
export interface SaveLayoutResponse extends Layout {}

View file

@ -1,4 +1,4 @@
import { User, Document } from "../"
import { User, Document, Plugin } from "../"
import { SocketSession } from "../../sdk"
export type AppMetadataErrors = { [key: string]: string[] }
@ -24,6 +24,8 @@ export interface App extends Document {
icon?: AppIcon
features?: AppFeatures
automations?: AutomationSettings
usedPlugins?: Plugin[]
upgradableVersion?: string
}
export interface AppInstance {

View file

@ -6,6 +6,9 @@ export interface Datasource extends Document {
type: string
name?: string
source: SourceName
// this is a googlesheets specific property which
// can be found in the GSheets schema - pertains to SSO creds
auth?: { type: string }
// the config is defined by the schema
config?: Record<string, any>
plus?: boolean
@ -36,6 +39,12 @@ export interface RestAuthConfig {
config: RestBasicAuthConfig | RestBearerAuthConfig
}
export interface DynamicVariable {
name: string
queryId: string
value: string
}
export interface RestConfig {
url: string
rejectUnauthorized: boolean
@ -47,11 +56,5 @@ export interface RestConfig {
staticVariables: {
[key: string]: string
}
dynamicVariables: [
{
name: string
queryId: string
value: string
}
]
dynamicVariables: DynamicVariable[]
}

View file

@ -1,6 +1,11 @@
import { Document } from "../document"
export interface Layout extends Document {
componentLibraries: string[]
title: string
favicon: string
stylesheets: string[]
props: any
layoutId?: string
name?: string
}

View file

@ -22,4 +22,5 @@ export interface Screen extends Document {
routing: ScreenRouting
props: ScreenProps
name?: string
pluginAdded?: boolean
}

View file

@ -5,15 +5,15 @@ export interface RowValue {
deleted: boolean
}
export interface RowResponse<T extends Document> {
export interface RowResponse<T extends Document | RowValue> {
id: string
key: string
error: string
value: T | RowValue
value: T
doc?: T
}
export interface AllDocsResponse<T extends Document> {
export interface AllDocsResponse<T extends Document | RowValue> {
offset: number
total_rows: number
rows: RowResponse<T>[]

View file

@ -1,4 +1,5 @@
import { Table } from "../documents"
import { Table, Row } from "../documents"
import { QueryJson } from "./search"
export const PASSWORD_REPLACEMENT = "--secret-value--"
@ -56,6 +57,7 @@ export enum SourceName {
FIRESTORE = "FIRESTORE",
REDIS = "REDIS",
SNOWFLAKE = "SNOWFLAKE",
BUDIBASE = "BUDIBASE",
}
export enum IncludeRelationship {
@ -180,11 +182,24 @@ export interface Schema {
errors: Record<string, string>
}
// return these when an operation occurred but we got no response
enum DSPlusOperation {
CREATE = "create",
READ = "read",
UPDATE = "update",
DELETE = "delete",
}
export type DatasourcePlusQueryResponse = Promise<
Row[] | Record<DSPlusOperation, boolean>[] | void
>
export interface DatasourcePlus extends IntegrationBase {
// if the datasource supports the use of bindings directly (to protect against SQL injection)
// this returns the format of the identifier
getBindingIdentifier(): string
getStringConcat(parts: string[]): string
query(json: QueryJson): DatasourcePlusQueryResponse
buildSchema(
datasourceId: string,
entities: Record<string, Table>

View file

@ -1,5 +1,11 @@
import type Nano from "@budibase/nano"
import { AllDocsResponse, AnyDocument, Document, ViewTemplateOpts } from "../"
import {
AllDocsResponse,
AnyDocument,
Document,
RowValue,
ViewTemplateOpts,
} from "../"
import { Writable } from "stream"
export enum SearchIndex {
@ -135,7 +141,7 @@ export interface Database {
opts?: DatabasePutOpts
): Promise<Nano.DocumentInsertResponse>
bulkDocs(documents: AnyDocument[]): Promise<Nano.DocumentBulkResponse[]>
allDocs<T extends Document>(
allDocs<T extends Document | RowValue>(
params: DatabaseQueryOpts
): Promise<AllDocsResponse<T>>
query<T extends Document>(

Some files were not shown because too many files have changed in this diff Show more