1
0
Fork 0
mirror of synced 2024-06-23 08:30:31 +12:00

created common package

This commit is contained in:
Michael Shanks 2020-04-08 17:00:27 +01:00 committed by Martin McKeaveney
parent 0430cd11b1
commit 1aacaea757
61 changed files with 8740 additions and 279 deletions

12
packages/common/.babelrc Normal file
View file

@ -0,0 +1,12 @@
{
"presets": ["@babel/preset-env"],
"sourceMaps": "inline",
"retainLines": true,
"plugins": [
["@babel/plugin-transform-runtime",
{
"regenerator": true
}
]
]
}

43
packages/common/.gitignore vendored Normal file
View file

@ -0,0 +1,43 @@
# Logs
logs
*.log
# Runtime data
pids
*.pid
*.seed
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# node-waf configuration
.lock-wscript
# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release
.eslintcache
# Dependency directory
# https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git
node_modules
node_modules_ubuntu
node_modules_windows
# OSX
.DS_Store
# flow-typed
flow-typed/npm/*
!flow-typed/npm/module_vx.x.x.js
.idea
npm-debug.log.*
dist

View file

@ -0,0 +1,2 @@
*
!dist/*

View file

@ -0,0 +1,11 @@
sudo: required
notifications:
slack: budibase:Nx2QNi9CP87Nn7ah2A4Qdzyy
script:
- npm install
- npm install -g jest
- node node_modules/eslint/bin/eslint src/**/*.js
- jest

14
packages/common/.vscode/launch.json vendored Normal file
View file

@ -0,0 +1,14 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Launch Program",
"program": "${workspaceFolder}\\index.js"
}
]
}

0
packages/common/.vscode/settings.json vendored Normal file
View file

View file

@ -0,0 +1,22 @@
### Contributing to budibase-core
* The contributors are listed in [AUTHORS.md](https://github.com/budibase/budibase-core/blob/master/AUTHORS.md) (add yourself).
* This project uses a modified version of the MPLv2 license, see [LICENSE](https://github.com/budibase/budibase-core/blob/master/LICENSE).
* We use the [C4 (Collective Code Construction Contract)](https://rfc.zeromq.org/spec:42/C4/) process for contributions.
Please read this if you are unfamiliar with it.
* Please maintain the existing code style.
* Please try to keep your commits small and focussed.
* If the project diverges from your branch, please rebase instead of merging. This makes the commit graph easier to read.
#### p.S...
I am using contribution guidelines from the fantastic [ZeroMQ](https://github.com/zeromq) community. If you are interested why, it's because I believe in the ethos laid out by this community, and written about in depth in the book ["Social Architecture"](https://www.amazon.com/Social-Architecture-Building-line-Communities/dp/1533112452) by Pieter Hintjens.
I am very much open to evolving this to suit our needs.
Love from [Mike](https://github.com/mikebudi).

373
packages/common/LICENSE Normal file
View file

@ -0,0 +1,373 @@
Mozilla Public License Version 2.0
==================================
1. Definitions
--------------
1.1. "Contributor"
means each individual or legal entity that creates, contributes to
the creation of, or owns Covered Software.
1.2. "Contributor Version"
means the combination of the Contributions of others (if any) used
by a Contributor and that particular Contributor's Contribution.
1.3. "Contribution"
means Covered Software of a particular Contributor.
1.4. "Covered Software"
means Source Code Form to which the initial Contributor has attached
the notice in Exhibit A, the Executable Form of such Source Code
Form, and Modifications of such Source Code Form, in each case
including portions thereof.
1.5. "Incompatible With Secondary Licenses"
means
(a) that the initial Contributor has attached the notice described
in Exhibit B to the Covered Software; or
(b) that the Covered Software was made available under the terms of
version 1.1 or earlier of the License, but not also under the
terms of a Secondary License.
1.6. "Executable Form"
means any form of the work other than Source Code Form.
1.7. "Larger Work"
means a work that combines Covered Software with other material, in
a separate file or files, that is not Covered Software.
1.8. "License"
means this document.
1.9. "Licensable"
means having the right to grant, to the maximum extent possible,
whether at the time of the initial grant or subsequently, any and
all of the rights conveyed by this License.
1.10. "Modifications"
means any of the following:
(a) any file in Source Code Form that results from an addition to,
deletion from, or modification of the contents of Covered
Software; or
(b) any new file in Source Code Form that contains any Covered
Software.
1.11. "Patent Claims" of a Contributor
means any patent claim(s), including without limitation, method,
process, and apparatus claims, in any patent Licensable by such
Contributor that would be infringed, but for the grant of the
License, by the making, using, selling, offering for sale, having
made, import, or transfer of either its Contributions or its
Contributor Version.
1.12. "Secondary License"
means either the GNU General Public License, Version 2.0, the GNU
Lesser General Public License, Version 2.1, the GNU Affero General
Public License, Version 3.0, or any later versions of those
licenses.
1.13. "Source Code Form"
means the form of the work preferred for making modifications.
1.14. "You" (or "Your")
means an individual or a legal entity exercising rights under this
License. For legal entities, "You" includes any entity that
controls, is controlled by, or is under common control with You. For
purposes of this definition, "control" means (a) the power, direct
or indirect, to cause the direction or management of such entity,
whether by contract or otherwise, or (b) ownership of more than
fifty percent (50%) of the outstanding shares or beneficial
ownership of such entity.
2. License Grants and Conditions
--------------------------------
2.1. Grants
Each Contributor hereby grants You a world-wide, royalty-free,
non-exclusive license:
(a) under intellectual property rights (other than patent or trademark)
Licensable by such Contributor to use, reproduce, make available,
modify, display, perform, distribute, and otherwise exploit its
Contributions, either on an unmodified basis, with Modifications, or
as part of a Larger Work; and
(b) under Patent Claims of such Contributor to make, use, sell, offer
for sale, have made, import, and otherwise transfer either its
Contributions or its Contributor Version.
2.2. Effective Date
The licenses granted in Section 2.1 with respect to any Contribution
become effective for each Contribution on the date the Contributor first
distributes such Contribution.
2.3. Limitations on Grant Scope
The licenses granted in this Section 2 are the only rights granted under
this License. No additional rights or licenses will be implied from the
distribution or licensing of Covered Software under this License.
Notwithstanding Section 2.1(b) above, no patent license is granted by a
Contributor:
(a) for any code that a Contributor has removed from Covered Software;
or
(b) for infringements caused by: (i) Your and any other third party's
modifications of Covered Software, or (ii) the combination of its
Contributions with other software (except as part of its Contributor
Version); or
(c) under Patent Claims infringed by Covered Software in the absence of
its Contributions.
This License does not grant any rights in the trademarks, service marks,
or logos of any Contributor (except as may be necessary to comply with
the notice requirements in Section 3.4).
2.4. Subsequent Licenses
No Contributor makes additional grants as a result of Your choice to
distribute the Covered Software under a subsequent version of this
License (see Section 10.2) or under the terms of a Secondary License (if
permitted under the terms of Section 3.3).
2.5. Representation
Each Contributor represents that the Contributor believes its
Contributions are its original creation(s) or it has sufficient rights
to grant the rights to its Contributions conveyed by this License.
2.6. Fair Use
This License is not intended to limit any rights You have under
applicable copyright doctrines of fair use, fair dealing, or other
equivalents.
2.7. Conditions
Sections 3.1, 3.2, 3.3, and 3.4 are conditions of the licenses granted
in Section 2.1.
3. Responsibilities
-------------------
3.1. Distribution of Source Form
All distribution of Covered Software in Source Code Form, including any
Modifications that You create or to which You contribute, must be under
the terms of this License. You must inform recipients that the Source
Code Form of the Covered Software is governed by the terms of this
License, and how they can obtain a copy of this License. You may not
attempt to alter or restrict the recipients' rights in the Source Code
Form.
3.2. Distribution of Executable Form
If You distribute Covered Software in Executable Form then:
(a) such Covered Software must also be made available in Source Code
Form, as described in Section 3.1, and You must inform recipients of
the Executable Form how they can obtain a copy of such Source Code
Form by reasonable means in a timely manner, at a charge no more
than the cost of distribution to the recipient; and
(b) You may distribute such Executable Form under the terms of this
License, or sublicense it under different terms, provided that the
license for the Executable Form does not attempt to limit or alter
the recipients' rights in the Source Code Form under this License.
3.3. Distribution of a Larger Work
You may create and distribute a Larger Work under terms of Your choice,
provided that You also comply with the requirements of this License for
the Covered Software. If the Larger Work is a combination of Covered
Software with a work governed by one or more Secondary Licenses, and the
Covered Software is not Incompatible With Secondary Licenses, this
License permits You to additionally distribute such Covered Software
under the terms of such Secondary License(s), so that the recipient of
the Larger Work may, at their option, further distribute the Covered
Software under the terms of either this License or such Secondary
License(s).
3.4. Notices
You may not remove or alter the substance of any license notices
(including copyright notices, patent notices, disclaimers of warranty,
or limitations of liability) contained within the Source Code Form of
the Covered Software, except that You may alter any license notices to
the extent required to remedy known factual inaccuracies.
3.5. Application of Additional Terms
You may choose to offer, and to charge a fee for, warranty, support,
indemnity or liability obligations to one or more recipients of Covered
Software. However, You may do so only on Your own behalf, and not on
behalf of any Contributor. You must make it absolutely clear that any
such warranty, support, indemnity, or liability obligation is offered by
You alone, and You hereby agree to indemnify every Contributor for any
liability incurred by such Contributor as a result of warranty, support,
indemnity or liability terms You offer. You may include additional
disclaimers of warranty and limitations of liability specific to any
jurisdiction.
4. Inability to Comply Due to Statute or Regulation
---------------------------------------------------
If it is impossible for You to comply with any of the terms of this
License with respect to some or all of the Covered Software due to
statute, judicial order, or regulation then You must: (a) comply with
the terms of this License to the maximum extent possible; and (b)
describe the limitations and the code they affect. Such description must
be placed in a text file included with all distributions of the Covered
Software under this License. Except to the extent prohibited by statute
or regulation, such description must be sufficiently detailed for a
recipient of ordinary skill to be able to understand it.
5. Termination
--------------
5.1. The rights granted under this License will terminate automatically
if You fail to comply with any of its terms. However, if You become
compliant, then the rights granted under this License from a particular
Contributor are reinstated (a) provisionally, unless and until such
Contributor explicitly and finally terminates Your grants, and (b) on an
ongoing basis, if such Contributor fails to notify You of the
non-compliance by some reasonable means prior to 60 days after You have
come back into compliance. Moreover, Your grants from a particular
Contributor are reinstated on an ongoing basis if such Contributor
notifies You of the non-compliance by some reasonable means, this is the
first time You have received notice of non-compliance with this License
from such Contributor, and You become compliant prior to 30 days after
Your receipt of the notice.
5.2. If You initiate litigation against any entity by asserting a patent
infringement claim (excluding declaratory judgment actions,
counter-claims, and cross-claims) alleging that a Contributor Version
directly or indirectly infringes any patent, then the rights granted to
You by any and all Contributors for the Covered Software under Section
2.1 of this License shall terminate.
5.3. In the event of termination under Sections 5.1 or 5.2 above, all
end user license agreements (excluding distributors and resellers) which
have been validly granted by You or Your distributors under this License
prior to termination shall survive termination.
************************************************************************
* *
* 6. Disclaimer of Warranty *
* ------------------------- *
* *
* Covered Software is provided under this License on an "as is" *
* basis, without warranty of any kind, either expressed, implied, or *
* statutory, including, without limitation, warranties that the *
* Covered Software is free of defects, merchantable, fit for a *
* particular purpose or non-infringing. The entire risk as to the *
* quality and performance of the Covered Software is with You. *
* Should any Covered Software prove defective in any respect, You *
* (not any Contributor) assume the cost of any necessary servicing, *
* repair, or correction. This disclaimer of warranty constitutes an *
* essential part of this License. No use of any Covered Software is *
* authorized under this License except under this disclaimer. *
* *
************************************************************************
************************************************************************
* *
* 7. Limitation of Liability *
* -------------------------- *
* *
* Under no circumstances and under no legal theory, whether tort *
* (including negligence), contract, or otherwise, shall any *
* Contributor, or anyone who distributes Covered Software as *
* permitted above, be liable to You for any direct, indirect, *
* special, incidental, or consequential damages of any character *
* including, without limitation, damages for lost profits, loss of *
* goodwill, work stoppage, computer failure or malfunction, or any *
* and all other commercial damages or losses, even if such party *
* shall have been informed of the possibility of such damages. This *
* limitation of liability shall not apply to liability for death or *
* personal injury resulting from such party's negligence to the *
* extent applicable law prohibits such limitation. Some *
* jurisdictions do not allow the exclusion or limitation of *
* incidental or consequential damages, so this exclusion and *
* limitation may not apply to You. *
* *
************************************************************************
8. Litigation
-------------
Any litigation relating to this License may be brought only in the
courts of a jurisdiction where the defendant maintains its principal
place of business and such litigation shall be governed by laws of that
jurisdiction, without reference to its conflict-of-law provisions.
Nothing in this Section shall prevent a party's ability to bring
cross-claims or counter-claims.
9. Miscellaneous
----------------
This License represents the complete agreement concerning the subject
matter hereof. If any provision of this License is held to be
unenforceable, such provision shall be reformed only to the extent
necessary to make it enforceable. Any law or regulation which provides
that the language of a contract shall be construed against the drafter
shall not be used to construe this License against a Contributor.
10. Versions of the License
---------------------------
10.1. New Versions
Mozilla Foundation is the license steward. Except as provided in Section
10.3, no one other than the license steward has the right to modify or
publish new versions of this License. Each version will be given a
distinguishing version number.
10.2. Effect of New Versions
You may distribute the Covered Software under the terms of the version
of the License under which You originally received the Covered Software,
or under the terms of any subsequent version published by the license
steward.
10.3. Modified Versions
If you create software not governed by this License, and you want to
create a new license for such software, you may create and use a
modified version of this License if you rename the license and remove
any references to the name of the license steward (except to note that
such modified license differs from this License).
10.4. Distributing Source Code Form that is Incompatible With Secondary
Licenses
If You choose to distribute Source Code Form that is Incompatible With
Secondary Licenses under the terms of this version of the License, the
notice described in Exhibit B of this License must be attached.
Exhibit A - Source Code Form License Notice
-------------------------------------------
This Source Code Form is subject to the terms of the Mozilla Public
License, v. 2.0. If a copy of the MPL was not distributed with this
file, You can obtain one at http://mozilla.org/MPL/2.0/.
If it is not possible or desirable to put the notice in a particular
file, then You may include the notice in a location (such as a LICENSE
file in a relevant directory) where a recipient would be likely to look
for such a notice.
You may add additional accurate notices of copyright ownership.
Exhibit B - "Incompatible With Secondary Licenses" Notice
---------------------------------------------------------
This Source Code Form is "Incompatible With Secondary Licenses", as
defined by the Mozilla Public License, v. 2.0.

View file

@ -0,0 +1,65 @@
{
"name": "@budibase/common",
"version": "0.0.32",
"description": "core javascript library for budibase",
"files": [
"dist/**",
"!dist/node_modules"
],
"directories": {
"test": "test"
},
"scripts": {
"test": "jest"
},
"keywords": [
"budibase"
],
"author": "Budibase",
"license": "MPL-2.0",
"jest": {
"testURL": "http://jest-breaks-if-this-does-not-exist",
"moduleNameMapper": {
"\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$": "<rootDir>/internals/mocks/fileMock.js",
"\\.(css|less|sass|scss)$": "identity-obj-proxy"
},
"moduleFileExtensions": [
"js",
"mjs"
],
"moduleDirectories": [
"node_modules"
],
"transform": {
"^.+\\.mjs$": "babel-jest",
"^.+\\.js$": "babel-jest"
}
},
"devDependencies": {
"@babel/cli": "^7.4.4",
"@babel/core": "^7.4.5",
"@babel/plugin-transform-runtime": "^7.4.4",
"@babel/preset-env": "^7.4.5",
"@babel/runtime": "^7.4.5",
"babel-jest": "^25.3.0",
"babel-plugin-transform-es2015-modules-commonjs": "^6.26.2",
"cross-env": "^5.1.4",
"jest": "^24.8.0",
"readable-stream": "^3.1.1",
"regenerator-runtime": "^0.11.1",
"rimraf": "^2.6.2"
},
"dependencies": {
"@nx-js/compiler-util": "^2.0.0",
"bcryptjs": "^2.4.3",
"date-fns": "^1.29.0",
"lodash": "^4.17.13",
"shortid": "^2.2.8"
},
"devEngines": {
"node": ">=7.x",
"npm": ">=4.x",
"yarn": ">=0.21.3"
},
"gitHead": "b1f4f90927d9e494e513220ef060af28d2d42455"
}

21
packages/common/readme.md Normal file
View file

@ -0,0 +1,21 @@
## Getting Started
Install packages:
`npm install`
Next, run the tests. Install jest, globally:
`npm install -g jest`
And finally, run
`jest`
## Documentation
A work in progress, lives here: https://github.com/Budibase/docs/blob/master/budibase-core.md

View file

@ -0,0 +1,128 @@
import { cloneDeep, isUndefined } from "lodash/fp"
import { generate } from "shortid"
import { UnauthorisedError } from "./errors"
export const apiWrapper = async (
app,
eventNamespace,
isAuthorized,
eventContext,
func,
...params
) => {
pushCallStack(app, eventNamespace)
if (!isAuthorized(app)) {
handleNotAuthorized(app, eventContext, eventNamespace)
return
}
const startDate = Date.now()
const elapsed = () => Date.now() - startDate
try {
await app.publish(eventNamespace.onBegin, eventContext)
const result = await func(...params)
await publishComplete(app, eventContext, eventNamespace, elapsed, result)
return result
} catch (error) {
await publishError(app, eventContext, eventNamespace, elapsed, error)
throw error
}
}
export const apiWrapperSync = (
app,
eventNamespace,
isAuthorized,
eventContext,
func,
...params
) => {
pushCallStack(app, eventNamespace)
if (!isAuthorized(app)) {
handleNotAuthorized(app, eventContext, eventNamespace)
return
}
const startDate = Date.now()
const elapsed = () => Date.now() - startDate
try {
app.publish(eventNamespace.onBegin, eventContext)
const result = func(...params)
publishComplete(app, eventContext, eventNamespace, elapsed, result)
return result
} catch (error) {
publishError(app, eventContext, eventNamespace, elapsed, error)
throw error
}
}
const handleNotAuthorized = (app, eventContext, eventNamespace) => {
const err = new UnauthorisedError(`Unauthorized: ${eventNamespace}`)
publishError(app, eventContext, eventNamespace, () => 0, err)
throw err
}
const pushCallStack = (app, eventNamespace, seedCallId) => {
const callId = generate()
const createCallStack = () => ({
seedCallId: !isUndefined(seedCallId) ? seedCallId : callId,
threadCallId: callId,
stack: [],
})
if (isUndefined(app.calls)) {
app.calls = createCallStack()
}
app.calls.stack.push({
namespace: eventNamespace,
callId,
})
}
const popCallStack = app => {
app.calls.stack.pop()
if (app.calls.stack.length === 0) {
delete app.calls
}
}
const publishError = async (
app,
eventContext,
eventNamespace,
elapsed,
err
) => {
const ctx = cloneDeep(eventContext)
ctx.error = err
ctx.elapsed = elapsed()
await app.publish(eventNamespace.onError, ctx)
popCallStack(app)
}
const publishComplete = async (
app,
eventContext,
eventNamespace,
elapsed,
result
) => {
const endcontext = cloneDeep(eventContext)
endcontext.result = result
endcontext.elapsed = elapsed()
await app.publish(eventNamespace.onComplete, endcontext)
popCallStack(app)
return result
}
export default apiWrapper

View file

@ -0,0 +1,26 @@
import { compileCode as cCode } from "@nx-js/compiler-util"
import { includes } from "lodash/fp"
export const compileCode = code => {
let func
let safeCode
if (includes("return ")(code)) {
safeCode = code
} else {
let trimmed = code.trim()
trimmed = trimmed.endsWith(";")
? trimmed.substring(0, trimmed.length - 1)
: trimmed
safeCode = `return (${trimmed})`
}
try {
func = cCode(safeCode)
} catch (e) {
e.message = `Error compiling code : ${code} : ${e.message}`
throw e
}
return func
}

View file

@ -0,0 +1,34 @@
export class BadRequestError extends Error {
constructor(message) {
super(message)
this.httpStatusCode = 400
}
}
export class UnauthorisedError extends Error {
constructor(message) {
super(message)
this.httpStatusCode = 401
}
}
export class ForbiddenError extends Error {
constructor(message) {
super(message)
this.httpStatusCode = 403
}
}
export class NotFoundError extends Error {
constructor(message) {
super(message)
this.httpStatusCode = 404
}
}
export class ConflictError extends Error {
constructor(message) {
super(message)
this.httpStatusCode = 409
}
}

View file

@ -0,0 +1,27 @@
import { has } from "lodash/fp"
const publish = handlers => async (eventName, context = {}) => {
if (!has(eventName)(handlers)) return
for (const handler of handlers[eventName]) {
await handler(eventName, context)
}
}
const subscribe = handlers => (eventName, handler) => {
if (!has(eventName)(handlers)) {
handlers[eventName] = []
}
handlers[eventName].push(handler)
}
export const createEventAggregator = () => {
const handlers = {}
const eventAggregator = {
publish: publish(handlers),
subscribe: subscribe(handlers),
}
return eventAggregator
}
export default createEventAggregator

View file

@ -0,0 +1,85 @@
import { union, reduce } from "lodash/fp"
const commonPlus = extra => union(["onBegin", "onComplete", "onError"])(extra)
const common = () => commonPlus([])
const _events = {
recordApi: {
save: commonPlus(["onInvalid", "onRecordUpdated", "onRecordCreated"]),
delete: common(),
getContext: common(),
getNew: common(),
load: common(),
validate: common(),
uploadFile: common(),
downloadFile: common(),
},
indexApi: {
buildIndex: common(),
listItems: common(),
delete: common(),
aggregates: common(),
},
collectionApi: {
getAllowedRecordTypes: common(),
initialise: common(),
delete: common(),
},
authApi: {
authenticate: common(),
authenticateTemporaryAccess: common(),
createTemporaryAccess: common(),
createUser: common(),
enableUser: common(),
disableUser: common(),
loadAccessLevels: common(),
getNewAccessLevel: common(),
getNewUser: common(),
getNewUserAuth: common(),
getUsers: common(),
saveAccessLevels: common(),
isAuthorized: common(),
changeMyPassword: common(),
setPasswordFromTemporaryCode: common(),
scorePassword: common(),
isValidPassword: common(),
validateUser: common(),
validateAccessLevels: common(),
setUserAccessLevels: common(),
},
templateApi: {
saveApplicationHierarchy: common(),
saveActionsAndTriggers: common(),
},
actionsApi: {
execute: common(),
},
}
const _eventsList = []
const makeEvent = (area, method, name) => `${area}:${method}:${name}`
for (const areaKey in _events) {
for (const methodKey in _events[areaKey]) {
_events[areaKey][methodKey] = reduce((obj, s) => {
obj[s] = makeEvent(areaKey, methodKey, s)
return obj
}, {})(_events[areaKey][methodKey])
}
}
for (const areaKey in _events) {
for (const methodKey in _events[areaKey]) {
for (const name in _events[areaKey][methodKey]) {
_eventsList.push(_events[areaKey][methodKey][name])
}
}
}
export const events = _events
export const eventsList = _eventsList
export default { events: _events, eventsList: _eventsList }

View file

@ -0,0 +1,307 @@
import {
head,
tail,
findIndex,
startsWith,
dropRight,
flow,
takeRight,
trim,
replace,
} from "lodash"
import {
some,
reduce,
isEmpty,
isArray,
join,
isString,
isInteger,
isDate,
toNumber,
isUndefined,
isNaN,
isNull,
constant,
split,
includes,
filter,
} from "lodash/fp"
import { events, eventsList } from "./events"
// this is the combinator function
export const $$ = (...funcs) => arg => flow(funcs)(arg)
// this is the pipe function
export const $ = (arg, funcs) => $$(...funcs)(arg)
export const keySep = "/"
const trimKeySep = str => trim(str, keySep)
const splitByKeySep = str => split(keySep)(str)
export const safeKey = key =>
replace(`${keySep}${trimKeySep(key)}`, `${keySep}${keySep}`, keySep)
export const joinKey = (...strs) => {
const paramsOrArray = (strs.length === 1) & isArray(strs[0]) ? strs[0] : strs
return $(paramsOrArray, [
filter(s => !isUndefined(s) && !isNull(s) && s.toString().length > 0),
join(keySep),
safeKey,
])
}
export const splitKey = $$(trimKeySep, splitByKeySep)
export const getDirFomKey = $$(splitKey, dropRight, p => joinKey(...p))
export const getFileFromKey = $$(splitKey, takeRight, head)
export const configFolder = `${keySep}.config`
export const fieldDefinitions = joinKey(configFolder, "fields.json")
export const templateDefinitions = joinKey(configFolder, "templates.json")
export const appDefinitionFile = joinKey(configFolder, "appDefinition.json")
export const dirIndex = folderPath =>
joinKey(configFolder, "dir", ...splitKey(folderPath), "dir.idx")
export const getIndexKeyFromFileKey = $$(getDirFomKey, dirIndex)
export const ifExists = (val, exists, notExists) =>
isUndefined(val)
? isUndefined(notExists)
? (() => {})()
: notExists()
: exists()
export const getOrDefault = (val, defaultVal) =>
ifExists(
val,
() => val,
() => defaultVal
)
export const not = func => val => !func(val)
export const isDefined = not(isUndefined)
export const isNonNull = not(isNull)
export const isNotNaN = not(isNaN)
export const allTrue = (...funcArgs) => val =>
reduce(
(result, conditionFunc) =>
(isNull(result) || result == true) && conditionFunc(val),
null
)(funcArgs)
export const anyTrue = (...funcArgs) => val =>
reduce(
(result, conditionFunc) => result == true || conditionFunc(val),
null
)(funcArgs)
export const insensitiveEquals = (str1, str2) =>
str1.trim().toLowerCase() === str2.trim().toLowerCase()
export const isSomething = allTrue(isDefined, isNonNull, isNotNaN)
export const isNothing = not(isSomething)
export const isNothingOrEmpty = v => isNothing(v) || isEmpty(v)
export const somethingOrGetDefault = getDefaultFunc => val =>
isSomething(val) ? val : getDefaultFunc()
export const somethingOrDefault = (val, defaultVal) =>
somethingOrGetDefault(constant(defaultVal))(val)
export const mapIfSomethingOrDefault = (mapFunc, defaultVal) => val =>
isSomething(val) ? mapFunc(val) : defaultVal
export const mapIfSomethingOrBlank = mapFunc =>
mapIfSomethingOrDefault(mapFunc, "")
export const none = predicate => collection => !some(predicate)(collection)
export const all = predicate => collection =>
none(v => !predicate(v))(collection)
export const isNotEmpty = ob => !isEmpty(ob)
export const isAsync = fn => fn.constructor.name === "AsyncFunction"
export const isNonEmptyArray = allTrue(isArray, isNotEmpty)
export const isNonEmptyString = allTrue(isString, isNotEmpty)
export const tryOr = failFunc => (func, ...args) => {
try {
return func.apply(null, ...args)
} catch (_) {
return failFunc()
}
}
export const tryAwaitOr = failFunc => async (func, ...args) => {
try {
return await func.apply(null, ...args)
} catch (_) {
return await failFunc()
}
}
export const defineError = (func, errorPrefix) => {
try {
return func()
} catch (err) {
err.message = `${errorPrefix} : ${err.message}`
throw err
}
}
export const tryOrIgnore = tryOr(() => {})
export const tryAwaitOrIgnore = tryAwaitOr(async () => {})
export const causesException = func => {
try {
func()
return false
} catch (e) {
return true
}
}
export const executesWithoutException = func => !causesException(func)
export const handleErrorWith = returnValInError =>
tryOr(constant(returnValInError))
export const handleErrorWithUndefined = handleErrorWith(undefined)
export const switchCase = (...cases) => value => {
const nextCase = () => head(cases)[0](value)
const nextResult = () => head(cases)[1](value)
if (isEmpty(cases)) return // undefined
if (nextCase() === true) return nextResult()
return switchCase(...tail(cases))(value)
}
export const isValue = val1 => val2 => val1 === val2
export const isOneOf = (...vals) => val => includes(val)(vals)
export const defaultCase = constant(true)
export const memberMatches = (member, match) => obj => match(obj[member])
export const StartsWith = searchFor => searchIn =>
startsWith(searchIn, searchFor)
export const contains = val => array => findIndex(array, v => v === val) > -1
export const getHashCode = s => {
let hash = 0
let i
let char
let l
if (s.length == 0) return hash
for (i = 0, l = s.length; i < l; i++) {
char = s.charCodeAt(i)
hash = (hash << 5) - hash + char
hash |= 0 // Convert to 32bit integer
}
// converting to string, but dont want a "-" prefixed
if (hash < 0) {
return `n${(hash * -1).toString()}`
}
return hash.toString()
}
// thanks to https://blog.grossman.io/how-to-write-async-await-without-try-catch-blocks-in-javascript/
export const awEx = async promise => {
try {
const result = await promise
return [undefined, result]
} catch (error) {
return [error, undefined]
}
}
export const isSafeInteger = n =>
isInteger(n) &&
n <= Number.MAX_SAFE_INTEGER &&
n >= 0 - Number.MAX_SAFE_INTEGER
export const toDateOrNull = s =>
isNull(s) ? null : isDate(s) ? s : new Date(s)
export const toBoolOrNull = s => (isNull(s) ? null : s === "true" || s === true)
export const toNumberOrNull = s => (isNull(s) ? null : toNumber(s))
export const isArrayOfString = opts => isArray(opts) && all(isString)(opts)
export const pushAll = (target, items) => {
for (let i of items) target.push(i)
}
export const pause = async duration =>
new Promise(res => setTimeout(res, duration))
export const retry = async (fn, retries, delay, ...args) => {
try {
return await fn(...args)
} catch (err) {
if (retries > 1) {
return await pause(delay).then(
async () => await retry(fn, retries - 1, delay, ...args)
)
}
throw err
}
}
export { events } from "./events"
export default {
ifExists,
getOrDefault,
isDefined,
isNonNull,
isNotNaN,
allTrue,
isSomething,
mapIfSomethingOrDefault,
mapIfSomethingOrBlank,
configFolder,
fieldDefinitions,
isNothing,
not,
switchCase,
defaultCase,
StartsWith,
contains,
templateDefinitions,
handleErrorWith,
handleErrorWithUndefined,
tryOr,
tryOrIgnore,
tryAwaitOr,
tryAwaitOrIgnore,
dirIndex,
keySep,
$,
$$,
getDirFomKey,
getFileFromKey,
splitKey,
somethingOrDefault,
getIndexKeyFromFileKey,
joinKey,
somethingOrGetDefault,
appDefinitionFile,
isValue,
all,
isOneOf,
memberMatches,
defineError,
anyTrue,
isNonEmptyArray,
causesException,
executesWithoutException,
none,
getHashCode,
awEx,
events,
eventsList,
isNothingOrEmpty,
isSafeInteger,
toNumber,
toDate: toDateOrNull,
toBool: toBoolOrNull,
isArrayOfString,
insensitiveEquals,
pause,
retry,
pushAll,
}

View file

@ -0,0 +1,14 @@
import { filter, map } from "lodash/fp"
import { $, isSomething } from "./index"
export const stringNotEmpty = s => isSomething(s) && s.trim().length > 0
export const makerule = (field, error, isValid) => ({ field, error, isValid })
export const validationError = (rule, item) => ({ ...rule, item })
export const applyRuleSet = ruleSet => itemToValidate =>
$(ruleSet, [map(applyRule(itemToValidate)), filter(isSomething)])
export const applyRule = itemTovalidate => rule =>
rule.isValid(itemTovalidate) ? null : validationError(rule, itemTovalidate)

View file

@ -0,0 +1,17 @@
import { generate } from "shortid"
import { getNewFieldValue } from "../schema/types"
export const getNewRecord = (schema, modelName) => {
const model = schema.findModel(modelName)
const record = {
_id: generate(),
_modelId: model.id,
}
for (let field of model.fields) {
record[field.name] = getNewFieldValue(field)
}
return record
}

View file

@ -0,0 +1,91 @@
import { map, reduce, filter, isEmpty, flatten, each } from "lodash/fp"
import { compileCode } from "../common/compileCode"
import _ from "lodash"
import { getExactNodeForKey } from "../templateApi/hierarchy"
import { validateFieldParse, validateTypeConstraints } from "../types"
import { $, isNothing, isNonEmptyString } from "../common"
import { _getContext } from "./getContext"
const fieldParseError = (fieldName, value) => ({
fields: [fieldName],
message: `Could not parse field ${fieldName}:${value}`,
})
const validateAllFieldParse = (record, recordNode) =>
$(recordNode.fields, [
map(f => ({ name: f.name, parseResult: validateFieldParse(f, record) })),
reduce((errors, f) => {
if (f.parseResult.success) return errors
errors.push(fieldParseError(f.name, f.parseResult.value))
return errors
}, []),
])
const validateAllTypeConstraints = async (record, recordNode, context) => {
const errors = []
for (const field of recordNode.fields) {
$(await validateTypeConstraints(field, record, context), [
filter(isNonEmptyString),
map(m => ({ message: m, fields: [field.name] })),
each(e => errors.push(e)),
])
}
return errors
}
const runRecordValidationRules = (record, recordNode) => {
const runValidationRule = rule => {
const isValid = compileCode(rule.expressionWhenValid)
const expressionContext = { record, _ }
return isValid(expressionContext)
? { valid: true }
: {
valid: false,
fields: rule.invalidFields,
message: rule.messageWhenInvalid,
}
}
return $(recordNode.validationRules, [
map(runValidationRule),
flatten,
filter(r => r.valid === false),
map(r => ({ fields: r.fields, message: r.message })),
])
}
export const validate = app => async (record, context) => {
context = isNothing(context) ? _getContext(app, record.key) : context
const recordNode = getExactNodeForKey(app.hierarchy)(record.key)
const fieldParseFails = validateAllFieldParse(record, recordNode)
// non parsing would cause further issues - exit here
if (!isEmpty(fieldParseFails)) {
return { isValid: false, errors: fieldParseFails }
}
const recordValidationRuleFails = runRecordValidationRules(record, recordNode)
const typeContraintFails = await validateAllTypeConstraints(
record,
recordNode,
context
)
if (
isEmpty(fieldParseFails) &&
isEmpty(recordValidationRuleFails) &&
isEmpty(typeContraintFails)
) {
return { isValid: true, errors: [] }
}
return {
isValid: false,
errors: _.union(
fieldParseFails,
typeContraintFails,
recordValidationRuleFails
),
}
}

View file

@ -0,0 +1,22 @@
export const createTrigger = () => ({
actionName: "",
eventName: "",
// function, has access to event context,
// returns object that is used as parameter to action
// only used if triggered by event
optionsCreator: "",
// action runs if true,
// has access to event context
condition: "",
})
export const createAction = () => ({
name: "",
behaviourSource: "",
// name of function in actionSource
behaviourName: "",
// parameter passed into behaviour.
// any other parms passed at runtime e.g.
// by trigger, or manually, will be merged into this
initialOptions: {},
})

View file

@ -0,0 +1,96 @@
import { some, map, filter, keys, includes, countBy, flatten } from "lodash/fp"
import {
isSomething,
$,
isNonEmptyString,
isNothingOrEmpty,
isNothing,
} from "../common"
import { all, getDefaultOptions } from "./types/index.mjs"
import { applyRuleSet, makerule } from "../common/validationCommon"
import { BadRequestError } from "../common/errors"
import { generate } from "shortid"
export const fieldErrors = {
AddFieldValidationFailed: "Add field validation: ",
}
export const allowedTypes = () => keys(all)
export const getNewField = type => ({
id: generate(),
name: "", // how field is referenced internally
type,
typeOptions: getDefaultOptions(type),
label: "", // how field is displayed
getInitialValue: "default", // function that gets value when initially created
getUndefinedValue: "default", // function that gets value when field undefined on record
})
const fieldRules = allFields => [
makerule("name", "field name is not set", f => isNonEmptyString(f.name)),
makerule("type", "field type is not set", f => isNonEmptyString(f.type)),
makerule("label", "field label is not set", f => isNonEmptyString(f.label)),
makerule("getInitialValue", "getInitialValue function is not set", f =>
isNonEmptyString(f.getInitialValue)
),
makerule("getUndefinedValue", "getUndefinedValue function is not set", f =>
isNonEmptyString(f.getUndefinedValue)
),
makerule(
"name",
"field name is duplicated",
f => isNothingOrEmpty(f.name) || countBy("name")(allFields)[f.name] === 1
),
makerule(
"type",
"type is unknown",
f => isNothingOrEmpty(f.type) || some(t => f.type === t)(allowedTypes())
),
]
const typeOptionsRules = field => {
const type = all[field.type]
if (isNothing(type)) return []
const def = optName => type.optionDefinitions[optName]
return $(field.typeOptions, [
keys,
filter(o => isSomething(def(o)) && isSomething(def(o).isValid)),
map(o =>
makerule(`typeOptions.${o}`, `${def(o).requirementDescription}`, field =>
def(o).isValid(field.typeOptions[o])
)
),
])
}
export const validateField = allFields => field => {
const everySingleField = includes(field)(allFields)
? allFields
: [...allFields, field]
return applyRuleSet([
...fieldRules(everySingleField),
...typeOptionsRules(field),
])(field)
}
export const validateAllFields = recordNode =>
$(recordNode.fields, [map(validateField(recordNode.fields)), flatten])
export const addField = (recordTemplate, field) => {
if (isNothingOrEmpty(field.label)) {
field.label = field.name
}
const validationMessages = validateField([...recordTemplate.fields, field])(
field
)
if (validationMessages.length > 0) {
const errors = map(m => m.error)(validationMessages)
throw new BadRequestError(
`${fieldErrors.AddFieldValidationFailed} ${errors.join(", ")}`
)
}
recordTemplate.fields.push(field)
}

View file

@ -0,0 +1,25 @@
export const fullSchema = (models, views) => {
const findModel = idOrName =>
models.find(m => m.id === idOrName || m.name === idOrName)
const findView = idOrName =>
views.find(m => m.id === idOrName || m.name === idOrName)
const findField = (modelIdOrName, fieldName) => {
const model = models.find(
m => m.id === modelIdOrName || m.name === modelIdOrName
)
return model.fields.find(f => f.name === fieldName)
}
const viewsForModel = modelId => views.filter(v => v.modelId === modelId)
return {
models,
views,
findModel,
findField,
findView,
viewsForModel,
}
}

View file

@ -0,0 +1,37 @@
import { generate } from "shortid"
export const newModel = () => ({
id: generate(),
name: "",
fields: [],
validationRules: [],
primaryField: "",
views: [],
})
/**
*
* @param {Array} models
* @param {string} modelId
* @returns {}
*/
export const canDeleteModel = (models, modelId) => {
const errors = []
for (let model of models) {
const links = model.fields.filter(
f => f.type === "link" && f.typeOptions.modelId === modelId
)
for (let link of links) {
errors.push(
`The "${model.name}" model links to this model, via field "${link.name}"`
)
}
}
return {
errors,
canDelete: errors.length > 0,
}
}

View file

@ -0,0 +1,68 @@
import { map, constant, isArray } from "lodash/fp"
import {
typeFunctions,
makerule,
parsedFailed,
getDefaultExport,
parsedSuccess,
} from "./typeHelpers"
import {
switchCase,
defaultCase,
toNumberOrNull,
$$,
isSafeInteger,
} from "../../common"
const arrayFunctions = () =>
typeFunctions({
default: constant([]),
})
const mapToParsedArrary = type =>
$$(
map(i => type.safeParseValue(i)),
parsedSuccess
)
const arrayTryParse = type =>
switchCase([isArray, mapToParsedArrary(type)], [defaultCase, parsedFailed])
const typeName = type => `array<${type}>`
const options = {
maxLength: {
defaultValue: 10000,
isValid: isSafeInteger,
requirementDescription: "must be a positive integer",
parse: toNumberOrNull,
},
minLength: {
defaultValue: 0,
isValid: n => isSafeInteger(n) && n >= 0,
requirementDescription: "must be a positive integer",
parse: toNumberOrNull,
},
}
const typeConstraints = [
makerule(
async (val, opts) => val === null || val.length >= opts.minLength,
(val, opts) => `must choose ${opts.minLength} or more options`
),
makerule(
async (val, opts) => val === null || val.length <= opts.maxLength,
(val, opts) => `cannot choose more than ${opts.maxLength} options`
),
]
export default type =>
getDefaultExport(
typeName(type.name),
arrayTryParse(type),
arrayFunctions(type),
options,
typeConstraints,
[type.sampleValue],
JSON.stringify
)

View file

@ -0,0 +1,47 @@
import { constant, isBoolean, isNull } from "lodash/fp"
import {
typeFunctions,
makerule,
parsedFailed,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import { switchCase, defaultCase, isOneOf, toBoolOrNull } from "../../common"
const boolFunctions = typeFunctions({
default: constant(null),
})
const boolTryParse = switchCase(
[isBoolean, parsedSuccess],
[isNull, parsedSuccess],
[isOneOf("true", "1", "yes", "on"), () => parsedSuccess(true)],
[isOneOf("false", "0", "no", "off"), () => parsedSuccess(false)],
[defaultCase, parsedFailed]
)
const options = {
allowNulls: {
defaultValue: true,
isValid: isBoolean,
requirementDescription: "must be a true or false",
parse: toBoolOrNull,
},
}
const typeConstraints = [
makerule(
async (val, opts) => opts.allowNulls === true || val !== null,
() => "field cannot be null"
),
]
export default getDefaultExport(
"bool",
boolTryParse,
boolFunctions,
options,
typeConstraints,
true,
JSON.stringify
)

View file

@ -0,0 +1,82 @@
import { constant, isDate, isString, isNull } from "lodash/fp"
import {
makerule,
typeFunctions,
parsedFailed,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import { switchCase, defaultCase, toDateOrNull, isNonEmptyArray } from "../../common"
const dateFunctions = typeFunctions({
default: constant(null),
now: () => new Date(),
})
const isValidDate = d => d instanceof Date && !isNaN(d)
const parseStringToDate = s =>
switchCase(
[isValidDate, parsedSuccess],
[defaultCase, parsedFailed]
)(new Date(s))
const isNullOrEmpty = d =>
isNull(d)
|| (d || "").toString() === ""
const isDateOrEmpty = d =>
isDate(d)
|| isNullOrEmpty(d)
const dateTryParse = switchCase(
[isDateOrEmpty, parsedSuccess],
[isString, parseStringToDate],
[defaultCase, parsedFailed]
)
const options = {
maxValue: {
defaultValue: null,
//defaultValue: new Date(32503680000000),
isValid: isDateOrEmpty,
requirementDescription: "must be a valid date",
parse: toDateOrNull,
},
minValue: {
defaultValue: null,
//defaultValue: new Date(-8520336000000),
isValid: isDateOrEmpty,
requirementDescription: "must be a valid date",
parse: toDateOrNull,
},
}
const typeConstraints = [
makerule(
async (val, opts) =>
val === null || isNullOrEmpty(opts.minValue) || val >= opts.minValue,
(val, opts) =>
`value (${val.toString()}) must be greater than or equal to ${
opts.minValue
}`
),
makerule(
async (val, opts) =>
val === null || isNullOrEmpty(opts.maxValue) || val <= opts.maxValue,
(val, opts) =>
`value (${val.toString()}) must be less than or equal to ${
opts.minValue
} options`
),
]
export default getDefaultExport(
"datetime",
dateTryParse,
dateFunctions,
options,
typeConstraints,
new Date(1984, 4, 1),
date => JSON.stringify(date).replace(new RegExp('"', "g"), "")
)

View file

@ -0,0 +1,56 @@
import { last, has, isString, intersection, isNull, isNumber } from "lodash/fp"
import {
typeFunctions,
parsedFailed,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import { switchCase, defaultCase, none, $, splitKey } from "../../common"
const illegalCharacters = "*?\\/:<>|\0\b\f\v"
export const isLegalFilename = filePath => {
const fn = fileName(filePath)
return (
fn.length <= 255 &&
intersection(fn.split(""))(illegalCharacters.split("")).length === 0 &&
none(f => f === "..")(splitKey(filePath))
)
}
const fileNothing = () => ({ relativePath: "", size: 0 })
const fileFunctions = typeFunctions({
default: fileNothing,
})
const fileTryParse = v =>
switchCase(
[isValidFile, parsedSuccess],
[isNull, () => parsedSuccess(fileNothing())],
[defaultCase, parsedFailed]
)(v)
const fileName = filePath => $(filePath, [splitKey, last])
const isValidFile = f =>
!isNull(f) &&
has("relativePath")(f) &&
has("size")(f) &&
isNumber(f.size) &&
isString(f.relativePath) &&
isLegalFilename(f.relativePath)
const options = {}
const typeConstraints = []
export default getDefaultExport(
"file",
fileTryParse,
fileFunctions,
options,
typeConstraints,
{ relativePath: "some_file.jpg", size: 1000 },
JSON.stringify
)

View file

@ -0,0 +1,85 @@
import { assign, merge } from "lodash"
import {
map,
isString,
isNumber,
isBoolean,
isDate,
keys,
isObject,
isArray,
has,
} from "lodash/fp"
import { $ } from "../../common"
import { parsedSuccess } from "./typeHelpers"
import string from "./string"
import bool from "./bool"
import number from "./number"
import datetime from "./datetime"
import array from "./array"
import link from "./link"
import file from "./file"
import { BadRequestError } from "../../common/errors"
const allTypes = () => {
const basicTypes = {
string,
number,
datetime,
bool,
link,
file,
}
const arrays = $(basicTypes, [
keys,
map(k => {
const kvType = {}
const concreteArray = array(basicTypes[k])
kvType[concreteArray.name] = concreteArray
return kvType
}),
types => assign({}, ...types),
])
return merge({}, basicTypes, arrays)
}
export const all = allTypes()
export const getType = typeName => {
if (!has(typeName)(all))
throw new BadRequestError(`Do not recognise type ${typeName}`)
return all[typeName]
}
export const getSampleFieldValue = field => getType(field.type).sampleValue
export const getNewFieldValue = field => getType(field.type).getNew(field)
export const safeParseField = (field, record) =>
getType(field.type).safeParseField(field, record)
export const validateFieldParse = (field, record) =>
has(field.name)(record)
? getType(field.type).tryParse(record[field.name])
: parsedSuccess(undefined) // fields may be undefined by default
export const getDefaultOptions = type => getType(type).getDefaultOptions()
export const validateTypeConstraints = async (field, record, context) =>
await getType(field.type).validateTypeConstraints(field, record, context)
export const detectType = value => {
if (isString(value)) return string
if (isBoolean(value)) return bool
if (isNumber(value)) return number
if (isDate(value)) return datetime
if (isArray(value)) return array(detectType(value[0]))
if (isObject(value) && has("key")(value) && has("value")(value))
return link
if (isObject(value) && has("relativePath")(value) && has("size")(value))
return file
throw new BadRequestError(`cannot determine type: ${JSON.stringify(value)}`)
}

View file

@ -0,0 +1,91 @@
import { isString, isObjectLike, isNull, has, isEmpty } from "lodash/fp"
import {
typeFunctions,
makerule,
parsedSuccess,
getDefaultExport,
parsedFailed,
} from "./typeHelpers"
import {
switchCase,
defaultCase,
isNonEmptyString,
isArrayOfString,
} from "../../common"
const linkNothing = () => ({ key: "" })
const linkFunctions = typeFunctions({
default: linkNothing,
})
const hasStringValue = (ob, path) => has(path)(ob) && isString(ob[path])
const isObjectWithKey = v => isObjectLike(v) && hasStringValue(v, "key")
const tryParseFromString = s => {
try {
const asObj = JSON.parse(s)
if (isObjectWithKey) {
return parsedSuccess(asObj)
}
} catch (_) {
// EMPTY
}
return parsedFailed(s)
}
const linkTryParse = v =>
switchCase(
[isObjectWithKey, parsedSuccess],
[isString, tryParseFromString],
[isNull, () => parsedSuccess(linkNothing())],
[defaultCase, parsedFailed]
)(v)
const options = {
indexNodeKey: {
defaultValue: null,
isValid: isNonEmptyString,
requirementDescription: "must be a non-empty string",
parse: s => s,
},
displayValue: {
defaultValue: "",
isValid: isNonEmptyString,
requirementDescription: "must be a non-empty string",
parse: s => s,
},
reverseIndexNodeKeys: {
defaultValue: null,
isValid: v => isArrayOfString(v) && v.length > 0,
requirementDescription: "must be a non-empty array of strings",
parse: s => s,
},
}
const isEmptyString = s => isString(s) && isEmpty(s)
const ensurelinkExists = async (val, opts, context) =>
isEmptyString(val.key) || (await context.linkExists(opts, val.key))
const typeConstraints = [
makerule(
ensurelinkExists,
(val, opts) =>
`"${val[opts.displayValue]}" does not exist in options list (key: ${
val.key
})`
),
]
export default getDefaultExport(
"link",
linkTryParse,
linkFunctions,
options,
typeConstraints,
{ key: "key", value: "value" },
JSON.stringify
)

View file

@ -0,0 +1,94 @@
import { constant, isNumber, isString, isNull } from "lodash/fp"
import {
makerule,
typeFunctions,
parsedFailed,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import {
switchCase,
defaultCase,
toNumberOrNull,
isSafeInteger,
} from "../../common"
const numberFunctions = typeFunctions({
default: constant(null),
})
const parseStringtoNumberOrNull = s => {
const num = Number(s)
return isNaN(num) ? parsedFailed(s) : parsedSuccess(num)
}
const numberTryParse = switchCase(
[isNumber, parsedSuccess],
[isString, parseStringtoNumberOrNull],
[isNull, parsedSuccess],
[defaultCase, parsedFailed]
)
const options = {
maxValue: {
defaultValue: Number.MAX_SAFE_INTEGER,
isValid: isSafeInteger,
requirementDescription: "must be a valid integer",
parse: toNumberOrNull,
},
minValue: {
defaultValue: 0 - Number.MAX_SAFE_INTEGER,
isValid: isSafeInteger,
requirementDescription: "must be a valid integer",
parse: toNumberOrNull,
},
decimalPlaces: {
defaultValue: 0,
isValid: n => isSafeInteger(n) && n >= 0,
requirementDescription: "must be a positive integer",
parse: toNumberOrNull,
},
}
const getDecimalPlaces = val => {
const splitDecimal = val.toString().split(".")
if (splitDecimal.length === 1) return 0
return splitDecimal[1].length
}
const typeConstraints = [
makerule(
async (val, opts) =>
val === null || opts.minValue === null || val >= opts.minValue,
(val, opts) =>
`value (${val.toString()}) must be greater than or equal to ${
opts.minValue
}`
),
makerule(
async (val, opts) =>
val === null || opts.maxValue === null || val <= opts.maxValue,
(val, opts) =>
`value (${val.toString()}) must be less than or equal to ${
opts.minValue
} options`
),
makerule(
async (val, opts) =>
val === null || opts.decimalPlaces >= getDecimalPlaces(val),
(val, opts) =>
`value (${val.toString()}) must have ${
opts.decimalPlaces
} decimal places or less`
),
]
export default getDefaultExport(
"number",
numberTryParse,
numberFunctions,
options,
typeConstraints,
1,
num => num.toString()
)

View file

@ -0,0 +1,59 @@
import { keys, isObject, has, clone, map, isNull, constant } from "lodash"
import {
typeFunctions,
parsedFailed,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import { switchCase, defaultCase, $ } from "../../common"
const objectFunctions = (definition, allTypes) =>
typeFunctions({
default: constant(null),
initialise: () =>
$(keys(definition), [
map(() => {
const defClone = clone(definition)
for (const k in defClone) {
defClone[k] = allTypes[k].getNew()
}
return defClone
}),
]),
})
const parseObject = (definition, allTypes) => record => {
const defClone = clone(definition)
for (const k in defClone) {
const type = allTypes[defClone[k]]
defClone[k] = has(record, k)
? type.safeParseValue(record[k])
: type.getNew()
}
return parsedSuccess(defClone)
}
const objectTryParse = (definition, allTypes) =>
switchCase(
[isNull, parsedSuccess],
[isObject, parseObject(definition, allTypes)],
[defaultCase, parsedFailed]
)
export default (
typeName,
definition,
allTypes,
defaultOptions,
typeConstraints,
sampleValue
) =>
getDefaultExport(
typeName,
objectTryParse(definition, allTypes),
objectFunctions(definition, allTypes),
defaultOptions,
typeConstraints,
sampleValue,
JSON.stringify
)

View file

@ -0,0 +1,74 @@
import { constant, isString, isNull, includes, isBoolean } from "lodash/fp"
import {
typeFunctions,
makerule,
parsedSuccess,
getDefaultExport,
} from "./typeHelpers"
import {
switchCase,
defaultCase,
toBoolOrNull,
toNumberOrNull,
isSafeInteger,
isArrayOfString,
} from "../../common"
const stringFunctions = typeFunctions({
default: constant(null),
})
const stringTryParse = switchCase(
[isString, parsedSuccess],
[isNull, parsedSuccess],
[defaultCase, v => parsedSuccess(v.toString())]
)
const options = {
maxLength: {
defaultValue: null,
isValid: n => n === null || (isSafeInteger(n) && n > 0),
requirementDescription:
"max length must be null (no limit) or a greater than zero integer",
parse: toNumberOrNull,
},
values: {
defaultValue: null,
isValid: v =>
v === null || (isArrayOfString(v) && v.length > 0 && v.length < 10000),
requirementDescription:
"'values' must be null (no values) or an array of at least one string",
parse: s => s,
},
allowDeclaredValuesOnly: {
defaultValue: false,
isValid: isBoolean,
requirementDescription: "allowDeclaredValuesOnly must be true or false",
parse: toBoolOrNull,
},
}
const typeConstraints = [
makerule(
async (val, opts) =>
val === null || opts.maxLength === null || val.length <= opts.maxLength,
(val, opts) => `value exceeds maximum length of ${opts.maxLength}`
),
makerule(
async (val, opts) =>
val === null ||
opts.allowDeclaredValuesOnly === false ||
includes(val)(opts.values),
val => `"${val}" does not exist in the list of allowed values`
),
]
export default getDefaultExport(
"string",
stringTryParse,
stringFunctions,
options,
typeConstraints,
"abcde",
str => str
)

View file

@ -0,0 +1,94 @@
import { merge } from "lodash"
import { constant, isUndefined, has, mapValues, cloneDeep } from "lodash/fp"
import { isNotEmpty } from "../../common"
export const getSafeFieldParser = (tryParse, defaultValueFunctions) => (
field,
record
) => {
if (has(field.name)(record)) {
return getSafeValueParser(
tryParse,
defaultValueFunctions
)(record[field.name])
}
return defaultValueFunctions[field.getUndefinedValue]()
}
export const getSafeValueParser = (
tryParse,
defaultValueFunctions
) => value => {
const parsed = tryParse(value)
if (parsed.success) {
return parsed.value
}
return defaultValueFunctions.default()
}
export const getNewValue = (tryParse, defaultValueFunctions) => field => {
const getInitialValue =
isUndefined(field) || isUndefined(field.getInitialValue)
? "default"
: field.getInitialValue
return has(getInitialValue)(defaultValueFunctions)
? defaultValueFunctions[getInitialValue]()
: getSafeValueParser(tryParse, defaultValueFunctions)(getInitialValue)
}
export const typeFunctions = specificFunctions =>
merge(
{
value: constant,
null: constant(null),
},
specificFunctions
)
export const validateTypeConstraints = validationRules => async (
field,
record,
context
) => {
const fieldValue = record[field.name]
const validateRule = async r =>
!(await r.isValid(fieldValue, field.typeOptions, context))
? r.getMessage(fieldValue, field.typeOptions)
: ""
const errors = []
for (const r of validationRules) {
const err = await validateRule(r)
if (isNotEmpty(err)) errors.push(err)
}
return errors
}
const getDefaultOptions = mapValues(v => v.defaultValue)
export const makerule = (isValid, getMessage) => ({ isValid, getMessage })
export const parsedFailed = val => ({ success: false, value: val })
export const parsedSuccess = val => ({ success: true, value: val })
export const getDefaultExport = (
name,
tryParse,
functions,
options,
validationRules,
sampleValue,
stringify
) => ({
getNew: getNewValue(tryParse, functions),
safeParseField: getSafeFieldParser(tryParse, functions),
safeParseValue: getSafeValueParser(tryParse, functions),
tryParse,
name,
getDefaultOptions: () => getDefaultOptions(cloneDeep(options)),
optionDefinitions: options,
validateTypeConstraints: validateTypeConstraints(validationRules),
sampleValue,
stringify: val => (val === null || val === undefined ? "" : stringify(val)),
getDefaultValue: functions.default,
})

View file

@ -0,0 +1,7 @@
import { generate } from "shortid"
export const newView = (modelId = null) => ({
id: generate(),
name: "",
modelId,
})

View file

@ -0,0 +1,216 @@
import common, { isOneOf } from "../src/common"
import _ from "lodash"
const lessThan = than => compare => compare < than
describe("common > switchCase", () => {
test("should return on first matching case", () => {
const result = common.switchCase(
[lessThan(1), _.constant("first")],
[lessThan(2), _.constant("second")],
[lessThan(3), _.constant("third")]
)(1)
expect(result).toBe("second")
})
test("should return undefined if case not matched", () => {
const result = common.switchCase(
[lessThan(1), _.constant("first")],
[lessThan(2), _.constant("second")],
[lessThan(3), _.constant("third")]
)(10)
expect(_.isUndefined(result)).toBeTruthy()
})
})
describe("common > allTrue", () => {
test("should only return true when all conditions are met", () => {
const result1 = common.allTrue(lessThan(3), lessThan(5), lessThan(10))(1)
expect(result1).toBeTruthy()
const result2 = common.allTrue(lessThan(3), lessThan(5), lessThan(10))(7)
expect(result2).toBeFalsy()
})
})
describe("common > anyTrue", () => {
test("should return true when one or more condition is met", () => {
const result1 = common.anyTrue(lessThan(3), lessThan(5), lessThan(10))(5)
expect(result1).toBeTruthy()
const result2 = common.anyTrue(lessThan(3), lessThan(5), lessThan(10))(4)
expect(result2).toBeTruthy()
})
test("should return false when no conditions are met", () => {
const result1 = common.anyTrue(lessThan(3), lessThan(5), lessThan(10))(15)
expect(result1).toBeFalsy()
})
})
const s = common.keySep
describe("common > getDirFromKey", () => {
test("should drop the final part of the path", () => {
const key = `${s}one${s}two${s}three${s}last`
const expectedDIr = `${s}one${s}two${s}three`
const result = common.getDirFomKey(key)
expect(result).toBe(expectedDIr)
})
test("should add leading /", () => {
const key = `one${s}two${s}three${s}last`
const expectedDIr = `${s}one${s}two${s}three`
const result = common.getDirFomKey(key)
expect(result).toBe(expectedDIr)
})
})
describe("common > getFileFromKey", () => {
test("should get the final part of the path", () => {
const key = `one${s}two${s}three${s}last`
const expectedFile = "last"
const result = common.getFileFromKey(key)
expect(result).toBe(expectedFile)
})
})
describe("common > getIndexKeyFromFileKey", () => {
test("should get the index key of the file's directory", () => {
const key = `one${s}two${s}three${s}file`
const expectedFile = common.dirIndex(`one${s}two${s}three`)
const result = common.getIndexKeyFromFileKey(key)
expect(result).toBe(expectedFile)
})
})
describe("common > somethingOrDefault", () => {
test("should use value if value is something", () => {
const result = common.somethingOrDefault("something", "default")
expect(result).toBe("something")
})
test("should use value if value is empty sting", () => {
const result = common.somethingOrDefault("", "default")
expect(result).toBe("")
})
test("should use value if value is empty array", () => {
const result = common.somethingOrDefault([], ["default"])
expect(result.length).toBe(0)
})
test("should use default if value is null", () => {
const result = common.somethingOrDefault(null, "default")
expect(result).toBe("default")
})
test("should use default if value is undefined", () => {
const result = common.somethingOrDefault({}.notDefined, "default")
expect(result).toBe("default")
})
})
describe("common > dirIndex", () => {
it("should match /config/dir/<path>/dir.idx to path", () => {
var result = common.dirIndex("some/path")
expect(result).toBe(`${s}.config${s}dir${s}some${s}path${s}dir.idx`)
})
})
describe("common > joinKey", () => {
it("should join an array with the key separator and leading separator", () => {
var result = common.joinKey("this", "is", "a", "path")
expect(result).toBe(`${s}this${s}is${s}a${s}path`)
})
})
describe("common > combinator ($$)", () => {
it("combines single params functions and returns a func", () => {
const f1 = str => str + " hello"
const f2 = str => str + " there"
const combined = common.$$(f1, f2)
const result = combined("mike says")
expect(result).toBe("mike says hello there")
})
})
describe("common > pipe ($)", () => {
it("combines single params functions and executes with given param", () => {
const f1 = str => str + " hello"
const f2 = str => str + " there"
const result = common.$("mike says", [f1, f2])
expect(result).toBe("mike says hello there")
})
})
describe("common > IsOneOf", () => {
it("should return true when supplied value is in list of given vals", () => {
expect(common.isOneOf("odo", "make")("odo")).toBe(true)
expect(common.isOneOf(1, 33, 9)(9)).toBe(true)
expect(common.isOneOf(true, false, "")(true)).toBe(true)
})
it("should return false when supplied value is not in list of given vals", () => {
expect(common.isOneOf("odo", "make")("bob")).toBe(false)
expect(common.isOneOf(1, 33, 9)(999)).toBe(false)
expect(common.isOneOf(1, false, "")(true)).toBe(false)
})
})
describe("defineError", () => {
it("should prefix and exception with message, and rethrow", () => {
expect(() =>
common.defineError(() => {
throw new Error("there")
}, "hello")
).toThrowError("hello : there")
})
it("should return function value when no exception", () => {
const result = common.defineError(() => 1, "no error")
expect(result).toBe(1)
})
})
describe("retry", () => {
let counter = 0
it("should retry once", async () => {
var result = await common.retry(async () => 1, 3, 50)
expect(result).toBe(1)
})
it("should retry twice", async () => {
var result = await common.retry(
async () => {
counter++
if (counter < 2) throw "error"
return counter
},
3,
50
)
expect(result).toBe(2)
})
it("throws error after 3 retries", async () => {
expect(
common.retry(
async () => {
counter++
throw counter
},
3,
50
)
).rejects.toThrowError(4)
})
})

View file

@ -0,0 +1,32 @@
import { testSchema } from "./testSchema.mjs"
import { isNonEmptyString } from "../src/common"
import { getNewRecord } from "../src/records/getNewRecord.mjs"
describe("getNewRecord", () => {
it("should get object with generated id and key (full path)", async () => {
const schema = testSchema()
const record = getNewRecord(schema, "Contact")
expect(record._id).toBeDefined()
expect(isNonEmptyString(record._id)).toBeTruthy()
expect(record._rev).not.toBeDefined()
expect(record._modelId).toBe(schema.findModel("Contact").id)
})
it("should create object with all declared fields, using default values", async () => {
const schema = testSchema()
const contact = getNewRecord(schema, "Contact")
expect(contact.Name).toBe(null)
expect(contact.Created).toBe(null)
expect(contact["Is Active"]).toBe(null)
})
it("should create object with all declared fields, and use inital values", async () => {
const schema = testSchema()
schema.findField("Contact", "Name").getInitialValue = "Default Name"
const contact = getNewRecord(schema, "Contact")
expect(contact.Name).toBe("Default Name")
})
})

View file

@ -0,0 +1,299 @@
import {
setupApphierarchy,
stubEventHandler,
basicAppHierarchyCreator_WithFields,
basicAppHierarchyCreator_WithFields_AndIndexes,
hierarchyFactory,
withFields,
} from "./specHelpers"
import { find } from "lodash"
import { addHours } from "date-fns"
import { events } from "../src/common"
describe("recordApi > validate", () => {
it("should return errors when any fields do not parse", async () => {
const { recordApi } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
record.isalive = "hello"
record.age = "nine"
record.createddate = "blah"
const validationResult = await recordApi.validate(record)
expect(validationResult.isValid).toBe(false)
expect(validationResult.errors.length).toBe(3)
})
it("should return errors when mandatory field is empty", async () => {
const withValidationRule = (hierarchy, templateApi) => {
templateApi.addRecordValidationRule(hierarchy.customerRecord)(
templateApi.commonRecordValidationRules.fieldNotEmpty("surname")
)
}
const hierarchyCreator = hierarchyFactory(withFields, withValidationRule)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.surname = ""
const validationResult = await recordApi.validate(record)
expect(validationResult.isValid).toBe(false)
expect(validationResult.errors.length).toBe(1)
})
it("should return error when string field is beyond maxLength", async () => {
const withFieldWithMaxLength = hierarchy => {
const surname = find(
hierarchy.customerRecord.fields,
f => f.name === "surname"
)
surname.typeOptions.maxLength = 5
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.surname = "more than 5 chars"
const validationResult = await recordApi.validate(record)
expect(validationResult.isValid).toBe(false)
expect(validationResult.errors.length).toBe(1)
})
it("should return error when number field is > maxValue", async () => {
const withFieldWithMaxLength = hierarchy => {
const age = find(hierarchy.customerRecord.fields, f => f.name === "age")
age.typeOptions.maxValue = 10
age.typeOptions.minValue = 5
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const tooOldRecord = recordApi.getNew("/customers", "customer")
tooOldRecord.age = 11
const tooOldResult = await recordApi.validate(tooOldRecord)
expect(tooOldResult.isValid).toBe(false)
expect(tooOldResult.errors.length).toBe(1)
})
it("should return error when number field is < minValue", async () => {
const withFieldWithMaxLength = hierarchy => {
const age = find(hierarchy.customerRecord.fields, f => f.name === "age")
age.typeOptions.minValue = 5
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const tooYoungRecord = recordApi.getNew("/customers", "customer")
tooYoungRecord.age = 3
const tooYoungResult = await recordApi.validate(tooYoungRecord)
expect(tooYoungResult.isValid).toBe(false)
expect(tooYoungResult.errors.length).toBe(1)
})
it("should return error when number has too many decimal places", async () => {
const withFieldWithMaxLength = (hierarchy, templateApi) => {
const age = find(hierarchy.customerRecord.fields, f => f.name === "age")
age.typeOptions.decimalPlaces = 2
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.age = 3.123
const validationResult = await recordApi.validate(record)
expect(validationResult.isValid).toBe(false)
expect(validationResult.errors.length).toBe(1)
})
it("should return error when datetime field is > maxValue", async () => {
const withFieldWithMaxLength = hierarchy => {
const createddate = find(
hierarchy.customerRecord.fields,
f => f.name === "createddate"
)
createddate.typeOptions.maxValue = new Date()
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.createddate = addHours(new Date(), 1)
const result = await recordApi.validate(record)
expect(result.isValid).toBe(false)
expect(result.errors.length).toBe(1)
})
it("should return error when number field is < minValue", async () => {
const withFieldWithMaxLength = hierarchy => {
const createddate = find(
hierarchy.customerRecord.fields,
f => f.name === "createddate"
)
createddate.typeOptions.minValue = addHours(new Date(), 1)
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.createddate = new Date()
const result = await recordApi.validate(record)
expect(result.isValid).toBe(false)
expect(result.errors.length).toBe(1)
})
it("should return error when string IS NOT one of declared values, and only declared values are allowed", async () => {
const withFieldWithMaxLength = hierarchy => {
const surname = find(
hierarchy.customerRecord.fields,
f => f.name === "surname"
)
surname.typeOptions.allowDeclaredValuesOnly = true
surname.typeOptions.values = ["thedog"]
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi } = await setupApphierarchy(hierarchyCreator)
const record = recordApi.getNew("/customers", "customer")
record.surname = "zeecat"
const result = await recordApi.validate(record)
expect(result.isValid).toBe(false)
expect(result.errors.length).toBe(1)
})
it("should not return error when string IS one of declared values, and only declared values are allowed", async () => {
const withFieldWithMaxLength = hierarchy => {
const surname = find(
hierarchy.customerRecord.fields,
f => f.name === "surname"
)
surname.typeOptions.allowDeclaredValuesOnly = true
surname.typeOptions.values = ["thedog"]
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi, appHierarchy } = await setupApphierarchy(
hierarchyCreator
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "thedog"
const result = await recordApi.validate(record)
expect(result.isValid).toBe(true)
expect(result.errors.length).toBe(0)
})
it("should not return error when string IS NOT one of declared values, but any values are allowed", async () => {
const withFieldWithMaxLength = (hierarchy, templateApi) => {
const surname = find(
hierarchy.customerRecord.fields,
f => f.name === "surname"
)
surname.typeOptions.allowDeclaredValuesOnly = false
surname.typeOptions.values = ["thedog"]
}
const hierarchyCreator = hierarchyFactory(
withFields,
withFieldWithMaxLength
)
const { recordApi, appHierarchy } = await setupApphierarchy(
hierarchyCreator
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "zeecat"
const result = await recordApi.validate(record)
expect(result.isValid).toBe(true)
expect(result.errors.length).toBe(0)
})
it("should return error when reference field does not exist in options index", async () => {
const { recordApi, appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields_AndIndexes
)
const partner = recordApi.getNew("/partners", "partner")
partner.businessName = "ACME Inc"
await recordApi.save(partner)
const customer = recordApi.getNew("/customers", "customer")
customer.partner = { key: "incorrect key", name: partner.businessName }
const result = await await recordApi.validate(customer)
expect(result.isValid).toBe(false)
expect(result.errors.length).toBe(1)
})
it("should publish invalid events", async () => {
const withValidationRule = (hierarchy, templateApi) => {
templateApi.addRecordValidationRule(hierarchy.customerRecord)(
templateApi.commonRecordValidationRules.fieldNotEmpty("surname")
)
}
const hierarchyCreator = hierarchyFactory(withFields, withValidationRule)
const { recordApi, subscribe } = await setupApphierarchy(hierarchyCreator)
const handler = stubEventHandler()
subscribe(events.recordApi.save.onInvalid, handler.handle)
const record = recordApi.getNew("/customers", "customer")
record.surname = ""
try {
await recordApi.save(record)
} catch (e) {}
const onInvalid = handler.getEvents(events.recordApi.save.onInvalid)
expect(onInvalid.length).toBe(1)
expect(onInvalid[0].context.record).toBeDefined()
expect(onInvalid[0].context.record.key).toBe(record.key)
expect(onInvalid[0].context.validationResult).toBeDefined()
})
})

View file

@ -0,0 +1,11 @@
{
"spec_dir": "test",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"helpers/**/*.js"
],
"stopSpecOnExpectationFailure": false,
"random": false
}

View file

@ -0,0 +1,110 @@
import { validateActions, validateTrigger } from "../src/templateApi/validate"
import { createValidActionsAndTriggers } from "./specHelpers"
describe("templateApi actions validation", () => {
it("should return no errors when all actions are valid", () => {
const { allActions } = createValidActionsAndTriggers()
const result = validateActions(allActions)
expect(result).toEqual([])
})
it("should return error for empty behaviourName", () => {
const { allActions, logMessage } = createValidActionsAndTriggers()
logMessage.behaviourName = ""
const result = validateActions(allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("behaviourName")
})
it("should return error for empty behaviourSource", () => {
const { allActions, logMessage } = createValidActionsAndTriggers()
logMessage.behaviourSource = ""
const result = validateActions(allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("behaviourSource")
})
it("should return error for empty name", () => {
const { allActions, logMessage } = createValidActionsAndTriggers()
logMessage.name = ""
const result = validateActions(allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("name")
})
it("should return error for duplicate name", () => {
const {
allActions,
logMessage,
measureCallTime,
} = createValidActionsAndTriggers()
logMessage.name = measureCallTime.name
const result = validateActions(allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("")
})
})
describe("tempalteApi triggers validation", () => {
it("should return error when actionName is empty", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.actionName = ""
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("actionName")
})
it("should return error when eventName is empty", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.eventName = ""
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("eventName")
})
it("should return error when eventName does not exist in allowed events", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.eventName = "non existant event name"
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("eventName")
})
it("should return error when actionName does not exist in supplied actions", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.actionName = "non existent action name"
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("actionName")
})
it("should return error when optionsCreator is invalid javascript", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.optionsCreator = "this is nonsense"
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("optionsCreator")
})
it("should return error when condition is invalid javascript", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.condition = "this is nonsense"
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(1)
expect(result[0].field).toEqual("condition")
})
it("should not return error when condition is empty", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.condition = ""
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(0)
})
it("should not return error when optionsCreator is empty", () => {
const { allActions, logOnErrorTrigger } = createValidActionsAndTriggers()
logOnErrorTrigger.optionsCreator = ""
const result = validateTrigger(logOnErrorTrigger, allActions)
expect(result.length).toBe(0)
})
})

View file

@ -0,0 +1,86 @@
import {
setupApphierarchy,
basicAppHierarchyCreator_WithFields,
stubEventHandler,
basicAppHierarchyCreator_WithFields_AndIndexes,
} from "./specHelpers"
import { canDeleteIndex } from "../src/templateApi/canDeleteIndex"
import { canDeleteRecord } from "../src/templateApi/canDeleteRecord"
describe("canDeleteIndex", () => {
it("should return no errors if deltion is valid", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const partnerIndex = appHierarchy.root.indexes.find(i => i.name === "partner_index")
const result = canDeleteIndex(partnerIndex)
expect(result.canDelete).toBe(true)
expect(result.errors).toEqual([])
})
it("should return errors if index is a lookup for a reference field", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const customerIndex = appHierarchy.root.indexes.find(i => i.name === "customer_index")
const result = canDeleteIndex(customerIndex)
expect(result.canDelete).toBe(false)
expect(result.errors.length).toBe(1)
})
it("should return errors if index is a manyToOne index for a reference field", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const referredToCustomersIndex = appHierarchy.customerRecord.indexes.find(i => i.name === "referredToCustomers")
const result = canDeleteIndex(referredToCustomersIndex)
expect(result.canDelete).toBe(false)
expect(result.errors.length).toBe(1)
})
})
describe("canDeleteRecord", () => {
it("should return no errors when deletion is valid", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
appHierarchy.root.indexes = appHierarchy.root.indexes.filter(i => !i.allowedRecordNodeIds.includes(appHierarchy.customerRecord.nodeId))
const result = canDeleteRecord(appHierarchy.customerRecord)
expect(result.canDelete).toBe(true)
expect(result.errors).toEqual([])
})
it("should return errors when record is referenced by hierarchal index", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const result = canDeleteRecord(appHierarchy.customerRecord)
expect(result.canDelete).toBe(false)
expect(result.errors.some(e => e.includes("customer_index"))).toBe(true)
})
it("should return errors when record has a child which cannot be deleted", async () => {
const { appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields_AndIndexes
)
const result = canDeleteRecord(appHierarchy.customerRecord)
expect(result.canDelete).toBe(false)
expect(result.errors.some(e => e.includes("Outstanding Invoices"))).toBe(true)
})
})

View file

@ -0,0 +1,124 @@
import { isDefined, join, fieldDefinitions, $ } from "../src/common"
import { getMemoryTemplateApi } from "./specHelpers"
import { fieldErrors } from "../src/templateApi/fields"
const getRecordTemplate = templateApi =>
$(templateApi.getNewRootLevel(), [templateApi.getNewRecordTemplate])
const getValidField = templateApi => {
const field = templateApi.getNewField("string")
field.name = "forename"
field.label = "forename"
return field
}
const testMemberIsNotSet = membername => async () => {
const { templateApi } = await getMemoryTemplateApi()
const field = getValidField(templateApi)
field[membername] = ""
const errorsNotSet = templateApi.validateField([field])(field)
expect(errorsNotSet.length).toBe(1)
expect(errorsNotSet[0].error.includes("is not set")).toBeTruthy()
}
const testMemberIsNotDefined = membername => async () => {
const { templateApi } = await getMemoryTemplateApi()
const field = getValidField(templateApi)
delete field[membername]
const errorsNotSet = templateApi.validateField([field])(field)
expect(errorsNotSet.length).toBe(1)
expect(errorsNotSet[0].error.includes("is not set")).toBeTruthy()
}
describe("validateField", () => {
it("should return error when name is not set", testMemberIsNotSet("name"))
it(
"should return error when name is not defined",
testMemberIsNotDefined("name")
)
it("should return error when type is not set", testMemberIsNotSet("type"))
it(
"should return error when type is not defined",
testMemberIsNotDefined("type")
)
it(
"should return error when label is not defined",
testMemberIsNotDefined("label")
)
it(
"should return error when getInitialValue is not defined",
testMemberIsNotDefined("getInitialValue")
)
it(
"should return error when getInitialValue is not set",
testMemberIsNotSet("getInitialValue")
)
it(
"should return error when getUndefinedValue is not defined",
testMemberIsNotDefined("getUndefinedValue")
)
it(
"should return error when getUndefinedValue is not set",
testMemberIsNotSet("getUndefinedValue")
)
it("should return no errors when valid field is supplied", async () => {
const { templateApi } = await getMemoryTemplateApi()
const field = getValidField(templateApi)
const errors = templateApi.validateField([field])(field)
expect(errors.length).toBe(0)
})
it("should return error when field with same name exists already", async () => {
const { templateApi } = await getMemoryTemplateApi()
const field1 = getValidField(templateApi)
field1.name = "surname"
const field2 = getValidField(templateApi)
field2.name = "surname"
const errors = templateApi.validateField([field1, field2])(field2)
expect(errors.length).toBe(1)
expect(errors[0].error).toBe("field name is duplicated")
expect(errors[0].field).toBe("name")
})
it("should return error when field is not one of allowed types", async () => {
const { templateApi } = await getMemoryTemplateApi()
const field = getValidField(templateApi)
field.type = "sometype"
const errors = templateApi.validateField([field])(field)
expect(errors.length).toBe(1)
expect(errors[0].error).toBe("type is unknown")
expect(errors[0].field).toBe("type")
})
})
describe("addField", () => {
it("should throw exception when field is invalid", async () => {
const { templateApi } = await getMemoryTemplateApi()
const record = getRecordTemplate(templateApi)
const field = getValidField(templateApi)
field.name = ""
expect(() => templateApi.addField(record, field)).toThrow(
new RegExp("^" + fieldErrors.AddFieldValidationFailed, "i")
)
})
it("should add field when field is valid", async () => {
const { templateApi } = await getMemoryTemplateApi()
const record = getRecordTemplate(templateApi)
const field = getValidField(templateApi)
field.name = "some_new_field"
templateApi.addField(record, field)
expect(record.fields.length).toBe(1)
expect(record.fields[0]).toBe(field)
})
})

View file

@ -0,0 +1,285 @@
import { getNewFieldValue, safeParseField } from "../src/types"
import { getNewField } from "../src/templateApi/fields"
import { isDefined } from "../src/common"
const getField = type => {
const field = getNewField(type)
return field
}
const nothingReference = { key: "" }
const nothingFile = { relativePath: "", size: 0 }
describe("types > getNew", () => {
const defaultAlwaysNull = type => () => {
const field = getField(type)
field.getInitialValue = "default"
const value = getNewFieldValue(field)
expect(value).toBe(null)
}
it(
"bool should return null when fields getInitialValue is 'default'",
defaultAlwaysNull("bool")
)
it(
"string should return null when fields getInitialValue is 'default'",
defaultAlwaysNull("string")
)
it(
"number should return null when fields getInitialValue is 'default'",
defaultAlwaysNull("number")
)
it(
"datetime should return null when fields getInitialValue is 'default'",
defaultAlwaysNull("datetime")
)
it("reference should return {key:''} when fields getInitialValue is 'default'", () => {
const field = getField("reference")
field.getInitialValue = "default"
const value = getNewFieldValue(field)
expect(value).toEqual(nothingReference)
})
it("file should return {relativePath:'', size:0} when fields getInitialValue is 'default'", () => {
const field = getField("file")
field.getInitialValue = "default"
const value = getNewFieldValue(field)
expect(value).toEqual(nothingFile)
})
it("array should return empty array when field getInitialValue is 'default'", () => {
const field = getField("array<string>")
field.getInitialValue = "default"
const value = getNewFieldValue(field)
expect(value).toEqual([])
})
it("datetime should return Now when getInitialValue is 'now'", () => {
const field = getField("datetime")
field.getInitialValue = "now"
const before = new Date()
const value = getNewFieldValue(field)
const after = new Date()
expect(value >= before && value <= after).toBeTruthy()
})
const test_getNewFieldValue = (type, val, expected) => () => {
const field = getField(type)
field.getInitialValue = val
const value = getNewFieldValue(field)
expect(value).toEqual(expected)
}
it("bool should parse value in getInitialValue if function not recognised", () => {
test_getNewFieldValue("bool", "true", true)()
test_getNewFieldValue("bool", "on", true)()
test_getNewFieldValue("bool", "1", true)()
test_getNewFieldValue("bool", "yes", true)()
test_getNewFieldValue("bool", "false", false)()
test_getNewFieldValue("bool", "off", false)()
test_getNewFieldValue("bool", "0", false)()
test_getNewFieldValue("bool", "no", false)()
})
it("bool should return null if function not recognised and value cannot be parsed", () => {
test_getNewFieldValue("bool", "blah", null)()
test_getNewFieldValue("bool", 111, null)()
})
it("number should parse value in getInitialValue if function not recognised", () => {
test_getNewFieldValue("number", "1", 1)()
test_getNewFieldValue("number", "45", 45)()
test_getNewFieldValue("number", "4.11", 4.11)()
})
it("number should return null if function not recognised and value cannot be parsed", () => {
test_getNewFieldValue("number", "blah", null)()
test_getNewFieldValue("number", true, null)()
})
it("string should parse value in getInitialValue if function not recognised", () => {
test_getNewFieldValue("string", "hello there", "hello there")()
test_getNewFieldValue("string", 45, "45")()
test_getNewFieldValue("string", true, "true")()
})
it("array should return empty array when function not recognised", () => {
test_getNewFieldValue("array<string>", "blah", [])()
test_getNewFieldValue("array<bool>", true, [])()
test_getNewFieldValue("array<number>", 1, [])()
test_getNewFieldValue("array<datetime>", "", [])()
test_getNewFieldValue("array<reference>", "", [])()
test_getNewFieldValue("array<file>", "", [])()
})
it("reference should {key:''} when function not recognised", () => {
test_getNewFieldValue("reference", "blah", nothingReference)()
})
it("file should return {relativePath:'',size:0} when function not recognised", () => {
test_getNewFieldValue("file", "blah", nothingFile)()
})
})
describe("types > getSafeFieldValue", () => {
const test_getSafeFieldValue = (type, member, value, expectedParse) => () => {
const field = getField(type)
field.getDefaultValue = "default"
field.name = member
const record = {}
if (isDefined(value)) record[member] = value
const parsedvalue = safeParseField(field, record)
expect(parsedvalue).toEqual(expectedParse)
}
it(
"should get default field value when member is undefined on record",
test_getSafeFieldValue("string", "forename", undefined, null)
)
it("should return null as null (except array and reference)", () => {
test_getSafeFieldValue("string", "forename", null, null)()
test_getSafeFieldValue("bool", "isalive", null, null)()
test_getSafeFieldValue("datetime", "created", null, null)()
test_getSafeFieldValue("number", "age", null, null)()
test_getSafeFieldValue("array<string>", "tags", null, [])()
test_getSafeFieldValue("reference", "moretags", null, nothingReference)()
test_getSafeFieldValue("file", "moretags", null, nothingFile)()
})
it("bool should parse a defined set of true/false aliases", () => {
test_getSafeFieldValue("bool", "isalive", true, true)()
test_getSafeFieldValue("bool", "isalive", "true", true)()
test_getSafeFieldValue("bool", "isalive", "on", true)()
test_getSafeFieldValue("bool", "isalive", "1", true)()
test_getSafeFieldValue("bool", "isalive", "yes", true)()
test_getSafeFieldValue("bool", "isalive", false, false)()
test_getSafeFieldValue("bool", "isalive", "false", false)()
test_getSafeFieldValue("bool", "isalive", "off", false)()
test_getSafeFieldValue("bool", "isalive", "0", false)()
test_getSafeFieldValue("bool", "isalive", "no", false)()
})
it(
"bool should parse invalid values as null",
test_getSafeFieldValue("bool", "isalive", "blah", null)
)
it("number should parse numbers and strings that are numbers", () => {
test_getSafeFieldValue("number", "age", 204, 204)()
test_getSafeFieldValue("number", "age", "1", 1)()
test_getSafeFieldValue("number", "age", "45", 45)()
test_getSafeFieldValue("number", "age", "4.11", 4.11)()
})
it(
"number should parse invalid values as null",
test_getSafeFieldValue("number", "age", "blah", null)
)
it(
"string should parse strings",
test_getSafeFieldValue("string", "forename", "bob", "bob")
)
it("string should parse any other basic type", () => {
test_getSafeFieldValue("string", "forename", true, "true")()
test_getSafeFieldValue("string", "forename", 1, "1")()
})
it("date should parse dates in various precisions", () => {
// dont forget that JS Date's month is zero based
test_getSafeFieldValue(
"datetime",
"createddate",
"2018-02-14",
new Date(2018, 1, 14)
)()
test_getSafeFieldValue(
"datetime",
"createddate",
"2018-2-14",
new Date(2018, 1, 14)
)()
test_getSafeFieldValue(
"datetime",
"createddate",
"2018-02-14 11:00:00.000",
new Date(2018, 1, 14, 11)
)()
test_getSafeFieldValue(
"datetime",
"createddate",
"2018-02-14 11:30",
new Date(2018, 1, 14, 11, 30)
)()
})
it("date should parse invalid dates as null", () => {
// dont forget that JS Date's month is zero based
test_getSafeFieldValue("datetime", "createddate", "2018-13-14", null)()
test_getSafeFieldValue("datetime", "createddate", "2018-2-33", null)()
test_getSafeFieldValue("datetime", "createddate", "bla", null)()
})
it("array should parse array", () => {
test_getSafeFieldValue(
"array<string>",
"tags",
["bob", "the", "dog"],
["bob", "the", "dog"]
)()
test_getSafeFieldValue(
"array<bool>",
"tags",
[true, false],
[true, false]
)()
test_getSafeFieldValue(
"array<number>",
"tags",
[1, 2, 3, 4],
[1, 2, 3, 4]
)()
test_getSafeFieldValue(
"array<reference>",
"tags",
[{ key: "/customer/1234", value: "bob" }],
[{ key: "/customer/1234", value: "bob" }]
)()
})
it("array should convert the generic's child type", () => {
test_getSafeFieldValue("array<string>", "tags", [1, true], ["1", "true"])()
test_getSafeFieldValue(
"array<bool>",
"tags",
["yes", "true", "no", "false", true, false],
[true, true, false, false, true, false]
)()
test_getSafeFieldValue("array<number>", "tags", ["1", 23], [1, 23])()
})
it("reference should parse reference", () => {
test_getSafeFieldValue(
"reference",
"customer",
{ key: "/customer/1234", value: "bob" },
{ key: "/customer/1234", value: "bob" }
)()
})
it("reference should parse reference", () => {
test_getSafeFieldValue(
"file",
"profilepic",
{ relativePath: "path/to/pic.jpg", size: 120 },
{ relativePath: "path/to/pic.jpg", size: 120 }
)()
})
})

View file

@ -0,0 +1,32 @@
import { newModel } from "../src/schema/models.mjs"
import { newView } from "../src/schema/views.mjs"
import { getNewField } from "../src/schema/fields.mjs"
import { fullSchema } from "../src/schema/fullSchema.mjs"
export function testSchema() {
const addFieldToModel = (model, { type, name }) => {
const field = getNewField(type || "string")
field.name = name
model.fields.push(field)
}
const contactModel = newModel()
contactModel.name = "Contact"
contactModel.primaryField = "Name"
addFieldToModel(contactModel, { name: "Name" })
addFieldToModel(contactModel, { name: "Is Active", type: "bool" })
addFieldToModel(contactModel, { name: "Created", type: "datetime" })
const activeContactsView = newView(contactModel.id)
activeContactsView.name = "Active Contacts"
activeContactsView.map = "if (doc['Is Active']) emit(doc.Name, doc)"
const dealModel = newModel()
dealModel.name = "Deal"
addFieldToModel(dealModel, { name: "Name" })
addFieldToModel(dealModel, { name: "Estimated Value", type: "number" })
addFieldToModel(dealModel, { name: "Contact", type: "link" })
return fullSchema([contactModel, dealModel], [activeContactsView])
}

5025
packages/common/yarn.lock Normal file

File diff suppressed because it is too large Load diff

View file

@ -70,6 +70,7 @@
"date-fns": "^1.29.0",
"lodash": "^4.17.13",
"lunr": "^2.3.5",
"nano": "^8.2.2",
"safe-buffer": "^5.1.2",
"shortid": "^2.2.8"
},

View file

@ -1,57 +1,7 @@
import { retry } from "../common/index"
import { NotFoundError } from "../common/errors"
const createJson = originalCreateFile => async (
key,
obj,
retries = 2,
delay = 100
) => await retry(originalCreateFile, retries, delay, key, JSON.stringify(obj))
const createNewFile = originalCreateFile => async (
path,
content,
retries = 2,
delay = 100
) => await retry(originalCreateFile, retries, delay, path, content)
const loadJson = datastore => async (key, retries = 3, delay = 100) => {
try {
return await retry(
JSON.parse,
retries,
delay,
await datastore.loadFile(key)
)
} catch (err) {
const newErr = new NotFoundError(err.message)
newErr.stack = err.stack
throw newErr
}
}
const updateJson = datastore => async (key, obj, retries = 3, delay = 100) => {
try {
return await retry(
datastore.updateFile,
retries,
delay,
key,
JSON.stringify(obj)
)
} catch (err) {
const newErr = new NotFoundError(err.message)
newErr.stack = err.stack
throw newErr
}
}
export const setupDatastore = datastore => {
const originalCreateFile = datastore.createFile
datastore.loadJson = loadJson(datastore)
datastore.createJson = createJson(originalCreateFile)
datastore.updateJson = updateJson(datastore)
datastore.createFile = createNewFile(originalCreateFile)
datastore.loadJson = datastore.loadFile
datastore.createJson = datastore.createFile
datastore.updateJson = datastore.updateFile
if (datastore.createEmptyDb) {
delete datastore.createEmptyDb
}

View file

@ -21,21 +21,9 @@ export const initialiseData = async (
applicationDefinition,
accessLevels
) => {
if (!(await datastore.exists(configFolder)))
await datastore.createFolder(configFolder)
if (!(await datastore.exists(appDefinitionFile)))
await datastore.createJson(appDefinitionFile, applicationDefinition)
await initialiseRootCollections(datastore, applicationDefinition.hierarchy)
await initialiseRootIndexes(datastore, applicationDefinition.hierarchy)
if (!(await datastore.exists(TRANSACTIONS_FOLDER)))
await datastore.createFolder(TRANSACTIONS_FOLDER)
if (!(await datastore.exists(AUTH_FOLDER)))
await datastore.createFolder(AUTH_FOLDER)
if (!(await datastore.exists(USERS_LIST_FILE)))
await datastore.createJson(USERS_LIST_FILE, [])
@ -48,17 +36,6 @@ export const initialiseData = async (
await initialiseRootSingleRecords(datastore, applicationDefinition.hierarchy)
}
const initialiseRootIndexes = async (datastore, hierarchy) => {
const flathierarchy = getFlattenedHierarchy(hierarchy)
const globalIndexes = $(flathierarchy, [filter(isGlobalIndex)])
for (const index of globalIndexes) {
if (!(await datastore.exists(index.nodeKey()))) {
await initialiseIndex(datastore, "", index)
}
}
}
const initialiseRootSingleRecords = async (datastore, hierarchy) => {
const app = {
publish: () => {},
@ -71,9 +48,8 @@ const initialiseRootSingleRecords = async (datastore, hierarchy) => {
const singleRecords = $(flathierarchy, [filter(isSingleRecord)])
for (let record of singleRecords) {
if (await datastore.exists(record.nodeKey())) continue
await datastore.createFolder(record.nodeKey())
const result = _getNew(record, "")
result.key = record.nodeKey()
await _save(app, result)
}
}

View file

@ -21,22 +21,15 @@ export const deleteRecord = (app, disableCleanup = false) => async key => {
}
// called deleteRecord because delete is a keyword
export const _deleteRecord = async (app, key, disableCleanup) => {
export const _deleteRecord = async (app, key) => {
const recordInfo = getRecordInfo(app.hierarchy, key)
key = recordInfo.key
const node = getExactNodeForKey(app.hierarchy)(key)
const record = await _load(app, key)
await transactionForDeleteRecord(app, record)
for (const collectionRecord of node.children) {
const collectionKey = joinKey(key, collectionRecord.collectionName)
await _deleteCollection(app, collectionKey, true)
}
await app.datastore.deleteFolder(recordInfo.dir)
if (!disableCleanup) {
await app.cleanupTransactions()
}
await app.datastore.deleteFile(key)
}

View file

@ -1,5 +1,5 @@
import { keyBy, mapValues, filter, map, includes, last } from "lodash/fp"
import { getNode } from "../templateApi/hierarchy"
import { getNode, getExactNodeForKey } from "../templateApi/hierarchy"
import { safeParseField } from "../types"
import {
$,
@ -12,7 +12,6 @@ import {
} from "../common"
import { mapRecord } from "../indexing/evaluate"
import { permission } from "../authApi/permissions"
import { getRecordInfo } from "./recordInfo"
export const getRecordFileName = key => joinKey(key, "record.json")
@ -29,10 +28,9 @@ export const load = app => async key => {
)
}
export const _loadFromInfo = async (app, recordInfo, keyStack = []) => {
const key = recordInfo.key
const { recordNode, recordJson } = recordInfo
const storedData = await app.datastore.loadJson(recordJson)
export const _load = async (app, key, keyStack = []) => {
const recordNode = getExactNodeForKey(app.hierarchy)(key)
const storedData = await app.datastore.loadJson(key)
const loadedRecord = $(recordNode.fields, [
keyBy("name"),
@ -66,15 +64,12 @@ export const _loadFromInfo = async (app, recordInfo, keyStack = []) => {
}
}
loadedRecord.transactionId = storedData.transactionId
loadedRecord.isNew = false
loadedRecord._rev = storedData._rev
loadedRecord._id = storedData._id
loadedRecord.key = key
loadedRecord.id = $(key, [splitKey, last])
loadedRecord.type = recordNode.name
return loadedRecord
}
export const _load = async (app, key, keyStack = []) =>
_loadFromInfo(app, getRecordInfo(app.hierarchy, key), keyStack)
export default load

View file

@ -1,30 +1,18 @@
import { cloneDeep, take, takeRight, flatten, map, filter } from "lodash/fp"
import { cloneDeep } from "lodash/fp"
import { validate } from "./validate"
import { _loadFromInfo } from "./load"
import { apiWrapper, events, $, joinKey } from "../common"
import {
getFlattenedHierarchy,
isModel,
getNode,
fieldReversesReferenceToNode,
} from "../templateApi/hierarchy"
import {
transactionForCreateRecord,
transactionForUpdateRecord,
} from "../transactions/create"
import { _load } from "./load"
import { apiWrapper, events } from "../common"
import { permission } from "../authApi/permissions"
import { initialiseIndex } from "../indexing/initialiseIndex"
import { BadRequestError } from "../common/errors"
import { getRecordInfo } from "./recordInfo"
import { initialiseChildren } from "./initialiseChildren"
import { getExactNodeForKey } from "../templateApi/hierarchy"
export const save = app => async (record, context) =>
apiWrapper(
app,
events.recordApi.save,
record.isNew
? permission.createRecord.isAuthorized(record.key)
: permission.updateRecord.isAuthorized(record.key),
record._rev
? permission.updateRecord.isAuthorized(record.key)
: permission.createRecord.isAuthorized(record.key),
{ record },
_save,
app,
@ -48,73 +36,30 @@ export const _save = async (app, record, context, skipValidation = false) => {
}
}
const recordInfo = getRecordInfo(app.hierarchy, record.key)
const { recordNode, pathInfo, recordJson, files } = recordInfo
const recordNode = getExactNodeForKey(app.hierarchy)(record.key)
if (recordClone.isNew) {
recordClone.nodeKey = recordNode.nodeKey()
if (!record._rev) {
if (!recordNode) throw new Error("Cannot find node for " + record.key)
const transaction = await transactionForCreateRecord(app, recordClone)
recordClone.transactionId = transaction.id
await createRecordFolderPath(app.datastore, pathInfo)
await app.datastore.createFolder(files)
await app.datastore.createJson(recordJson, recordClone)
await initialiseChildren(app, recordInfo)
// FILES
// await app.datastore.createFolder(files)
await app.datastore.createJson(record.key, recordClone)
await app.publish(events.recordApi.save.onRecordCreated, {
record: recordClone,
})
} else {
const oldRecord = await _loadFromInfo(app, recordInfo)
const transaction = await transactionForUpdateRecord(
app,
oldRecord,
recordClone
)
recordClone.transactionId = transaction.id
await app.datastore.updateJson(recordJson, recordClone)
const oldRecord = await _load(app, record.key)
await app.datastore.updateJson(record.key, recordClone)
await app.publish(events.recordApi.save.onRecordUpdated, {
old: oldRecord,
new: recordClone,
})
}
await app.cleanupTransactions()
const returnedClone = cloneDeep(recordClone)
returnedClone.isNew = false
return returnedClone
}
const createRecordFolderPath = async (datastore, pathInfo) => {
const recursiveCreateFolder = async (
subdirs,
dirsThatNeedCreated = undefined
) => {
// iterate backwards through directory hierachy
// until we get to a folder that exists, then create the rest
// e.g
// - some/folder/here
// - some/folder
// - some
const thisFolder = joinKey(pathInfo.base, ...subdirs)
if (await datastore.exists(thisFolder)) {
let creationFolder = thisFolder
for (let nextDir of dirsThatNeedCreated || []) {
creationFolder = joinKey(creationFolder, nextDir)
await datastore.createFolder(creationFolder)
}
} else if (!dirsThatNeedCreated || dirsThatNeedCreated.length > 0) {
dirsThatNeedCreated = !dirsThatNeedCreated ? [] : dirsThatNeedCreated
await recursiveCreateFolder(take(subdirs.length - 1)(subdirs), [
...takeRight(1)(subdirs),
...dirsThatNeedCreated,
])
}
}
await recursiveCreateFolder(pathInfo.subdirs)
return joinKey(pathInfo.base, ...pathInfo.subdirs)
// TODO: use nano.head to get _rev (saves loading whole doc)
const savedResult = await app.datastore.loadFile(record.key)
recordClone._rev = savedResult._rev
return recordClone
}

View file

@ -0,0 +1,32 @@
import { includes } from "lodash/fp"
export const getCouchDbView = (hierarchy, indexNode) => {
const filter = codeAsFunction("filter", indexNode.filter)
const map = codeAsFunction("map", indexNode.map)
const allowedIdsFilter
const includeDocs = !map
const couchDbMap = ``
}
const codeAsFunction = (name, code) => {
if ((code || "").trim().length === 0) return
let safeCode
if (includes("return ")(code)) {
safeCode = code
} else {
let trimmed = code.trim()
trimmed = trimmed.endsWith(";")
? trimmed.substring(0, trimmed.length - 1)
: trimmed
safeCode = `return (${trimmed})`
}
return `function ${name}() {
${safeCode}
}`
}

View file

@ -29,6 +29,10 @@ export const _saveApplicationHierarchy = async (datastore, hierarchy) => {
)
}
if (hierarchy.getFlattenedHierarchy) {
delete hierarchy.getFlattenedHierarchy
}
if (await datastore.exists(appDefinitionFile)) {
const appDefinition = await datastore.loadJson(appDefinitionFile)
appDefinition.hierarchy = hierarchy

View file

@ -0,0 +1,107 @@
import { isUndefined, isString } from "lodash"
import initialiseNano from "nano"
export const getTestDb = async () => {
const nano = initialiseNano("http://admin:password@127.0.0.1:5984")
try {
await nano.db.destroy("unit_tests")
} catch (_) {
// do nothing
}
await nano.db.create("unit_tests")
const db = nano.use("unit_tests")
await db.insert({ _id: "/", folderMarker, items: [] })
return db
}
const folderMarker = "OH-YES-ITSA-FOLDER-"
const isFolder = val => {
if (isUndefined(val)) {
throw new Error("Passed undefined value for folder")
}
return val.folderMarker === folderMarker
}
export const createFile = db => async (key, content) => {
return await db.insert({ _id: key, ...content })
}
export const updateFile = db => async (key, content) => {
if (!content._rev) {
throw new Error("not an update: no _rev supplied")
}
return await db.insert({ _id: key, ...content })
}
export const writableFileStream = db => async key => {
throw new Error("WRITABLE STREAM: souldn't need this")
}
export const readableFileStream = db => async key => {
throw new Error("READABLE STREAM: souldn't need this")
}
export const getFileSize = data => async path => {
throw new Error("GET FILE SIZE: should'nt need this")
}
export const renameFile = db => async (oldKey, newKey) => {
// used by indexing and Files - wont be needed
throw new Error(
"RENAME FILE: not clear how to do this in CouchDB - we probably dont need it"
)
}
export const loadFile = db => async key => {
return await db.get(key)
}
export const exists = db => async key => {
try {
await db.head(key)
return true
} catch (_) {
return false
}
}
export const deleteFile = db => async keyOrDoc => {
const doc = isString(keyOrDoc) ? await db.get(keyOrDoc) : keyOrDoc
const key = isString(keyOrDoc) ? keyOrDoc : doc._id
if (isFolder(doc))
throw new Error("DeleteFile: Path " + key + " is a folder, not a file")
await db.destroy(key)
}
export const createFolder = db => async key => {
await db.insert({ _id: key, folderMarker, items: [] })
}
export const deleteFolder = db => async keyOrDoc => {
throw new Error("DELETE FOLDER: should not be needed")
}
export const getFolderContents = db => async key => {
const doc = await db.get(key)
if (!isFolder(doc)) throw new Error("Not a folder: " + key)
return doc.items
}
export default db => {
return {
createFile: createFile(db),
updateFile: updateFile(db),
loadFile: loadFile(db),
exists: exists(db),
deleteFile: deleteFile(db),
createFolder: createFolder(db),
deleteFolder: deleteFolder(db),
readableFileStream: readableFileStream(db),
writableFileStream: writableFileStream(db),
renameFile: renameFile(db),
getFolderContents: getFolderContents(db),
getFileSize: getFileSize(db),
datastoreType: "couchdb",
datastoreDescription: "",
data: db,
}
}

View file

@ -7,7 +7,7 @@ import { isFunction } from "lodash"
describe("getAppApis", () => {
const getMemoryAppApis = async () => {
const { templateApi } = getMemoryTemplateApi()
const { templateApi } = await getMemoryTemplateApi()
const rootNode = templateApi.getNewRootLevel()
await templateApi.saveApplicationHierarchy(rootNode)

View file

@ -75,7 +75,7 @@ describe("initialiseData", () => {
})
const getApplicationDefinition = () => {
const { templateApi, app } = getMemoryTemplateApi()
const { templateApi, app } = await getMemoryTemplateApi()
const h = basicAppHierarchyCreator_WithFields_AndIndexes(templateApi)
return {
appDef: { hierarchy: h.root, actions: [], triggers: [] },

View file

@ -48,23 +48,6 @@ describe("recordApi > save then load", () => {
expect(saved.createddate).toEqual(record.createddate)
})
it("loaded record isNew() always return false", async () => {
const { recordApi } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.age = 9
record.createddate = new Date()
await recordApi.save(record)
const saved = await recordApi.load(record.key)
expect(saved.isNew).toBeDefined()
expect(saved.isNew).toBe(false)
})
it("loaded record id() and key() should work", async () => {
const { recordApi } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
@ -133,7 +116,7 @@ describe("recordApi > save then load", () => {
referredByCustomer.age = 9
;(referredByCustomer.isalive = true),
(referredByCustomer.createdDate = new Date())
const savedReferredBy = await recordApi.save(referredByCustomer)
await recordApi.save(referredByCustomer)
const referredCustomer = recordApi.getNew("/customers", "customer")
referredCustomer.surname = "Zeecat"
@ -143,6 +126,7 @@ describe("recordApi > save then load", () => {
referredCustomer.referredBy = referredByCustomer
await recordApi.save(referredCustomer)
const savedReferredBy = recordApi.load(referredByCustomer.key)
savedReferredBy.surname = "Zeedog"
await recordApi.save(savedReferredBy)
@ -152,16 +136,6 @@ describe("recordApi > save then load", () => {
})
describe("save", () => {
it("IsNew() should return false after save", async () => {
const { recordApi } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
const savedRecord = await recordApi.save(record)
expect(savedRecord.isNew).toBe(false)
})
it("should publish onbegin and oncomplete events", async () => {
const { recordApi, subscribe } = await setupApphierarchy(
@ -197,13 +171,16 @@ describe("save", () => {
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
const savedRecord = await recordApi.save(record)
await recordApi.save(record)
const onCreate = handler.getEvents(events.recordApi.save.onRecordCreated)
expect(onCreate.length).toBe(1)
expect(onCreate[0].context.record).toBeDefined()
expect(onCreate[0].context.record.key).toBe(record.key)
const savedRecord = await recordApi.load(record.key)
savedRecord.surname = "Zeecat"
await recordApi.save(savedRecord)
const onUpdate = handler.getEvents(events.recordApi.save.onRecordUpdated)
@ -216,63 +193,6 @@ describe("save", () => {
expect(onUpdate[0].context.new.surname).toBe("Zeecat")
})
it("should create folder and index for subcollection", async () => {
const { recordApi, appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
await recordApi.save(record)
const recordDir = getRecordInfo(appHierarchy.root, record.key).dir
expect(
await recordApi._storeHandle.exists(
`${recordDir}/invoice_index/index.csv`
)
).toBeTruthy()
expect(
await recordApi._storeHandle.exists(`${recordDir}/invoice_index`)
).toBeTruthy()
expect(
await recordApi._storeHandle.exists(`${recordDir}/invoices`)
).toBeTruthy()
})
it("should create index folder and shardMap for sharded reverse reference index", async () => {
const { recordApi, appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
await recordApi.save(record)
const recordDir = getRecordInfo(appHierarchy.root, record.key).dir
expect(
await recordApi._storeHandle.exists(
`${recordDir}/referredToCustomers/shardMap.json`
)
).toBeTruthy()
expect(
await recordApi._storeHandle.exists(`${recordDir}/referredToCustomers`)
).toBeTruthy()
})
it("should create folder for record", async () => {
const { recordApi, appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields
)
const record = recordApi.getNew("/customers", "customer")
record.surname = "Ledog"
await recordApi.save(record)
const recordDir = getRecordInfo(appHierarchy.root, record.key).dir
expect(await recordApi._storeHandle.exists(`${recordDir}`)).toBeTruthy()
expect(
await recordApi._storeHandle.exists(`${recordDir}/record.json`)
).toBeTruthy()
})
it("create should throw error, user user does not have permission", async () => {
const { recordApi, app, appHierarchy } = await setupApphierarchy(
basicAppHierarchyCreator_WithFields

View file

@ -5,7 +5,7 @@ import {
getIndexApi,
getActionsApi,
} from "../src"
import memory from "./memory"
import couchDb, { getTestDb } from "./couchDb"
import { setupDatastore } from "../src/appInitialise"
import {
configFolder,
@ -38,11 +38,13 @@ export const testFieldDefinitionsPath = testAreaName =>
export const testTemplatesPath = testAreaName =>
path.join(testFileArea(testAreaName), templateDefinitions)
export const getMemoryStore = () => setupDatastore(memory({}))
export const getMemoryTemplateApi = store => {
export const getMemoryStore = async () =>
setupDatastore(couchDb(await getTestDb()))
export const getMemoryTemplateApi = async store => {
const app = {
datastore: store || getMemoryStore(),
publish: () => { },
datastore: store || (await getMemoryStore()),
publish: () => {},
getEpochTime: async () => new Date().getTime(),
user: { name: "", permissions: [permission.writeTemplates.get()] },
}
@ -466,7 +468,7 @@ export const setupApphierarchy = async (
creator,
disableCleanupTransactions = false
) => {
const { templateApi } = getMemoryTemplateApi()
const { templateApi } = await getMemoryTemplateApi()
const hierarchy = creator(templateApi)
await initialiseData(templateApi._storeHandle, {
hierarchy: hierarchy.root,

View file

@ -866,6 +866,11 @@
dependencies:
"@babel/types" "^7.3.0"
"@types/caseless@*":
version "0.12.2"
resolved "https://registry.yarnpkg.com/@types/caseless/-/caseless-0.12.2.tgz#f65d3d6389e01eeb458bd54dc8f52b95a9463bc8"
integrity sha512-6ckxMjBBD8URvjB6J3NcnuAn5Pkl7t3TizAg+xdlzzQGSPSmBcXf8KoIH0ua/i+tio+ZRUHEXp0HEmvaR4kt0w==
"@types/estree@0.0.39":
version "0.0.39"
resolved "https://registry.yarnpkg.com/@types/estree/-/estree-0.0.39.tgz#e177e699ee1b8c22d23174caaa7422644389509f"
@ -896,6 +901,16 @@
resolved "https://registry.yarnpkg.com/@types/node/-/node-12.7.2.tgz#c4e63af5e8823ce9cc3f0b34f7b998c2171f0c44"
integrity sha512-dyYO+f6ihZEtNPDcWNR1fkoTDf3zAK3lAABDze3mz6POyIercH0lEUawUFXlG8xaQZmm1yEBON/4TsYv/laDYg==
"@types/request@^2.48.4":
version "2.48.4"
resolved "https://registry.yarnpkg.com/@types/request/-/request-2.48.4.tgz#df3d43d7b9ed3550feaa1286c6eabf0738e6cf7e"
integrity sha512-W1t1MTKYR8PxICH+A4HgEIPuAC3sbljoEVfyZbeFJJDbr30guDspJri2XOaM2E+Un7ZjrihaDi7cf6fPa2tbgw==
dependencies:
"@types/caseless" "*"
"@types/node" "*"
"@types/tough-cookie" "*"
form-data "^2.5.0"
"@types/resolve@0.0.8":
version "0.0.8"
resolved "https://registry.yarnpkg.com/@types/resolve/-/resolve-0.0.8.tgz#f26074d238e02659e323ce1a13d041eee280e194"
@ -908,6 +923,11 @@
resolved "https://registry.yarnpkg.com/@types/stack-utils/-/stack-utils-1.0.1.tgz#0a851d3bd96498fa25c33ab7278ed3bd65f06c3e"
integrity sha512-l42BggppR6zLmpfU6fq9HEa2oGPEI8yrSPL3GITjfRInppYFahObbIQOQK3UGxEnyQpltZLaPe75046NOZQikw==
"@types/tough-cookie@*":
version "4.0.0"
resolved "https://registry.yarnpkg.com/@types/tough-cookie/-/tough-cookie-4.0.0.tgz#fef1904e4668b6e5ecee60c52cc6a078ffa6697d"
integrity sha512-I99sngh224D0M7XgW1s120zxCt3VYQ3IQsuw3P3jbq5GG4yc79+ZjyKznyOGIQrflfylLgcfekeZW/vk0yng6A==
"@types/yargs-parser@*":
version "13.0.0"
resolved "https://registry.yarnpkg.com/@types/yargs-parser/-/yargs-parser-13.0.0.tgz#453743c5bbf9f1bed61d959baab5b06be029b2d0"
@ -1399,6 +1419,11 @@ browser-process-hrtime@^0.1.2:
resolved "https://registry.yarnpkg.com/browser-process-hrtime/-/browser-process-hrtime-0.1.3.tgz#616f00faef1df7ec1b5bf9cfe2bdc3170f26c7b4"
integrity sha512-bRFnI4NnjO6cnyLmOV/7PVoDEMJChlcfN0z4s1YMBY989/SvlfMI1lgCnkFUs53e9gQF+w7qu7XdllSTiSl8Aw==
browser-request@~0.3.0:
version "0.3.3"
resolved "https://registry.yarnpkg.com/browser-request/-/browser-request-0.3.3.tgz#9ece5b5aca89a29932242e18bf933def9876cc17"
integrity sha1-ns5bWsqJopkyJC4Yv5M975h2zBc=
browser-resolve@^1.11.3:
version "1.11.3"
resolved "https://registry.yarnpkg.com/browser-resolve/-/browser-resolve-1.11.3.tgz#9b7cbb3d0f510e4cb86bdbd796124d28b5890af6"
@ -1626,6 +1651,15 @@ clone@~0.1.9:
resolved "https://registry.yarnpkg.com/clone/-/clone-0.1.19.tgz#613fb68639b26a494ac53253e15b1a6bd88ada85"
integrity sha1-YT+2hjmyaklKxTJT4Vsaa9iK2oU=
cloudant-follow@^0.18.2:
version "0.18.2"
resolved "https://registry.yarnpkg.com/cloudant-follow/-/cloudant-follow-0.18.2.tgz#35dd7b29c5b9c58423d50691f848a990fbe2c88f"
integrity sha512-qu/AmKxDqJds+UmT77+0NbM7Yab2K3w0qSeJRzsq5dRWJTEJdWeb+XpG4OpKuTE9RKOa/Awn2gR3TTnvNr3TeA==
dependencies:
browser-request "~0.3.0"
debug "^4.0.1"
request "^2.88.0"
co@^4.6.0:
version "4.6.0"
resolved "https://registry.yarnpkg.com/co/-/co-4.6.0.tgz#6ea6bdf3d853ae54ccb8e47bfa0bf3f9031fb184"
@ -1842,7 +1876,7 @@ debug@^3.2.6:
dependencies:
ms "^2.1.1"
debug@^4.1.0, debug@^4.1.1:
debug@^4.0.1, debug@^4.1.0, debug@^4.1.1:
version "4.1.1"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.1.1.tgz#3b72260255109c6b589cee050f1d516139664791"
integrity sha512-pYAIzeRo8J6KPEaJ0VWOh5Pzkbw/RetuzehGM7QRRX5he4fPHx2rdKMB256ehJCkX+XRQm16eZLqLNS8RSZXZw==
@ -2013,6 +2047,11 @@ error-ex@^1.2.0, error-ex@^1.3.1:
dependencies:
is-arrayish "^0.2.1"
errs@^0.3.2:
version "0.3.2"
resolved "https://registry.yarnpkg.com/errs/-/errs-0.3.2.tgz#798099b2dbd37ca2bc749e538a7c1307d0b50499"
integrity sha1-eYCZstvTfKK8dJ5TinwTB9C1BJk=
es-abstract@^1.5.1:
version "1.13.0"
resolved "https://registry.yarnpkg.com/es-abstract/-/es-abstract-1.13.0.tgz#ac86145fdd5099d8dd49558ccba2eaf9b88e24e9"
@ -2289,6 +2328,15 @@ forever-agent@~0.6.1:
resolved "https://registry.yarnpkg.com/forever-agent/-/forever-agent-0.6.1.tgz#fbc71f0c41adeb37f96c577ad1ed42d8fdacca91"
integrity sha1-+8cfDEGt6zf5bFd60e1C2P2sypE=
form-data@^2.5.0:
version "2.5.1"
resolved "https://registry.yarnpkg.com/form-data/-/form-data-2.5.1.tgz#f2cbec57b5e59e23716e128fe44d4e5dd23895f4"
integrity sha512-m21N3WOmEEURgk6B9GLOE4RuWOFf28Lhh9qGYeNlGq4VDXUlJy2th2slBNU8Gp8EzloYZOibZJ7t5ecIrFSjVA==
dependencies:
asynckit "^0.4.0"
combined-stream "^1.0.6"
mime-types "^2.1.12"
form-data@~2.3.2:
version "2.3.3"
resolved "https://registry.yarnpkg.com/form-data/-/form-data-2.3.3.tgz#dcce52c05f644f298c6a7ab936bd724ceffbf3a6"
@ -2451,7 +2499,7 @@ har-schema@^2.0.0:
resolved "https://registry.yarnpkg.com/har-schema/-/har-schema-2.0.0.tgz#a94c2224ebcac04782a0d9035521f24735b7ec92"
integrity sha1-qUwiJOvKwEeCoNkDVSHyRzW37JI=
har-validator@~5.1.0:
har-validator@~5.1.0, har-validator@~5.1.3:
version "5.1.3"
resolved "https://registry.yarnpkg.com/har-validator/-/har-validator-5.1.3.tgz#1ef89ebd3e4996557675eed9893110dc350fa080"
integrity sha512-sNvOCzEQNr/qrvJgc3UG/kD4QtlHycrzwS+6mfTrrSq97BvaYcPZZI1ZSqGSPR73Cxn4LKTD4PttRwfU7jWq5g==
@ -3837,6 +3885,17 @@ nan@^2.12.1:
resolved "https://registry.yarnpkg.com/nan/-/nan-2.14.0.tgz#7818f722027b2459a86f0295d434d1fc2336c52c"
integrity sha512-INOFj37C7k3AfaNTtX8RhsTw7qRy7eLET14cROi9+5HAVbbHuIWUHEauBv5qT4Av2tWasiTY1Jw6puUNqRJXQg==
nano@^8.2.2:
version "8.2.2"
resolved "https://registry.yarnpkg.com/nano/-/nano-8.2.2.tgz#4fdd48965cece51892cf41e78d433d1b772e6e40"
integrity sha512-1/rAvpd1J0Os0SazgutWQBx2buAq3KwJpmdIylPDqOwy73iQeAhTSCq3uzbGzvcNNW16Vv/BLXkk+DYcdcH+aw==
dependencies:
"@types/request" "^2.48.4"
cloudant-follow "^0.18.2"
debug "^4.1.1"
errs "^0.3.2"
request "^2.88.0"
nanoid@^2.0.0:
version "2.0.4"
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-2.0.4.tgz#4889355c9ce8e24efad7c65945a4a2875ac3e8f4"
@ -4712,6 +4771,32 @@ request@^2.87.0:
tunnel-agent "^0.6.0"
uuid "^3.3.2"
request@^2.88.0:
version "2.88.2"
resolved "https://registry.yarnpkg.com/request/-/request-2.88.2.tgz#d73c918731cb5a87da047e207234146f664d12b3"
integrity sha512-MsvtOrfG9ZcrOwAW+Qi+F6HbD0CWXEh9ou77uOb7FM2WPhwT7smM833PzanhJLsgXjN89Ir6V2PczXNnMpwKhw==
dependencies:
aws-sign2 "~0.7.0"
aws4 "^1.8.0"
caseless "~0.12.0"
combined-stream "~1.0.6"
extend "~3.0.2"
forever-agent "~0.6.1"
form-data "~2.3.2"
har-validator "~5.1.3"
http-signature "~1.2.0"
is-typedarray "~1.0.0"
isstream "~0.1.2"
json-stringify-safe "~5.0.1"
mime-types "~2.1.19"
oauth-sign "~0.9.0"
performance-now "^2.1.0"
qs "~6.5.2"
safe-buffer "^5.1.2"
tough-cookie "~2.5.0"
tunnel-agent "^0.6.0"
uuid "^3.3.2"
require-directory@^2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/require-directory/-/require-directory-2.1.1.tgz#8c64ad5fd30dab1c976e2344ffe7f792a6a6df42"
@ -5299,7 +5384,7 @@ to-regex@^3.0.1, to-regex@^3.0.2:
regex-not "^1.0.2"
safe-regex "^1.1.0"
tough-cookie@^2.3.3, tough-cookie@^2.3.4:
tough-cookie@^2.3.3, tough-cookie@^2.3.4, tough-cookie@~2.5.0:
version "2.5.0"
resolved "https://registry.yarnpkg.com/tough-cookie/-/tough-cookie-2.5.0.tgz#cd9fb2a0aa1d5a12b473bd9fb96fa3dcff65ade2"
integrity sha512-nlLsUzgm1kfLXSXfRZMc1KLAugd4hqJHDTvc2hDIwS3mZAfMEuMbc03SujMF+GEcpaX/qboeycw6iO8JwVv2+g==

View file

@ -1,5 +1,6 @@
const nano = require("nano");
const nano = require("nano")
const COUCH_DB_URL = process.env.COUCH_DB_URL || "http://admin:password@localhost:5984";
const COUCH_DB_URL =
process.env.COUCH_DB_URL || "http://admin:password@localhost:5984"
module.exports = nano(COUCH_DB_URL);
module.exports = nano(COUCH_DB_URL)

View file

@ -0,0 +1,7 @@
const { testSchema } = require("../../common/test/testSchema")
describe("record persistence", async () => {
it("should ")
})

View file

@ -1252,8 +1252,6 @@ error-inject@^1.0.0:
resolved "https://registry.yarnpkg.com/error-inject/-/error-inject-1.0.0.tgz#e2b3d91b54aed672f309d950d154850fa11d4f37"
integrity sha1-4rPZG1Su1nLzCdlQ0VSFD6EdTzc=
<<<<<<< HEAD
=======
errs@^0.3.2:
version "0.3.2"
resolved "https://registry.yarnpkg.com/errs/-/errs-0.3.2.tgz#798099b2dbd37ca2bc749e538a7c1307d0b50499"
@ -1276,7 +1274,6 @@ es-abstract@^1.16.3, es-abstract@^1.17.0-next.1, es-abstract@^1.17.4:
string.prototype.trimleft "^2.1.1"
string.prototype.trimright "^2.1.1"
>>>>>>> building out new budibase API
es-abstract@^1.5.1:
version "1.13.0"
resolved "https://registry.yarnpkg.com/es-abstract/-/es-abstract-1.13.0.tgz#ac86145fdd5099d8dd49558ccba2eaf9b88e24e9"
@ -3021,8 +3018,6 @@ nan@^2.12.1:
resolved "https://registry.yarnpkg.com/nan/-/nan-2.14.0.tgz#7818f722027b2459a86f0295d434d1fc2336c52c"
integrity sha512-INOFj37C7k3AfaNTtX8RhsTw7qRy7eLET14cROi9+5HAVbbHuIWUHEauBv5qT4Av2tWasiTY1Jw6puUNqRJXQg==
<<<<<<< HEAD
=======
nano@^8.2.2:
version "8.2.2"
resolved "https://registry.yarnpkg.com/nano/-/nano-8.2.2.tgz#4fdd48965cece51892cf41e78d433d1b772e6e40"
@ -3039,7 +3034,6 @@ nanoid@^2.1.0:
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-2.1.11.tgz#ec24b8a758d591561531b4176a01e3ab4f0f0280"
integrity sha512-s/snB+WGm6uwi0WjsZdaVcuf3KJXlfGl2LcxgwkEwJF0D/BWzVWAZW/XY4bFaiR7s0Jk3FPvlnepg1H1b1UwlA==
>>>>>>> building out new budibase API
nanomatch@^1.2.9:
version "1.2.13"
resolved "https://registry.yarnpkg.com/nanomatch/-/nanomatch-1.2.13.tgz#b87a8aa4fc0de8fe6be88895b38983ff265bd119"