diff --git a/README.md b/README.md index 5ec69f6..28af22b 100644 --- a/README.md +++ b/README.md @@ -1,163 +1,4 @@ # README -This Project is the answer to : +From QA Solution To Different Solutions -``` -接口自动化测试平台需求: -1、可对接 swagger 文档,自动/手动导入接口信息 -2、具备接口信息管理、测试用例管理、测试步骤管理、测试报告管理的功能 -3、具备接口调试的功能 -4、以测试集为单位执行多个测试用例,并生成测试报告 - -API Automation Testing Platform Requirements: - -1. Capable of integrating with Swagger documentation, with the ability to automatically/ manually import interface information. -2. with functions for managing interface information, test case management, test step management, and test report management. -3. Possesses the capability for interface debugging. -4. Executes multiple test cases as a test suite and generates test reports. -``` - -Build A [restack](https://www.restack.io/) like toolkits but for Software QA. - - -## Features - - - -FluentQA Workspace project is JAVA Project includes: - -1. Toolkits to handler QA Daily Work -2. Workspace Server for QA Daily Work -3. Learning JAVA in QA perspective -4. Revisited JAVA after A QA writing JAVA several years -5. Easy to Use, Maintain and extend -6. All Codes are Used for Real Cases -7. All libs have its purpose in QA Daily Work - -But Actually not only for QA, it also for anyone want to create -JAVA Libs or Applications. - -## 1. Software QA: JAVA Revisited Overview - -**Automation Language Perspective**: -![img](qa-automation.png) - -**Different Libs Perspective**: - -![img](overall.png) - -## 1.1 QA Java frequent used libs - -- [basic-libs](./components/fluent-builtin) basic JAVA Utils -- [excel-csv-operation](./components/fluent-excel) JAVA Excel/CSV Lib -- [mindmap-operation](./components/fluent-mindmap) JAVA mindmap Lib -- [openapi-operations](./components/fluent-openapi) JAVA openapi lib -- [quick-database-operations](./components/fluent-quickdao) JAVA Database access Lib -- [markdonw-operations](./components/fluentqa-md) JAVA Markdown Lib -- [testlibs](./components/fluent-testlibs) JAVA test libs - -### 2. Real Cases - -## 2.1. How to Build a Just-Working Test Case System in one day - -- [slides](https://fluentqa-revistied.netlify.app/present/tc-mgt-one-day/#/1) - -## 3. Integration Libs - -- [integration with other application](http://github.com/fluent-qa/fluent-integrations.git) - -## stats - -[//]: # (

) - -[//]: # ( ) - -[//]: # ( IHub) - -[//]: # ( IHub) - -[//]: # ( ) - -[//]: # (

) - ---- - -[//]: # (

) - -[//]: # ( ) - -[//]: # ( CII Best Practices Level) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Gradle Build) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Space Metric) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( CodeFactor) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Codecov) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( IHubPub) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( IHubPub) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Gitter) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Java Doc) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( IHub Snapshot Repository) - -[//]: # ( ) - -[//]: # ( ) - -[//]: # ( Maven Central) - -[//]: # ( ) - -[//]: # (

) - -[//]: # (组件库) - -## 🧭 开源贡献指南 - -## 👨‍💻 Contributors - -![Alt](https://repobeats.axiom.co/api/embed/97e9207fda40115a8deccefedc421fa84d02eb17.svg "Repobeats analytics image") - -[//]: # ([![Contributors](https://contrib.rocks/image?repo=ihub-pub/libs)](https://github.com/ihub-pub/libs/graphs/contributors "Contributors")) \ No newline at end of file diff --git a/docs/0-setup/README.md b/docs/0-setup/README.md deleted file mode 100644 index 62546e2..0000000 --- a/docs/0-setup/README.md +++ /dev/null @@ -1,16 +0,0 @@ -# README - -- Setup By AI ChatGPT Prompt - -## 环境准备 - -- 操作系统管理/Docker: OrbStack -- 数据库/Database: Supabase -- 开发环境: JAVA/MAVEN - -## JAVA MAVEN Project Setup With Erupts - - - - -## Run it \ No newline at end of file diff --git a/docs/1-FEATURES/1-API-Sepc-Mgr.md b/docs/1-FEATURES/1-API-Sepc-Mgr.md deleted file mode 100644 index 31867a8..0000000 --- a/docs/1-FEATURES/1-API-Sepc-Mgr.md +++ /dev/null @@ -1,14 +0,0 @@ -## 1. 可对接 swagger 文档,自动/手动导入接口信息 - -A. 背景内容: 什么是swagger文档? - -```可对接 swagger 文档,自动/手动导入接口信息```,对这个功能进行拆解和转化为要实现的代码功能如下: -1. swagger文档-JSON文档,保存这个文档到数据库 -2. 将这个文档解析保存成更好理解的HTTP API结构形式到数据库 -3. 可以展示所有项目相关的接口信息 - - -## - -> prompt: -> 拆解功能***对接 swagger 文档,自动/手动导入接口信息*** 成需要实现的小功能,使用中文Markdown形式展示 \ No newline at end of file diff --git a/docs/1-FEATURES/README.md b/docs/1-FEATURES/README.md deleted file mode 100644 index 56c1a0d..0000000 --- a/docs/1-FEATURES/README.md +++ /dev/null @@ -1,31 +0,0 @@ -# README - -这个仓库用于回答一下论坛的问题有一定意义,This Project is the answer to : - -``` -接口自动化测试平台需求: -1、可对接 swagger 文档,自动/手动导入接口信息 -2、具备接口信息管理、测试用例管理、测试步骤管理、测试报告管理的功能 -3、具备接口调试的功能 -4、以测试集为单位执行多个测试用例,并生成测试报告 - -API Automation Testing Platform Requirements: - -1. Capable of integrating with Swagger documentation, with the ability to automatically/ manually import interface information. -2. with functions for managing interface information, test case management, test step management, and test report management. -3. Possesses the capability for interface debugging. -4. Executes multiple test cases as a test suite and generates test reports. -``` - -意义在哪里? -1. 最简单构建一个平台需要什么成本? -2. 在low-code工具比较多的情况下,如何结合不同的工具快速完成可以满足完成基本功能的系统 -3. 是不是有另外一些思路,如何利用好开源,已经有的工具就可以完成基本使用了? -4. 有了一定东西之后,在看到底需要什么东西? -5. 做东西的时候可以更客观的了解自己的技术不足和需要的帮助 -6. 如果需要快速完成一些内容需要沉淀哪些内容,日常需要积累哪些东西? - -## 开始动手完成功能 - - - diff --git a/docs/1-FEATURES/upload/code.png b/docs/1-FEATURES/upload/code.png deleted file mode 100644 index bc754bf..0000000 Binary files a/docs/1-FEATURES/upload/code.png and /dev/null differ diff --git a/docs/1-FEATURES/upload/feature.png b/docs/1-FEATURES/upload/feature.png deleted file mode 100644 index 342a5d9..0000000 Binary files a/docs/1-FEATURES/upload/feature.png and /dev/null differ diff --git a/docs/1-FEATURES/upload/upload-file.md b/docs/1-FEATURES/upload/upload-file.md deleted file mode 100644 index 8660348..0000000 --- a/docs/1-FEATURES/upload/upload-file.md +++ /dev/null @@ -1,17 +0,0 @@ -# Upload File Feature in One class - -根据[记一次前后端分离项目 Django 的图片代理](https://testerhome.com/topics/40020)我估计如果从头实现需要 -前后端各半天(乐观一点), 如果用一些好用的工具实现呢?大概是1个小时吧。 - -假设需要实现一个如下的文件上传功能需要多少开发量? - -![img.png](feature.png) - - -如果用一些好用工具实现,一个JAVA类吧: - -![img_1.png](code.png) - -我就是这么实现一个前后端都有的上传功能。 至少个人认为,有些东西确实没有必要什么重零开始,尤其是测试开发相关的东西. - - diff --git "a/docs/10-THOUGHTS/1-\345\244\226\345\214\205.md" "b/docs/10-THOUGHTS/1-\345\244\226\345\214\205.md" deleted file mode 100644 index 3ff7fef..0000000 --- "a/docs/10-THOUGHTS/1-\345\244\226\345\214\205.md" +++ /dev/null @@ -1,32 +0,0 @@ -## 关于外包 - -个人做过一段时间外包也找过一些外包人员,就讲讲我知道的外包里面有一些不同的情况。 - - -## 最好的外包情况: 顾问 - -我看到过有外包,人的头衔是顾问,但是他不是甲方公司的正式员工, 这种情况下,他的工资有可能比甲方的都高, -不过这种概率很小,不过我是在外企亲眼所见,所以我认为这是存在的。 - -## 人力外包 - -人力外包,可能就是类似于只管这种方式,就是外包公司不管任何和具体工作有关的事情,外包公司只管发你工资。 -你所有的事情都是有甲方公司的人指派或者安排,这里面也有两种情况: -1. 比较好的一种,就是你人被拍到甲方公司,你做的事情和对待上大体上和正式员工一样,当然你如果完全想一样那也是不切实际的,但是表面上基本看不出任何差别,开会,安排工作,报告进度等等没有任何差别,周会,分享会什么也会让你参加,你和正式员工有很多接触,只是大部分福利你没有,当然可能聚餐,团建叫你这种是可能的。这种情况我也是在外企亲眼所见, - 所以也是存在的,如果你的工作能力和结果被一些重要的人认可,转正还是有不小机率的,外包个2-3年,其实可能就转正了。 -2. 比较差的一种,你被派到甲方公司,你就是接受工作安排,然后做具体的事情,至于会议什么一般不会参加,甲方公司的人一般就是安排任务,你给他做就行,其他他不关心,所以有一些和甲方公司有一些接触,但是和第一种比起来,你就是完成任务的一个人,不要想太多,这种我见过的是一个国内企业,一般你就是一个代号,测试A,测试B,把任务给A,把任务给B,大概就是这么个状态。 这种情况转正概率相对比较少,工作完了,基本就会回外包公司了 - -以上两种情况是比较好和比较差的情况, 有些应该就是介于这两者中间,可能一些好点公司介于这中间,一些好的外企是和1这种情况比较接近。 - - -## 项目外包 - -项目外包的情况就是把一个项目直接给外包公司,然后你接触的所有人基本都是你所外外包公司的,可能会和甲方公司打一些交道,这种情况下,基本不太会和甲方的人打交道,而且甲方也可能不太想和一个测试打交道,所以这种情况下,转正去甲方可能性很小。 不过你可能对你外包公司比较重要,如果这个项目缺了你,没法做了,这也不是没有可能。 - -## 接近幻想的想法 - -以上是我了解的一些外包的情况,只是我之前遇到的一些体验,不知道现在情况怎么样了,在见过一些公司状况之后,个人判断有些可能更差,好一点的可能也就是保持着。举个例子来说,面试过一些软通给华为做外包的同学,基本反应就是给华为干活,对面管他的人基本就是派任务,不会有太多交集,个人觉得这种已经不错了,也比较正常,毕竟也是两家公司,你有不会提供太多价值给对方,而对方也只是期望你不出乱子就行,不会太期望有什么出彩的事情,如果这里得罪人,我道歉,我只是用普通人的眼光看这些问题。 - -最后希望的就是每个公司都能善待每个工作的人,因为毕竟大家都是打工的,你不去善待别人,哪天你就是被别人不善待,其实打工人本来就是一伙的,只是有些自命不凡的人表现的比较粗暴和不讲礼貌,导致外包打工人的生存环境不好,或者所有打工人的生存环境不好。这里面尤其是正常的打工人,不想跪舔,又比较认真工作的人,这种往往两头不讨好,一边得不到上边的支持,一边付出和得到不匹配,比如流行跪舔的时候,你不会好处轮不到你,像现在这种动不动吐槽老板的事情,你也不会太吐槽被更高的人当枪用,所以你也得不到好处;再加上你还比较认真负责,所以你付出和那些不负责的人比起来,只会多,但是在其他人眼里,你和其他外包没有任何区别,价钱是一样的。 总之,正常人就是最有可能吃亏的,做好心理准备就行。这不是你的问题,这也不是任何人的问题,这就是现状,哪里都是这样的。 - -最后说下,目前所在公司没有这种问题,以上我说的只是我看到的,经历的和面试时问到的,仅供参考。 或许可以来一个,***外包同学写出你的经历的活动***,***让更多的人了解更真实的外包工作***,其实每个人都是外包,能养活自己就行, 我们只是在体验经历感受,真实总比矫情强,看多了,经历多了社会职场的无情,只会让你平静和客观,想像中完美的地方并不存在,你要获得的东西更重要。 \ No newline at end of file diff --git "a/docs/10-THOUGHTS/2-\347\262\276\345\207\206\346\265\213\350\257\225.md" "b/docs/10-THOUGHTS/2-\347\262\276\345\207\206\346\265\213\350\257\225.md" deleted file mode 100644 index 831465d..0000000 --- "a/docs/10-THOUGHTS/2-\347\262\276\345\207\206\346\265\213\350\257\225.md" +++ /dev/null @@ -1,116 +0,0 @@ -# 关于精准测试的疑问 - -看了不少精准测试的文章,我收集如下,这些主要是大厂出品,内容也比较精彩和全面,是了解精准测试比较全面的文章. - -- [网易严选的精准测试实践](https://www.infoq.cn/article/xuu91crqa4hcjz8uomjs) -- [字节跳动精准测试实践,SmartEye 背后的设计逻辑](https://www.infoq.cn/article/uqnsuc3zm04ydwcjo132) -- [精准测试之过程与实践 | 京东云技术团队](https://juejin.cn/post/7230986641900683320) -- [走出回归测试困境,爱奇艺精准测试体系建设](https://juejin.cn/post/6995809238119514142?from=search-suggest) -- [关于智能化、精准化测试的一切,这50个问题我们帮你整理好了!](https://juejin.cn/post/7001534604343509022?from=search-suggest) - -我摘录一些这些文章中关于精准测试的概念目标的说明: - -网易文章的说明: - -- 精准测试的概念: 借助一定的技术手段、通过辅助算法对传统软件测试过程进行可视化、 - 分析及优化的过程,使得测试过程更加可视化、智能、可信和精准 -- 精准测试的目标: 非常精确和智能的软件来解决传统软件测试过程中存在的问题,在测试资源有限的前提下,将用例精简到更加有针对性,提高测试效率,有效的减少漏测风险 -- 精准测试的核心:双向追溯,代码和用例可以做到匹配和关联,从而实现测试用例的精准覆盖 - -或者用字节文章中的说明: -> 在日常的研发活动中,我们经常会遇到下列场景: -> - 这次需要研发自测保障了, 我的用例集是不是全都有效覆盖了? -> - 这次技术重构改动挺大的,会影响哪些已有功能? -> - 基础工具 SDK 有重大升级,我是涉及到的业务方,哪些功能需要测试验证? -> - 版本要上线了,大家都走一下全量回归 Case,测试重点在哪里?回归测试用例集全量执行是不是必要的? -> - 在项目研发团队中的每个同学质量标准是不是都统一了? - -京东云: -> 精准测试是中国自己有知识产权的完全的理论体系,它同时关注功能点和代码相关逻辑这样一个方法论,是一种灰盒的测试模式。 -> 最开始在 2014 年的国际软件测试大会上发布精准测试的时候,它叫穿线测试,英文名字叫 Threading Test, -> 表达了精准测试的本质,Threading 这个英文单词本身有两个含义,一个是穿线一个是线程, -> 建立用例和代码的关系,相当于把黑盒和白盒关联起来,做黑盒测试也能看到白盒数据,同时把开发和测试能够关联起来,测试一做完, -> 开发的逻辑马上就能自动生成。另一个层面,精准测试最本质就是线程测试,因为精准测试基于覆盖率白盒理论产生, -> 它跟白盒最大的区别是它的覆盖率是线程级的,也就是说要追溯到用例这个级别。 - - -## 京东云文章中Thread testing的疑问 - -对于京东云文章中说到精准测试一开始是threading testing,由于提到是精准测试的起源,想来一开始的目的是比较有意义的, -所以特意查了一下: - -1. thread testing这个术语(terminology/glossary)令人惊讶的在 -[istqb-Standard Glossary of Terms Used in Software Testing](https://glossary.istqb.org/en_US/search) -这个网站上查不到,那这个从哪里来?哪个 2014 年的国际软件测试大会上发布?再一查,还是没有查到,可能是我的搜索能力有限 -2. 再查了一些外网关于这个thread testing: -``` -Thread Testing is one such type of software testing that is usually -conducted during the early stages of System Integration Testing. - -A thread is the smallest unit of work that can be carried out by the system and -it is mainly used to verify the functional capabilities of a specific task or thread. -In thread testing, testers focus on testing individual logical execution paths -in context of the entire system. -``` - -- [professionalqa-thread-testing](https://www.professionalqa.com/thread-testing) -- [geeksforgeeks-thread-testing](https://www.geeksforgeeks.org/thread-testing-in-software-engineering/) - -综合以上,个人认为thread testing 不是精准测试的起源,而是如何尽可能尽早的进行主要功能的集成测试,用一个测试(thread test)尽可能 -覆盖足够好独立的功能. 你说他精准吧,可能也说的通,但是他的定义是集成测试的一种,所以还是集成测试. - -## 关于精准测试解决的问题的疑问 - -总结以上几个文章我总结的一些我自己认为精准测试要解决的问题: -1. 没有办法精确定义改动需要跑哪些Case,如果每次都全跑太浪费时间成本高 -2. 对改动进行测试之后,没有办法进行评估,评估所做的测试是否已经全部覆盖了变化 - -看了这些文章之后,这些精准测试系统主要实现的思路都是: -1. 收集代码变化,确定代码对哪些接口有影响 -2. 收集执行测试用例之后的代码覆盖率数据,确定哪些代码覆盖没有做到 -3. 使用可视化的方式展示哪些代码变化,哪些接口可能有影响,推荐哪些用例 - -以上我觉得都很好,但是在小公司,自己的疑问就是: -1. 做一个这个精准测试的平台成本是多少,收益是什么? -2. 关于代码变化,有哪些影响,人工沟通和检查是否可以确认? -3. 如果都是通过系统推送出来,然后人工确认,那么和直接人工对着代码确认相差的成本和收益是什么? -4. 如果单独到一个接口的变化,如果有全量的接口自动化,那么这个接口变化,是否可以自动触发全量接口的回归测试?成本在哪里? - 为什么还需要额外进行精准测试? -5. 精准测试平台对于有漏侧可以承担责任吗?如果每一个测试只是负责自己熟悉的一小块,代码review可以替代这些精准测试吗? -6. 还有通过代码分析的方式去了解调用链路可以现在已经有日志追踪系统了或者类似的skywallking这种可以追踪调用链的工具, -为什么还要自己写一套从代码开始的实现?不应该生产的调用链路更能表示实际使用情况,从而从实用和准确 - -考虑以上的一些问题不能得到回答,还是觉得放弃精准测试平台这个路,可能做好一下几点就足够了: -1. 接口变化: 通过追踪接口定义文件就可以了解,无论是swagger还是openapi格式,这个追踪变化的实现要比代码层的实现成本低很多 -2. 接口有变化,全量接口回归测试,这个是无脑操作,没有成本 -3. 新的测试用例的新加和影响范围的评估,通过和开发沟通,查看代码也可以达到,毕竟自己负责系统,也挺熟悉的,分析出来的结果可能也会进行再加工和讨论 - 并不能减少多少成本 -4. 用例管理起来,可以知道哪些是高优先级用例,如果需要全面重构,精准测试和全量回归工作量预计相差不会太大,而如果只是1,2个接口的改动, - 选择高优先级用例和review之后的结果需要的用例也能满足,或许精准测试会再提供一些提示,但是这些提示从文章中我没有看到哪些特殊情况特殊的点容易 - 遗忘的,所以我判断其实可能也不会太多这样的Case,因为可能特殊点再测试用例库里面就没有,如果测试用例库里面有,那么在选高优先级用例的时候也能包含 -5. 针对回归测试的代码覆盖率或许有用,但是文章中还是没有实际的例子来说服我代码覆盖率可以发现的Case不在高优先级用例场景覆盖中的 - -所以最后还是放弃精准测试平台这个路,管好测试用例,做好接口自动化,做好代码review和开发沟通或许是目前最合适的方法。 - -## 题外话 - -既然查了一下相关内容,然后一时对于代码插桩这个描述,是在不知道是个什么东西,最后查了一下,英文是: code instrumentation 这个 -不确定为什么要翻译成代码插桩,但是我也不知道翻译成什么好, 就是叫成代码插桩感觉哪里不对,不能叫代码测量吗? - -``` -In software engineering the need for secure and high quality software has spurred intense -research activity in the areas on software debugging, testing and constraint analysis. -Code instrumentation is a common technique used to track application behaviour. -The most popular usages for code instrumentation are software debugging, -performance analysis, monitoring, distributed computing and aspect oriented programming. -Typical instrumentation techniques provide information about code coverage during software testing activities. -Current approaches make use of instrumentation by inserting additional code that monitors the behavior of a specific component. -This thesis presents and applies two novel approaches that use an instrumentation technique: -(1) A Runtime Debugging approach is aimed at detecting and resolving runtime faults in object-oriented code. -The approach relies on bytecode instrumentation in order to provide code coverage for predefined unit tests. -The results are analysed using Reverse Engineered techniques. The approach consists in merging both succesfull and - faulty code execution traces and detecting the faults by analysing the differences in the output traces. - -(2) A Security Constraint Checking approach uses the notion of security consistency in designs. Byte code instrumentation techniques are used to provide code coverage for selected unit tests. Direct acyclic graphs are constructed from the output traces using reverse engineered techniques. The graphs contain object method calls in a similar manner to UML Sequence Diagrams. This approach uses the results of the instrumentation to check for consistency with design generated security constraints. Furthermore this approach analyzes these views for -security inconsistencies, and generates a set of recommendations. -``` \ No newline at end of file diff --git "a/docs/10-THOUGHTS/3-\344\270\272\344\273\200\344\271\210\350\247\211\345\276\227\345\255\246\344\270\215\345\210\260\344\270\234\350\245\277.md" "b/docs/10-THOUGHTS/3-\344\270\272\344\273\200\344\271\210\350\247\211\345\276\227\345\255\246\344\270\215\345\210\260\344\270\234\350\245\277.md" deleted file mode 100644 index 6d00205..0000000 --- "a/docs/10-THOUGHTS/3-\344\270\272\344\273\200\344\271\210\350\247\211\345\276\227\345\255\246\344\270\215\345\210\260\344\270\234\350\245\277.md" +++ /dev/null @@ -1,68 +0,0 @@ -# 测试为什么总觉得又学不到东西了? - -看到有人说为什么测试同学在经过了一段时间之后,往往会觉得学不到东西,或者学的差不多了. -有这种感觉其实不奇怪,但是为什么测试同学特别多呢?而且会发现往往没有很好的解决办法,有这种想法的人 -肯定自己会努力找东西学习,但是在努力找东西学习之后,发现还是能学到的东西差不多了?反反复复,最终拼命 -挖掘,就是找不到,就是觉得能学到的东西差不多了。 为什么?能学的东西那么多,为什么就是找不到?而且还是努力找的情况下? - -为什么?想过吗?是你所在的环境的问题?还是其他问题? - -## 测试为什么总觉得又学不到东西了 - -为什么测试为什么总觉得又学不到东西了?是你所在的环境的问题?还是其他问题? -这个问题没有答案,只说说自己的体会和经历,或许有点帮助. - -首先为什么会觉得学不到东西?我自己的体验是: -1. 事实: 能马上学到的东西,马上实际用起来的而且自我感觉有点硬核的,或者和周围人比起来的自己少的,没有想的那么多,所以会觉得很快就学的差不多了 -只要自己回顾一下测试中常用的一些东西: -- SQL: 常用的一个月能学会吧,select/insert/delete/update,一个月可能都多了 -- 业务测试用例设计: 测试用例设计方法,什么等价类,因果图等等,3个月也很熟悉了 -- 接口: HTTP协议常用的一个月能学会吧 -- 流程: 常用的3个月能学习会吧,什么敏捷,迭代开发,测试用例,回归测试各种术语,但是这些并没有不是你想要学的东西,不够硬核吧 -- 接口自动化: 如果在框架下面,就写个调用请求,获取返回,最多2个月也差不多了吧 -- 业务相关知识: 电商的业务知识其实就是订单买卖,支付,库存,物流等等,这些你说能满足你硬核的需求吗?3个月也差不多吧,但是为什么库存怎么设置,为什么 - 物流仓储怎么设置,物流仓储怎么建立才能挣钱,怎么设置才能合理?这是你接触不到的 -- 性能测试: JMETER用用,然后试试各种分析工具,大部分情况下也就是跑个压测看个结果,然后也不会太深入,2个月做几次,问下别人可能也差不多了 -- linux 命令: 常用的可能也就是15个左右,用的熟悉也就是2个月吧 -- 还有什么和测试相关的 .......,其实就是没有多少 -2. 学习的反馈:看了一堆网上资料,工资也没加,做的事情还是哪些. - 这些学的东西(看书/看网上文章算学东西吧)是没有任何反馈的?***没有反馈你当然觉得没学到东西*** - -感觉学到东西是需要反馈的,在工作中得到反馈的,你才能有体会,比如做这个事情明显熟练了,比如加工资了,比如以前做不来的,现在能做了。 -而第一点中说到的,能学的东西其实很快就熟练了,所以在获得一段反馈之后,就很难再得到正反馈了,这是自己的一些体会. - - -## 第一阶段:确实学到东西了 - -一开始走测试的时候确实觉得学到东西了,学到的其实就就就是我上面提到的这些,SQL,用例设计,流程,接口,linux命令,业务等等。 -但是2-3年知乎就觉得一直都是这些东西,在有个2-3年还是这些东西 而且就是寻找寻找,看文章看文章,程序语法看了一遍又一遍,但是还是老样子。。。。。。。, -我大体就是这么个情况。而我往周围一看,我可能还是知道最多的那一个测试, 那我学什么。。。。。。。。 - -## 开始觉得学到点东西了 - -第一阶段维持了很长实践,大概做了挺久业务测试之后,SQL已经非常熟练,一个人测试个100张表的系统也毫无压力之后, 在感觉很久没有学到新东西之后, -慢慢感觉开始觉得学到点新东西了?总结下来几点吧: - -1. 真的动手练习了很多,JAVA从马士宾视频自学开始,一行一行的写 -2. 熟练一些之后自己写点框架和小功能,慢慢能感受到自己的进步 -3. 遇到不同的问题,自己想办法能解决了,英文文档阅读,对话都不成问题了,感觉还是学到点东西了 - -写代码写出来的东西至少对自己工作效率提高了,后来和开发沟通他们说出来的技术名次,一听就懂也能分辨处开发是真懂还是套用瞎说, -大概知道自己确实学到东西了。 大概是这样子的一个过程,真正觉得特别有用的理念吧,其实就一条: - -***动手,动手练习,唯手熟尔!*** 一旦动手多了,你会发现打开了另外一个窗户,至于能加多少工资,我觉得看运气和造化吧。但是肯定是加工资的,30%以上是必须的。 - -## 明明看了很多东西,确感觉没东西学到 - -为什么会觉得明明看了都东西,确感觉没东西学到?其实看的都是理念,理念当然好,但是没法落地,没有落地的能力,你不要说让别人落地,你自己想实现 -一个就自己用的东西都很困难,那你谈什么呢?你怎么可能从学东西中有正反馈来确认你学到了东西呢? 这是无用的焦虑,不动手,大部分情况下正常人是 -哪有办法感受到学到东西的,而且你确定你学到的东西有用吗?理念是没有用的,他不能直接证明正确,每一个能说到理念这个层面的,都是正确的,就是他 -解决不了任何现实问题。 说理念的人他的目标是贩卖理念,所以不会管有没有用的,而动手才是检验这些,哪里可行,哪里不可行。 - -不管怎么样,慢慢的,最少动手多了,你有自己想法了,你不会完全相信那些理念了,这个对我是有帮助的. -还是一点,***动手,动手练习,唯手熟尔!*** 是我解决自己为什么觉得学不到东西的最重要的一条体会. 慢慢自己开始分析,开始比对理念和现实, -开始找些问题,然后慢慢感觉管人,沟通,解决一些问题也慢慢顺利了. - -对于测试同学而言,***动手,动手练习,唯手熟尔!***,其实就是多练练代码,多用命令操作,多看看文档,我能想到的其实也就这些。 -如果学到东西就能发财的话,那我相信这些应该也不是,如果学到东西需要的正反馈是发财,那这些可能也不是什么学到的东西, -如果你的期望是可以做一些以前不知道的事情或者做不了的事情,那这些可能就是学到的东西了,我自己的体验就是必须要动手,否则也解决不了问题. \ No newline at end of file diff --git a/docs/README.md b/docs/README.md deleted file mode 100644 index 299ec48..0000000 --- a/docs/README.md +++ /dev/null @@ -1,58 +0,0 @@ -# README - - - -工具,平台都好,但是如何快速构建一些可以使用的应用。一般测试开发团队需要开发能力配置: -1. 前端开发 -2. 后端开发 -3. 数据库管理 -4. 需求编写/或者和业务测试进行沟通 - -带来的挑战是: -1. 成本不低,试错成本高 -2. 沟通不顺畅,业务测试和测试开发想的可能会有出入 -3. 有些可能不是在解决实际的问题 - -## 如何降低试错成本 - -- 把开发成本降到足够低 -- 业务测试也可以通过一些方式解决自己的问题 - -## 低代码解决方案 - -- 表格方式 -- 后端接口低代码 -- 前端页面低代码 -- 前后端一体化解决方案 -- 第三方组件形式 - -## 表格方式 - -- 飞书多维表格 -- Vika多维表格 -- 开源替代 - - nocodb - - rows - - 。。。。。。。。。。。 -## 后端低代码 -减少API编写开发工作量: - -1. ###Database###: - - Supabase - -## 一体化多代码 - -- erupt -- illabuilder -- 。。。。。。。。 - -## 第三方组件 - -- [] 登录授权组件 -- [] 开发框架 -- [] 成熟组件 - -## AI如何引入 - -AI能够带来好处 - diff --git a/docs/architecture-as-code/README.md b/docs/architecture-as-code/README.md deleted file mode 100644 index 86ac64f..0000000 --- a/docs/architecture-as-code/README.md +++ /dev/null @@ -1,36 +0,0 @@ -# jianmu-architecture-as-code - -#### 介绍 -建木架构即代码(architecture as code) - -#### 使用说明 - -本项目使用[Structurizr DSL](https://github.com/structurizr/dsl) 的Cli工具来生成C4 Model的架构图 - -该工具可以输出为PlantUML格式文件 - -Mac环境下可以使用Homebrew安装该工具 - -``` -brew install structurizr-cli -``` - -安装完成后,可以使用以下命令来生成.puml文件到c4文件夹下 - -``` -structurizr-cli export -workspace jianmu.dsl -format plantuml -output c4 -``` - -#### 生成效果 - -*建木容器图* -![容器图](out/c4/structurizr-jianmu-container/structurizr-jianmu-container.png) - -*主服务内部组件图* -![组件图](out/c4/structurizr-web-component/structurizr-web-component.png) - -*执行器组件图* -![执行器组件图](out/c4/structurizr-worker-component/structurizr-worker-component.png) - -*服务部署图* -![部署图](out/c4/structurizr-DevelopmentDeployment/structurizr-DevelopmentDeployment.png) \ No newline at end of file diff --git a/docs/architecture-as-code/c4/structurizr-DevelopmentDeployment.puml b/docs/architecture-as-code/c4/structurizr-DevelopmentDeployment.puml deleted file mode 100644 index caa430f..0000000 --- a/docs/architecture-as-code/c4/structurizr-DevelopmentDeployment.puml +++ /dev/null @@ -1,38 +0,0 @@ -@startuml -title 建木自动化集成平台 - Deployment - dev - -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Context.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Container.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Deployment.puml -LAYOUT_WITH_LEGEND() - -Deployment_Node(55, "Docker Container - Worker Server", "Docker") { - Deployment_Node(56, "Worker Server", "Golang") { - Container(57, "docker-worker", "容器化执行环境") - } - -} - -Deployment_Node(45, "Web Browser", "Chrome, Firefox, Safari, or Edge") { - Container(46, "SPA单页面应用", "TypeScript and Vue 3.0", "Provides all of the Jianmu functionality to customers via their web browser.") -} - -Deployment_Node(47, "Docker Container - Web Server", "Docker") { - Deployment_Node(48, "Spring boot", "Spring boot 2.x") { - Container(49, "主服务", "主服务") - } - -} - -Deployment_Node(51, "Docker Container - Database Server", "Docker") { - Deployment_Node(52, "Database Server", "Mysql 8.0") { - Container(53, "数据库", "Mysql 8.0") - } - -} - -Rel_D(46, 49, "启动流程或任务", "Rest API") -Rel_D(49, 53, "读写数据", "JDBC/SSL") -Rel_D(57, 49, "获取任务执行", "http/https") -@enduml \ No newline at end of file diff --git a/docs/architecture-as-code/c4/structurizr-jianmu-container.puml b/docs/architecture-as-code/c4/structurizr-jianmu-container.puml deleted file mode 100644 index 85ca52c..0000000 --- a/docs/architecture-as-code/c4/structurizr-jianmu-container.puml +++ /dev/null @@ -1,24 +0,0 @@ -@startuml -title 建木自动化集成平台 - Containers - -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Context.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Container.puml -LAYOUT_WITH_LEGEND() - -Person(1, "用户", "泛指用户") - -System_Boundary("2_boundary", "建木自动化集成平台") { - Container(11, "数据库", "Mysql 8.0") - Container(12, "docker-worker", "容器化执行环境") - Container(16, "shell-worker", "非容器化执行环境") - Container(3, "SPA单页面应用", "TypeScript and Vue 3.0", "Provides all of the Jianmu functionality to customers via their web browser.") - Container(4, "主服务", "主服务") -} - -Rel_D(1, 3, "操作或查看流程与任务", "Rest API") -Rel_D(4, 11, "读写数据", "JDBC/SSL") -Rel_D(12, 4, "获取任务执行", "http/https") -Rel_D(16, 4, "获取任务执行", "http/https") -Rel_D(3, 4, "启动流程或任务", "Rest API") -@enduml \ No newline at end of file diff --git a/docs/architecture-as-code/c4/structurizr-web-component.puml b/docs/architecture-as-code/c4/structurizr-web-component.puml deleted file mode 100644 index 54bb0dd..0000000 --- a/docs/architecture-as-code/c4/structurizr-web-component.puml +++ /dev/null @@ -1,35 +0,0 @@ -@startuml -title 建木自动化集成平台 - 主服务 - Components - -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Context.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Container.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Component.puml -LAYOUT_WITH_LEGEND() - -Container(3, "SPA单页面应用", "TypeScript and Vue 3.0", "Provides all of the Jianmu functionality to customers via their web browser.") - -Container_Boundary("4_boundary", "主服务") { - Component(10, "参数管理", "参数上下文") - Component(5, "DSL解析器", "DSL语法解析器") - Component(6, "触发器", "触发器上下文") - Component(7, "流程流转", "流程上下文") - Component(8, "el引擎", "表达式引擎") - Component(9, "任务分发", "任务上下文") -} - -Rel_D(3, 6, "启动流程或任务", "Rest API") -Rel_D(3, 5, "提交DSL定义", "Rest API") -Rel_D(5, 7, "保存流程定义", "Java API") -Rel_D(6, 7, "触发流程启动", "Java API") -Rel_D(6, 9, "直接触发任务启动", "Java API") -Rel_D(7, 9, "任务节点激活事件触发任务启动", "Java API") -Rel_D(7, 9, "任务节点中止事件触发任务中止", "Java API") -Rel_D(7, 8, "执行表达式", "Java API") -Rel_D(8, 7, "返回表达式结果", "Java API") -Rel_D(9, 7, "返回任务执行状态", "Java API") -Rel_D(7, 10, "读取参数信息", "Java API") -Rel_D(7, 10, "流程执行结果参数写入", "Java API") -Rel_D(9, 10, "读取参数信息", "Java API") -Rel_D(9, 10, "任务执行结果参数写入", "Java API") -@enduml \ No newline at end of file diff --git a/docs/architecture-as-code/c4/structurizr-worker-component.puml b/docs/architecture-as-code/c4/structurizr-worker-component.puml deleted file mode 100644 index 09acee5..0000000 --- a/docs/architecture-as-code/c4/structurizr-worker-component.puml +++ /dev/null @@ -1,22 +0,0 @@ -@startuml -title 建木自动化集成平台 - docker-worker - Components - -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Context.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Container.puml -!includeurl https://raw.githubusercontent.com/plantuml-stdlib/C4-PlantUML/master/C4_Component.puml -LAYOUT_WITH_LEGEND() - -Container(4, "主服务", "主服务") - -Container_Boundary("12_boundary", "docker-worker") { - Component(13, "daemon", "worker守护容器") - Component(14, "代码编译", "任务运行容器") - Component(15, "Ansible执行", "任务运行容器") -} - -Rel_D(13, 4, "获取任务执行", "http/https") -Rel_D(13, 4, "返回执行结果", "http/https") -Rel_D(13, 14, "启动容器", "Docker API") -Rel_D(13, 15, "启动容器", "Docker API") -@enduml \ No newline at end of file diff --git "a/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\232\344\271\211\347\261\273\345\233\276.png" "b/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\232\344\271\211\347\261\273\345\233\276.png" deleted file mode 100644 index 5bbdf86..0000000 Binary files "a/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\232\344\271\211\347\261\273\345\233\276.png" and /dev/null differ diff --git "a/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\236\344\276\213\347\261\273\345\233\276.png" "b/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\236\344\276\213\347\261\273\345\233\276.png" deleted file mode 100644 index 74297b6..0000000 Binary files "a/docs/architecture-as-code/class_ diagrams/\346\265\201\347\250\213\345\256\236\344\276\213\347\261\273\345\233\276.png" and /dev/null differ diff --git a/docs/architecture-as-code/jianmu.dsl b/docs/architecture-as-code/jianmu.dsl deleted file mode 100644 index b496139..0000000 --- a/docs/architecture-as-code/jianmu.dsl +++ /dev/null @@ -1,93 +0,0 @@ -workspace "Jianmu" "建木自动化集成平台" { - model { - user = person "用户" "泛指用户" - jianmu = softwareSystem "建木自动化集成平台" "建木自动化集成平台" { - singlePageApplication = container "SPA单页面应用" "Provides all of the Jianmu functionality to customers via their web browser." "TypeScript and Vue 3.0" "Web Browser" - web = container "主服务" "主服务" { - dsl = component "DSL解析器" "DSL语法解析器" - trigger = component "触发器" "触发器上下文" - workflow = component "流程流转" "流程上下文" - el = component "el引擎" "表达式引擎" - task = component "任务分发" "任务上下文" - parameter = component "参数管理" "参数上下文" - } - database = container "数据库" "Mysql 8.0" - worker1 = container "docker-worker" "容器化执行环境" { - daemon = component "daemon" "worker守护容器" - runner1 = component "代码编译" "任务运行容器" - runner2 = component "Ansible执行" "任务运行容器" - } - worker2 = container "shell-worker" "非容器化执行环境" { - daemon_process = component "daemon-process" "worker守护进程" - runner_process = component "runner-process" "任务运行进程" - } - } - # 容器之间关联关系 - user -> singlePageApplication "操作或查看流程与任务" "Rest API" - web -> database "读写数据" "JDBC/SSL" - daemon -> web "获取任务执行" "http/https" - daemon_process -> web "获取任务执行" "http/https" - daemon -> web "返回执行结果" "http/https" - daemon_process -> web "返回执行结果" "http/https" - - # worker内部组件关系 - daemon -> runner1 "启动容器" "Docker API" - daemon -> runner2 "启动容器" "Docker API" - - # 主服务内部组件关系 - singlePageApplication -> trigger "启动流程或任务" "Rest API" - singlePageApplication -> dsl "提交DSL定义" "Rest API" - dsl -> workflow "保存流程定义" "Java API" - trigger -> workflow "触发流程启动" "Java API" - trigger -> task "直接触发任务启动" "Java API" - workflow -> task "任务节点激活事件触发任务启动" "Java API" - workflow -> task "任务节点中止事件触发任务中止" "Java API" - workflow -> el "执行表达式" "Java API" - el -> workflow "返回表达式结果" "Java API" - task -> workflow "返回任务执行状态" "Java API" - workflow -> parameter "读取参数信息" "Java API" - workflow -> parameter "流程执行结果参数写入" "Java API" - task -> parameter "读取参数信息" "Java API" - task -> parameter "任务执行结果参数写入" "Java API" - - deploymentEnvironment "dev" { - deploymentNode "Web Browser" "" "Chrome, Firefox, Safari, or Edge" { - developerSinglePageApplicationInstance = containerInstance singlePageApplication - } - deploymentNode "Docker Container - Web Server" "" "Docker" { - deploymentNode "Spring boot" "" "Spring boot 2.x" { - developerWebApplicationInstance = containerInstance web - } - } - deploymentNode "Docker Container - Database Server" "" "Docker" { - deploymentNode "Database Server" "" "Mysql 8.0" { - developerDatabaseInstance = containerInstance database - } - } - deploymentNode "Docker Container - Worker Server" "" "Docker" { - deploymentNode "Worker Server" "" "Golang" { - developerWorkerInstance = containerInstance worker1 - } - } - } - } - - views { - container jianmu "jianmu-container" "建木容器图" { - include * - autoLayout - } - component web "web-component" "主服务组件图" { - include * - autoLayout lr 400 - } - component worker1 "worker-component" "执行器组件图" { - include * - autoLayout lr - } - deployment jianmu "dev" "DevelopmentDeployment" { - include * - autoLayout - } - } -} \ No newline at end of file diff --git a/docs/architecture-as-code/out/c4/structurizr-DevelopmentDeployment/structurizr-DevelopmentDeployment.png b/docs/architecture-as-code/out/c4/structurizr-DevelopmentDeployment/structurizr-DevelopmentDeployment.png deleted file mode 100644 index 166c97c..0000000 Binary files a/docs/architecture-as-code/out/c4/structurizr-DevelopmentDeployment/structurizr-DevelopmentDeployment.png and /dev/null differ diff --git a/docs/architecture-as-code/out/c4/structurizr-jianmu-container/structurizr-jianmu-container.png b/docs/architecture-as-code/out/c4/structurizr-jianmu-container/structurizr-jianmu-container.png deleted file mode 100644 index ee8841c..0000000 Binary files a/docs/architecture-as-code/out/c4/structurizr-jianmu-container/structurizr-jianmu-container.png and /dev/null differ diff --git a/docs/architecture-as-code/out/c4/structurizr-web-component/structurizr-web-component.png b/docs/architecture-as-code/out/c4/structurizr-web-component/structurizr-web-component.png deleted file mode 100644 index 24b6885..0000000 Binary files a/docs/architecture-as-code/out/c4/structurizr-web-component/structurizr-web-component.png and /dev/null differ diff --git a/docs/architecture-as-code/out/c4/structurizr-worker-component/structurizr-worker-component.png b/docs/architecture-as-code/out/c4/structurizr-worker-component/structurizr-worker-component.png deleted file mode 100644 index 181ec1a..0000000 Binary files a/docs/architecture-as-code/out/c4/structurizr-worker-component/structurizr-worker-component.png and /dev/null differ diff --git a/docs/features/database-client.md b/docs/features/database-client.md deleted file mode 100644 index 614dde3..0000000 --- a/docs/features/database-client.md +++ /dev/null @@ -1,3 +0,0 @@ -# README - -- [Simple SQL Client for lightweight data analysis.](https://github.com/bdash-app/bdash) \ No newline at end of file diff --git a/docs/features/ngrox.md b/docs/features/ngrox.md deleted file mode 100644 index 3f939c7..0000000 --- a/docs/features/ngrox.md +++ /dev/null @@ -1,6 +0,0 @@ -# ngrox - -```shell -brew install ngrok/ngrok/ngrok -``` - diff --git a/docs/features/tis.md b/docs/features/tis.md deleted file mode 100644 index cbcbdc8..0000000 --- a/docs/features/tis.md +++ /dev/null @@ -1,9 +0,0 @@ -# TIS - -一个完整的TIS应用,由以下三个子工程构成: - -TIS主干逻辑 https://github.com/datavane/tis -TIS插件 https://github.com/qlangtech/plugins -前端逻辑 https://github.com/qlangtech/ng-tis - - diff --git a/docs/guides.md b/docs/guides.md deleted file mode 100644 index a6ce63e..0000000 --- a/docs/guides.md +++ /dev/null @@ -1,118 +0,0 @@ -# README - -QA Automation/Coding Revisited MindMap: - -![img](img.png) - -> It is not about learn every piece of JAVA language first. -please spend less time on syntax, use more time on coding to -complete some your daily task. - -## Layers - -- Utilities Layers: Different Toolkits - - [X] Configuration - - File/IO Handler - - OS Handlers - - Database Accessor - - CSV/EXCEL/XML/JSON/YAML Handler - - Different Clients: - - HTTP Client - - Redis Client - - Database Client - - Meta - - Reflection - - Class/Method/Arguments Resolver - - DI Container - -- Service Layer: - - DATA Access/Repository Layer - - Service Domain: Compose Different Repositories and API Integration -- Integration/SDK Layer: - - Different SDKs to connect different Service - - Easy to Compose different integration and services -- API Layer: - - Easy to Create Different Protocol API to exposure - - Easy to do API Registration - -## FluentQA-Modules - -- [fluent-builtin](./fluentqa-builtin) based-on different tools - Use java for QA Daily Work: - 1. code to handle different data and client: - 1. json/map - 2. csv/excel - 3. redis/http/database - 2. build internal admin system - 3. integration testing - 4. automation testing -- [fluentqa-data](./fluentqa-modules/fluentqa-data) data access layers - - [quick-dao](./fluentqa-modules/fluentqa-data/fluent-quickdao) only for testing,data preparation - - [jpa-dao](./fluentqa-modules/fluentqa-data/fluent-jpa-data) jpa data operations - -## 2.Docs -## 2.1 Dao for tester -- [quick-dao usage](docs/qa-java-toolkits/data/quick-dao.md) - -## 3. To Do List -### 3.1 extensions: for daily use or integrated in an app -- [] Excel/CSV Toolkit -- [] Mindmap Toolkit -- [] openapi toolkit -- [] Database Operation Simplify - -## 3.2 Integrations: to integrate with third-party app by API -- [] feishu/飞书 -- [] vika 表格 -- [] seatable -- [] ....... - -## 4. Modularization - -1. modularization: compose different modules to build an app - - [] Generic DTO - - [] Generic Exceptions - - [] Logging Aspect - - [] spring starters - -## 5. QA-LowCode Thoughts - -1. [] Application Code Structures: -- Configuration -- API: 对外 -- Service: 对内部,细颗粒度 -- Repository: 数据访问层 -- Configuration: 系统配置 -- Core/Base: 常用方法和客户端组件调用方法/流程组合/控制反转/依赖注入 -2. [] Code Generate -- [] Database Table -> Entity/Repository/Service Code/Api Codes -- [] Different Templates Support/Configurable -- [] UI Code Generation - -## 6. Reference -- Hutool: https://hutool.cn/docs -- Excel: - - https://github.com/liaochong/myexcel.git - - https://ozlerhakan.github.io/poiji -- [manifold-system](http://manifold.systems/) - -## tools - -- [QR Code](QR code) - - -# To Do List - -- [X] built-in : Done - - [] meta: reflection/class utils - - [] meta: aop/proxy -- [X] SimpleDao : Done -- [X] Excel: Done -- [] JPA Simple -- [] Data Transform - - -## TODO: - -- https://github.com/domaframework/doma.git -- https://docs.jmix.cn/jmix/2/tutorial/index.html \ No newline at end of file diff --git a/docs/lessons/api-test/1-overview.md b/docs/lessons/api-test/1-overview.md deleted file mode 100644 index 0a36a72..0000000 --- a/docs/lessons/api-test/1-overview.md +++ /dev/null @@ -1,20 +0,0 @@ -# API - -## API lifecycle - -## API Definition - -- [] Postman Collection -- [] OpenAPI Specification - -## API List CRUD - -## API Testing Records - -## API Testing Cases - -## API Testing Scenarios - -## API Testing Result - -## API For Other Usages \ No newline at end of file diff --git a/docs/lessons/dashboard/dashboard.md b/docs/lessons/dashboard/dashboard.md deleted file mode 100644 index 42ffb52..0000000 --- a/docs/lessons/dashboard/dashboard.md +++ /dev/null @@ -1,6 +0,0 @@ -# Dashboard - -![img.png](img.png) -![img_1.png](img_1.png) -![img_2.png](img_2.png) -![img_3.png](img_3.png) \ No newline at end of file diff --git a/docs/lessons/dashboard/img.png b/docs/lessons/dashboard/img.png deleted file mode 100644 index 23e8c2a..0000000 Binary files a/docs/lessons/dashboard/img.png and /dev/null differ diff --git a/docs/lessons/dashboard/img_1.png b/docs/lessons/dashboard/img_1.png deleted file mode 100644 index eda878f..0000000 Binary files a/docs/lessons/dashboard/img_1.png and /dev/null differ diff --git a/docs/lessons/dashboard/img_2.png b/docs/lessons/dashboard/img_2.png deleted file mode 100644 index 7fc41de..0000000 Binary files a/docs/lessons/dashboard/img_2.png and /dev/null differ diff --git a/docs/lessons/dashboard/img_3.png b/docs/lessons/dashboard/img_3.png deleted file mode 100644 index 792ed2c..0000000 Binary files a/docs/lessons/dashboard/img_3.png and /dev/null differ diff --git a/docs/lessons/intro.md b/docs/lessons/intro.md deleted file mode 100644 index 8034e71..0000000 --- a/docs/lessons/intro.md +++ /dev/null @@ -1,74 +0,0 @@ -# 无废话构建极简测试管理系统 - - -无废话系列-极简测试管理系统-5分钟一个增删改查页面 - -看着各种测试大佬拿着高工资,其实他们都是写代码的,写测试平台难吗? -难也不难,无废话系列-极简测试管理系统主要就是告诉你不难的一面, - -短短几行代码,让你就实现一个小功能,成就感满满,有了反馈才能继续前进呀。 -要不然学了几个月代码,还是做不出一个功能,这代码是学不下去的。动手做点什么吧,真的不难! -视频就是为了展示这些都不难,10分钟左右一个小功能介绍,代码都不长。成就感满满。 - -## 资料参考 - -- erupts文档: https://www.yuque.com/erupts/erupt/sgx66o -- JAVA/MAVEN文档: 随便Baidu,最重要的是自己动手 -- 教程项目源码: https://github.com/fluent-qa/qabox-tutorials/tree/main/fluentqa-java-tutorials -- 数据库搭建: postgresql, docker - -## 起点 - -本系列文章只针对一个假设,就是突然有一天老板说你看外面好多做测试平台的,我们可以自己引入一个或者自己做一个来表示我们测试的能力吗? -至于做什么吗,可以围绕下面几个方面考虑: - -1. 自动化测试 -2. 测试用例管理 -3. 测试需求管理 -4. 测试执行管理 -5. 测试环境管理 -6. 应该还有好多好多,你可以结合实际情况自己想想 - -主要呢就是能够提高效率,展示能力, 你去看看可不可以?或者你先去调研下? - -自然你会做一些调研,然后你会和老板说: -1. metershpere可以直接用 -2. 还有好多开源的也可以直接搭建 - -老板可能就同意了,那就用开源的了,大家都开心,那么这系列文章大体你也不感兴趣了. -如果老板说,最好我们自己做吧(抄也好),我们就自己做一下,也当展示和锻炼下我们能力. 然后其实可能你们组也就几个人,忙着业务测试,忙着上线,真的没太多时间做这个, -那么或许下面这一些文章可能对你有点帮助,不需要花费太多精力就能做一个看起来是那么一回事的测试管理平台,而从中你也能获得一些收获比如: -1. 使用了一门语言JAVA,了解了框架SpringBoot,springboot-data-jpa -2. 做了一个后台管理系统,说不定哪天有人给个外包的活,你也能接一下 -3. 不需要花费太多成本,可以交差,也可以让你老板对你高看一眼,看起来你还挺能干的 - -好了,废话阶段结束,直接开始动手干活吧. - -## 搭建项目开始 - -- 一行代码,常规配置,就能看到一个后台管理页面 - -```java -package io.fluentqa.workspace; - -import org.springframework.boot.SpringApplication; -import org.springframework.boot.autoconfigure.SpringBootApplication; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.scheduling.annotation.EnableAsync; -import xyz.erupt.core.annotation.EruptScan; - -@SpringBootApplication -@EnableAsync -@EruptScan(value = {"io.fluentqa"}) -@EntityScan(value = {"io.fluentqa"}) -public class QAWorkspaceApp { - public static void main(String[] args) { - SpringApplication.run(QAWorkspaceApp.class); - } -} - -``` - - - - diff --git a/docs/lessons/master-data/1-master-data-simple.md b/docs/lessons/master-data/1-master-data-simple.md deleted file mode 100644 index 750b09d..0000000 --- a/docs/lessons/master-data/1-master-data-simple.md +++ /dev/null @@ -1,118 +0,0 @@ -# 极简测试管理系统: 第一个功能之增删改查列表 - -最简单功能其实就是一个增删改查表,实现一个系统字典表需要做哪些? -系统字典表主要用来记录一些系统常量: -1. 比如什么优先级P1,P2这种可以用来配置的东西 -2. 这个功能是最典型的后台增删改查功能,是学习erupt最好的入门案例 - -## 先说实现的效果 - -![img.png](master-data.png) - -实现以上内容需要哪些工作: -- 数据库设计? -- 前端? -- 后端? -- 其实只要一个JAVA类......, 认真的....... - -可以说都需要,也可以说都不需要. - -实际上需要做的是: -1. ***定义一个JAVA类和数据库表对应*** -2. ***每个需要的页面操作的字段使注解来说明页面如何展示*** - - -## 一个JAVA文件一个增删改查页面 - -```java -package io.fluentqa.workspace.base.model; - -import io.fluentqa.workspace.base.handlers.SqlTagFetchHandler; -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; - -import javax.persistence.Entity; -import javax.persistence.Table; - - -@Erupt(name = "产品字典值配置", - power = @Power(importable = true, - export = true), - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30 - )) -@Table(name = "master_data") -@Entity -@Data -public class MasterData extends ModelWithValidFlagVo { - - @EruptField( - views = @View(title = "分类"), - edit = @Edit( - search = @Search(vague = true), - title = "获取可选种类", - type = EditType.TAGS, - desc = "动态获取可选种类", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct category from master_data where valid=true" - )) - ) - private String category; - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, - inputType = @InputType - ) - ) - private String detail; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String code; - -} -``` - -## 启动看效果 - -1. 一开始什么也没有 -2. 进行菜单配置 - -![img.png](add-menu.png) - diff --git a/docs/lessons/master-data/1-master-data.md b/docs/lessons/master-data/1-master-data.md deleted file mode 100644 index 548591c..0000000 --- a/docs/lessons/master-data/1-master-data.md +++ /dev/null @@ -1,123 +0,0 @@ -# 极简测试管理系统: 第一个功能之增删改查列表 - -了解学习一个东西可能没有什么好方法,就是下面几步: -1. 从简单入手 -2. 逐步复杂,理解更多概念 -3. 理解概念之间的逻辑,通过逻辑推导举一反三 - -## 从最简单入手 - -最简单功能其实就是一个增删改查表,实现一个系统字典表需要做哪些? -系统字典表主要用来记录一些系统常量: -1. 比如什么优先级P1,P2这种可以用来配置的东西 -2. 这个功能是最典型的后台增删改查功能,是学习erupt最好的入门案例 - -## 先说实现的效果 - -![img.png](master-data.png) - -实现以上内容需要哪些工作: -- 数据库设计? -- 前端? -- 后端? -- 其实只要一个JAVA类......, 认真的....... - -可以说都需要,也可以说都不需要. - -实际上需要做的是 - - -## 一个JAVA文件一个增删改查页面 - -```java -package io.fluentqa.workspace.base.model; - -import io.fluentqa.workspace.base.handlers.SqlTagFetchHandler; -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; - -import javax.persistence.Entity; -import javax.persistence.Table; - - -@Erupt(name = "产品字典值配置", - power = @Power(importable = true, - export = true), - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30 - )) -@Table(name = "master_data") -@Entity -@Data -public class MasterData extends ModelWithValidFlagVo { - - @EruptField( - views = @View(title = "分类"), - edit = @Edit( - search = @Search(vague = true), - title = "获取可选种类", - type = EditType.TAGS, - desc = "动态获取可选种类", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct category from master_data where valid=true" - )) - ) - private String category; - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, - inputType = @InputType - ) - ) - private String detail; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String code; - -} -``` - -## 启动看效果 - -1. 一开始什么也没有 -2. 进行菜单配置 - -![img.png](add-menu.png) - diff --git a/docs/lessons/master-data/add-menu.png b/docs/lessons/master-data/add-menu.png deleted file mode 100644 index eea7979..0000000 Binary files a/docs/lessons/master-data/add-menu.png and /dev/null differ diff --git a/docs/lessons/master-data/master-data.png b/docs/lessons/master-data/master-data.png deleted file mode 100644 index aab90a1..0000000 Binary files a/docs/lessons/master-data/master-data.png and /dev/null differ diff --git a/docs/lessons/metersphere/api-testing/api-cases.png b/docs/lessons/metersphere/api-testing/api-cases.png deleted file mode 100644 index b6e7f14..0000000 Binary files a/docs/lessons/metersphere/api-testing/api-cases.png and /dev/null differ diff --git a/docs/lessons/metersphere/api-testing/api-scenarios.png b/docs/lessons/metersphere/api-testing/api-scenarios.png deleted file mode 100644 index 3f3ce2d..0000000 Binary files a/docs/lessons/metersphere/api-testing/api-scenarios.png and /dev/null differ diff --git a/docs/lessons/metersphere/api-testing/api-test-run-single.png b/docs/lessons/metersphere/api-testing/api-test-run-single.png deleted file mode 100644 index 5561da9..0000000 Binary files a/docs/lessons/metersphere/api-testing/api-test-run-single.png and /dev/null differ diff --git a/docs/lessons/metersphere/api-testing/api-testing.md b/docs/lessons/metersphere/api-testing/api-testing.md deleted file mode 100644 index 2c6a76b..0000000 --- a/docs/lessons/metersphere/api-testing/api-testing.md +++ /dev/null @@ -1,15 +0,0 @@ -# API Testing - -- API Dashbaord -![img.png](dashboard.png) - -- 接口测试 - -![img.png](api-cases.png) - -- api-test -![img.png](api-test.png) -![img.png](api-test-run-single.png) - -![img.png](api-scenarios.png) -![img.png](test-report.png) \ No newline at end of file diff --git a/docs/lessons/metersphere/api-testing/dashboard.png b/docs/lessons/metersphere/api-testing/dashboard.png deleted file mode 100644 index ef29b6d..0000000 Binary files a/docs/lessons/metersphere/api-testing/dashboard.png and /dev/null differ diff --git a/docs/lessons/metersphere/api-testing/test-report.png b/docs/lessons/metersphere/api-testing/test-report.png deleted file mode 100644 index 5a750f1..0000000 Binary files a/docs/lessons/metersphere/api-testing/test-report.png and /dev/null differ diff --git a/docs/lessons/metersphere/perf/img.png b/docs/lessons/metersphere/perf/img.png deleted file mode 100644 index 90ae1a9..0000000 Binary files a/docs/lessons/metersphere/perf/img.png and /dev/null differ diff --git a/docs/lessons/metersphere/perf/img_1.png b/docs/lessons/metersphere/perf/img_1.png deleted file mode 100644 index 90ae1a9..0000000 Binary files a/docs/lessons/metersphere/perf/img_1.png and /dev/null differ diff --git a/docs/lessons/metersphere/perf/perf-test.md b/docs/lessons/metersphere/perf/perf-test.md deleted file mode 100644 index df2d464..0000000 --- a/docs/lessons/metersphere/perf/perf-test.md +++ /dev/null @@ -1,4 +0,0 @@ - -# 性能测试 - -![img_1.png](img_1.png) \ No newline at end of file diff --git a/docs/lessons/metersphere/system-setting/env-group.png b/docs/lessons/metersphere/system-setting/env-group.png deleted file mode 100644 index 13d266e..0000000 Binary files a/docs/lessons/metersphere/system-setting/env-group.png and /dev/null differ diff --git a/docs/lessons/metersphere/system-setting/env.png b/docs/lessons/metersphere/system-setting/env.png deleted file mode 100644 index 661472b..0000000 Binary files a/docs/lessons/metersphere/system-setting/env.png and /dev/null differ diff --git a/docs/lessons/metersphere/system-setting/img.png b/docs/lessons/metersphere/system-setting/img.png deleted file mode 100644 index 5ed5a55..0000000 Binary files a/docs/lessons/metersphere/system-setting/img.png and /dev/null differ diff --git a/docs/lessons/metersphere/system-setting/logs.png b/docs/lessons/metersphere/system-setting/logs.png deleted file mode 100644 index bd0a5e9..0000000 Binary files a/docs/lessons/metersphere/system-setting/logs.png and /dev/null differ diff --git a/docs/lessons/metersphere/system-setting/project-settings.md b/docs/lessons/metersphere/system-setting/project-settings.md deleted file mode 100644 index 2a507ac..0000000 --- a/docs/lessons/metersphere/system-setting/project-settings.md +++ /dev/null @@ -1,3 +0,0 @@ -# Project Settings - -![img.png](img.png) \ No newline at end of file diff --git a/docs/lessons/metersphere/system-setting/project.png b/docs/lessons/metersphere/system-setting/project.png deleted file mode 100644 index 424ea0b..0000000 Binary files a/docs/lessons/metersphere/system-setting/project.png and /dev/null differ diff --git a/docs/lessons/metersphere/system-setting/workspace.md b/docs/lessons/metersphere/system-setting/workspace.md deleted file mode 100644 index baeca1b..0000000 --- a/docs/lessons/metersphere/system-setting/workspace.md +++ /dev/null @@ -1,17 +0,0 @@ -# Workspace - -- 项目管理 - -![img.png](project.png) - -- 环境管理 - -![img.png](env.png) - -- 环境组 - -![img.png](env-group.png) - -- 操作日志 - -![img.png](logs.png) \ No newline at end of file diff --git a/docs/lessons/metersphere/test-track/bug-list.png b/docs/lessons/metersphere/test-track/bug-list.png deleted file mode 100644 index 706e1d2..0000000 Binary files a/docs/lessons/metersphere/test-track/bug-list.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/dashboard.png b/docs/lessons/metersphere/test-track/dashboard.png deleted file mode 100644 index 302658d..0000000 Binary files a/docs/lessons/metersphere/test-track/dashboard.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/img.png b/docs/lessons/metersphere/test-track/img.png deleted file mode 100644 index 97ca58b..0000000 Binary files a/docs/lessons/metersphere/test-track/img.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-case-detail.png b/docs/lessons/metersphere/test-track/test-case-detail.png deleted file mode 100644 index f930d4d..0000000 Binary files a/docs/lessons/metersphere/test-track/test-case-detail.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-case-repo.png b/docs/lessons/metersphere/test-track/test-case-repo.png deleted file mode 100644 index 499ea33..0000000 Binary files a/docs/lessons/metersphere/test-track/test-case-repo.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-case-review.png b/docs/lessons/metersphere/test-track/test-case-review.png deleted file mode 100644 index c64b00a..0000000 Binary files a/docs/lessons/metersphere/test-track/test-case-review.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-plan.png b/docs/lessons/metersphere/test-track/test-plan.png deleted file mode 100644 index c2d0f29..0000000 Binary files a/docs/lessons/metersphere/test-track/test-plan.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-report.png b/docs/lessons/metersphere/test-track/test-report.png deleted file mode 100644 index 5ab31a4..0000000 Binary files a/docs/lessons/metersphere/test-track/test-report.png and /dev/null differ diff --git a/docs/lessons/metersphere/test-track/test-track.md b/docs/lessons/metersphere/test-track/test-track.md deleted file mode 100644 index 822ca84..0000000 --- a/docs/lessons/metersphere/test-track/test-track.md +++ /dev/null @@ -1,31 +0,0 @@ -# Test Track - -- Dashboard - -![img.png](dashboard.png) -![img.png](img.png) - -- 测试用例 - -![img.png](test-case-repo.png) - -- 用例详细 - -![img.png](test-case-detail.png) - -- 用例评审 - -![img.png](test-case-review.png) - -- 测试计划 - -![img.png](test-plan.png) - -- 缺陷报告 -![img.png](bug-list.png) - -- 测试报告 - -![img.png](test-report.png) - - diff --git a/docs/lessons/metersphere/ui/img.png b/docs/lessons/metersphere/ui/img.png deleted file mode 100644 index 99c2feb..0000000 Binary files a/docs/lessons/metersphere/ui/img.png and /dev/null differ diff --git a/docs/lessons/metersphere/ui/ui-automation.md b/docs/lessons/metersphere/ui/ui-automation.md deleted file mode 100644 index 6ddcd0b..0000000 --- a/docs/lessons/metersphere/ui/ui-automation.md +++ /dev/null @@ -1,3 +0,0 @@ -# UI Automation - -![img.png](img.png) \ No newline at end of file diff --git a/docs/lessons/metersphere/workspace/dashboard.png b/docs/lessons/metersphere/workspace/dashboard.png deleted file mode 100644 index 1b07dc6..0000000 Binary files a/docs/lessons/metersphere/workspace/dashboard.png and /dev/null differ diff --git a/docs/lessons/metersphere/workspace/my-todo.png b/docs/lessons/metersphere/workspace/my-todo.png deleted file mode 100644 index f21af7c..0000000 Binary files a/docs/lessons/metersphere/workspace/my-todo.png and /dev/null differ diff --git a/docs/lessons/metersphere/workspace/watch.png b/docs/lessons/metersphere/workspace/watch.png deleted file mode 100644 index 7614578..0000000 Binary files a/docs/lessons/metersphere/workspace/watch.png and /dev/null differ diff --git a/docs/lessons/metersphere/workspace/workspace.md b/docs/lessons/metersphere/workspace/workspace.md deleted file mode 100644 index 458ad8f..0000000 --- a/docs/lessons/metersphere/workspace/workspace.md +++ /dev/null @@ -1,12 +0,0 @@ -# workspace - -- dashboard -![img.png](dashboard.png) - -- todo -![img.png](my-todo.png) - -- 关注 -![img.png](watch.png) - - diff --git a/docs/lessons/product/product-page.png b/docs/lessons/product/product-page.png deleted file mode 100644 index f2a3ac1..0000000 Binary files a/docs/lessons/product/product-page.png and /dev/null differ diff --git a/docs/lessons/product/product-view.png b/docs/lessons/product/product-view.png deleted file mode 100644 index b040ed2..0000000 Binary files a/docs/lessons/product/product-view.png and /dev/null differ diff --git a/docs/lessons/product/product.md b/docs/lessons/product/product.md deleted file mode 100644 index 0996e9e..0000000 --- a/docs/lessons/product/product.md +++ /dev/null @@ -1,117 +0,0 @@ -# 产品-模块配置 - -测试管理系统中需要给测试用例,测试任务,自动化用例归类到产品或者模块。 -那么我们就来进行产品模块分类的开发,主要用来给后续测试用例管理做配置使用。 - -## 产品/模块配置信息 - -一般实际过程中,产品模块配置信息是一个树形结构,也就是: -1. 产品A -2. 产品A下面有模块A,模块B -3. 模块A下面有子模块A1,A2 - -所以他就是一个树形结构,需要实现的样子是: - -![img.png](product-view.png) - -看起来有点复杂,实现起来呢?***不复杂***,还是只要一个JAVA类 - -## 实现产品/模块配置信息树形结构页面 - -***一个JAVA类实现*** - -```java -@Erupt(name = "产品模块配置", - power = @Power(importable = true, export = true), - tree = @Tree(pid = "parent.id"), - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30 - )) -@Entity -@Table(name = "products") -public class ProductModuleModel extends ModelWithValidFlagVo { - - @ManyToOne - @EruptField( - edit = @Edit( - title = "上级树节点", - type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id") - ) - ) - private ProductModuleModel parent; - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String code; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String details; - - @EruptField( - views = @View(title = "类型"), - edit = @Edit( - search = @Search, - title = "获取可选类型", - type = EditType.CHOICE, - desc = "动态获取可选类型", - choiceType = @ChoiceType( - fetchHandler = SqlChoiceFetchHandler.class, - fetchHandlerParams = "select id,name from master_data where category='PRODUCT'" - )) - ) - private String productType; -} -``` - -- 菜单配置 - -## 树形页面的中重点 - -![img.png](product-page.png) - -数字库表结构: - -![img_1.png](table.png) - -这是一张递归结构的表,parent_id是父节点id,id是当前节点id,name是当前节点名称。 - -## 总结 - -层级结构/树形结构页面的实现: -1. Model 定义中有一个 parent的字段,表示父节点id,类型就是这个类本身 -2. 数据库的递归结构: 通过id和parent.id关联就可以查找所有的子节点,这些数据本身都在一张表里面 - - diff --git a/docs/lessons/product/table.png b/docs/lessons/product/table.png deleted file mode 100644 index 6d89378..0000000 Binary files a/docs/lessons/product/table.png and /dev/null differ diff --git a/docs/lessons/project/README.md b/docs/lessons/project/README.md deleted file mode 100644 index 52e8a39..0000000 --- a/docs/lessons/project/README.md +++ /dev/null @@ -1,8 +0,0 @@ -# Project Management Design - -![img.png](img.png) - -## Project Details - -![img_2.png](img_2.png) -![img_1.png](img_1.png) diff --git a/docs/lessons/project/img.png b/docs/lessons/project/img.png deleted file mode 100644 index 24a6073..0000000 Binary files a/docs/lessons/project/img.png and /dev/null differ diff --git a/docs/lessons/project/img_1.png b/docs/lessons/project/img_1.png deleted file mode 100644 index 48fc1ac..0000000 Binary files a/docs/lessons/project/img_1.png and /dev/null differ diff --git a/docs/lessons/project/img_2.png b/docs/lessons/project/img_2.png deleted file mode 100644 index 54f2394..0000000 Binary files a/docs/lessons/project/img_2.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/defect-requirement-milestone.md b/docs/lessons/qa-products-learning/defect-requirement-milestone.md deleted file mode 100644 index a7eb852..0000000 --- a/docs/lessons/qa-products-learning/defect-requirement-milestone.md +++ /dev/null @@ -1 +0,0 @@ -![img_6.png](img_6.png) \ No newline at end of file diff --git a/docs/lessons/qa-products-learning/env.md b/docs/lessons/qa-products-learning/env.md deleted file mode 100644 index 1e4c5c2..0000000 --- a/docs/lessons/qa-products-learning/env.md +++ /dev/null @@ -1,3 +0,0 @@ -![img_5.png](img_5.png) - - diff --git a/docs/lessons/qa-products-learning/img.png b/docs/lessons/qa-products-learning/img.png deleted file mode 100644 index a2747da..0000000 Binary files a/docs/lessons/qa-products-learning/img.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_1.png b/docs/lessons/qa-products-learning/img_1.png deleted file mode 100644 index c30d6c8..0000000 Binary files a/docs/lessons/qa-products-learning/img_1.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_2.png b/docs/lessons/qa-products-learning/img_2.png deleted file mode 100644 index 497cbbe..0000000 Binary files a/docs/lessons/qa-products-learning/img_2.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_3.png b/docs/lessons/qa-products-learning/img_3.png deleted file mode 100644 index b426401..0000000 Binary files a/docs/lessons/qa-products-learning/img_3.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_4.png b/docs/lessons/qa-products-learning/img_4.png deleted file mode 100644 index 9d3f592..0000000 Binary files a/docs/lessons/qa-products-learning/img_4.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_5.png b/docs/lessons/qa-products-learning/img_5.png deleted file mode 100644 index dadf37e..0000000 Binary files a/docs/lessons/qa-products-learning/img_5.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/img_6.png b/docs/lessons/qa-products-learning/img_6.png deleted file mode 100644 index 1da7a2e..0000000 Binary files a/docs/lessons/qa-products-learning/img_6.png and /dev/null differ diff --git a/docs/lessons/qa-products-learning/test-plan.md b/docs/lessons/qa-products-learning/test-plan.md deleted file mode 100644 index 78f9ce6..0000000 --- a/docs/lessons/qa-products-learning/test-plan.md +++ /dev/null @@ -1,12 +0,0 @@ -# Test Plan - -![img_1.png](img_1.png) - -![img_2.png](img_2.png) - -![img_3.png](img_3.png) - - -## Test run - -![img_4.png](img_4.png) diff --git a/docs/lessons/qa-products-learning/workspace.md b/docs/lessons/qa-products-learning/workspace.md deleted file mode 100644 index 5b75176..0000000 --- a/docs/lessons/qa-products-learning/workspace.md +++ /dev/null @@ -1,3 +0,0 @@ -# Workpsace - -![img.png](img.png) \ No newline at end of file diff --git a/docs/lessons/test-plan/test-plan.md b/docs/lessons/test-plan/test-plan.md deleted file mode 100644 index 23c80c7..0000000 --- a/docs/lessons/test-plan/test-plan.md +++ /dev/null @@ -1,8 +0,0 @@ -# Test Plan 测试计划功能 - -测试计划功能包含了: -1. 测试计划的开始/结束时间 -2. 测试计划负责人 -3. 测试计划包含了计划执行的测试用例 -4. 测试计划包含了计划执行的测试场景 - diff --git a/docs/lessons/testcase/bug-list.png b/docs/lessons/testcase/bug-list.png deleted file mode 100644 index 706e1d2..0000000 Binary files a/docs/lessons/testcase/bug-list.png and /dev/null differ diff --git a/docs/lessons/testcase/dashboard.png b/docs/lessons/testcase/dashboard.png deleted file mode 100644 index 302658d..0000000 Binary files a/docs/lessons/testcase/dashboard.png and /dev/null differ diff --git a/docs/lessons/testcase/edit-test-case.png b/docs/lessons/testcase/edit-test-case.png deleted file mode 100644 index d64bd37..0000000 Binary files a/docs/lessons/testcase/edit-test-case.png and /dev/null differ diff --git a/docs/lessons/testcase/img.png b/docs/lessons/testcase/img.png deleted file mode 100644 index 0cc40b0..0000000 Binary files a/docs/lessons/testcase/img.png and /dev/null differ diff --git a/docs/lessons/testcase/test-case-detail.png b/docs/lessons/testcase/test-case-detail.png deleted file mode 100644 index f930d4d..0000000 Binary files a/docs/lessons/testcase/test-case-detail.png and /dev/null differ diff --git a/docs/lessons/testcase/test-case-menu.png b/docs/lessons/testcase/test-case-menu.png deleted file mode 100644 index ccff4d6..0000000 Binary files a/docs/lessons/testcase/test-case-menu.png and /dev/null differ diff --git a/docs/lessons/testcase/test-case-repo.png b/docs/lessons/testcase/test-case-repo.png deleted file mode 100644 index 499ea33..0000000 Binary files a/docs/lessons/testcase/test-case-repo.png and /dev/null differ diff --git a/docs/lessons/testcase/test-case-review.png b/docs/lessons/testcase/test-case-review.png deleted file mode 100644 index c64b00a..0000000 Binary files a/docs/lessons/testcase/test-case-review.png and /dev/null differ diff --git a/docs/lessons/testcase/test-execution.png b/docs/lessons/testcase/test-execution.png deleted file mode 100644 index 97ca58b..0000000 Binary files a/docs/lessons/testcase/test-execution.png and /dev/null differ diff --git a/docs/lessons/testcase/test-plan.png b/docs/lessons/testcase/test-plan.png deleted file mode 100644 index c2d0f29..0000000 Binary files a/docs/lessons/testcase/test-plan.png and /dev/null differ diff --git a/docs/lessons/testcase/test-report.png b/docs/lessons/testcase/test-report.png deleted file mode 100644 index 5ab31a4..0000000 Binary files a/docs/lessons/testcase/test-report.png and /dev/null differ diff --git a/docs/lessons/testcase/test-scenarios.md b/docs/lessons/testcase/test-scenarios.md deleted file mode 100644 index 710c64b..0000000 --- a/docs/lessons/testcase/test-scenarios.md +++ /dev/null @@ -1,8 +0,0 @@ -# 测试场景 - -测试场景有一系列的测试用例组成,所以他包含了不同关联的测试用例. - -使用场景如下: -1. 如果进行模式回归测试,可以选择不同模块的P1/P2的测试用例加入到测试场景,进行测试计划排期 -2. 如果某次测试需要复用已有的测试用例进行测试计划排期,可以直接引用 - diff --git a/docs/lessons/testcase/testcase-upload.md b/docs/lessons/testcase/testcase-upload.md deleted file mode 100644 index cdfafcc..0000000 --- a/docs/lessons/testcase/testcase-upload.md +++ /dev/null @@ -1,16 +0,0 @@ -# 批量上传测试用例 - -实际情况: -1. 通过EXCEL编写测试用例批量上传或者REVIEW -2. 通过脑图编写测试要点批量上传或者REVIEW - -## 快速实现 - -辅助处理EXCEL/Mingmap的库 - -## 完成页面 - -1. 测试用例上传 -2. 测试要点上传 -3. 测试用例批量上传 -4. 测试要点批量 \ No newline at end of file diff --git a/docs/lessons/testcase/testcase.md b/docs/lessons/testcase/testcase.md deleted file mode 100644 index d118db1..0000000 --- a/docs/lessons/testcase/testcase.md +++ /dev/null @@ -1,194 +0,0 @@ -# 极简测试用例管理实现-用例增删改查 - -关于极简系列的说明: - -> 难者不会,会者不难 - -事实: 不管这个事情难还是容易,如果你不会,你就是不会; 这个和这个事情有没有技术含量没有关系 -写代码: 不管写代码难不难,你如果不能熟练快速的写出一些可以用的,你就是不会;这和写代码时候有技术含量没关系 -技术含量: 技术含量这个词没有任何意义,就是一个主观判断,会的人觉得没技术含量,不会的人觉有有技术含量,看着简单不代表没有技术含量 -熟练: 熟练也是技术含量,熟练就是生产力,生产力提高就是有技术含量 - -极简系列的目的: - -- 极简系列教程只用最简单的说明,讲述如何实现,不存在任何技术含量. -- 极简系列教程主要的价值就是是个人二次实践和二次加工过的内容,能起到的作用就是上你看文档实践,作出一个东西的实践少点 -- 没有不得了的技术,但是可以降低你做一个能用东西的成本 - - -## 测试用例用例增删改查 - -测试用例管理是测试平台常见功能,Metersphere技术用例功能 - -- 测试用例模块 - -![img.png](test-case-repo.png) - -- 用例详细的添加编辑 -![img.png](test-case-detail.png) - -总结以上功能主要实现是: -1. 测试用例的增删改查 -2. 测试用例关联到产品/模块/测试用例集合 - -问题: 如果单独实现这样一个刚刚够用的类似功能(前端交互一般),需要多久? -答案: 4个小时足够,如果做参考下面 - -## 自制测试用例管理页面 - -完成以下两个页面基本就完成Metersphere的测试用例管理中的测试增删改查和 -测试用例模块关联的功能了. - -- 用例展示 -![img.png](img.png) -- 用例编辑 -![img_1.png](edit-test-case.png) - - -## 30分钟实现 - -如果要实现以上功能如果一个熟练工需要话多久: 30分钟实现差不多, -如何实现: -1. **还是一个JAVA类** 对应测试用例的数据库表 -2. **这个JAVA类中关联产品模块**表 -3. 系统配置菜单 - -以上三步完成就能完成一个极简测试管理用例. - -## 最终的JAVA类 - -```java -@Data -@Entity -@Erupt(name = "测试用例", - power = @Power(export = true), - orderBy = "TestCase.updateTime desc", - linkTree = @LinkTree(field = "module")) -@Table(name = "test_cases") -public class TestCase extends ModelWithValidFlagVo { - - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "产品名称", column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel module; - - @EruptField( - views = @View( - title = "用例ID" - ) - ) - private String uuid = UUID.randomUUID().toString(); - - @EruptField( - views = @View( - title = "用例描述" - ), - edit = @Edit( - title = "用例描述", - type = EditType.INPUT, notNull = true, inputType = @InputType(fullSpan = true) - ) - ) - private String summary; - - @EruptField( - views = @View( - title = "用例优先级" - ), - edit = @Edit( - title = "用例优先级", - type = EditType.CHOICE, - choiceType = @ChoiceType( - fetchHandler = SqlChoiceFetchHandler.class, - fetchHandlerParams = {"select distinct code " + - "from master_data where category ='优先级' and valid =true"} - ) - ) - ) - private String priority = "P2"; - - @EruptField( - views = @View( - title = "用例前提条件" - ), - edit = @Edit( - title = "用例前提条件", - type = EditType.TEXTAREA) - ) - private String precondition; - - @EruptField( - views = @View( - title = "测试步骤" - ), - edit = @Edit( - title = "测试步骤", - type = EditType.TEXTAREA - ) - ) - private String steps; - @EruptField( - views = @View( - title = "用例期望结果" - ), - edit = @Edit( - title = "用例期望结果", - type = EditType.TEXTAREA - ) - ) - private String expectedResult; -} -``` - -是的其实就是这样一个类就完成了所有的功能,并不会有任何夸张. - -## 关于这个类的说明 - -这个类的说明其实为: -1. 测试用例需要记录哪些字段,这个例子中使用的是最简单的字段 -2. 每个字段使用什么页面元素展示,怎么用JAVA来表示,这个例子中主要有以下几种: - * EditType.INPUT: 文本框, 比如用例描述这个字段 - * EditType.CHOICE: 下拉框, 比如优先级这个字段, 问题:下拉选项从哪里来? - * linkTree = @LinkTree(field = "module"): 这个表示树形结构 - * EditType.REFERENCE_TREE: 树形结构, 比如模块这个字段, 问题:树形结构的数据从哪里来? -3. 关于两表关联, 使用了如下注解,这些注解在实际情况如何功能 - ```java - @ManyToOne - @JoinColumn(name = "product_id") - ``` - -如果你对以上问题都比较熟练了,那么做这个功能可能只需要20分钟,就是设计以下表结构. - -## 最后运行程序配置菜单 - -![img_1.png](test-case-menu.png) - -自此类似于一个Meterspher的测试用例增删改查,关联的功能就完成了. -你说这个测试有什么用,目前看其实没有实际意义,因为实际情况是: -1. 不会怎么写用例,用例一般是脑图或者EXCEL批量完成 -2. 即使保存下来了,那么如何更新呢? - -那么做这么一个功能的好处是什么: -1. 作为一个最基础功能 -2. 有了一个统一保存测试用例的地方 -3. 可以在上面进行优化,来解决如何让批量的用例进入系统和进行方便更新 -4. 可以在上面叠加功能,比如: - - 测试用例REVIEW - - 一个大功能迭代可以将保存的测试用例精确分工,将用例分配到人 - - 根据测试用例的分配情况进行测试进度跟踪,可以追踪到具体的人 - - 可以在这个基础上进行一些统计 - - 可以用测试用例的方式记录需求,让需求不只是在人的脑子里面 - - 可能还有不少 - -## 下集预告 - -***上传脑图和EXCEL批量添加测试用例的功能*** diff --git a/docs/lessons/tips/gamma.md b/docs/lessons/tips/gamma.md deleted file mode 100644 index d814a04..0000000 --- a/docs/lessons/tips/gamma.md +++ /dev/null @@ -1,4 +0,0 @@ -## Gamma APP - -![img.png](img.png) -![img_1.png](img_1.png) \ No newline at end of file diff --git a/docs/lessons/tips/img.png b/docs/lessons/tips/img.png deleted file mode 100644 index 12a0252..0000000 Binary files a/docs/lessons/tips/img.png and /dev/null differ diff --git a/docs/lessons/tips/img_1.png b/docs/lessons/tips/img_1.png deleted file mode 100644 index b955c62..0000000 Binary files a/docs/lessons/tips/img_1.png and /dev/null differ diff --git a/docs/more/1977 - Lamport - Concurrent Reading and Writing.pdf b/docs/more/1977 - Lamport - Concurrent Reading and Writing.pdf deleted file mode 100644 index 2842653..0000000 Binary files a/docs/more/1977 - Lamport - Concurrent Reading and Writing.pdf and /dev/null differ diff --git a/docs/more/2010 - Pisa - SPSC Queues on Shared Cache Multi-Core Systems.pdf b/docs/more/2010 - Pisa - SPSC Queues on Shared Cache Multi-Core Systems.pdf deleted file mode 100644 index e2d6ce2..0000000 Binary files a/docs/more/2010 - Pisa - SPSC Queues on Shared Cache Multi-Core Systems.pdf and /dev/null differ diff --git a/docs/more/2011 - Dice - MultiLane - A Concurrent Blocking Multiset.pdf b/docs/more/2011 - Dice - MultiLane - A Concurrent Blocking Multiset.pdf deleted file mode 100644 index a6ab0af..0000000 Binary files a/docs/more/2011 - Dice - MultiLane - A Concurrent Blocking Multiset.pdf and /dev/null differ diff --git a/docs/more/2011 - Technion - CAFE - Scalable Task Pools with Adjustable Fairness and Contention.pdf b/docs/more/2011 - Technion - CAFE - Scalable Task Pools with Adjustable Fairness and Contention.pdf deleted file mode 100644 index 10cf02c..0000000 Binary files a/docs/more/2011 - Technion - CAFE - Scalable Task Pools with Adjustable Fairness and Contention.pdf and /dev/null differ diff --git "a/docs/more/2012 - Junchang- BQueue- Ef\357\254\201cient and Practical Queuing.pdf" "b/docs/more/2012 - Junchang- BQueue- Ef\357\254\201cient and Practical Queuing.pdf" deleted file mode 100644 index 44fe633..0000000 Binary files "a/docs/more/2012 - Junchang- BQueue- Ef\357\254\201cient and Practical Queuing.pdf" and /dev/null differ diff --git a/docs/more/2012 - Salzburg - Fast and Scalable k-FIFO Queues.pdf b/docs/more/2012 - Salzburg - Fast and Scalable k-FIFO Queues.pdf deleted file mode 100644 index 02a3716..0000000 Binary files a/docs/more/2012 - Salzburg - Fast and Scalable k-FIFO Queues.pdf and /dev/null differ diff --git a/docs/more/2012 - Technion - SALSA - NUMA-aware Algorithm for Producer-Consumer Pools.pdf b/docs/more/2012 - Technion - SALSA - NUMA-aware Algorithm for Producer-Consumer Pools.pdf deleted file mode 100644 index 2fce738..0000000 Binary files a/docs/more/2012 - Technion - SALSA - NUMA-aware Algorithm for Producer-Consumer Pools.pdf and /dev/null differ diff --git a/docs/more/2013 - Afek - Fast Concurrent Queues for x86 Processors.pdf b/docs/more/2013 - Afek - Fast Concurrent Queues for x86 Processors.pdf deleted file mode 100644 index 392cb1e..0000000 Binary files a/docs/more/2013 - Afek - Fast Concurrent Queues for x86 Processors.pdf and /dev/null differ diff --git a/docs/more/2013 - Salzburg - Distributed Queues in Shared Memory.pdf b/docs/more/2013 - Salzburg - Distributed Queues in Shared Memory.pdf deleted file mode 100644 index cbc12a5..0000000 Binary files a/docs/more/2013 - Salzburg - Distributed Queues in Shared Memory.pdf and /dev/null differ diff --git a/docs/more/2014 - Afek - Fence-Free Work Stealing on Bounded TSO Processors.pdf b/docs/more/2014 - Afek - Fence-Free Work Stealing on Bounded TSO Processors.pdf deleted file mode 100644 index 0fcfebb..0000000 Binary files a/docs/more/2014 - Afek - Fence-Free Work Stealing on Bounded TSO Processors.pdf and /dev/null differ diff --git a/docs/more/api/_data/design.yml b/docs/more/api/_data/design.yml deleted file mode 100644 index 18f6158..0000000 --- a/docs/more/api/_data/design.yml +++ /dev/null @@ -1,1502 +0,0 @@ ---- - -info: - title: Lifecycle Visualization - description: - body: This is an opinionated visual and accompanying outline to help demonstrate what a complete digital product can be when you adopt an API-first strategy, providing a common approach to defining the API lifecycle, governance, tooling, and which roles are needed to move APIs forward. - tip: This is one approach to delivering a consistent API across a common lifecycle while considering governance, and the roles in place. This blueprint will not reflect the needs of every organization, but provides a nice template to consider as you are building your own strategy. - type: - body: HTTP / Web / REST API - tip: This lifecycle has been validated against existing web APIs, providing your most simplest, but also most broad approach to how you can do APIs. - approach: - body: Design-Led - tip: This blueprint is opinionated about following a design-led approach to delivering an API, iterating on the contract amongst stakeholders before any code gets written. - visibility: - body: Public - tip: This is a blueprint for a publicly available API, and we should take into consideration what happens when anyone around the world can access it. - maturity: - body: Production - tip: This blueprint is meant to cover the needs of an API in production, requiring the most comprehensive quality, security, and governance. - lifecycle: - image: producer-consumer-lifecycle.png - tip: Forward motion with any API requires a balanced relationship between the API producer and the consumer, let's stand this lifecycle on its side and walk through all of the elements. -# Producer Lifecycle -producer: - - name: Define - tip: Each API should be well-defined by a mix of business and technical stakeholders. - type: business - elements: - - name: Domains - tip: Each API should be part of a well-defined domain that is in alignment with business goals. - text: What domain will an API be operating in? Define the vocabulary, standards, and other patterns that developers at design and development teams will use. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: Regions - tip: Which regions an API will be running in should be well-defined from the beginning. - text: Identify the region(s) where an API will operate so that you can comply with regulations and other business requirements and ensure that APIs are as close to consumers as possible. - tools: - one: - name: AWS API Gateway - description: Using AWS API Gateway to deploy APIs into different regions around the world. - link: https://aws.amazon.com/api-gateway/ - two: - three: - four: - - name: Requirements - tip: The business, security, and other requirements for an API should always be well-defined. - text: What are the requirements for the API? You need to define the business value it will bring to help you guide development and operation. - tools: - one: - name: Github - description: Leveraging the Github README to outline the technical and business requirements. - link: https://github.com - two: - three: - four: - - name: Workspaces - tip: Each API should have a dedicated workspace where you can stay up to speed on what is happening. - text: Set up the workspaces where teams will be designing, developing, and managing APIs, then iterating upon them and managing multiple versions. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: Source - tip: Each API should have a source control repository, ensuring all artifacts and code are available. - text: Ensuring that every API is using source control and has a repository for managing code, but also API artifacts, and other elements needed to manage the API lifecycle. - tools: - one: - name: Github - description: Setting up a GitHub repository within the organization for the enterprise, domain, or team, providing a place for the API source of truth. - link: https://github.com - two: - three: - four: - - name: Teams - tip: The teams behind an API should be well-defined and documented as part of the API lifecycle. - text: Line up who will be working on an API, bringing together designers, developers, technical writers, QA specialists, and other roles who will be involved in moving your APIs forward. - tools: - one: - name: Postman - description: Postman provides the ability to manage API teams, organizing API operations across domains and teams. - link: https://postman.com - two: - three: - four: - - name: Stakeholders - tip: All business and technical stakeholders should be identified and made part of the process. - text: Identify business and technical stakeholders, including any external partners and consumers who might need to be involved. - tools: - one: - name: Postman - description: Private, partner, and public networks and workspaces open up the ability to work with all stakeholders for an API. - link: https://postman.com - two: - three: - four: - - name: Design - tip: This is where we design an aPI before we get to work writing any code. - type: business - elements: - - name: Source - tip: Source control should be used as the foundation for designing API contracts. - text: Using source control to manage the artifacts involved with the API design process, providing a single source of truth that grounds design across teams. - tools: - one: - name: Github - description: Cloning and syncing with the GitHub repository, keeping all API contracts as part of source control for use across the lifecycle. - link: https://example.com - two: - three: - four: - - name: Contracts - tip: Each API should have a well-defined and complete machine readable contract available. - text: Put contracts like OpenAPI and AsyncAPI to work in defining the surface area of each API, providing a machine-readable contract to guide teams’ work. - tools: - one: - name: Postman - description: The Postman API builder allows you to manage the contracts you are designing as part of delivering APIs. - link: https://postman.com - two: - name: OpenAPI - description: The OpenAPI specification is used to provide a contract that guides, and defines the design of this HTTP / Web / REST API being developed. - link: https://example.com - three: - four: - - name: Schema - tip: The schema for all objects that an API depends on should be part of the contracts. - text: Establish a schema for all of the objects used as part of requests, responses, publishing, and subscribing when integrating an API into applications. - tools: - one: - name: Postman - description: In API builder you can manage the JSON Schema via OpenAPI contracts used for APIs, or simply used to manage the schema using the domain pattern. - link: https://postman.com - two: - name: JSON Schema - description: The JSON Schema specification is used to define the models applied in API requests and responses, allowing them to be validated. - link: https://example.com - three: - four: - - name: Reference - tip: Reference documentation should exist for each API. - text: Generating reference documentation using the API contract, providing a complete set of documentation for everything that is possible with an API. - tools: - one: - name: Postman - description: Postman is used to generate the reference documentation from the OpenAPI contract, keeping a complete reference available for each API. - link: https://postman.com - two: - name: OpenAPI - description: The OpenAPI specification is used as the source of truth for reference documentation, allowing it to be auto generated as needed. - link: https://example.com - three: - name: Collection - description: The Postman collection is used as a generated, portable, and sharable set of reference documentation for each API being developed. - link: https://example.com - four: - - name: Mocks - tip: Mock servers should be used as part of the API design process. - text: Generate a mock representation of an API using its contract, providing an example of what the API will do in production. That will help with the design process and later on, with onboarding. - tools: - one: - name: Postman - description: You can use Postman to generate mock servers, leveraging OpenAPI as the source of truth, and collections as derivative of the truth. - link: https://postman.com - two: - name: OpenAPI - description: The OpenAPI specification is used as the source of truth for mock servers, allowing it to be auto generated as needed. - link: https://openapis.org - three: - name: Collection - description: postman collections are used as a container for each mocked representation, providing portable and sharable virtualized instances of an API. - link: https://postman.com - four: - - name: Variables - tip: The common values, secrets and other elements of APIs should be abstracted as variables. - text: Variables should be used to abstract away common patterns and secrets that are used across reference documents, mock servers, and eventually API testing. - tools: - one: - name: Postman - description: You can develop and apply different scopes of variables using Postman, which can be applied across the API lifecycle across environments. - link: https://postman.com - two: - name: Collection - description: Postman collections can utilize global, collection, and environment variables to apply common values and secrets where APIs are being used. - link: https://example.com - three: - four: - - name: Environments - tip: A machine readable environment should be available for all API environments. - text: Environments allow development, staging, production, and other environments needed to develop and deliver APIs in a machine readable and executable way. - tools: - one: - name: Postman - description: Postman environments are used to define the environments teams will use to deliver APIs, standardizing them across APIs being produced. - link: https://postman.com - two: - three: - four: - - name: Develop - tip: The development stage of the API lifecycle should be well-defined, even if it is happening locally. - type: technical - elements: - - name: Source - tip: Source control is the base of local development and provides access to the API source truth. - text: The source control repository is cloned locally, providing access to artifacts that are produced as part of the API design process to develop API. - tools: - one: - name: Github - description: The GitHub repository is used to keep API artifacts synced, so that they can be used throughout the development process. - link: https://github.com - two: - three: - four: - - name: IDE - tip: Developers love their IDE, and will be using it to develop APIs locally. - text: An integrated development environment is used to develop the code and other artifacts that will be used as part of deploying each API. - tools: - one: - name: Visual Studio - description: VSCode is used to develop the code and artifacts behind each API, cloning the repository used as the source of truth. - link: https://code.visualstudio.com/ - two: - three: - four: - - name: CLI - tip: The command line interface (CLI) is an integral part of the local development of APIs. - text: A Command Line Interface (CLI) can be used as part of local development, making Postman artifacts and capability as part of the development process. - tools: - one: - name: Postman - description: The Postman CLI is used as part of the development process, making the platform available locally. - link: https://postman.com - two: - three: - four: - - name: Server - tip: The server behind the API will need to be generated and then developed. - text: Establish a baseline for the underlying API compute, choosing among virtual servers, containers, and serverless to power each API. - tools: - one: - name: AWS Lambda - description: Using AWS Lambda as the serverless backend server for the API, providing scalable compute to power the API. - link: https://aws.amazon.com/lambda/ - two: - three: - four: - - name: Gateway - tip: The gateway will need to be set up and configured for deploying the API. - text: Set up the gateway to prepare for an API deployment. Do the initial work to set up everything needed at the gateway layer, shifting as much of this work as far to the left in the life cycle as possible. - tools: - one: - name: AWS API Gateway - description: The AWS API Gateway is used for development, staging, and production APIs, providing a regional gateway for accessing each API. - link: https://aws.amazon.com/api-gateway/ - two: - three: - four: - - name: Test - tip: All APIs should be tested, shifting the process as far left in the lifecycle as possible. - type: technical - elements: - - name: Contract - tip: The contract for each API should be used as the source of truth for testing. - text: Use OpenAPI and AsyncAPI contracts to ensure that 100% of the surface area of an API is tested and behavior reflects the contract between producer and consumer. - tools: - one: - name: Postman - description: Postman is used to test the contract for each API. - link: https://postman.com - two: - name: Collection - description: Contract test collections are generated for testing aPIs. - link: https://postman.com - three: - four: - - name: Performance - tip: Performance tests should exist for each API to meet SLAs. - text: Test specific paths for each API in multiple regions to make sure the API, gateway, and network provide the desired performance. - tools: - one: - name: Postman - description: Postman is used to test the performance of each API. - link: https://postman.com - two: - name: Collection - description: Performance test collections are generated for testing APIs. - link: https://postman.com - three: - four: - - name: Monitors - tip: Contract and performance testing should be monitored on a schedule. - text: Schedule testing monitors to run on a schedule reflecting the business needs of the API, but also the type of test being run, allowing teams to automate testing. - tools: - one: - name: Postman - description: Postman is used to generate monitors across the tests being automated. - link: https://postman.com - two: - name: Collection - description: A monitor can be used to schedule the run of each test across regions. - link: https://postman.com - three: - four: - - name: Secure - tip: The security of APIs should be shifted left as far as possible in the API lifecycle. - type: technical - elements: - - name: Authentication - tip: Standardized authentication for APIs should always be used. - text: Authentication helps ensure APIs are accessed only by those who should have access, allowing API producers and consumers to easily apply rules consistently. - tools: - one: - name: AWS API Gateway - description: Using AWS API Gateway to deploy APIs into different regions around the world. - link: https://aws.amazon.com/api-gateway/ - two: - three: - four: - - name: Authorization - tip: The fine-grain access of digital resources should be considered. - text: Once a user is authenticated, the authorization layer will make sure they only have access to approved resources. - tools: - one: - name: AWS API Gateway - description: Using AWS API Gateway to deploy APIs into different regions around the world. - link: https://aws.amazon.com/api-gateway/ - two: - three: - four: - - name: Environments - tip: Environments help standardize how we manage API environments. - text: Have a solid map of the development, staging, and production environments across all APIs in operation. That will help you manage API deployment more consistently. - tools: - one: - name: Postman - description: Postman environments are used to define and execute across development, staging, and production. - link: https://postman.com - two: - three: - four: - - name: RBAC - tip: ROle-based access control should be applied to APIs and operations. - text: Role-based access controls should be applied at the authorization layer of an API and to the API operations around it, helping govern who has access to operations. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: Fuzzing - tip: The entire surface area of APIs should be fuzzed as part of the life cycle. - text: Providing invalid, unexpected, or random data as inputs to an API as part of manual and automated testing, helping check for the most common vulnerabilities. - tools: - one: - name: Postman - description: You can use Postman combined with network partners who offer API-driven fuzzing solutions. - link: https://postman.com - two: - name: Collection - description: Fuzzing collections can be defined and used to test for common vulnerabilities across APis. - link: https://example.com - three: - four: - - name: Deploy - tip: This is when an API is being deployed in a repeatable way across teams. - type: technical - elements: - - name: Source - tip: Source control is the foundation of how APIs are being deployed. - text: Use source control to manage code and artifacts used to deploy an API, providing a single location where you can find everything behind each version, ideally with multiple branches to accommodate many API contributions. - tools: - one: - name: Github - description: The GitHub repository is used as the source of truth when deploying each API. - link: https://github.com - two: - three: - four: - - name: Artifacts - tip: All artifacts should be synced and made available via source control. - text: OpenAPI, JSON Schema, and collections are committed to the source control so that they can be used as part of the CI/CD pipelines, but also anywhere else across the API lifecycle. - tools: - one: - name: Github - description: A folder is created within the GitHub repository for storing artifacts. - link: https://github.com - two: - name: OpenAPI - description: OpenAPI contracts for each API are committed to the Github repository. - link: https://example.com - three: - name: Collection - description: Collections for each API are committed to the Github repository. - link: https://example.com - four: - name: JSON Schema - description: JSON Schema are published to the GitHub repository as part of OpenAPI contracts. - link: https://example.com - - name: CI/CD - tip: CI/CD is used to deploy and test APIs as they are being built. - text: The pipeline ensures that the deployment of an API to each stage is as repeatable as possible, with tests and other essential needs of the API build process making API deployment as repeatable as possible across teams. - tools: - one: - name: Github - description: GitHub actions are used to make the deployment of each API repeatable. - link: https://github.com - two: - three: - four: - - name: Environments - tip: Machine readable environments are needed to apply testing and policies. - text: Apply commonly managed environments, with a coordinated variable strategy for testing and automating configuration as part of the pipeline, helping to abstract away the technical details and secrets of API environments. - tools: - one: - name: Postman - description: Staging and production environments are used as part of the CI/CD pipeline. - link: https://postman.com - two: - three: - four: - - name: Server - tip: The server code will get deployed as part of the build process. - text: The CI/CD pipeline publishes the code behind each individual API operation, levering a serverless backend for this API, helping automate and scale the backend resources needed to power APIs. - tools: - one: - name: AWS Lambda - description: AWS Lambda is leveraged to ensure the API is scalable, providing an elastic backend for each aPI. - link: https://example.com - two: - three: - four: - - name: Gateway - tip: The gateway configuration will be updated and deployed. - text: Publish contracts, extensions, and other configurations to the API gateway, deploying an API into a staging, then a production environment if all tests pass in the pipeline. This gives you a repeatable way of managing gateways. - tools: - one: - name: AWS API Gateway - description: AWS API Gateway is being used as the gateway for each API, ensuring deployment is automated. - link: https://example.com - two: - three: - four: - - name: Contract Test - tip: Contract test collections will be executed. - text: Contract tests are executed for staging and production environments, ensuring that the contract for each API is met in staging before it is published to production for consumers to use. - tools: - one: - name: Postman - description: Postman is used to execute contract tests as part of the CI/CD pipeline. - link: https://postman.com - two: - name: Collection - description: Contract test collections are executed within the CI/CD pipeline. - link: https://postman.com - three: - four: - - name: Security Test - tip: Security test collections will be executed. - text: Security tests are executed for staging and production environments, ensuring that there are no vulnerabilities for each API that is met in staging before it is published to production for consumers to use. - tools: - one: - name: Postman - description: Postman is used to execute security tests as part of the CI/CD pipeline. - link: https://postman.com - two: - name: Collection - description: Security test collections are executed within the CI/CD pipeline. - link: https://postman.com - three: - four: - - name: Observe - tip: The observability of APIs is important to understand the state of each API. - type: business - elements: - - name: Usage - tip: The usage of APIs needs to be readily observable by teams. - text: The usage of APIs by consumers is observable, leveraging the gateway to understand which APIs are accessed, how much consumers are using APIs, and what errors and other considerations are. - tools: - one: - name: AWS API Gateway - description: Usage data is provided by the AWS API Gateway logging, helping define user access to APis. - link: https://example.com - two: - three: - four: - - name: Monitors - tip: Monitors should exist for all collections, scheduling their execution. - text: Establish monitors for all contract, performance, security, and governance tests. These tests provide the results you need to understand the state of APIs and API operations. Use collections to define all of the outputs you need to understand the state of your APIs. - tools: - one: - name: Postman - description: Postman monitors are used to observe each element of the producer lifecycle and routed into APM, messaging, CDNs, and other infrastructure. - link: https://postman.com - two: - name: Collection - description: Collections are the executable unit of execution that can be used by each monitor to execute the desired business outcomes. - link: https://example.com - three: - four: - - name: APM - tip: All collection runs can be piped into APM solutions for wider observability. - text: Route all outputs across API operations into your existing APM solutions, tapping every output across the API life cycle to understand the health and state of the platform through the infrastructure you have already invested in. - tools: - one: - name: New Relic - description: All monitors have the option to route run results into New Relic, providing dashboards for each API. - link: https://example.com - two: - three: - four: - - name: Notifications - tip: Teams can receive notifications via web, desktop, and email about API operations. - text: Use notifications and alerts to observe changes with each API, as well as events that occur across the life cycle. They will help you understand consumer activity, as well as what is happening across teams and other stakeholders. - tools: - one: - name: Postman - description: Notifications regarding regular API activity are made available via desktop, web, and email notifications. - link: https://postman.com - two: - three: - four: - - name: Reports - tip: Reports are available to understand teams, APIs, and operations. - text: Provide team, API, documentation, testing, and other reporting, showing what teams are doing across API operations and how the lifecycle is unfolding across teams. Use native platform reporting that speaks specifically to API operations. - tools: - one: - name: Postman - description: Postman provides a rich set of documentation for observing the usage and activity across aPis. - link: https://postman.com - two: - three: - four: - - name: Watches - tip: Teams can watch APIs and collections to understand operations. - text: API engagement can be observed using API and collection watches, tracking how API consumers are watching the APIs, documentation, mocks, testing, and other aspects of API operations. - tools: - one: - name: Postman - description: All APIs and collections within Postman workspaces are able to be watched. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPI contracts can be watched using API builder. - link: https:/openapis.org - three: - name: Collection - description: Postman collections can be watched within workspaces. - link: https://postman.com - four: - - name: Forks - tip: The number of forks paint a picture about what is happening. - text: Any Postman collection can be forked by consumers, allowing documentation, mock servers, tests, and other types of collecitons to be forked and used as part of integration and automations. - tools: - one: - name: Postman - description: Postman provides watch counts, and shows consumers who are watching for API producers. - link: https://postman.com - two: - name: Collection - description: Each collection has a watch count, allowing consumers to engage with elements of API operations. - link: https://postman.com - three: - four: - - name: Feedback - tip: The feedback loop for APIs is important for iterating and evolving APIs. - text: Feedback loops connecting API consumers with API producers allow teams to gather feedback about how well APIs are meeting the needs of consumers, providing valuable information that can be used to iterate upon each API. - tools: - one: - name: Github - description: GitHub issues are used to gather and organize feedback from consumers. - link: https://github.com - two: - name: OpenAPI - description: Platform comment capabilities are used to gather feedback on OpenAPI. - link: https://example.com - three: - four: - name: Collection - description: postman comment capabilities are used to gather feedback on collections. - link: https://example.com - - name: Distribute - tip: Once ready, APIs need to be properly distributed to consumers for use. - type: business - elements: - - name: Network - tip: Networks connect producers and consumers across API operations. - text: APIs should be published to the private, partner, and public networks where API consumers are, via the platforms they are already using. Tap into existing network effects for API consumers so your developers can meet them where they already are. - tools: - one: - name: Postman - description: Postman networks are used to distribute APIs to consumers. - link: https://postman.com - two: - name: Collection - description: Collections help make onboarding with APIs via networks easier. - link: https://example.com - three: - four: - - name: Portal - tip: APIs should be published to relevant private, partner, or public portals. - text: An API deployed into production should be published to the central portal, providing centralized access for internal or external consumers through a single doorway that can be supported as part of overall API operations. - tools: - one: - name: Postman - description: Postman networks are used to distribute APIs to consumers. - link: https://postman.com - two: - name: Collection - description: Collections help make onboarding with APIs via networks easier. - link: https://example.com - three: - four: - - name: Documentation - tip: Reference documentation should be distributed for all APIs. - text: Reference documentation is published for the API, leveraging OpenAPI contracts to generate reference collections that provide a complete picture of what is possible for consumers. - tools: - one: - name: Postman - description: Postman is used to generate and publish documentation. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPI is used as the source of truth for documentation. - link: https://example.com - three: - four: - name: Collection - description: Reference collections are used to publish all documentation. - link: https://postman.com - - name: Client SDKs - tip: Client SDKs and snippets should always be available for APIs by default. - text: Generating client SDKs in a variety of programming languages, helping do the redundant work for your consumers when it comes to integration. - tools: - one: - name: Postman - description: Postman is used to produce client snippets or SDKs. - link: https://postman.com - two: - name: OpenAPI - description: The OpenAPI contract can be used as code generation source. - link: https://example.com - three: - four: - name: Collection - description: Workflow collections can be used to produce SDKs. - link: https://postman.com - - name: Buttons - tip: Embeddbable buttons make onboarding with APis easier. - text: The documentation, tests, and workspace behind an API should be made available via blog posts, videos, wikis, and other resources to support APIs. Use embeddable and actionable buttons that consumers can activate with one click. - tools: - one: - name: Postman - description: Postman is used to produce the HTML / JavaScript buttons. - link: https://postman.com - two: - name: Collection - description: Buttons are associated with specific API collections. - link: https://example.com - three: - four: - - name: Blog Post - tip: Each API should enjoy an internal, partner, or private blog post. - text: Each API should at least have one blog post published, helping announce the availability of each API, ensuring that business and technical are made aware of the value that is available. - tools: - one: - name: Postman - description: The Postman blog is used to publish blog posts to the community. - link: https://postman.com - two: - three: - four: - - name: Video - tip: A walk-through video should be recorded to show the value of an API. - text: Each API should have at least one video that helps walk through documentation, onboarding, and using the API as part of workflows, applications, and integrations. - tools: - one: - name: Youtube - description: YouTube is used for publishing API videos and making them available to consumers. - link: https://youtube.com - two: - three: - four: - - name: Education - tip: Educational resources should be made available across all aPIs. - text: Contracts, collections, blog posts, videos, and other sources should be made available via workspaces and repositories, helping ensure the required education for consumers is available. - tools: - one: - name: Postman - description: The Postman blog is used to publish blog posts to the community. - link: https://postman.com - two: - three: - four: - - - -# Roles involved in the life cycle -roles: - - name: Define - tip: The API definition should be a partnership between business and IT. - type: business - elements: - - name: Product Manager - tip: Every API should have a product owner in charge of the definition. - text: The product manager or owner is leading the definition of each API. - - name: Software Architect - tip: The big picture should be defined by software architects. - text: Software architects are providing the support the product managers need. - - name: Design - tip: Not all developers will have the design skills needed. - type: business - elements: - - name: Software Architect - tip: Software architects should lay the foundation for design. - text: The software architect participates and helps guide the design process. - - name: Product Manager - tip: API product managers should be leading the design. - text: Product managers participate in the design of each API product delivered. - - name: Designer - tip: Ideally, there are dedicated designers in charge of this stage. - text: The API designer is taking the lead on the design of each API delivered. - - name: Develop - tip: This stage of the lifecycle is usually a local affair. - type: technical - elements: - - name: Developers - tip: This work is done by developers in their local environment. - text: A developer generally develops the backend for each API locally. - - name: Test - tip: Teams are increasingly brought into handle this stage. - type: technical - elements: - - name: Test Engineer - tip: Engineers with the test experience should be owning this. - text: Test engineer and QA teams are brought in to help ensure the quality of APIs. - - name: Secure - tip: Security is being shifted left, but still defined centrally. - type: technical - elements: - - name: Information Security - tip: The InfoSec teams are coming in to help. - text: InfoSec teams are brought in to help make sure all APIs delivered are secure. - - name: Deploy - tip: The deployment of APIs is increasingly a team sport. - type: technical - elements: - - name: Developers - tip: Developers are still playing a key role. - text: Developers may play a role in deploying each API into some or all stages. - - name: Release Management - tip: Release management teams are helping stabilize things. - text: The release management team is tapped to actually move APIs into production. - - name: Observe - tip: We are seeing observability begin shifting towards business needs. - type: business - elements: - - name: Site reliability Engineering (SRE) - tip: This role is important to help improve reliability. - text: SRE are brought in to help manage the technical observability for each API. - - name: DevOps - tip: Specialization in this area is increasingly important. - text: DevOps teams are leveraged to help make the API lifecycle more observable. - - name: Platform - tip: The existence of a platform team shows an organization is further along in their journey. - text: PlatformOps teams are increasingly owning this stage of the API lifecycle. - - name: Distribute - tip: The distribution of APIs is increasingly not the developer. - type: business - elements: - - name: Product Marketing - tip: Marketing teams are increasingly taking a lead here. - text: Product marketing teams are in charge of actually distributed APis. - - name: Product Manager - tip: The product manager is helping drive the distribution of APIs. - text: The product manager or owner is leading the distribution of each API. - - name: Advocacy - tip: Developer advocacy is becoming commonplace in distribution. - text: Dev relations and advocacy are leverage to bring attention to APis. - -# Governance of API Operations -governance: - - name: Define - tip: Having things well-defined is essential to API governance. - type: business - elements: - - name: Guidelines - tip: Guidelines provide teams with an overview of governance. - text: Human readable guidelines are crafted and made available, helping educate teams about policies and practices used across the API lifecycle. - tools: - one: - name: Github - description: The README for the GitHub repository is used for publishing guidelines. - link: https://github.com - two: - three: - four: - - name: Rules - tip: Spectral rules codify what is needed for governance. - text: Spectral rules are used to define the design, security, and other policies that are expected to be applied to APIs across the lifecycle. - tools: - one: - name: Postman - description: Spectral rules are centrally defined as part of the Postman platform. - link: https://postman.com - two: - name: Spectral - description: Spectral rules are defined to drive automation of governance across the lifecycle. - three: - four: - - name: Standards - tip: Common standards should be made available. - text: Common standards are defined, providing templates that can be applied by designers and developers when it comes to delivering each API. - tools: - one: - name: Postman - description: Standards can be defined as OpenAPI, and applied automatically using collections. - link: https://postman.com - two: - three: - four: - - name: Design - tip: This is where we shift governance left in the life cycle. - type: business - elements: - - name: Errors - tip: Validate and reveal errors as early on in the process as possible. - text: Errors encountered when editing each API contract are reflected inline when designing an API, helping understand when mistakes are being made. - tools: - one: - name: Postman - description: The API builder provides linting errors encountered when designing each API. - link: https://postman.com - two: - name: Spectral - description: Spectral rules are executing returning severe errors via API builder. - three: - four: - - name: Warning - tip: Make the API design process a learning experience. - text: Warnings encountered when editing each API contract are reflected inline when designing an API, helping provide common warnings - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - name: Spectral - description: Spectral is used to lint contracts. - three: - four: - - name: Design Review - tip: Have a conversation with stakeholders about each API. - text: Having a design review between developer, designer, architects, and product manager to help improve the design of the API, but also help improve the policies, rules, and guidance provided as part of API governance. - tools: - one: - name: Postman - description: Design reviews can exist within a workspace, either synchronously or asynchronously with all the stakeholders. - link: https://postman.com - two: - three: - four: - - name: Develop - tip: Governance should be localized for developers during development. - type: business - elements: - - name: Errors - tip: Make all errors known locally and allow devs to fix early on. - text: Developers should be able to lint their APIs locally as they develop, seeing the errors they are making in the design of an API in real time. - tools: - one: - name: Postman - description: Postman on the desktop can be used to edit API contracts and return inline linting warnings triggered by Spectral rules. - link: https://postman.com - two: - name: Spectral - description: Spectral is used to lint contracts, and return errors. - three: - four: - - name: Warning - tip: Make development a learning experience for developers. - text: Developers should receive warnings as part of their local development for the most common policies which are automated as part of rules. - tools: - one: - name: Postman - description: Postman on the desktop can be used to edit API contracts and return inline linting warnings triggered by Spectral rules. - link: https://postman.com - two: - name: Spectral - description: Spectral is used to lint contracts, and return warnings. - three: - four: - - name: CLI - tip: Making linting available via CLI for local use. - text: Developers should have access to governance linting via the command line interface as they are developing APIs locally. - tools: - one: - name: Postman - description: The Postman CLI can be used to validate API contracts during local development. - link: https://postman.com - two: - three: - four: - - name: IDE - tip: Governance enablement should be available via IDE. - text: Developers should have access to governance linting via their chosen IDE as they are developing APIs locally. - tools: - one: - name: VSCode - description: VSCode is being used to develop the API locally. - link: https://example.com - two: - three: - four: - - name: Test - tip: Testing is an essential aspect of API governance. - type: technical - elements: - - name: Quality Review - tip: Coming together to review the quality of an API. - text: A review of contract, performance and other tests, as well as documentation and mock servers can be conducted to help ensure a baseline of quality across teams, and the APIs they are developing. - tools: - one: - name: Postman - description: Quality reviews can be conducted via private, partner, or public workspaces across teams. - link: https://postman.com - two: - three: - four: - - name: Secure - tip: Security is a critical aspect of API governance. - type: technical - elements: - - name: Security Review - tip: Coming together to review the security of an API. - text: A review of each API being developed can be conducted by security teams, helping evaluate the decisions made by teams when it comes to security, and ensure there are no vulnerabilities as the API is being deployed. - tools: - one: - name: Postman - description: Security reviews can be conducted via private, partner, or public workspaces across teams. - link: https://postman.com - two: - three: - four: - - name: Deploy - tip: This is where governance can be informed and enforced. - type: technical - elements: - - name: Errors - tip: Expose governance errors during build time. - text: Governance linting should be applied as part of the CI/CD pipeline when deploying an API, returning errors as part of the reporting and build failure to staging or production. - tools: - one: - name: Postman - description: The Postman CLI provides linting errors as part of the CI/CD pipeline. - link: https://postman.com - two: - name: Spectral - description: Spectral is used to lint contracts. - three: - four: - - name: Warning - tip: Expose warnings to developers during build time. - text: Governance linting should be applied as part of the CI/CD pipeline when deploying an API, returning warnings as part of the reporting, allowing teams to learn as they iterate upon their APIs. - tools: - one: - name: Postman - description: The Postman CLI provides linting errors as part of the CI/CD pipeline. - link: https://postman.com - two: - name: Spectral - description: Spectral is used to lint contracts. - three: - four: - - name: Policies - tip: Automate the application of gateway and other policies. - text: Security, gateway, and other policies are applied as part of the CI/CD pipeline, ensuring that all policies are tested for, and in some cases automatically applied as part of the API deployment process. - tools: - one: - name: Postman - description: Postman can be used to test and apply policies. - link: https://postman.com - two: - name: Collection - description: Postman collections are used to define, test, and apply policies. - link: https://postman.com - three: - four: - - name: Observe - tip: Governance can be made observable too! - type: business - elements: - - name: APM - tip: Pipe your governance into your APM. - text: Governance test results can be fed into APM solutions, providing a way to report across the different aspects of governance for all APIs. - tools: - one: - name: New Relic - description: Each governance collection can be monitored and piped into New Relic. - link: https://example.com - two: - three: - four: - - name: Reports - tip: Take advantage of centralized governance reporting. - text: Platform reporting shows information about how government rules are being applied across teams and APIs, helping architects, product owners, and other roles see the impact of governance work. - tools: - one: - name: Postman - description: Postman reporting allows you to see governance as it is being applied across teams and APIs. - link: https://postman.com - two: - three: - four: - - name: Distribute - tip: Discover is the most important aspect of governance. - type: business - elements: - - name: Search - tip: Make sure your API operations are discoverable. - text: Governance guidance, collections, and reporting should be discoverable as part of operations, allowing stakeholders to understand the state of one or many APIs by engaging with operations. - tools: - one: - name: Postman - description: Postman discovery allows you to find workspaces, APIs, collections, environments, and monitors for APIs. - link: https://postman.com - two: - three: - four: - -# The consumer side of the life cycle -consumer: - - name: Discover - tip: Make it simple for your consumers to find what they need. - type: business - elements: - - name: Search - tip: All of your APIs should be available via the search your consumers use. - text: Consumers should be able to search for APIs on the interfaces they are already using, allowing for discovery of API information in a manner relevant to their work. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - name: OpenAPI - description: An OpenAPI can be used to make an API more discoverazble. - link: https://example.com - three: - four: - name: Collection - description: Documentation, mock, and test collections can be used to make an API more discoverazble. - link: https://example.com - - name: Networks - tip: Consumers will find what they need via networks. - text: Publishing APIs to private, partner, and public networks for discovery. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: Teams - tip: Your teams need to be discoverable. - text: The teams behind APIs, and any partner or public contributors, should be made discoverable alongside documentation and other data, encouraging engagement. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: Workspaces - tip: The workspaces behind APIs should be discoverable. - text: Alongside Git repositories, you should include private, partner, and public workspaces as part of discovery, indexing the places where all work occurs for each API. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - three: - four: - - name: APIs - tip: All APIs should be discoverable. - text: The contracts and other artifacts that define the surface area of an API, including authentication and authorization, should be discoverable as part of regular operations. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - name: OpenAPI - description: The OpenAPI is how you make an API more discoverable. - link: https://example.com - three: - four: - - name: Documentation - tip: Docs make your APIs discoverable. - text: Up-to-date and accurate documentation for all APIs should be easily discoverable by teams, with human-readable details describing what is possible with each API. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPI is used to generate documentation. - link: https://example.com - three: - four: - - name: Access - tip: Make sure things are truly accessible! - text: This APIs, documentation, and the operations around APIs should be accessible to targeted consumers, ensuring that access is handled consistently across all APis, and removes barriers to entry for all consumers. - tools: - one: - name: AWS API Gateway - description: Theh AWS API Gateway is used to manage users, keys, plans, and APis made accessible via workspaces. - link: https://example.com - two: - three: - four: - - name: Evaluate - tip: Consumers are picky and want to evaluate whether an API meets their needs. - type: business - elements: - - name: Explore - tip: Allow consumers to explore each API. - text: Enable consumers to explore as much of the surface area of an API as possible, perhaps without even authenticating, so they can learn what is possible. - tools: - one: - name: Postman - description: Reference and onboarding collections available via public, partner, and public workspaces help consumers explore APIs. - link: https://postman.com - two: - three: - four: - - name: Documentation - tip: Make documentation support evaluation. - text: Provide rich documentation with useful descriptions, examples, and other information to help consumers get started using each API. - tools: - one: - name: Postman - description: Postman collections can be used to publish documentation for APIs. - link: https://postman.com - two: - name: OpenAPI - description: Documentation can be generated using valid OpenAPI contracts for APis. - link: https://example.com - three: - four: - - name: Examples - tip: Provide rich examples for all APIs. - text: Make sure there are always examples for each element of an API, allowing API contracts to be mocked and providing rich documentation showing how they work. - tools: - one: - name: Postman - description: You can manage examples for APIs using Postman collections. - link: https://postman.com - two: - name: OpenAPI - description: Examples can be added to OpenAPI contracts as part of API builder. - link: https://example.com - three: - four: - - name: Onboarding - tip: Make onboarding frictionless for consumers. - text: It should be easy for a consumer to go from exploring to executing and integrating an API, making it as easy as possible to receive keys and understand how to make API requests. - tools: - one: - name: Postman - description: Postman network and workspaces make onboarding developers frictionless. - link: https://postman.com - two: - name: Collection - description: Postman collections make it easier for consumers to onboard with APIs. - link: https://example.com - three: - four: - - name: Execute - tip: Make things hands-on and executable. - text: Provide the ability to execute each request, response, publish, and subscribe, making sure that learning about an API is as hands-on as it can possibly be. - tools: - one: - name: Postman - description: Postman provides an automated environment for working with APIs. - link: https://postman.com - two: - name: Collection - description: Postman collections provide an executable unit of API value. - link: https://example.com - three: - four: - - name: Integrate - tip: This is where consumers move from exploration to integration. - type: technical - elements: - - name: Authentication - tip: Keep authentication standardized. - text: The authentication with APIs for developers needs to be made as simple and standardized as possible using API keys, OAuth, and OpenID. - tools: - one: - name: Postman - description: Postman helps manage the authentication across the API lifecycle for APIs. - link: https://postman.com - two: - name: AWS API Gateway - description: The AWS API Gateway is used to issue keys, and manage authentication and logging for APis. - link: https://example.com - three: - four: - - name: Capabilities Collections - tip: Reduce capabilities into bite-size collections. - text: Small modular collections can be made available for automating the integration of APIs, applying environments and making integration easier for developers. - tools: - one: - name: Postman - description: Postman provides a modular way to integrate with APis. - link: https://postman.com - two: - name: Collection - description: Collections can be defined for any single API-driven functionality. - link: https://example.com - three: - four: - - name: Contracts - tip: Make contracts available to your consumers. - text: Machine-readable contracts like OpenAPI and AsyncAPI make integration as simple as importing the contract for an API, authenticating, and making the API calls you need to move data between systems, providing an artifact that defines consumption. - tools: - one: - name: Postman - description: Postman can be used to manage the source of truth for contracts. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPIs can be found in API builder and synced with repositories. - link: https://example.com - three: - four: - - name: Client SDKs - tip: Provide the code consumers need. - text: Software development kids in a variety of programming languages help minimize the time it takes for developers to integrate with an API in the language they are most familiar with. - tools: - one: - name: Postman - description: Postman auto generated code snippets for you in a variety of languages. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPI can be used to generate client SDKs as part of the build process. - link: https://example.com - three: - four: - - name: Automation - tip: Make API automation easy for your consumers. - text: Take advantage of automation opportunities, allowing collections to be scheduled and baked into the CI/CD pipelines and common business capabilities to be executed. Business and technical stakeholders can do more with less through API automation. - tools: - one: - name: Postman - description: Postman allows you to automate integration using CI/CD and monitors. - link: https://postman.com - two: - three: - four: - - name: Workflows - tip: Offer the most meaningful business workflows. - text: Provide ready-to-go low-code and no-code options for executing common business workflows, allowing multiple internal, partner, and public APIs to be daisy-chained into solutions that will help business and technical stakeholders integrate better. - tools: - one: - name: Postman - description: Postman allows you to build workflows across one or many APIs. - link: https://postman.com - two: - name: Collection - description: Collections can be used to develop workflows for API integration. - link: https://example.com - three: - four: - - name: Test - tip: Your API consumers will care a lot about API testing. - type: technical - elements: - - name: Availability - tip: Share the availability of your APIs. - text: Share uptime and availability information with consumers on a dashboard, providing transparency around the operation of the platform they will depend on for their applications and helping provide a historical accounting of availability. - tools: - one: - name: Postman - description: Postman can be used to monitor the uptime and availability of APIs. - link: https://postman.com - two: - three: - four: - - name: Contract - tip: Share your contract tests with consumers. - text: Exposing contract tests and even the results of scheduled contract test runs will increase consumer awareness of API contracts and how their validation can be used in applications and integrations. - tools: - one: - name: Postman - description: API contracts can be made available to users via workspaces so they can test APIs. - link: https://postman.com - two: - name: OpenAPI - description: OpenAPI provides a machine readable contract between API producer and consumer. - link: https://example.com - three: - four: - name: Collection - description: Collections can be used to provide contract tests that consumers can run on their own. - link: https://example.com - - name: Performance - tip: Share your performance tests with consumers. - text: Share performance tests and results of scheduled performance test runs, increasing transparency about API performance while demonstrating how your platform has considered performance and is taking steps to improve upon it. - tools: - one: - name: Postman - description: You can use Postman to understand the performance of APIs. - link: https://postman.com - two: - name: Collection - description: Performance test collections can be provided to consumers for running. - link: https://postman.com - three: - four: - - name: Security - tip: Share your security tests with consumers. - text: Share your overall security policy and security test results, creating trust with consumers. - tools: - one: - name: Postman - description: Postman workspaces can make security collections and reporting available to consumers. - link: https://postman.com - two: - name: Collection - description: You can provide security collections that consumers can execute on their own to understand security. - link: https://example.com - two: - three: - four: - - name: Deploy - tip: Your consumers need to deploy their integrations. - type: technical - elements: - - name: Source - tip: Source control powers integrations. - text: Place all manually developed or automatically generating client code in a repository, providing a source of truth for the code and for any artifacts that are needed to define the deployment and operation of any API integration. - tools: - one: - name: Github - description: GitHub source control can be used to deploy API integrations. - link: https://github.com - two: - three: - four: - - name: CI/CD Pipelines - tip: CI/CD makes integrations repeatable. - text: Implement the continuous integration portion of CI/CD, automating how applications and integrations are deployed and, making the deployment of API integrations, applications, and other use cases something that is always repeatable. - tools: - one: - name: Github - description: Github actions can be used to automate and make API deployment repeatable. - link: https://github.com - two: - three: - four: - - name: Collection - tip: Collection integrations are easy! - text: Leverage a Postman collection as a modular, shareable, and executable definition of an application, stitching together many different API calls across internal and external API sources to apply digital resources and capabilities in a specific way. - tools: - one: - name: Postman - description: Postman can be used to define and automate the integration of APIs. - link: https://postman.com - two: - name: Collection - description: Collections can be used to automate the integration of many different APis. - link: https://example.com - three: - four: - - name: Serverless - tip: Provide simple computers for consumers. - text: Use serverless layers for deploying integrations, orchestrations, and automating API resources and capabilities, tapping into ephemeral compute to deploy integration code that accomplishes specific business outcomes. - tools: - one: - name: AWS Lambda - description: AWS Lambda can be used as the serverless computer behind API operations. - link: https://postman.com - two: - three: - four: - - name: Runners - tip: Offer consumers collection runners. - text: Acknowledge that some collection applications will be manually run by different team members using runners, organizing different types of integrations and applications by workspaces and letting different stakeholders manually put them to work. - tools: - one: - name: Postman - description: You can organize teams and workspaces by domain or group using Postman. - link: https://postman.com - two: - name: Collection - description: BUsiness workflows can be defined as collections and automated using runners. - link: https://example.com - three: - four: - - name: Observe - tip: Your consumers need to observe APIs too. - type: technical - elements: - - name: Usage - tip: Keep consumers informed of usage. - text: A consumer's usage of an API should always be made available to consumers so they can always be understood of how they are putting resources to work. - tools: - one: - name: AWS API Gateway - description: AWS API Gateway is used to understand and report upon the usage of APIs by consumers. - link: https://example.com - two: - three: - four: - - name: Watches - tip: Consumers can watch APIs. - text: Keeping track of the watches on workspaces, APIs, and collections to understand who is tuned into what is happening. Use watches as a metric for the number of consumers, contributors, and internal and external stakeholders who are tuned in. - tools: - one: - name: Postman - description: Postman provides the ability to watch APIs and elements of operations. - link: https://postman.com - two: - name: OpenAPI - description: Consumers can watch OpenAPI contracts that are made available to consumers. - link: https://example.com - three: - four: - name: Collection - description: Consumers can watch collections that are made available to consumers. - link: https://example.com - - name: Forks - tip: Consumers can fork and submit pull requests. - text: Track who is forking repositories and collections, using the fork count as a metric for engagement and knowing who your consumers are. Track engagement via workspaces, repositories, and collections to learn how consumers are using your APIs. - tools: - one: - name: Postman - description: Postmans allow for forking of collections used to define API operations. - link: https://postman.com - two: - name: Collection - description: Each collection can be forked and submit pull requests to make changes. - link: https://example.com - three: - four: - - name: Feedback - tip: Provide feedback loops for consumers. - text: Engage with API producers and consumers, understanding the conversation is around each API or group of APIs. Observe discussions about digital resources and capabilities. - tools: - one: - name: Github - description: GitHub issues are used to gather feedback. - link: https://example.com - two: - name: OpenAPI - description: Comments on the OpenAPI definition is used to gather inline feedback. - link: https://example.com - three: - four: - name: Collection - description: Comments on documentation, mock, and test collection is used to gather inline feedback. - link: https://example.com - - name: Monitors - tip: Allow consumers to monitor your APIs. - text: Monitors can be used by consumers to monitor uptime, contracts, performance, and security of the APIs that they are depending on. - tools: - one: - name: Postman - description: Postman enables consumers to set up monitors on any APIs they are consuming. - link: https://postman.com - two: - three: - four: - - name: Notifications - tip: Keep consumers informed with notifications. - text: Use in-app, email, or SMS notifications to engage with a platform and keep consumers part of the forward motion of an API, collecting metrics for observability. - tools: - one: - name: Postman - description: Postman provides notifications regarding the APIs consumers are watching. - link: https://postman.com - two: - three: - four: \ No newline at end of file diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-18-02.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-18-02.png deleted file mode 100644 index 560e51e..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-18-02.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-18-35.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-18-35.png deleted file mode 100644 index 95b7e06..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-18-35.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-01.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-19-01.png deleted file mode 100644 index 6973732..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-01.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-13.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-19-13.png deleted file mode 100644 index 00c595d..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-13.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-32.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-19-32.png deleted file mode 100644 index d59c858..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-32.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-43.png b/docs/more/api/api-driven/0-overview/2022-11-14-16-19-43.png deleted file mode 100644 index 6602ae3..0000000 Binary files a/docs/more/api/api-driven/0-overview/2022-11-14-16-19-43.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/README.md b/docs/more/api/api-driven/0-overview/README.md deleted file mode 100644 index ede082a..0000000 --- a/docs/more/api/api-driven/0-overview/README.md +++ /dev/null @@ -1,33 +0,0 @@ -# Application Dev Lifecycle - -- Requirement/UI/UX -- API Design -- Code Generation -- Coding -- Integration Testing -- Testing -- Release -- Smoke Testing - - -## Mono APP - -![](mono-app.png) - - -## Microservice - -![](microservice.png) - -## Cloud Native - -![](2022-11-14-16-18-02.png) -![](2022-11-14-16-18-35.png) -![](2022-11-14-16-19-01.png) -![](2022-11-14-16-19-13.png) -![](2022-11-14-16-19-32.png) -![](2022-11-14-16-19-43.png) - -## Reference - -- https://learn.microsoft.com/en-us/dotnet/architecture/cloud-native/definition \ No newline at end of file diff --git a/docs/more/api/api-driven/0-overview/microservice.png b/docs/more/api/api-driven/0-overview/microservice.png deleted file mode 100644 index 4499c1a..0000000 Binary files a/docs/more/api/api-driven/0-overview/microservice.png and /dev/null differ diff --git a/docs/more/api/api-driven/0-overview/mono-app.png b/docs/more/api/api-driven/0-overview/mono-app.png deleted file mode 100644 index a12aafb..0000000 Binary files a/docs/more/api/api-driven/0-overview/mono-app.png and /dev/null differ diff --git a/docs/more/api/api-driven/1-definition/2022-11-14-16-23-43.png b/docs/more/api/api-driven/1-definition/2022-11-14-16-23-43.png deleted file mode 100644 index c82dbc3..0000000 Binary files a/docs/more/api/api-driven/1-definition/2022-11-14-16-23-43.png and /dev/null differ diff --git a/docs/more/api/api-driven/1-definition/README.md b/docs/more/api/api-driven/1-definition/README.md deleted file mode 100644 index e2560ae..0000000 --- a/docs/more/api/api-driven/1-definition/README.md +++ /dev/null @@ -1,40 +0,0 @@ -# API 定义阶段 - -1. 需求分析 -2. 前后台交互分析 -3. 接口定义文档 - - -## 接口定义文档格式 - -- openapi3 -- swagger -- postman -- skel -- protobuff - -## 工具使用 - -1. postman -2. insomina -3. Stoplight Studio - API Design -4. Intellj Plugin - Skel - -## API 定义 - -![](2022-11-14-16-23-43.png) - -## API 定义 - -HTTP 协议: -- request url -- method -- request header -- request body -- response body -- response header - - -## Reference - -- [todo-api-spec](https://github.com/gavincornwell/todo-api-spec.git) \ No newline at end of file diff --git a/docs/more/api/api-driven/1-definition/todo.png b/docs/more/api/api-driven/1-definition/todo.png deleted file mode 100644 index 6d03784..0000000 Binary files a/docs/more/api/api-driven/1-definition/todo.png and /dev/null differ diff --git a/docs/more/api/api-driven/1-definition/todo.puml b/docs/more/api/api-driven/1-definition/todo.puml deleted file mode 100644 index 1d52229..0000000 --- a/docs/more/api/api-driven/1-definition/todo.puml +++ /dev/null @@ -1,58 +0,0 @@ -@startuml - -Title To Do Architecture - -skinparam defaultFontSize 16 -skinparam linetype ortho -skinparam titleFontSize 22 -skinparam backgroundColor #EEEEEE - -actor "User" as U -actor "Webhook" as W - -node "Queue" as Q #D1C4E9{ -} - -node "API Gateway" as APIG #C8E6C9 - -node "Tasks" as T #E3F2FD { - node "/tasks" as tasks #FFF9C4 - database "DB" as tasksDB -} - -node "Identity" as I #E3F2FD { - node "/users" as users #FFF9C4 - database "DB" as usersDB -} - -node "Attachments" as A #E3F2FD { - node "/attachments" as attachments #FFF9C4 - database "DB" as attachmentsDB - database "Content" as attachmentsContent -} - -node "Notifications" as N #E3F2FD { - node "/notifications" as notifications #FFF9C4 - database "DB" as notificationsDB -} - -U->APIG -APIG->T -APIG->I -APIG->A -APIG->N - -tasks<->Q -users<->Q -attachments<->Q -notifications<->Q - -tasks->tasksDB -users->usersDB -attachments->attachmentsDB -attachments->attachmentsContent -notifications->notificationsDB - -N->W - -@enduml \ No newline at end of file diff --git a/docs/more/api/api-driven/1-definition/todo.yaml b/docs/more/api/api-driven/1-definition/todo.yaml deleted file mode 100644 index 8f71e76..0000000 --- a/docs/more/api/api-driven/1-definition/todo.yaml +++ /dev/null @@ -1,477 +0,0 @@ -swagger: '2.0' -info: - description: API providing access to a task management system. - version: '1' - title: To Do -basePath: /v1 -tags: - - name: tasks - description: Retrieve and manage tasks - - name: users - description: Retrieve and manage users - - name: notifications - description: Register for notifications -parameters: - taskIdParam: - name: taskId - in: path - description: The identifier of a task - required: true - type: string - skipCountParam: - name: skipCount - in: query - description: The number of items to skip - required: false - type: integer - pageSizeParam: - name: pageSize - in: query - description: The maximum number of items to return in the list - required: false - type: integer -paths: - /tasks: - get: - tags: - - tasks - summary: Retrieve tasks - description: Retrieve tasks for the current user - operationId: retrieveTasks - produces: - - application/json - parameters: - - $ref: '#/parameters/skipCountParam' - - $ref: '#/parameters/pageSizeParam' - responses: - '200': - description: Successful response - schema: - $ref: '#/definitions/TaskList' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - post: - tags: - - tasks - summary: Create task - description: Creates a new task owned by the current user - operationId: createTask - consumes: - - application/json - produces: - - application/json - parameters: - - in: body - name: taskBody - description: The task details - required: true - schema: - $ref: '#/definitions/Task' - responses: - '201': - description: Successful response - schema: - $ref: '#/definitions/Task' - '400': - description: If the task can not be created - schema: - $ref: '#/definitions/Error' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - - /tasks/{taskId}: - get: - tags: - - tasks - summary: Retrieve a task - description: Retrieve details of a specific task - operationId: retrieveTask - consumes: - - application/json - produces: - - application/json - parameters: - - $ref: '#/parameters/taskIdParam' - responses: - '200': - description: Successful response - schema: - $ref: '#/definitions/Task' - '404': - description: If **taskId** does not exist - schema: - $ref: '#/definitions/Error' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - put: - tags: - - tasks - summary: Update a task - description: Updates an existing task - operationId: updateTask - produces: - - application/json - parameters: - - $ref: '#/parameters/taskIdParam' - - in: body - name: taskBody - description: The task details - required: true - schema: - $ref: '#/definitions/Task' - responses: - '200': - description: Successful response - schema: - $ref: '#/definitions/Task' - '400': - description: If the task can not be updated - schema: - $ref: '#/definitions/Error' - '404': - description: If taskId does not exist - schema: - $ref: '#/definitions/Error' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - delete: - tags: - - tasks - summary: Delete a task - description: Deletes an existing task - operationId: deleteTask - produces: - - application/json - parameters: - - $ref: '#/parameters/taskIdParam' - responses: - '204': - description: Successful response - '404': - description: If taskId does not exist - schema: - $ref: '#/definitions/Error' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - - /tasks/{taskId}/attachments: - get: - tags: - - tasks - summary: Retrieve attachments - description: Returns a list of attachments for a task - operationId: retrieveAttachments - produces: - - application/json - parameters: - - $ref: '#/parameters/taskIdParam' - responses: - '200': - description: Successful response - schema: - $ref: '#/definitions/AttachmentList' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - post: - tags: - - tasks - summary: Add attachment - description: Adds a new attachment to the task - operationId: addAttachment - consumes: - - application/json - produces: - - application/json - parameters: - - $ref: '#/parameters/taskIdParam' - - in: body - name: attachmentBody - description: The attachment details - required: true - schema: - $ref: '#/definitions/Attachment' - responses: - '201': - description: Successful response - schema: - $ref: '#/definitions/Attachment' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - - /users: - get: - tags: - - users - summary: Retrieve users - description: Returns a list of users defined in the system - operationId: retrieveUsers - produces: - - application/json - parameters: - - $ref: '#/parameters/skipCountParam' - - $ref: '#/parameters/pageSizeParam' - responses: - '200': - description: Successful response - schema: - $ref: '#/definitions/UserList' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - post: - tags: - - users - summary: Create user - description: Creates a new user - operationId: createUser - consumes: - - application/json - produces: - - application/json - parameters: - - in: body - name: userBody - description: The users details - required: true - schema: - $ref: '#/definitions/User' - responses: - '201': - description: Successful response - schema: - $ref: '#/definitions/User' - '400': - description: If the user can not be created - schema: - $ref: '#/definitions/Error' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - - /notifications: - post: - tags: - - notifications - summary: Register webhook - description: Registers a webhook callback with the system - operationId: registerNotification - consumes: - - application/json - produces: - - application/json - parameters: - - in: body - name: notificationBody - description: The webhook details - required: true - schema: - $ref: '#/definitions/Notification' - responses: - '201': - description: Successful response - schema: - $ref: '#/definitions/Notification' - default: - description: Unexpected error - schema: - $ref: '#/definitions/Error' - -definitions: - Error: - type: object - required: - - statusCode - - message - - messageKey - properties: - id: - type: string - description: Unique identifier of the error, can be used to look up the error in the logs - statusCode: - type: string - description: The HTTP status code - message: - type: string - description: The error message - messageKey: - type: string - description: The key of the message, can be used for localising the error message - description: Error response - - Paging: - type: object - required: - - count - - hasMore - - pageSize - - skipCount - properties: - count: - type: integer - format: int32 - description: The number of items returned - hasMore: - type: boolean - default: false - pageSize: - type: integer - format: int32 - description: The requested maximum number of items to return - skipCount: - type: integer - format: int32 - description: The requested number of items to skip - totalItems: - type: integer - format: int32 - description: If present, indicated the total number of items there are in the system - description: Pagination information - - TaskList: - type: object - required: - - pagination - - data - properties: - pagination: - $ref: '#/definitions/Paging' - data: - type: array - items: - $ref: '#/definitions/Task' - description: List of tasks - - Task: - type: object - required: - - id - - owner - - state - - title - properties: - id: - type: string - description: Unique identifier - readOnly: true - title: - type: string - description: The title of the task - description: - type: string - description: Description of the task - dueDate: - type: string - format: date-time - description: The date the task is due to be done by - owner: - type: string - description: The id of the owner of the task - state: - type: string - description: The state of the task - tags: - type: array - description: List of tags - items: - type: string - description: A task - - UserList: - type: object - required: - - pagination - - data - properties: - pagination: - $ref: '#/definitions/Paging' - data: - type: array - items: - $ref: '#/definitions/User' - description: List of users - - User: - type: object - required: - - id - - firstName - - email - properties: - id: - type: string - description: Unique identifier for the user - readOnly: true - firstName: - type: string - description: The users first name - lastName: - type: string - description: The users last (family) name - email: - type: string - description: The email address of the user - password: - type: string - description: The password for the user - description: A user - - AttachmentList: - type: object - required: - - pagination - - data - properties: - pagination: - $ref: '#/definitions/Paging' - data: - type: array - items: - $ref: '#/definitions/Attachment' - description: List of attachments - - Attachment: - type: object - description: Represents a task attachment - required: - - id - - url - properties: - id: - type: string - description: Unique identifier for the attachment - readOnly: true - url: - type: string - description: URL of the attachment - - Notification: - type: object - description: Represents a notification - required: - - id - - url - properties: - id: - type: string - description: Unique identifier for the notification - readOnly: true - url: - type: string - description: URL of the webhook - \ No newline at end of file diff --git a/docs/more/api/api-driven/2-code-gen/README.md b/docs/more/api/api-driven/2-code-gen/README.md deleted file mode 100644 index 6e54605..0000000 --- a/docs/more/api/api-driven/2-code-gen/README.md +++ /dev/null @@ -1,30 +0,0 @@ -## Code Generation - -安装: -```sh -npm install @openapitools/openapi-generator-cli -g -``` ---- - -## 生成Python-FastAPI 代码 - -```sh -openapi-generator-cli generate -i spec/openapi3-todo.yaml -g python-fastapi -o todo-simple --skip-validate-spec -``` ---- - -```sh -make todo-py -poetry init -poetry add fastapi-code-generator -D -poetry run fastapi-codegen --input ../../spec/openapi3-todo.yaml --output app -``` ---- - -## 生成springboot 代码 - -```sh -openapi-generator-cli generate -i 1-definition/openapi3-todo.yaml -g spring --skip-validate-spec -o todo-java -``` - - diff --git a/docs/more/api/api-driven/README.md b/docs/more/api/api-driven/README.md deleted file mode 100644 index 7343fde..0000000 --- a/docs/more/api/api-driven/README.md +++ /dev/null @@ -1,32 +0,0 @@ -# API 定义阶段 - -## 接口定义文档格式 - - -## 工具使用 - -## Code Generation - -安装: -```sh -npm install @openapitools/openapi-generator-cli -g -``` - -## 生成Python-FastAPI 代码 - -```sh -openapi-generator-cli generate -i api-definition/openapi3-todo.yaml -g python-fastapi -o todo -``` - -```sh -make todo-py -poetry init -poetry add fastapi-code-generator -D -fastapi-codegen --input openapi3-todo.yaml --output app -``` - -## 生成springboot 代码 - -```sh -openapi-generator-cli generate -i api-definition/openapi3-todo.yaml -g spring -o todo-java -``` diff --git a/docs/more/api/api-driven/openapitools.json b/docs/more/api/api-driven/openapitools.json deleted file mode 100644 index c871d87..0000000 --- a/docs/more/api/api-driven/openapitools.json +++ /dev/null @@ -1,7 +0,0 @@ -{ - "$schema": "./node_modules/@openapitools/openapi-generator-cli/config.schema.json", - "spaces": 2, - "generator-cli": { - "version": "6.2.1" - } -} diff --git a/docs/more/api/api-driven/reference/petshop-store.yaml b/docs/more/api/api-driven/reference/petshop-store.yaml deleted file mode 100644 index b9d4817..0000000 --- a/docs/more/api/api-driven/reference/petshop-store.yaml +++ /dev/null @@ -1,128 +0,0 @@ -openapi: "3.0.0" -info: - version: 1.0.0 - title: Swagger Petstore - license: - name: MIT -servers: - - url: http://petstore.swagger.io/v1 -paths: - /pets: - get: - summary: List all pets - operationId: listPets - tags: - - pets - parameters: - - name: limit - in: query - description: How many items to return at one time (max 100) - required: false - schema: - type: integer - format: int32 - responses: - '200': - description: A paged array of pets - headers: - x-next: - description: A link to the next page of responses - schema: - type: string - content: - application/json: - schema: - $ref: "#/components/schemas/Pets" - default: - description: unexpected error - content: - application/json: - schema: - $ref: "#/components/schemas/Error" - x-amazon-apigateway-integration: - uri: - Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${PythonVersionFunction.Arn}/invocations - passthroughBehavior: when_no_templates - httpMethod: POST - type: aws_proxy - post: - summary: Create a pet - operationId: createPets - tags: - - pets - responses: - '201': - description: Null response - default: - description: unexpected error - content: - application/json: - schema: - $ref: "#/components/schemas/Error" - x-amazon-apigateway-integration: - uri: - Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${PythonVersionFunction.Arn}/invocations - passthroughBehavior: when_no_templates - httpMethod: POST - type: aws_proxy - /pets/{petId}: - get: - summary: Info for a specific pet - operationId: showPetById - tags: - - pets - parameters: - - name: petId - in: path - required: true - description: The id of the pet to retrieve - schema: - type: string - responses: - '200': - description: Expected response to a valid request - content: - application/json: - schema: - $ref: "#/components/schemas/Pets" - default: - description: unexpected error - content: - application/json: - schema: - $ref: "#/components/schemas/Error" - x-amazon-apigateway-integration: - uri: - Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${PythonVersionFunction.Arn}/invocations - passthroughBehavior: when_no_templates - httpMethod: POST - type: aws_proxy -components: - schemas: - Pet: - required: - - id - - name - properties: - id: - type: integer - format: int64 - name: - type: string - tag: - type: string - Pets: - type: array - description: list of pet - items: - $ref: "#/components/schemas/Pet" - Error: - required: - - code - - message - properties: - code: - type: integer - format: int32 - message: - type: string \ No newline at end of file diff --git a/docs/more/api/api-driven/spec/openapi3-todo.yaml b/docs/more/api/api-driven/spec/openapi3-todo.yaml deleted file mode 100644 index 0bf1d11..0000000 --- a/docs/more/api/api-driven/spec/openapi3-todo.yaml +++ /dev/null @@ -1,262 +0,0 @@ -openapi: 3.0.2 -info: - title: Todos API - description: Documentation of a simple Todos API - version: 0.0.2 -servers: - - url: 'https://test-api.temp.com' - description: Production - - url: localhost - description: Development server -tags: - - name: todo - description: Todo related end-points - - name: tag - description: Tag related end-points - - name: healthcheck - description: Tag for internal healthcheck routes -paths: - /v1/todos: - get: - summary: List all the todos - description: List all the todos of the user provided in the query parameters - operationId: listTodos - tags: - - todo - parameters: - - name: ownerUuid - in: query - required: true - schema: - type: string - responses: - '200': - description: Successfully retrieved all the todos - content: - application/json: - schema: - type: array - items: - $ref: '#/components/schemas/Todo' - example: - - uuid: 3d780d09-c520-4817-b430-ce849bcc5423 - ownerUuid: 535d6711-2ec0-4ba7-9f34-3d13f25de822 - title: Groceries - state: ACTIVE - examples: {} - '400': - $ref: '#/components/responses/BadRequest' - '429': - $ref: '#/components/responses/RateLimited' - post: - summary: Create a new todo - description: Create a new todo - operationId: createTodo - tags: - - todo - requestBody: - description: New todo payload - content: - application/json: - schema: - $ref: '#/components/schemas/Todo' - responses: - '200': - description: Successfully created a new todo - content: - application/json: - schema: - $ref: '#/components/schemas/Todo' - links: - GetTodoByUuid: - $ref: '#/components/links/GetTodoByUuid' - '400': - $ref: '#/components/responses/BadRequest' - '429': - $ref: '#/components/responses/RateLimited' - '/v1/todos/{todoUuid}': - get: - summary: Get a todo - description: Get a todo by providing its uuid - operationId: getTodo - tags: - - todo - parameters: - - name: ownerUuid - in: query - required: true - schema: - type: string - - name: todoUuid - required: true - in: path - schema: - type: string - responses: - '200': - description: Successfully retrieved the todo - content: - application/json: - schema: - $ref: '#/components/schemas/Todo' - '400': - $ref: '#/components/responses/BadRequest' - '404': - description: Couldn't find the todo with the provided uuid - '429': - $ref: '#/components/responses/RateLimited' - put: - summary: Update a todo - description: Update a todo by providing its uuid and the updated todo content - operationId: updateTodo - tags: - - todo - parameters: - - name: ownerUuid - in: query - required: true - schema: - type: string - - name: todoUuid - required: true - in: path - schema: - type: string - requestBody: - description: New todo payload - content: - application/json: - schema: - $ref: '#/components/schemas/Todo' - responses: - '200': - description: Successful response - content: - application/json: - schema: - $ref: '#/components/schemas/Todo' - '400': - $ref: '#/components/responses/BadRequest' - '429': - $ref: '#/components/responses/RateLimited' - delete: - summary: Delete a todo - description: Delete a todo by providing its uuid - operationId: deleteTodo - tags: - - todo - parameters: - - name: todoUuid - required: true - in: path - schema: - type: string - - name: ownerUuid - in: query - required: true - schema: - type: string - - name: hard - description: | - Defines if the deletion is a "hard" delete (true) or a "soft" delete (false or not present) - in: query - required: false - schema: - type: boolean - responses: - '204': - description: Successful response - '400': - $ref: '#/components/responses/BadRequest' - /ready: - get: - summary: Get the readiness of the service - operationId: isReady - tags: - - healthcheck - responses: - '200': - description: Service is ready - '503': - description: Service isn't ready yet - /healthy: - get: - summary: Get the healthiness of the service - operationId: isHealthy - tags: - - healthcheck - responses: - '200': - description: Service is healthy - '503': - description: Service isn't healthy -components: - schemas: - Todo: - type: object - properties: - uuid: - description: Unique identifier of the todo - type: string - ownerUuid: - description: Unique identifier of the owner of the todo - type: string - title: - description: Title/short summary of the todo - type: string - state: - description: State of the todo - type: string - enum: - - ACTIVE - - COMPLETED - - DELETED - description: - description: The lengthy description of this todo - type: string - required: - - uuid - - ownerUuid - - title - - state - example: - uuid: 3d780d09-c520-4817-b430-ce849bcc5423 - ownerUuid: 535d6711-2ec0-4ba7-9f34-3d13f25de822 - title: Groceries - state: ACTIVE - Error: - type: object - properties: - statusCode: - description: Status code of the response - type: number - error: - description: Name of the error - type: string - message: - description: Explanation of the error - type: string - example: - statusCode: 400 - error: Bad Request - message: querystring must have required property 'ownerUuid' - responses: - BadRequest: - description: Request badly formatted - content: - application/json: - schema: - $ref: '#/components/schemas/Error' - RateLimited: - description: Too many requests were sent - content: - application/json: - schema: - $ref: '#/components/schemas/Error' - links: - GetTodoByUuid: - operationId: getTodo - parameters: - todoUuid: $response.body#/uuid - description: | - The `uuid` value returned in the response can be used as the `todoUuid` parameter in `GET /todos/{todoUuid}`. diff --git a/docs/more/api/api-driven/spec/openapitools.json b/docs/more/api/api-driven/spec/openapitools.json deleted file mode 100644 index 27e6d53..0000000 --- a/docs/more/api/api-driven/spec/openapitools.json +++ /dev/null @@ -1,7 +0,0 @@ -{ - "$schema": "./node_modules/@openapitools/openapi-generator-cli/config.schema.json", - "spaces": 2, - "generator-cli": { - "version": "6.2.0" - } -} diff --git a/docs/more/api/api-driven/toc.json b/docs/more/api/api-driven/toc.json deleted file mode 100644 index aaf5627..0000000 --- a/docs/more/api/api-driven/toc.json +++ /dev/null @@ -1,13 +0,0 @@ -{ - "items": [ - { - "type": "divider", - "title": "APIS" - }, - { - "type": "item", - "title": "Todos API", - "uri": "/openapi3-todo.yaml" - } - ] -} \ No newline at end of file diff --git a/docs/more/api/blueprints/ad-hoc-landscape.md b/docs/more/api/blueprints/ad-hoc-landscape.md deleted file mode 100644 index 247706c..0000000 --- a/docs/more/api/blueprints/ad-hoc-landscape.md +++ /dev/null @@ -1,13 +0,0 @@ -# An Ad Hoc Landscape -In the fast-paced business environment that exists within almost every enterprise organization, there is a never-ending attraction towards defining digital resources and capabilities in a one-off fashion to support specific projects, applications, and integrations. Leaving a legacy trail of APIs that are often redundant and overlapping, but all in need of equal investment to support and keep reliable and delivering value for the business. - -## Common Distractions -There are plenty of distractions when it comes to enterprise operations, but there a common set of ways in which organizations lose their way when it comes to delivering, sustaining, and evolving their API landscape. - -- **One Project** - It is easy to see APIs as just a subset of a project with clear start and completion dates, rather than seeing those APis as part of a larger enterprise system and something that should be discoverable and usable beyond just the project. -- **One Application** - The needs of a specific web or mobile application are often proritized over seeing the APIs behind them as part of a larger enterprise system, resulting in often redundant APIs or shadow APIs that are not seen because they are obfuscated by apps. -- **One Partner** - It is common to feel the need to respon to partner requests individual rather than seeing then as part of a larger system, resulting in repetitive work occurring, which when defined as a standardized set of partner resources would help speed up interactions. -- **One Integration** - Integrations between two internal or external systems provides another common way in which similar business needs end up with one-off solutions that contributes to an ad hoc landscape rather than a ready to go integration toolbox for all business needs. -- **One Customer** - Some customers are louder than others and may enjoy an outsized role in road map decisions, resulting in APIs that do not serve the wider needs of a customer base, resulting in more overhead when it comes to supporting an ad hoc API business landscape. - -In the moment each potential distraction might seem like a priority, however with the gravity of a centralized API strategy and a well-known API lifecycle and governance strategy, they can collectively be harnessed as part of the wider enterprise motion, resulting in less distractions along the way. diff --git a/docs/more/api/blueprints/api-economy.md b/docs/more/api/blueprints/api-economy.md deleted file mode 100644 index a36cdd5..0000000 --- a/docs/more/api/blueprints/api-economy.md +++ /dev/null @@ -1,21 +0,0 @@ -# The API Economy -The phrase “API Economy” is often used to describe the direct revenue opportunities associated with making APIs available to 3rd party developers. While this represents one slice of this new lens to look at the economy through, there is a much larger opportunity with what these API providers will be enabling. We are seeing ongoing waves of startups and large enterprises leveraging an API-first approach to providing the solutions consumers are looking for, and even defining entirely new layers to the global economy. - -## Companies - -- **Uber** - In December 2008, after Garrett Camp and Travis Kalanick attended the LeWeb tech conference in Paris, they found it difficult to get a taxi. This experience inspired them to design a mobile app that allowed people to find a ride by tapping a button. In 2009, they developed the UberCab app. At first, they tested UberCab using 3 cars in New York City. A rider opened UberCab on their phone, the app used their phone's GPS location to let drivers know where to find them, and a black luxury car picked them up. The rider paid their fare within the app, so drivers did not have to handle payments. In 2010, the UberCab app launched and the first official UberCab ride was requested in San Francisco. In 2011, UberCab changed its name to Uber to distance itself from traditional taxi services. In 2012, Uber launched UberX, which allowed drivers to use non-luxury vehicles for less expensive fares. In 2013, Uber began to allow drivers to use their own personal vehicles to drive for UberX. -- **Grubhub** - In 2004, Mike Evans and Matt Maloney founded Grubhub in Chicago, hoping to improve the experience of ordering food. Evans and Maloney were looking for a way to order meals without calling the restaurant, looking at a paper menu, reading their credit card number, and dictating their address. They collected menus from restaurants around their neighborhood and offered to include them on their website for a 10% commission. Customers could open the Grubhub website to see which restaurants around them had a delivery option, and they could also view the menu and place their order. In 2010, Grubhub launched their mobile app. In 2012, Grubhub launched a tablet app for restaurants to use when taking orders. In 2014, Grubhub began offering delivery service for restaurants that do not have their own delivery drivers. -- **DoorDash** - In 2012, Tony Xu and Evan Moore were pitching their startup ideas to small businesses in Palo Alto, California. After talking with the owner of a macaroon shop, they found out that the shop was turning down delivery requests because they were unable to fulfill them. Xu and Moore went on to interview about 200 other small business owners and learned that deliveries were a pain point for most of them. They brought in Stanford classmates Andy Fang and Stanley Tang, and the four of them developed the Palo Alto Delivery website and began doing deliveries themselves around the Stanford campus. They charged a $6 delivery fee for each order, which was their only revenue. The website contained menus for local restaurants and a Google Voice number where customers would call them to place their orders. In June 2013, they renamed the website to DoorDash and began to hire additional drivers and expand their delivery area. -- **Drizly** - In 2012, Boston College students Nick Rellas and Justin Robinson were chatting over text, wishing that they could get beer delivered. After doing some research, Rellas and Robinson learned that alcohol delivery was legal. As long as they did not charge a commission, no liquor license was required. In 2013, along with Rellas's cousin Cory Rellas and with input from information systems professor John Gallaugher, they founded Drizly, a website and iOS app that made it possible to order beer, wine, and liquor from local stores for delivery within one hour. Either in the app or on the Drizly website, a customer can browse the available options at the stores in their area, place an order, and pay. The store receives an alert and then sends a request to their delivery partner. When the delivery partner reaches the customer, they must scan the barcode on the customer's ID to verify the customer's age, and then they deliver the order. -- **Instacart** - In 2012, Apoorva Mehta, a former employee of Amazon's fulfillment services department, developed the Instacart app in San Francisco so that he could avoid shopping for groceries. Users can open the Instacart app, browse and shop for groceries at local grocery stores, and select a delivery time. A personal shopper goes to the store, picks out the items, purchases and packs them up, and delivers the order to the customer's home at the time they specified. Using an early version of the app, Mehta placed an order at a local grocery store, went to the store, shopped the order himself, and then delivered it to his home. Mehta had founded several businesses, but Instacart gained immediate demand and he had to quickly hire personal shoppers to fulfill orders. In the summer of 2012, Mehta brought in co-founders Max Mullen and Brandon Leonardo. By 2014, Instacart had expanded to 10 additional cities, and by 2017, Instacart had partnered with grocery stores nationwide. - -## Solutions - -- **Ridesharing** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Food Delivery** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Grocery Delivery** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Alcohol Delivery** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. - -## Concepts - -- **Gig Marketplace** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. diff --git a/docs/more/api/blueprints/apis-as-a-product.md b/docs/more/api/blueprints/apis-as-a-product.md deleted file mode 100644 index c92a114..0000000 --- a/docs/more/api/blueprints/apis-as-a-product.md +++ /dev/null @@ -1,13 +0,0 @@ -# APIs as a Product -Treating APIs as a product means doing the hard work to establish user empathy and ensure your APIs are easy to use and possess the shortest time to value. Providing a complete API experience with quality documentation and well-used feedback loops and support channels where you are able to effectively gather and measure the insights needed to iterate upon each version of API, ensuring that each release meets the needs of the widest possible audience you can. - -- **Consumer-Centered** - Making sure that the design, development, and operation of your APis is as consumer centered as possible, ensuring your APIs are serving their needs. -- **Experience** - Your API operations focuses on creating the most meaningful experience possible for your consumers, and transcends the resources and capabilities being offered. -- **Use Cases** - Make sure that your APIs are designed for specific business use cases that matter to your consumers, making the investment to understand how those uses work. -- **Feedback Loops** - Invest in your feedback loops with consumers, making it as easy as possible for them to provide feedback that can be considered as part of your road map. -- **Value Generation** - Ensure that your APIs focus on generating value for your consumers as well as your platform, making sure that API operations is benefitting both of you. -- **Measurement- Define your metrics, gather the data, and use it to make sense of how yoru consumers are putting your APIs to work, and how they are made part of the API lifecycle. -- **Revenue** - Have a clear strategy for how your APIs will generate revenue, or indirectly support the generation of business revenue, justifying their existence along the way. -- **Road Map** - Establish a road map on day one, making sure you are keeping consumers informed regarding what the future will hold, helping ensure they are along for the ride. - -Treating your APIs as a product is how you go from technically good APIs to products that meet the needs of consumer, blending both the technology and business of APIs to find the right blaance between API producer and consumer–doing the dance with each version of an API, while keeping producer and consumer in sync as a unit. diff --git a/docs/more/api/blueprints/apm.md b/docs/more/api/blueprints/apm.md deleted file mode 100644 index bb4f03d..0000000 --- a/docs/more/api/blueprints/apm.md +++ /dev/null @@ -1,11 +0,0 @@ -# Application Performance Management -Tapping into existing APM solutions to monitor and manage the performance of APIs, the infrastructure, and operations around them. Ensuring that all outputs from across every stage of a well-defined API lifecycle, and the software development lifecycle beneath it are fed into existing APM solutions. Leaving no part of API operations untapped when it comes to detecting and diagnosing the performance and quality of API operations. - -- Collections -- Environments -- Monitors -- Results -- Dashboards -- Reporting -- Operations -- Coverage diff --git a/docs/more/api/blueprints/applications.md b/docs/more/api/blueprints/applications.md deleted file mode 100644 index fa90c14..0000000 --- a/docs/more/api/blueprints/applications.md +++ /dev/null @@ -1,13 +0,0 @@ -# Applications -APIs are used to deliver resources and digital capabilities across multiple types of applications, providing what is needed to power many different web, mobile, and device applications, as well as system-to-system integration and automation. How the enterprise sees applications and integrations evolves once they begin shifting towards an API-first way of looking at operations, demonstrating how APIs are behind every digital application we depend on today. - -## Types -APIs have powered applications since the beginning of compute, but the current breed of web APIs found its roots in aggregating content for web applications, but then quickly expanded to mobile, devices, back to the desktop, and even the network beneath our applications. - -- **Desktop** - APIs are how our desktop applications on our Windows and Mac desktops send and receive data back and forth to the server, making hundreds or thousands of calls each day to created, read, update, and delete the information needed as part of regular usage. -- **Web** - Websites began as simple HTML documents, but have become a dynamic mix of many internal and external API calls being made to stitch together the desired online experience within a specific domain, providing a more rich web than possible before. -- **Mobile** - Mobile phones allow websites and applications to be accessible in our hands, turning what used to just be a device for voice or messaging into a rich application ecosystem that is sending and receiving data from across many different cloud platforms. -- **Device** - Once developers realized that APIs could power mobile applications they began seeing what other types of mobile devices could be made Internet enabled, moving our televisions, thermostats, automobiles, and almost every other device in our life online. -- **Network** - As more of the infrastructure we depend on for the web moved into the clouds, the network that connected servers and clients together also has become API-enabled, making the network composable and configurable, changing how we operate applications. - -These constructs shape what many think of as an “application”, but APIs and the automation and orchestration they power is rapidly changing the notion of how we “apply” our digital resources and capabilities online and offline. diff --git a/docs/more/api/blueprints/asyncapi.md b/docs/more/api/blueprints/asyncapi.md deleted file mode 100644 index 407afd4..0000000 --- a/docs/more/api/blueprints/asyncapi.md +++ /dev/null @@ -1,12 +0,0 @@ -# AsyncAPI -The AsyncAPI specification provides the ability to describe the surface area of your multi-protocol APIs using JSON or YAML. The open source specification provides a robust way to describe what is possible with each API, defining the surface area messages and channels, which can then be used as the source of truth for what is possible when publishing and subscribing to each asynchronous API. - -- **Info** - Provides a place to define common meta data for an API like a name, description, licensing, terms of service, and contact information, helping ensure al APIs have enough metadata available so that it’s purpose can be articulated across the API lifecycle. -- **Application** - An application is any kind of computer program or a group of them, allowinug for both the view of a producer or a consumer, a microservice, IoT device (sensor), or possibly a mainframe process that will be publishing and subscribing to messages. -- **Producer** - A producer is a type of application, connected to a server, that is creating messages and addressing them to a channel or publishing to multiple channels depending on the server, protocol, and use-case pattern being applied as part of an API implementation. -- **Consumer** - A consumer is a type of application, connected to a server via a supported protocol, that is consuming messages from channel or consuming from multiple channels depending on the server, protocol, and the use-case pattern in an API implementation -- *Message** - A message is the mechanism by which information is exchanged via a channel between servers and applications, with the payload containing the data, defined by the application, which MUST be serialized into JSON, XML, Avro, binary, or other format. -- **Channel** - A channel is an addressable component, made available by the server, for the organization of messages, enabling producer applications to send messages to channels and consumer applications to consume messages from channels. -- **Protocol** - A protocol is the mechanism (wireline protocol OR API) by which messages are exchanged between the application and the channel. Example protocol include, but are not limited to, AMQP, HTTP, JMS, Kafka, MQTT, STOMP, WebSocket. - -AsyncAPI provides business contracts for the many different channels you can publish or subscribe to across the enterprise, defining the events that matter to operations, then helping ensure these events and the messages passed along are well-defined. diff --git a/docs/more/api/blueprints/asynchronous.md b/docs/more/api/blueprints/asynchronous.md deleted file mode 100644 index cb6d8dc..0000000 --- a/docs/more/api/blueprints/asynchronous.md +++ /dev/null @@ -1,17 +0,0 @@ -# Asynchronous -Enabling more real-time or asynchronous interactions, the web and other network technologies have been adapted to publish and subscribe to large volumes of digital resources and capabilities via highly reliable connections. Using APIs to establish an always-connected set of channels or topics that publish and push digital messages where they need to be to support business functions. - -## Publish - -- **Protocols** - Selecting from HTTP, HTTP/2, TCP, MQTT, or some other common protocol that is used for application to asynchronously publish data to internal systems for wider usage. -- **Channel** - Publishing messages to a specific topic or sometimes called a channel, providing a context for each message being published as part of the application using each API. -- **Message** - The JSON or XML message being published, submitting information to a system asynchronously, making it available for consumption by other system via asynchronous API. -- **Schema** - The structure of the message, standardizing how data is being published to the asynchronous API, ensuring that there is a standardized schema available to validate. - -## Subscribe - -- **Protocols** - Selecting from HTTP, HTTP/2, TCP, MQTT, or some other common protocol that is used for application to asynchronously subscribe to data that is published and available. -- **Channel** - Subscribing to messages published a specific topic or sometimes called a channel, providing a context for each message being subscribed to by each application. -- **Message** - The JSON or XML message being received, providing information from a system asynchronously, making it available for consumption by an application. -- **Schema** - The structure of the message, standardizing how data is being subscribed to the asynchronous API, ensuring that there is a standardized schema available to validate. - diff --git a/docs/more/api/blueprints/automated-landscape.md b/docs/more/api/blueprints/automated-landscape.md deleted file mode 100644 index 27c3722..0000000 --- a/docs/more/api/blueprints/automated-landscape.md +++ /dev/null @@ -1,8 +0,0 @@ -# Automaed Landscape -It is impossible to keep up with the pace of API operations today with humans alone, requiring an ever increasing investment in the automation of the API operations behind our web, mobile, and device applications. Luckily there is a very modular, collaborative, and executable way that API operations can be automated across teams, providing the unit of automation needed to define and execute anything across operations that can be done via an API. - -- Collections -- Runners -- Pipelines -- Newman -- Monitors diff --git a/docs/more/api/blueprints/business-in-the-clouds.md b/docs/more/api/blueprints/business-in-the-clouds.md deleted file mode 100644 index 7ae2443..0000000 --- a/docs/more/api/blueprints/business-in-the-clouds.md +++ /dev/null @@ -1,8 +0,0 @@ -# Business in the Clouds -Web APIs were increasingly being used for more industrial-grade purposes, making common building blocks of information technology (IT) available as simple XML or JSON APIs. Further commoditizing essential technical resources that businesses were depending on, shifting how we deploy and pay for our IT infrastructure while making them more elastic and scalable to meet the increasing demands being placed on our enterprises by web and mobile applications. - -## Early Cloud APIs - -- **S3**: In the early 2000s, Amazon was already a successful e-commerce platform, and around that time the company began to focus on making their internal development more efficient. Developers frequently built new databases, compute resources, and storage resources every time they worked on a new project. To save time and resources, developers began to build common resources that any internal team could consume and use for their own projects. Over time, this practice turned the Amazon platform into a set of APIs that combined infrastructure services with intuitive developer tools. In 2002, Amazon released an early version of their web service, which included SOAP and XML interfaces for the Amazon product catalog along with the same infrastructure services they were using internally. In March 2006, Amazon officially released Amazon Web Services (AWS) and the API for their Simple Storage Service (S3). S3 is an object storage solution that provides near-infinite storage for developers, allowing them to pay for each gigabyte they use. By storing objects in S3, developers can access their data from anywhere on the Internet at any time, instead of relying on static HTTP objects that depend on the resources and uptime of a specific server. S3 made it possible to use an API to create and manage infrastructure, instead of using APIs for data management only. -- **EC2**: In 2003, following the release of an early version of AWS, Amazon was working to improve their internal infrastructure. A group of engineers at Amazon proposed the idea of provisioning virtual servers that developers could create on demand instead of using their traditional servers and datacenters. Maintaining and scaling traditional infrastructure can be very expensive, so creating ephemeral servers in the cloud had the potential to save Amazon a significant amount of money. This internal service became Elastic Compute Cloud (EC2). In August 2006, a few months after Amazon launched AWS and S3 publicly, Amazon also released EC2. Similar to S3, the EC2 service allows companies to decentralize their infrastructure by storing computing resources in the cloud. -- **RDS**: In October 2009, Amazon introduced their Relational Database Service (RDS). Developers can use RDS to provision relational databases for applications in the cloud. Tasks such as backups, recovery, and software patching are automatic. Developers resize the database and manage its computing resources using an API call. diff --git a/docs/more/api/blueprints/business.md b/docs/more/api/blueprints/business.md deleted file mode 100644 index ce0e183..0000000 --- a/docs/more/api/blueprints/business.md +++ /dev/null @@ -1,12 +0,0 @@ -# Business - -APIs are no longer just an IT-led initiative behind applications and support projects. APIs are central to how a business operates and provides the digital capabilities and resources to do business in a digital age. Expanding participation by business stakeholders across the API lifecycle is essential moving from API-early to API-first, and defining the success of your digital transformation. - -- **Delivering Products** - Treating your APIs as a product has proven to be the winning strategy for companies over the last decade, ranging from Twilio for messaging, and Stripe for payments. Elevating APIs from being just a checkbox on a digital product, to being its own product, finding more alignment with business, and the consumer. -- **Jobs to Be Done** - Jobs Theory provides a rich way to look at the delivery of APIs as a product, and a framework to set the tone for the feedback loop between producer and consumer. Helping speed up the gathering and prioritization of feedback, data, and other needs surrounding an existing API, then translating that into the next version of the API within days, not weeks or months. -- **Roles** - [Need Content] -- **Consumer Experience** - Making the improvement of discovery, learning, onboarding, support, and other dimensions of the relationship between API producer and consumer a priority. Spending more resources to plan, implement, measure, and iterate upon different approaches to improvig the experience for consumers, reducing friction, and making the lifes of consumers easier by helping them succeed faster. -- **Low Code / No Code** - Investing in low-code/no-code options for producing and consuming APIs, abstracting away the complexity involved with working with digital resources and capabilities. Providing ways of expressing business intent using APIs, then going from intent to integration and business value without having to understand a programming language and how APIs work. -- **Innovation** - Ensuring there is always time for innovation and experimentation when it comes to producing and consuming APIs, encouraging team members and consumers to explore new ways of putting APIs to work, and abstracting away working with them at scale. Putting a priority on the potential that exists when you operate in an API-first way, and new and experimental digital resources and capabilities might be the competitive edge a company needs when it comes to reaching the right market. - -The divide between business and IT needs to become a relic of the past. Today’s digital factory floor requires all hands on deck, helping find the optimal balance needed to deliver the composable digital products tomorrows consumers are going to need or want. diff --git a/docs/more/api/blueprints/category-defining.md b/docs/more/api/blueprints/category-defining.md deleted file mode 100644 index ada9f9a..0000000 --- a/docs/more/api/blueprints/category-defining.md +++ /dev/null @@ -1,10 +0,0 @@ -# Category Defining -The agility associated with high performing API-first approaches to business was demonstrated in how Amazon changed doing business with Amazon Web Services, and with what API-first startups like Stripe and Twilio set into motion by enabling the development of ridesharing, food delivery, and other gig economy businesses. APIs allow startups, but also enterprise organizations to rapidly iterate in a way that possesses a feedback loop with consumers, keeping the trajectory and velocity of APIs to move in the directions that matter, increasing the potential for creating entirely new ways of doing business. - -- **Domains** - Well-defined domains within the enterprise don’t just result in more productivity, quality, and a governed API lifecycle, then can transcend their platforms, communities, and create entire new industry-level domains. Allowing enterprise organizations to notjust lead, but set the rules when it comes to how business works. -- **Products** - An API platform factory floor that is capable of delivering the entirely new products that markets want, need, or are completely unaware that they need. Iterating upon, and evolving the products consumers are needing, responding rapidly to changing market demands through rapid iteration on APIs, and the applications they support. -- **Standards** - The design, schema, and approaches adopted across the enterprise become so maturity and useful to business that they begin to be emulated and adopted in other external APIs, becoming the defacto standard within an industry, not through a traditional standards body, but leading by example and delivering what is needed. -- **Velocity** - Achieving levels of enterprise-wide velocity so that other competitors can’t keep pace, or face no chance in being able to catch up. Optimizing the digital factory floor in such a perpetual and nimble way, that large enterprises are able to build up so much momentum, that it starts to set the tone for an entirely new way of doing business. -- **Pulse** - Through an extensive network of feedback loops with consumers across thousands of internal, partner, and public APIs the enterprise is able to maintain a fix on the pulse of what is needed across an industry, or multiple industries, allowing a single business to redefine existing industries or create entirely new ways of doing business. - -Being API-first is how Amazon invented the cloud, and it is how Stripe and Twilio helped enable the gig or sharing economy. These were entirely new approaches to doing business that didn’t exist before, establishing entirely new categories of value creation for markets and consumers. Being API-first is how the categories of the next fifty years will be shaped and dominated, leaving only the enterprise organizatons who are able to internalize and transform their API-first state of operation into bringing us the next big thing. diff --git a/docs/more/api/blueprints/centralization.md b/docs/more/api/blueprints/centralization.md deleted file mode 100644 index b47eeea..0000000 --- a/docs/more/api/blueprints/centralization.md +++ /dev/null @@ -1,13 +0,0 @@ -# Centralization -The centralization of practices and information within an enterprise organization helps streamline and standardize what is happening across domains and teams, providing the consistent nutrients that teams will need to be successful, while also helping keep disparate teams producing consistent and valuable services. - -- **Excellence** - Setting up a center of excellence, bringing together all the necessary knowledge, skills, and practices, then working to evanglize them across domains and teams, helping bring more awareness, participation, and feedback from different teams. -- **Expertise** - Centralizing business an technical leadership, architects, and other expertise into a single group that meets regularly to help identify and evolve knowledge and practices that teams need to be successful in an ongoing and centralized manner. -- **Leadership** - Establishing clear and engaging leadership for centeralized governance, taking a lead role in demonstrating how and why API governance matters without ever having to say the word governance, or being seen as an enforcing centralized entity. -- **Domains** - Thoughtfully carve the enterprise in logical domains, reflecting, but also transcending the tribal boundaries that have emerged from lines of business, business vs IT, and legacy acquisitions to establish clear articulations of business domains. -- **Vocabulary** - Define the common vocabulary that is used within domains, providing the language that teams will use when designing APIs, but also engaging with consumers of those APIs, helping make producing and consuming APIs as intuitive as it possibly can. -- **Rules** - Crafting, evolving, educating, and helping incentivize the applications of linting rules across domain contracts and artifacts, and evn applied as part of policies used across gateways and other stops along the lifecycle to help stabilize API operations. -- **Enablement** - Provide the best possible services, tooling, standards, and other resources for teams, helping enablem them to do the right thing, making it easy for them to deliver and operate consistent APIs, no matter team which created and owns them. -- **Feedback Loop** - Foster an active two-way feedback loop with teams, encouraging feedback on governance, allowing teams to help dictate and own the evolution of how centralized guidance is evolved and applied on the ground, across federated API teams. - -How much of governance becomes centralized will vary depending on an organization, the culture that exists teams, and the industries they operate in, which is why having a conversation regarding about what should and shouldn’t be centralized amongst leadership, but also across teams is very important. diff --git a/docs/more/api/blueprints/challenges-advantage.md b/docs/more/api/blueprints/challenges-advantage.md deleted file mode 100644 index 5eeb8b8..0000000 --- a/docs/more/api/blueprints/challenges-advantage.md +++ /dev/null @@ -1,7 +0,0 @@ -# Challenges** - Advantage - -- **Reactive (A1)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Missed Opportunities (A2)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Stagnation (A3)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Unknown Resources (A4)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Unknown Capabilities (A5)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. diff --git a/docs/more/api/blueprints/challenges-motion.md b/docs/more/api/blueprints/challenges-motion.md deleted file mode 100644 index 1798f6f..0000000 --- a/docs/more/api/blueprints/challenges-motion.md +++ /dev/null @@ -1,8 +0,0 @@ -# Challenges - Motion - -- **Not Discoverable (M1)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **No Change (M2)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **No Momentum (M3)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Awkward (M4)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **No Feedback (M5)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. - diff --git a/docs/more/api/blueprints/challenges-stability.md b/docs/more/api/blueprints/challenges-stability.md deleted file mode 100644 index 2222879..0000000 --- a/docs/more/api/blueprints/challenges-stability.md +++ /dev/null @@ -1,7 +0,0 @@ -# Challenges - Stability - -- **Outages (S1)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Slow Performance (S2)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Not Scalable (S3)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Low Quality (S4)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Breaking Change (S5)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. diff --git a/docs/more/api/blueprints/challenges-trust.md b/docs/more/api/blueprints/challenges-trust.md deleted file mode 100644 index a987e0e..0000000 --- a/docs/more/api/blueprints/challenges-trust.md +++ /dev/null @@ -1,8 +0,0 @@ -# Challenges - Trust - -- **Unreliable (T1)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Risk (T2)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Non-Compliant (T3)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Vulnerable (T4)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **Incidents (T5)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. -- **No Communication (T6)** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut. diff --git a/docs/more/api/blueprints/challenges-usability.md b/docs/more/api/blueprints/challenges-usability.md deleted file mode 100644 index 18ac40b..0000000 --- a/docs/more/api/blueprints/challenges-usability.md +++ /dev/null @@ -1,9 +0,0 @@ -# Challenges - Usability -The overall usability of the APIs behind enterprise applications and integrations contribute to the efficiency and velocity of an organization. As the result of adhoc API operations, we see many common challenges when it comes to the usability of APIs across an organizations, which if addressed would significantly shift the speed at which an organization can move. - -- **Onboarding Friction (U1)** - APIs require approval before they can be used, or impose other known or unknown challenges when it comes from going to discovery of an API, to making the first request to a sandbox or live API resource. -- **No Documentation (U2)** - The lack of up to date and complete documentation for APIs is the number one npain point faced by API consumers, increasing the time it takes to put digital resources and capabilities to good use. -- **Inconsistent (U3)** - Even the small differences between how APIs work can introduce significant time and money downstream when it comes to putting APIs to work, slowing consumers of digital resources. -- **Not Repeatable (U4)** - Behavior made available via APIs aren’t always repeatable, with unexpected functionality emerging as API consumers work across different APIs, making the development of applications more difficult. -- **Complexity (U5)** - The complexity of how each API works plays a significant role in the usability of enterprise digital resources and capabilities, exponentially slowing down operations when not addressed at scale. -- **Duplication (U6)** - Duplication and overlapping capabilities across teams, domains, and an organizaiton can further blur the usability of digital resources and capabilities, leaving consumers unsure how to put APIs to work. diff --git a/docs/more/api/blueprints/changing-landscape.md b/docs/more/api/blueprints/changing-landscape.md deleted file mode 100644 index 54987e0..0000000 --- a/docs/more/api/blueprints/changing-landscape.md +++ /dev/null @@ -1,8 +0,0 @@ -## Changing Landscape -The API landscape is always changing, and when you are API-first this is a goo thing. You have control over the rate of change and are able to effectively communicate change with consumers. Leveraging change to your advantage, iterating upon your APIs as needed to not just keep up with the pace of change, but actual use it to move ahead. - -- Versioning -- Source of Truth -- Source Control -- Contract-Driven -- Releases diff --git a/docs/more/api/blueprints/ci-cd.md b/docs/more/api/blueprints/ci-cd.md deleted file mode 100644 index 8f4b62b..0000000 --- a/docs/more/api/blueprints/ci-cd.md +++ /dev/null @@ -1,10 +0,0 @@ -# CI/CD -Tapping into the continuous integration and continuous deployment, or better known as CI/CD to help build, test, and automate integrations with internal and external APIs, but also deploy APIs that can be used for applications and integrations. Leveraging the native pipelines of source control or adding on a commercial or open source CI/CD solution, then seamlessly tying how APIs are produced and consumed with the repeatability that is available to us via modern CI/CD solutions. - -- **Pipelines** - Utilize CI/CD pipelines to ensure that the API lifecycle is always repeatable, and something that always builds the highest quality API as possible. -- **Variables** - Tapping into a lifecycle-wide strategy for defining variables that get applied as part of the pipeline build process, ensuring that the naming, use, and evolution of variables occurs at a strategic level. -- **Collections** - Running contract, performance, security, and other types of collections as part of the pipeline using open source Newman, standardizing and validating as APIs are being built. -- **Environments** - Leveraging standardized guidance for teams to have development, staging, and production environments available for running as part of CI/CD pipelines alongside collections. -- **Observability** - Tapping into the existing outputs that exist for our CI/CD pipelines and piping data into our API platforms, APM, and other observability and reporting systems. - -A modern API lifecycle is built on top of our existing enterprise investment in continuous integration and continuous deployment infrastructure and processes, layering in the usage of common artifacts like OpenAPI, collections, and environments to ensure quality while maintaining velocity across operations. diff --git a/docs/more/api/blueprints/code-first.md b/docs/more/api/blueprints/code-first.md deleted file mode 100644 index 7cb50af..0000000 --- a/docs/more/api/blueprints/code-first.md +++ /dev/null @@ -1,12 +0,0 @@ -# Code-First -The development of a new API, or adding functionality to an existing API by writing code to shape how an API will work. Producing the desired behavior by writing code in a local or shared development environment, but then once the agreed upon shape and behavior of the API is achieved, you produce a machine readable contract, and provide a set of tests that will automate the validation of assertions made by all stakeholders throughout the development process. - -- **Workspaces** - Even when an API is already existing or being developed in a code-first manner, a dedicated workspace should be setup to provide a single place to find all work. -- **Annotations** - Using programming language annotations to add the necessary metadata to the code behind an API that can be used to generate a machine readable contract for APIs. -- **CI/CD Pipeline ** - Using your CI/CD pipeline to turn annotations into machine readable contracts, translating your code into a contract to power documentation, and much more. -- **Contract** - Producing a a machine readable contract for your API, leveraging code and your existing software development lifecycle to produce the contract needd to govern each API. -- **Document** - Generating human readable documentation from an APIs contract, ensuring there is accurate and up-to-date documentation for each API as it is being designed. -- **Feedback** - Gathering feedback on the design of an API, tapping into existing channels to understand what consumers and other stakeholders are needing from the API being built. -- **Iterate** - Update the code base based upon feedback adding capabilities to the API, building and updating the machine readable contract, which then updates of elements downstream. -- **Test** - Produce contract tests for APIs, adding tests to your CI/CD pipeline to ensure each API continues to relect the agreed upon contract, with no surprises pushed to production. - diff --git a/docs/more/api/blueprints/collaborative.md b/docs/more/api/blueprints/collaborative.md deleted file mode 100644 index 92c153b..0000000 --- a/docs/more/api/blueprints/collaborative.md +++ /dev/null @@ -1,14 +0,0 @@ -# Collaborative -Doing business in the cloud means you can collaborate with internal teams, partner stakeholders, and even the public when it comes to producing and consuming APIs. Connecting technical and non-technical stakeholders when it comes to delivering and applying the digital resources and capabilities that are needed to do business. Allowing tens, hundreds, or even thousands of people to collaborate when it comes to building and using the digital resources and cpabiities that shape how we do business today, but also tomorrow. - -- **Teams** - API-first means teams are well defined within their domain and groups, roles are clear, and discovery and access control is well-defined to enable desired outcomes. -- **Workspaces** - Work on APIs is available vai dedicated workspaces, where stakeholders can find artifacts, documentation, and collaborate with teams to put APIs to work. -- **Watch** - Team members, stakeholders, and when logical, even public consumers can choose to watch an API and the work going on behind it, receiving regular notifications. -- **Forking** - API artifacts, documentation, environments, and other elements should be forkable so that other team members and consumers can fork, use, and help evolve. -- **Pull Requests** - Allowing team members and other consumers to submit pull requests across API operations to contribute potential change that can be merged by teams. -- **Sharing** - Each element of the API lifecycle is shareable, allowing API artifacts, documentation, tests, mock servers, and other elements to be easily shared by teams. -- **Publishing** - Documentation, buttons, and other embeddables can be published to networks, catalogs, directories, portals, or even simple as a blog or social media post. -- **Conversations** - There are active conversation around around each API, and across API operations, ensuring there are feedback loops present for teams, and consumers. - - -APIs are very technical and abstract, but only flourish when there are active relationships present between API producers and consumers, and at scale across an entire organization, or possibly within a specific industry. Making collaboration the essential nutrient present in the most vibrant enterprise API operations, and the companies we see leading the way in their industries and markets. diff --git a/docs/more/api/blueprints/collections.md b/docs/more/api/blueprints/collections.md deleted file mode 100644 index c2fbb1c..0000000 --- a/docs/more/api/blueprints/collections.md +++ /dev/null @@ -1,13 +0,0 @@ -# Collections -Postman collections are a machine-readable specification for saving API requests in a portable, executable, and documented way, allowing one or many API requests to be organized by folder, then shared or published for use by others. Providing an executable unit of value as defined by each APIs source of truth, but available in a format that can be used to document, mock, test, secure, and automate with APIs. - -- **Folders** - Each collection has the ability to define one or many folders and organize API requests into each folder, making collections more intuitive and easier to use. -- **Authentication** - Collections allow the authentication to be defined for any API being consumed, providing most of the top authentication formats used to secure APIs. -- **Documentation** - You can document your APIs, as well as the tests, automation, visualizations, and other use cases for collections, ensuring API operations is documented. -- **Parameters** - The parameters and default values can be provided for each API request, helping define query and path parameters that shape each API request and response. -- **Headers** - Collections allow for HTTP headers to be passed along with each request, shaping the transport of each API request and response being made with collection. -- **Body** - Enabling the ability to add JSON, XML, Text, and other types of data payloads as a body of the request, securely sending (when encrypted) data as part of a request. -- **Responses** - The details of a response, including the HTTP status codes, headers, network information, response time, and the other technical details of the API response for APIs. -- **Scripts** - Collections allow for folder level, as well as pre-request and post-request scripts to be applied, providing scripts that get executed when collections are executed manually. - -OpenAPI and AsyncAPI provide a source of truth for each APII, and a collection provides a derivative of that truth for a specific stop along the API lifecycle, providing a versatile way of defining each stop. diff --git a/docs/more/api/blueprints/composable.md b/docs/more/api/blueprints/composable.md deleted file mode 100644 index 29d6b19..0000000 --- a/docs/more/api/blueprints/composable.md +++ /dev/null @@ -1,14 +0,0 @@ -# Composable -Modern software is composable, assembling, evolving, and decommissioning business resources and capabilities that meet the ever-changing needs of the enterprise. Perpetually building with, but also inventing modular industrial-grade Lego building blocks that power the web, mobile, and device applications we need to do business at the pace of the Internet. - -- **Resources** - Every digital resource produced and consumed by the enterprise is available for use. Mapping out all of the atoms of your digital transformation, keeping business with what it needs to operate, and compete globally in a digital landscape. -- **Capabilities** - Essential workflows and algorithms are well-defined using many private, partner, and public APIs, documenting what the enterprise is capable of in any moment, while also being ready to adapt, change, and respond to entire new needs of the market. -- **Modular** - Enterprise resources and capabilities are as modular as possible, driving re-use and collectively applying at scale. Reducing business value down to the smallest most reliable form, then productizing and making available to consumers via APIs. -- **Distributed** - The enterprise is distributed geographically and organizationally, giving domains, groups, and teams the agency they need to deliver essential enterprise resources and capabilities while minimizing their dependency on other enterprise teams. -- **Discoverable** - All of the building blocks of the enterprise are discoverable, making digital resources and capabilities available to both business and technical stakeholders and consumers, ensuring they have what they need to conduct business and compete. -- **Self-Service** - Enterprise digital resources and capabilities are available in a self-service way with visibility available only to intended audiences. Allowing those who should have access the ability to sign-up and begin putting APIs to work with as few steps possible. -- **Reliable** - Everwhat teams need when y digital resource and capability is reliable, providing what they need without friction. Establishing and maintaining trust with consumers, do exactly what they need in a way they can depend on without concern. -- **Observable** - The ability to observe the usage of any digital resource or capability is the default, no matter how many times something is reused and bundled with other services, making awareness regarding how resources and capabilities are applied at scale. - -Modern software is composable because APIs exist just beneath the surface. APIs are how massive enterprise operations are made composable, reusable, and scalable, through doing the hard work of reshaping the enterprise into the most valuable and reusable business value. - diff --git a/docs/more/api/blueprints/consumer-lifecycle-deploy.md b/docs/more/api/blueprints/consumer-lifecycle-deploy.md deleted file mode 100644 index 3d56bd1..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-deploy.md +++ /dev/null @@ -1,12 +0,0 @@ -# Deploy -The deployment of API integrations comes in many shapes and sizes today, and the notion of what is an application has shifted over time. When putting APIs to work it used to simply about web and mobile applications, or system to system integrations. The concept of deployment in the world of API consumer could range from writing custom code for an application to low-code/no-code syncing between two separate API platforms. What is being deployed, how it runs and is maintained is becoming as versatile and modular as APIs themselves. - -- **Source Control** - Having all manually developed or automatically generating client code in a repository, providing a source of truth for the code being deployed, but also for any artifacts that are needed to define the deployment and operation of any API integration. -- **CI/CD Pipelines** - Implementing the continuous integration portion of CI/CD, automating how applications and integrations are deployed, making the deployment of API integrations, applications, and other use cases something that is always repeatable. -- **Collections** - Leveraging a Postman collections as a modular, sharable, and executable definition of an application, stitching together many different API calls across internal and external API sources to apply digital resources and capabilities in a specific way. -- **Serverless** - Utilizing serverless layers for deploying integrations, orchestrations, and different ways to automate API resources and capabilities, tapping into ephemeral compute to deploy integration code that accomplishes specific business outcomes. -- **Runners** - Acknowledging that some collection applications will just be manually run my different team members using runners, organizing different types of integrations and applications by workspaces and letting different stakeholders manually put to work. -- **Monitors** - Scheduling the run of collection-defined integrations, using the collection, combined with environments that employ a variable strategy to accomplish any API-driven applications and integrations on a schedule from any cloud region. -- **Workflows** - Take advantage of complex workflows to put APIs to work, iterating through multiple series of API calls to enabl business and technical stakeholders to design, save, and execute the scenarios they need to accomplish business each day. -- **Webhooks** - Responding to different API events using webhooks to trigger collection defined integrations, applications, and workflows, engaging with users via different platforms while responding to their activity using modular API deployments. - diff --git a/docs/more/api/blueprints/consumer-lifecycle-develop.md b/docs/more/api/blueprints/consumer-lifecycle-develop.md deleted file mode 100644 index f417334..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-develop.md +++ /dev/null @@ -1 +0,0 @@ -# Consumer Lifecycle - Develop diff --git a/docs/more/api/blueprints/consumer-lifecycle-discover.md b/docs/more/api/blueprints/consumer-lifecycle-discover.md deleted file mode 100644 index 2513d50..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-discover.md +++ /dev/null @@ -1,13 +0,0 @@ -# Discover -The regular practice of an API consumer being able to find exactly the API that they need for their application or integration. Enabling API consumers to be able to search, browse, and discover the API they need, but also everything they will need to onboard with an API, be able to assess the overall quality and reliability of the API, and get to work quickly integrating it into their use case, no matter what language or platform they are working with. - -- **Search** - Consumers should be able to search for APIs using the interaces they are already using, allowing for discovery of API information in manner that is more relevant to their work. -- **APIs** - The contracts and other artifacts that define the surface area of an API, including authentircation and authorization, should be discoverable as part of regular operations. -- **Documentation** - Up-to-date and accurate API documentation for all APIs should be easily discoverable by team, providing human readable details of what is possible with each API. -- **Tests** - The contract, performance, integration, and even user acceptance tests should be made searchable by consumers, helping make not just APIs discoverable, but their tests. -- **Workspaces** - Alongside Git repositories, private, partner, and public workspaces can be included as part of discovery, indexing the places where all work is occurring for each API. -- **Teams** - The teams behind APIs, and the partner or public contributors should be made discoverable alongside documentation and other data, encouraging human engagement. -- **Workflows** - Common workflows using APIs should be made discoverable, helping business and technical stakeholders implement business workflows they need in their work. -- **Changes** - Multiple versions of each API should be made discoverable, indexes each API release and the communications around them, helping consumers easily get up to speed. - -Every moment spent looking for an API, trying to find the latest version, and understanding what is possible contributes or removes from the overall motion of a team, domain, and organization, leaving consumer discovery a priority aspect of the API lifecycle to invest in. diff --git a/docs/more/api/blueprints/consumer-lifecycle-evaluate.md b/docs/more/api/blueprints/consumer-lifecycle-evaluate.md deleted file mode 100644 index aef3be2..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-evaluate.md +++ /dev/null @@ -1,13 +0,0 @@ -# Evaluate -Once a consumer has discovered an API they will always need time to evaluate whether or not it will meet their needs. An experience that is significantly streamlined when it is a hands-on experience, allowing consumers to play with real functionality, learn about resources and capabilities by doing something rather than just reading. Giving API consumers the information they need to make an educated decision about whether or not an API will satisfy their needs, helping them feel empowered to understand how an API fits into the business solution they are looking to build. - -- **Explore** - Enabling consumers with the ability to explore as much of the surface area of an API as possible, possibly without even authenticating, helping learn about what is possible. -- **Execute** - Providing the ability to execute each request, response, publish, and subscribe, making sure that learning about an API is as hands on as it possibly can be for consumers. -- **Examples** - Making sure there are always examples for each element of an API, allowing API contracts to be mocked and provide rich documentation show how it works. -- **Documentation** - Ensure there is always rich documentation with useful descriptions, examples, and other information that helps consumers get started using each API. -- **Workflows** - Go beyond reference documentation and provide actual business workflows that consumers can use to accomplish scenarios they will need in their work. -- **Demonstrate** - Show consumers how it all works, providing tutorials, videos, and other stories that demonstrate what is possible when you put APIs to work in applicaitons. -- **Forking ** - Make your documentation, mock servers, tests, and workflows forkable, enabling consumers to fork artifacts that help them onboard and use an API faster. -- **Feedback** - Make it easy to provide feedback as consumers are evaluating an API, capturing as much feedback from the experience as possible to inform the API roadmap. - -Investing in the evaluate stage of a consumer-centric API lifecycle helps reduce consumers time to first call, or more importantly, time to first transaction, making it as easy as possible for consumers to begin generating value in their applications, but also on the API platform. diff --git a/docs/more/api/blueprints/consumer-lifecycle-integrate.md b/docs/more/api/blueprints/consumer-lifecycle-integrate.md deleted file mode 100644 index c5a8e06..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-integrate.md +++ /dev/null @@ -1,11 +0,0 @@ -# Integrate -Once ready, there are numerous ways in which API consumers will be developing against an API. It used to be enough to provide a variety of snippets or libraries in a variety of programming languages, but increasingly consumers are needing access to artifacts, collections, and other ways of visually developing their integrations or applications, pushing the boundaries of what is traditionally considered to be a desktop, web, or mobile application. - -- **Contracts** - Machine readable contracts like OpenAPI and AsyncAPI help make integrate as simple as importing the contract for an API, authenticate, and then begin making the API calls you need to move data between systems, providing an artifact that define consumption. -- **Collections** - Provide forkable and executable collections that describe specific parts of an APIi that help consumers accomplish a specific digital capability, providing a buffet of capabilities for consumers to choose from when it comes to their API integration needs. -- **Automation** - Opening up automation opportunities, allowing collections to be scheduled and baked into the CI/CD pipelines, allowing common business capabilities to be executed, allowing business and technical stakeholders to do more with less through API automation. -- **Workflows** - Provide ready-to-go low-code and no-code options for executing common business workflows, allowing multiple internal, partner, and public APIs to be daisy chained into valuable scenarios that will help business and technical stakeholders integrate better. -- **Snippets** - Generating of lightweight code snippets in a variety of programming languages, doing most of the heavy lifting for consumers when it comes to integrating with APsI in the language of their choice, automating the more repetitive aspects of API integration. -- **SDKs** - Generating of complete software development kits, abstracting away the authenticaiton and other complex aspects of putting an API, or many APIs to work, helping reduce the workload for consumers, helping them integrate APIs into their applications. - -Today’s API integrations come in many shapes and sizes, and requires a mix of approaches to satisfy the needs of the widest possible consumer audience. Moving us into a more modular, automated, and low-code/no-code reality when it comes to stitching together the thousands of APIs we need to do business today. diff --git a/docs/more/api/blueprints/consumer-lifecycle-observe.md b/docs/more/api/blueprints/consumer-lifecycle-observe.md deleted file mode 100644 index 549c2e3..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-observe.md +++ /dev/null @@ -1,10 +0,0 @@ -# Observe -API consumers need to be able to effectively “see” API operations and how their API consumption, or the consumption across a community will influence their own applications and integrations. The observability involving specific instances of APIs, produce suites of APIs, as well as the operations that surround them is proving two be something that is fast defining the difference between healthy API ecosystems, and not-so-healthy API communities, and something that will define your organization. - -- **Watches** - Keeping track of the watches on workspaces, APIs, and collections to understand who is tuned into what is happening, using watches as a metric for the number of consumers, contributors, and internal and external stakeholders who are tuned in. -- **Forks** - Tracking on who is forking repositories and collections, using the fork count as a metric for engagement and known who your consumers are, and how they are putting APIs to work by tracking on engagement via workspaces, repositories, and collections. -- **Feedback** - Being part of the feedback loop, engaging with API producers and consumers, understanding what the conversation is around each API, or group of APIs, observing the discussions that are going on around digital resources and capabilities. -- **Notifications** - Using notifications to engage with a platform and keep consumers part of the forward motion of an API, using in-app, email, or even SMS notifications to handle the engagement between producer and consumers, using it as an output for observability. -- **Usage** - Providing dashboards, reporting, and other visuals to help consumers understand what their platform usage is, using it as an opportunity to keep consumers engaged, and playing and active part in the community, helping them observe the activity that matters. - -Using existing platform outputs to keep consumers informed, but also make their engagement more observable for producers and other consumers helps contribute to the overall health and viability of the ecosystem that exists around each API internally within the enterprise or within the external community. diff --git a/docs/more/api/blueprints/consumer-lifecycle-test.md b/docs/more/api/blueprints/consumer-lifecycle-test.md deleted file mode 100644 index 0569601..0000000 --- a/docs/more/api/blueprints/consumer-lifecycle-test.md +++ /dev/null @@ -1,11 +0,0 @@ -# Test -The testing of APIs is something that should be shared with consumers, opening up the possibilities for user acceptance testing, a greater trust and understanding of overall reliability of a platform, and strengthens the relationship between API producer and consumer. The results of API testing, and even dashboards and visualizations can be shared with consumers via workspaces, repositories, and alongside API documentation and portals, going the extra distance when it comes to understanding the role API testing plays for both API producer and consumer. - -- **Availability** - Sharing uptime and available information with consumers as a dashboard, helping make sure there is transparency around the operation of the platform they are depending on their applications, helping provide a historical accounting of availability. -- **Contract** - Exposing contract tests and even the results of schedule contract test runs, helping bring more awareness amongst consumers regarding the contract that exists for APIs, and how the validation of contracts can be used in their applications and integrations. -- **Performance** - Sharing performance tests and event the results of schedule performance test runs, helping be more transparent about the performance of APIs while demonsstrating that the platform has considered performance, and is taking step to improve upon it. -- **Security** - Making the overall security policy, as well as the security tests that are executed as part of regular operations, offering a more transparent look at how security testing occurs, and sharing results, helping create more trust with consumers when it comes to security. -- **Usage** - Ensuring that any type of testing for consumers is not considered as part of the usage they pay for, or shows up as part of rate limits, helping encourage consumers to help test an API, and play a role in the feedback loop around the overall availability of a platform. -- **SLA** - Providing an honest service level agreement (SLA) for each API, providing a pragmatic look at what the expectations are for services, allowing for beta, experimental, and less mature APIs, while demonstrating the production ready grade of other APIs. - -To strike a balance between producer and consumers there needs to be a shared understanding the baseline testing occurring via a platform, bringing consumers into the conversation when it comes to API quality, while also holding producers accountable when it comes to talking about platform quality. diff --git a/docs/more/api/blueprints/contract-testing.md b/docs/more/api/blueprints/contract-testing.md deleted file mode 100644 index 18bb0d8..0000000 --- a/docs/more/api/blueprints/contract-testing.md +++ /dev/null @@ -1,21 +0,0 @@ -# Contract Testing -Contract testing takes the machine-readable contract for an API and then tests each instance of that contract, ensuring the requests and responses, messages, and other details match the contract defined. Providing an executable set of tests that will validate the contract for each API manually as part of producing or consuming the API, enforced as part of the CI/CD pipeline, or scheduled to run from different regions via the monitor. - -## Contracts - -- OpenAPI -- ASyncAPI -- JSON Schema - -## Testing - -- Collections -- Authentication -- Scripting -- Environments - -## Automation - -- Runner -- Monitor -- Pipeline diff --git a/docs/more/api/blueprints/contracts.md b/docs/more/api/blueprints/contracts.md deleted file mode 100644 index a17629f..0000000 --- a/docs/more/api/blueprints/contracts.md +++ /dev/null @@ -1,15 +0,0 @@ -# Contracts -On top of the protocols and patterns employed by API producers, a variety of machine and human-readable contracts have emerged to help govern not just the technical, but also the business and even legal aspects of the API producer and consumer relationship. These contracts are being used to make sure producers and consumers are on the same page, providing a single source of truth for each version of an API being made available. - -## Specifications -A handful of specifications have emerged that help us describe the surface area of our synchronous and asynchronous APIs, employing a mix or protocols, patterns, styles, and formats. Providing a machine and human readable formats we can use to define the contracts available between producer and consumer. - -- **OpenAPI** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **AsyncAPI** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **JSON Schema** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Protocol Buffers** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Collections** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **GraphQL** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **WSDL** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. - -API contracts provide the technical, but also business details of the relationship between API producer and consumer, moving everyone to a shared understanding of what can be expected with each version of an API, the quality of service, and the digital resources and capabilities being made available. diff --git a/docs/more/api/blueprints/design-first.md b/docs/more/api/blueprints/design-first.md deleted file mode 100644 index bca54c0..0000000 --- a/docs/more/api/blueprints/design-first.md +++ /dev/null @@ -1,12 +0,0 @@ -# Design First -The practice of properly defining and designing your API before you begin writing any code, using an API specification to develop an API contract, mocking and documenting the contract, then iterating upon the design of the resources and capability with stakeholders. Once you are ready, and have reached an agreed upon state of what an API should do, you can then develop tests that help verify what was agreed upon during the design-first process is actually what ends up being in production. - -- **Workspace** - Work on a new API always begins in a dedicated workspace, making sure that there is a single place to find artifacts, team, and the work that exists behind each API. -- **Contract** - Every API has a machine readable contract that describes the surface area of the API, providing an understanding between API producer and consume to guide operation. -- **Mocks** - The contract for an API is perpetually used to generate mock servers helping make the design of the API as realistic as possible, matching specific use cases with examples. -- **Document** - Generating human readable documentation from an APIs contract, ensuring there is accurate and up-to-date documentation for each API as it is being designed. -- **Feedback** - Providing a feedback mechanism for all stakeholders to use when it comes to providing feedback on the current design of an API, helping guide producers froward. -- **Iterate** - Aggregate feedback from consumers and other stakeholders, identify the sensible changes to the API, then iterate on the contract, updating mocks and the documentation. -- **Test** - Once the contract for an API has been established, and there will be no more iterations to this version, then contract tests can be produced to validate in production. -- **Develop** - Hand off the API contract, providing teams with what they need to bring an API to life in development, moving to staging and running tests before it is put into production. - diff --git a/docs/more/api/blueprints/design-review.md b/docs/more/api/blueprints/design-review.md deleted file mode 100644 index 6fac70f..0000000 --- a/docs/more/api/blueprints/design-review.md +++ /dev/null @@ -1,24 +0,0 @@ -# Design Review -API design reviews give pause before any API goes into production to make sure every API meets the design standards of an organization. Designers and developers should have a wealth of resources to help them design the best possible API before it gets submitted - -## Define -Each API being submitted as part of an API design review process should possess the necessary artifacts and elements needed to properly evaluate the design of each API. - -- Team Workspace -- Team Members -- Github Repository -- Design Rules -- OpenAPI -- Reference Documentation -- Examples -- Mock Server -- Contract Testing - -## Process -Once ready, with all the needed artifacts and elements, an API should be submitted to a well-defined process for reviewing the design of an API, then providing feedback on the state of an API, and whether it is ready for production, - -- Design Review -- Design Review Timeline -- Design Review Feedback -- Design Review Outcomes - diff --git a/docs/more/api/blueprints/developer-experience.md b/docs/more/api/blueprints/developer-experience.md deleted file mode 100644 index 52b1a30..0000000 --- a/docs/more/api/blueprints/developer-experience.md +++ /dev/null @@ -1,14 +0,0 @@ -# Consumer Experience - -The experience that API consumers face when putting APIs to work will forever define and shape the relationship between API producer and consumer. Onboarding is the first impression producers have to make on consumers, and every interaction from there forward is an opportunity to build or lose trust along the way. Establishing a lifelong relationship with consumers who bake your APIs into their applications and integrations, a relationship where there can be mutual value exchanged, if just the right experience exists - -- **Discoverable** - It is easy for consumers to find what they need, with APIs distributed whey consumers frequent, and part of the stories and information they consume on a regular basis, making it effortles to find and understand how an AP will benefit them. -- **Easy Onboarding** - Consumers should be able to from discovery to making their first request in as little as time possible, with as view steps as possible, helping consumer reach the point where value is evident, and the overall experience has been a joy. -- **Simple** - Keeping APis, documentation, and other supporting elements as simple and intuitive as possible, helping reduce the cogniftive load when it comes to understanding what is possible with an API, keeping the work consumers have to do to a minimum. -- **Consistent** - Every API used has a consistent feel, and employs common patterns, making API integration a familiar experience for consumers, keeping API resources and capabilities as aligned with a consumers view of the world and experience at work. -- **Reliable** - Every step of the onboarding and integration experience, as well as the ongoing support and service after integration should be as reliable, predictable, and dependable as possible, leaving the right impression on consumers in their journey. -- **Documented** - Everything along the way for consumers is documented and explained, ensuring that everything is up-to-date and provides a complete picture for consumers, keeping things short, sweet, and to the point, helping consumers find what they need. -- **Conversation** - Making the relationship between API producer and consumer a conversational one, making sure consumers feel confident in providing feedback, and trusting that they will be heard, helping provide the support consumer need regularly. -- **Supported** - When challenges are faced by consumers, API producers are there to support, helping solve problems and empathizing with consumers regarding the pain they are experiencing, and working to recover and get things back to normal quickly. - -Treating yoru APis as products is the best way to help improve the consumer experience, establishing the feedback loops required to understand what consumers are experiencing in near real-time, then working to iterate as fast as possible to respond to consumers, while keeping them in alignment with each release. Measuring everything along the way, relying on your relationship with consumers and the data you gather from across operations to iterate in just the right ways regarding road map for APis. diff --git a/docs/more/api/blueprints/devices.md b/docs/more/api/blueprints/devices.md deleted file mode 100644 index 1db6f36..0000000 --- a/docs/more/api/blueprints/devices.md +++ /dev/null @@ -1,17 +0,0 @@ -# Devices in our Physical World -Once it was discovered that simple low-cost web APIs would work for low-latency mobile applications, developers began using the same approaches to sending messages and data between common everyday objects. Connecting devices across our personal and professional lives together, and bringing the online world into the physical world, helping automate and optimize how we do business. - -## Internet of Things -Cameras, sensors, drones, and other types of Internet-connected devices are being applied across a range of industries, introducing “smart” capabilities into existing real-world processes, and leveraging APIs to send and receive data from the cloud. Tapping into the recent use of APIs makes cloud infrastructure available, and many of the APIs are being used by our iPhone and Android mobile applications, but are now used to power everyday objects. - -- **Manufacturing** - In the manufacturing industry, the industrial Internet of things (IIoT) allows for improvements in efficiency, cost, and safety. Networked sensors can be attached to physical equipment and assets. These sensors gather data so that operators can perform condition monitoring and measure overall equipment effectiveness. With the information provided by the sensors, operators can monitor machine temperature, vibrations, and sound frequencies, and they can use this data to perform predictive maintenance on equipment that gives abnormal readings. Predictive maintenance gives operators the opportunity to detect and repair problems before the equipment fails. Operators can also monitor machine run time, operating speed, and product output, which allows them to measure and improve production efficiency. IIoT sensors can also be used for supply chain improvements, such as locating inventory and controlling environmental conditions during storage and transport. Workers can use wearable technology to monitor their heart rate, skin temperature, and location to prevent overexertion, falls, and other safety threats. -- **Transportation** - The transportation industry uses Internet-connected devices to increase traveler safety, reduce traffic congestion, and improve performance and infrastructure. For example, to increase railroad passenger safety, cameras and sensors can monitor factors like road temperature, train speed, traffic, and mechanical parts to look for anomalies, and trains can use intelligent cruise control to adapt speeds to current conditions. Dynamic road signs can alert drivers to lane closures, upcoming toll charges, and accidents. Autonomous vehicles also contain networked sensors, cameras, and communication systems, which provide the driver with real-time data about everything else that's happening on the road. These autonomous vehicles can also communicate with each other about road conditions, traffic, and closures, and they can use this data to provide dynamic navigation instructions. Similar to other IoT-connected machinery, networked sensors allow operators to perform predictive maintenance on vehicles, which reduces operating costs. IoT devices can monitor roadways to analyze traffic patterns, fuel usage, and accidents in order to reduce congestion, and this data can be used to make decisions about traffic signals and resource scaling. To improve mass transit performance, sensors can monitor and analyze physical infrastructure to identify areas that need improvement, which can reduce operating costs and increase rider capacity. -- **Energy** - The energy industry can use Internet-connected sensors, controllers, and meters to measure and analyze public utility consumption, and then use this data to make improvements to energy assets and processes. For example, billions of cubic meters of water are wasted each year due to leaks and mechanical failures. Using smart meters to track water pressure and flow lets utility workers find leaks and fix them immediately, and some of the fixes can be automated. Operators can also run analytics on electrical assets, such as cables, utility poles, and transformers, to track their past and present performance and then make predictions about future performance. Smart grids make it possible to track how much energy is being used in real time, which lets cities distribute power based on how much is actually needed. -- **Retail** - In the retail industry, Internet-connected devices can be used to manage inventory throughout the supply chain and inform store managers about customers' in-store experience. Drones and robots can monitor inventory both in storage warehouses as well as in-store shelves. GPS and RFID sensors can monitor assets, such as shopping carts, to reduce theft and lower costs. Smart shelving systems also use RFID tags. With smart shelving, an RFID tag sends data over antenna to an RFID reader, recording how many items are in stock and which ones people buy the most. Sensors can also generate heat maps within stores to identify where shoppers spend the most time, which helps store managers optimize the store's layout. Stores can also use beacons that communicate with shoppers' mobile phones. Beacons help stores identify patterns by gathering information about shoppers, such as the time of day they went to the store, the amount of time they spent in the checkout line, and which areas of the store they visited. Mobile phone apps also use geofencing to send alerts and personalized discounts to shoppers when they are near the store. -- **Cities** - Smart cities use Internet-connected sensors, lights, and meters to improve public transit, safety, waste management, and energy use. City buses can use GPS and GSM to share real-time location information, making it easier for passengers to plan their travel. Pavement sensors can tell city planners how busy the roads are, and then use this information to dynamically change the timing of traffic lights. Video surveillance cameras can use automatic license plate recognition to collect tolls and help law enforcement locate stolen vehicles. IoT sensors can also help waste management companies by tracking how full trucks are so that they can plan more efficient waste pickup routes through the city. When a city uses smart grids, people can use home energy storage units, such as batteries charged by solar panels. During peak energy consumption hours, people can use energy from the battery, which places less stress on the city's electrical grid. -- **Healthcare** - In the healthcare industry, Internet-connected devices offer improvements that benefit patients, healthcare providers, hospitals, and health insurance providers. IoT devices such as wearable fitness bands, blood pressure cuffs, heart rate monitors, glucometers, and inhalers make it easier for people to manage chronic health conditions like asthma or diabetes. Physicians can use the data that is gathered by patients' devices to monitor symptoms remotely and make proactive treatment plans. In an operating room, robots can assist with procedures that require extreme precision, and students can observe surgeries remotely using augmented reality. Hospitals can use sensors to track the location of medical equipment such as wheelchairs, nebulizers, or defibrillators within the facility. Health insurance companies can use data from wearable sensors to make underwriting decisions and detect fraudulent claims. -- **Supply Chain** - Internet-connected devices offer improvements to supply chain management. IoT sensors can monitor the temperature, humidity, and light exposure of raw materials and their storage containers to make sure they are being stored in the best conditions. Sensors can also track the location of goods, allowing them to be located at any time, whether in storage or transit. Location information can be used to detect delays in transit, giving suppliers the option to find alternate routes. Suppliers can also use tracking information to confirm when goods arrive at their destination, which they can then use to trigger payments and invoices. The information provided by IoT devices can help prevent bottlenecks for critical supply chains like pharmaceuticals, semiconductors, and essential minerals. -- **Agriculture** - In the agriculture industry, IoT devices can gather data that helps farmers work more efficiently by monitoring crops and livestock, automating labor, and reducing waste. Farmers can use sensors or drones to monitor crop fields, allowing them to gather data about conditions such as temperature, humidity, soil moisture, pests, and overall crop health. Sensors can also monitor livestock and track both their location and health. Farmers can use this data to make their farms more efficient, profitable, and sustainable. For example, when a sensor detects that soil moisture is low, it can trigger automatic irrigation. Drones can also perform tasks like planting crops and eliminating pests, which previously required human labor. To reduce waste, sensors can analyze metrics like the quantity of fertilizer being used, farm vehicle usage, and water and electricity consumption. Farmers can use this data to optimize their use of resources, preventing problems like overwatering. -- **Buildings** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. - -APIs provide data to Internet-connected objects and gather data from those objects for use in desktop, web, and mobile applications. APIs are how every business sector is being reshaped as part of an ongoing digital transformation that every enterprise organization is facing today. Low-cost web infrastructure used to power more efficient and reliable APIs is how the Internet-connected landscape of the future will be defined. diff --git a/docs/more/api/blueprints/digital-capabilities.md b/docs/more/api/blueprints/digital-capabilities.md deleted file mode 100644 index 7ac2dc5..0000000 --- a/docs/more/api/blueprints/digital-capabilities.md +++ /dev/null @@ -1,12 +0,0 @@ -# Digital Capabilities -A capability is a simple representation of something that can be accomplished digitally, defining a valuable process, workflow, model, or algorithm that is valuable to internal or external consumers. Often defined by one or many API requests or subscriptions that reflect a specific business capability, representing a single product offering, or bundling multiple capabilities together for a specific digital product experience. - -## Common Digital Capabilities -APIs are rapidly evolving beyond traditional digital resources and data sources to transform business operations, providing the digital gears of a modern business factory floor and digital supply chain. - -- **Workflows** - A workflow consists of an orchestrated and repeatable pattern of activity, enabled by the systematic organization of resources into processes that transform materials, provide services, or process information using API defined resources and workflows. -- **Actions** - Digital capabilities are designed for enabling action of some sort, allowing online or offline things to be triggered, setting additional workflows in motion, but also generating data along the way, producing intended outcomes across platforms or real world devices. -- **Events** - Enterprise organizations are increasingly define by meaningful events occurring across internal and external operations, delivering the intended outcomes business and technical leadership is looking for, allowing for the orchestration of the desired outcomes. -- **Algorithms** - Artificial Intelligence and machine learning models are developed, iterated upon, and applied in increasingly modular ways using APIs, leveraging the data we have stored in databases and the exhaust from daily operations to iterate upon useful algorithms. - -Enterprise digital capabilities are defined by APIs, and business is set in motion through automation and orchestration across native, but also 3rd party digital capabilities each organization has access to. diff --git a/docs/more/api/blueprints/digital-resources.md b/docs/more/api/blueprints/digital-resources.md deleted file mode 100644 index 170e92f..0000000 --- a/docs/more/api/blueprints/digital-resources.md +++ /dev/null @@ -1,14 +0,0 @@ -# Digital Resources -The modern enterprise is made up of hundreds or thousands of individual digital resources that are stored in databases and file systems, and put to work across a dizzying array of desktop, web, mobile, and other types of applications or integrations. You see the vast inventory of enterprise digital resources present in the browser URLs, and digital experiences we encounter in our personal and professional lives. Shaping our offline and online experiences each day as we engage at work, at home, and throughout the day on our mobile devices. - -## Common Digital Objects -APIs are how you define and make digital resources available available online for almost any use case. - -- **Users** - Every one of our user profiles are are made accessible via APIs, so that we can login, follow, be followed, allowing thousands or millions of users to engage via a single platform, but more importantly via any platform or device that we choose to engage on. -- **Messages** - We send messages back and forth via APIs, leverage SMS and other common messaging formats, but also the proprietary platfor messaging APIs that allow us to engage with our closest friends, family, and coworkers as we engage in real-time each day. -- **Images** - Every one of those social media images, and photos on our mobile phone is being published, stored, backed up, and shared using APIs, providing us access to visual represenationis of the world around us, but organized and made available digitally online. -- **Videos** - Like images, videos are defined, published, and shared across the web and social meda channels using APIs, allowing us to capture the world around us and then weave these moments into our digital presence, helping us blur the line between on and offline. -- **Payments** - APIs provide us with the ability to send and receive money in small and large amounts to whoever we want around the world, providing us with the basic financial ingredient that makes our world go around, funding the lives we lead on and offline. -- **Documents** - The web is made up of digital documents, HTML, PDF, and other digital files that allow us to do business around the globe, leveraging APIs to publish and share them, but also sign and send them between two parties, making our business more interactive. - -These are just some of the most common digital resources we all encounter each day, but there are millions of other digital resources available across the digital landscape we engage with each day. diff --git a/docs/more/api/blueprints/discoverable-landscape.md b/docs/more/api/blueprints/discoverable-landscape.md deleted file mode 100644 index d218cc2..0000000 --- a/docs/more/api/blueprints/discoverable-landscape.md +++ /dev/null @@ -1,23 +0,0 @@ -# Discoverable Landscape -An API-first landscape is discoverable by default. API operations are known because the API, artifacts, and the operations around them are discoverable via known workspaces and repositories that are kept in sync. Providing an ongoing snapshot of the state of API operations that is always indexed and searchable via private, partner, and public networks and workspaces. - -## State - -- Latest -- Activities -- Search -- Browse -- Notifications -- Suggested -- Feedback - -## Elements - -- Domains -- Teams -- APIs -- Documentation -- Mock Servers -- Tests -- Environments -- Monitors diff --git a/docs/more/api/blueprints/documentation-collection-checklist.md b/docs/more/api/blueprints/documentation-collection-checklist.md deleted file mode 100644 index 2e4d3e6..0000000 --- a/docs/more/api/blueprints/documentation-collection-checklist.md +++ /dev/null @@ -1,22 +0,0 @@ -# Documentation Collection Checklist -When properly defined, Postman collections provide a rich way to define shareable, portable, and embeddable documentation for APIs, moving API documentation out of the portal and making it available anywhere, designed to keep up with the pace of change across enterprise operations. - -- Metadata -- Descriptions -- Folders -- Getting Started -- Authorization -- Headers -- Bodies -- Variables -- Examples -- Errors -- Scripts -- Visualizations -- Code Snippets -- Workspace -- Links -- Run in Postman Button -- Watches -- Forks -- Comments \ No newline at end of file diff --git a/docs/more/api/blueprints/education.md b/docs/more/api/blueprints/education.md deleted file mode 100644 index 2fa6c9a..0000000 --- a/docs/more/api/blueprints/education.md +++ /dev/null @@ -1 +0,0 @@ -# Education diff --git a/docs/more/api/blueprints/enablement.md b/docs/more/api/blueprints/enablement.md deleted file mode 100644 index c794068..0000000 --- a/docs/more/api/blueprints/enablement.md +++ /dev/null @@ -1,11 +0,0 @@ -# Enablement -An API-first enterprise understands that operations is defined by the enablement that exists for teams to do the right thing on the ground floor. Establishing a clear list of what tools are needed by developers when it comes to producing and consuming APIs, then making sure they have access to them as part of their regular work. Helping augment the variety of roles involved in bringing APIs to life, and then also putting them to work in applications and integrations that are powering business today. - -** - **Platform** - Setup your API platform, beginning with your source control, CI/CD, gateways, and APM, but then evolving towards other stops along the API lifecycle, taking full advantage of the APIs behind your infrastructure to stabilize your developers. -** - **Integrations** - Lean on industrial-grade integrations to bring together the services needed to make producing and consuming APIs as seamless and native as possible for all stakeholders, realizing that APIs aren’t just for applications, but needed behind APIs. -** - **Tooling** - Empower developers with the tools they need, providing bedrock tooling like source control, CI/CD, and IDE, but then encourage and foster exploration of new commercial and open-source tooling, providing tracks to formalize team adoption. -** - **Collaboration** - Modern API development teams aren’t locked away anymore, and producing and consuming APIs is a team effort across business and technical groups, providing a huge opportunity to enable teams productivity through more collaboration. -** - **Automation** - Help teams automate at every turn, documenting and then automating common processes that do not need human execution and intervention, equipping teams to schedule and bake workflows into the CI/CD pipeline as part of their work. -** - **Observability** - Make observability available to teams, allowing them to understand the health of their APIs, and how their APIs compare to what other teams are building, helping encourage teamslearn from other teams and allow them to pick up new skills. - -Remember, the success of your governance effort will be the direct reflection of the enablement provided to teams. Nobody cares about governance on the ground floor, and just news to be supported in doing the right thing, which is much easier to provide with an API-first platform approach. When teams are fully equipped to do the right thing in the moment, with the right training and tools, this is where you begin to see productivity, quality, and governance shift into a higher gear. diff --git a/docs/more/api/blueprints/entry-points-proxy-first.md b/docs/more/api/blueprints/entry-points-proxy-first.md deleted file mode 100644 index cfff6d6..0000000 --- a/docs/more/api/blueprints/entry-points-proxy-first.md +++ /dev/null @@ -1,11 +0,0 @@ -# Proxy-First -The process of reverse engineering of an existing API by proxying the requests and responses sent, or the published and subscribed messages delivered, generating a collection from actual behavior that occurs within a desktop, web, mobile, or device application. Producing a collection that describes the surface area of an API, then mocking and documenting this behavior, before you produce a machine readable contract for the API, and provide tests that validate behavior moving forward. - -- **Workspace** - Whether this is a new API, or work to an existing API, a workspace is always the place to begin work, ensuring there is a single location to find all of the work happening. -- **Proxy** - Running the traffic for a desktop, mobile, or device application through a proxy to reverse engineer the traffic, mapping out the surface area of the APIs behind them. -- **Collection** - We will be hand-crafting a collection to describe the surface area of the API, defining the requests, and example responses, making our API as real as possible. -- **Mocks** - The contract for an API is perpetually used to generate mock servers helping make the design of the API as realistic as possible, matching specific use cases with examples. -- **Document** - Generating human readable documentation from an APIs contract, ensuring there is accurate and up-to-date documentation for each API as it is being designed. -- **Test** - Once the contract for an API has been established, and there will be no more iterations to this version, then contract tests can be produced to validate in production. -- **Contract** - Once we’ve effectively prototyped our API, iterate upon the design usign the prototype, then we can choose to generate a contract from our collection prototype used. - diff --git a/docs/more/api/blueprints/federation.md b/docs/more/api/blueprints/federation.md deleted file mode 100644 index 2495d69..0000000 --- a/docs/more/api/blueprints/federation.md +++ /dev/null @@ -1,13 +0,0 @@ -# Federation -Successful API transformations often take advantage federated approaches to governance and the resulting operation and evolution across the enterprise. Acknowledging that the enterprise will always be defined by many smaller groups, domains, or states of business operation, which will need a certain degree of autonomy, but also can leverage centralized structure, standards, and resources. - -- **Principles** - Strengthen the understanding and foundation that API operations is perpetually realized and executed in a federated way across different domains and teams responding to the changing needs of the enterprise and the markets it serves. -- **Tooling** - Perpetually defining the tools that are being used within different domains and teams to help get the job done on the ground floor, allowing for exploration of new tools, but then a formal process for weaving into the overall API platform and stabilizing use. -- **Lifecycle** - Mapping out the lifecycle employed across teams, identifying the common areas of the lifecycle, but also the unique and specific variations that either should remain specialized, or shared across other teams or centralized as part of governance. -- **Bounded Context** - Defining the existing organic lines that exist between teams and groups, then get to work understanding ways in which this can be reshaped and evolved over time to better optimize the vocabulary, standards, and lifecycle in use within teams. -- **Policies** - Crafting policies centrally, but then diseminating and applying across federated teams, but them including teams in the process of versioning and evolving them as well, estasblishing a feedback loop between centralized and federated policies. -- **Automation** - Equpping teams with the artifacts and tooling needed to automate governance, so that it can be centrally defined and consistent, but then enable teams to do the right thing when it comes to governance by automating the redundant aspects. -- **Observability** - Providing the observability necessary to “see” what is happening within a domain, group, or team, but then also across them, helping ensure that APIs and the operations around them are as observable as possible, no matter who is behind it. -- **Champions** - For federation to be success, it requires the hard work of champions embedded within teams, groups, and domains, but are willing to do the hard work to share knowledge, policies, and practices centrally, but also across enterprise teams. - -Like the centralization, there are upsides and downsides to federated governance, and there will be to be a constant evaluation and recalibration of what is done in a federated or centralized across teams diff --git a/docs/more/api/blueprints/forward-motion.md b/docs/more/api/blueprints/forward-motion.md deleted file mode 100644 index 4e9e29c..0000000 --- a/docs/more/api/blueprints/forward-motion.md +++ /dev/null @@ -1,13 +0,0 @@ -# Forward Motion -There are only minor differences in efficiency and velocity between taking a design-first, code-first, proxy-first, or prototype-first approach to deliver a new API or iterating upon an existing API. The major gain in efficiency and velocity comes when all APIs possess contracts, up to date documentation, and are fully tested, and able to be governed, bringing all APIs into a common, well-known API lifecycle, no matter how teams decide to actually begin their journey. - -- **Lifecycle** - A common, agreed upon API lifecycle is emerging across teams, and the various ways for entering it are well understood and accepted across API development. -- **Contracts** - There are up to date and accurate machine readable contracts available for all APIs, no matter how the API lifecycle is entered, ensuring business value can be validated. -- **Discoverable** - Every API is discoverable, including the metadata for an API, the contracts defining what is possible, as well as the operations surrounding the API in production. -- **Productivity** - Teams can move forward at the desired velocity for the team, but also the company, while not compromising quality, keeping teams as productive as possible. -- **Quality** - There is contract and performance test coverage across as close to 100% of the surface area of APIs as possible, ensuring a baseline of quality moving forward. -- **Observable** - Every API is observable, as well as the API lifecycle around them, providing awareness and control over all aspects of the API lifecycle, and how APIs are being used. -- **Governance** - APIs are discoverable, reliable, consistent, and delivered in a standardized way, no matter which team is developing them–governing the forward motion the enterprise. - -The goal of the API lifecycle isn’t to restrict teams to one way of delivering APIs, but to help establish a common vocabulary for how we talk about the API lifecycle, helping us get on the same page regarding what is important across the API lifecycle, and contributing to the forward motion of our API operations. - diff --git a/docs/more/api/blueprints/future-proof.md b/docs/more/api/blueprints/future-proof.md deleted file mode 100644 index 943bafa..0000000 --- a/docs/more/api/blueprints/future-proof.md +++ /dev/null @@ -1,11 +0,0 @@ -# Future-Proof -Minimizing risk and making the future as predictable as possible is the desire of any enterprise organization, and the best way to protect any company against the future is to be able to respond to change and the unknown unknowns in an API-first way. Reducing the legacy baggage you possess, ensuring your teams are always iterating and innovating, while embracing and defining the shape and velocity at which the future arrives. Using APIs to more rapidly iterate and map out change as it is happening, allowing teams to respond to whatever comes your way, and come out of challenges ahead of the competition. - -- **Innovation** - As an enterprise you are able to make innovation part of your operational DNA, with teams actively flexing their innovation muscle, experimenting, and engaging with consumers to find the busines value that shapes what is next. Moderning legacy infrastructure, delivering the services business today, while also innovating out ahead to understand what is needed for tomorrow. -- **Agility** - Teams are able to respond quickly to market shifts, take what has been learn from regular day-to-day operations, but also via labs, experimentation, and other innovation, and respond with new products that meet a changing world. Taking the well-oiled enterprise API operations and responding to whatever comes next in small, quick, and well-informed iterations to the digital resources and capabilities needed. -- **Velocity** - Teams know what they are capable of, and are able to increase or decrease velocity as required by the business and consumers, providing just the right velocity needed to respond to shifts in the market, and move ahead of the competition. Velocity that will take a lot of training and experience, but with the right investment across the API lifecycle, teams will have what they need to respond to whatever comes there way. -- **Change** - Change is embraced in an API-first environment. It is expected, welcomed, and seen as the way. Teams are well-versed in defining, designing, and delivering API-driven change, and leveraging the feedback loop from these iterations to inform what is next. A Jobs to Done attitude equips everyone with a focus on the value present in change, and their training and confidence has prepared them for the work required. -- **Control** - Leadership has the control needed to not just understand the state of operations, but also the state of operations in relationship to external consumers. Providing the observability and control make adjustments, and the agile and responsive teams to move in the right direction. Giving enterprise organizations what they need to not just move and evolve forward, but in the direction they need to lead the way. -- **Adaptive** - Enterprise organizaitons are able to adapt to whatever is thrown at them, adapting and evolving to changing markets, and completely reinventing specific domains or the enterprise if required. Perpetually find the optimal state of doing business by employing an API-first approach delivering the products markets are demanding today, while also being prepared and in-tune with what is needed to lead the way tomorrow. - -There is no way to predict the future, but APIs provide a proven way to respond to any possible future that may arrive tomorrow. APIs don’t necessary provide exactly what we need to do business tomorrow, but being in an API-first state means that we wil be able to quickly respond, evolve, and step up to confidently respond to what the future holds. diff --git a/docs/more/api/blueprints/gateways.md b/docs/more/api/blueprints/gateways.md deleted file mode 100644 index 9b5d601..0000000 --- a/docs/more/api/blueprints/gateways.md +++ /dev/null @@ -1,24 +0,0 @@ -# Gateways -The first wave of API Gateways, historically, was offered as a component within larger API Management platforms. As the industry evolved, a new wave of API Gateways was introduced to provide solutions to a changing landscape. Some of these gateway providers are now supplementing their standalone offering with additional tools, often centered around a management approach more conducive to internal API lifecycles. - -## Capabilities -Depending on the gateway solution chosen, there is a mix of common capabilities offered at this essential, but also ubiquitous part of doing APIs today. - -- **Authentication** - Ensuring that consumers are authenticated before they access an API resource or capability, helping provide security across all APIs being used behind applications. -- **Authorization** - Ensuring that once a consumer gains access to an API, that they only have access to resources and capabilities that they are entitled to have access to. -- **Plans** - Organizing APIs and their consumers into standardized, but also sometimes customized access plans that govern which APIs they can use and how much of a resource they can have. -- **Routing** - A common capability for gateways is to route traffic to a specific backend service, or possibly an external service, playing traffic cop when it comes to all the API requests made. -- **Policies** - Applying machine readable policies that define the configuration and constraints applied to APIs and their consumers as they access digital resources via each gateway. -- **Rate Limiting** - Placing limits on the number of requests consumers can make in a second, minute, hour, week, or month, ensuring that the availability of services isn’t abused. -- **Transformation** - It is common for API gateways to transform requests being made, changing or adding headers, parameters, or the body of each request to support back-end needs. -- **Contracts** - Gateways are increasingly using OpenAPI, AsyncAPI, JSON Schema, and other contracts to shape the deployment and management of APis, leveraging the contract to operate. -- **Versioning** - Helping API producers manage change across their AIs is increasingly supported at the gateway level, helping define access across many different versions of an API over time. - -## Characteristics -The age of a single gateway from a single vendor controlling access to internal resources and capabilities is over, and today’s multi-gateway landscape possesses a range of characteristics. - -- **Centralized** - In many enterprises there can be a single centralized gateway, often times handline all of the traffic coming from outside the enterprise via a single entry point. -- **Federated** - It is increasingly common for enterprise operations to support a federated gateway approach to making APIs available across domains, acquisitions, and potentially regions. -- **Regional** - The deployment of regionally specific gateways has emerged to respond to the increased regulation and data sovereignty that has emerged within specific countries. -- **Vendors** - It is common for enterprise organizations to have API gateways from multiple vendors, providing a mix of gateway solutions for teams to use when securing APIs. -- **Open** - The usage of openl licensed gateways, as well as openly licensed contracts, policies, and other artifacts is an increasingly common hallmark of the gateway in operation. \ No newline at end of file diff --git a/docs/more/api/blueprints/governance-design.md b/docs/more/api/blueprints/governance-design.md deleted file mode 100644 index edc15d4..0000000 --- a/docs/more/api/blueprints/governance-design.md +++ /dev/null @@ -1 +0,0 @@ -# Design Governance diff --git a/docs/more/api/blueprints/governance.md b/docs/more/api/blueprints/governance.md deleted file mode 100644 index eb5f46c..0000000 --- a/docs/more/api/blueprints/governance.md +++ /dev/null @@ -1,17 +0,0 @@ -# Governance -A blueprint for approaching the governance of APIs from the top-down, establishing a higher-level strategy for defining what governance is and then helping spread guidance across teams that helps enable them to deliver more consistent APIs across a more consistent API lifecycle no matter what type of API they are delivering. - -- **Organization** - Bolting governance onto the existing organizational apparatus in place for the enterprise, ensuring that API operations is always in alignment with business. -- **Guidelines** - Formal documentation, wiki, or other document that defines what governance is, and how teams are enabled to do the right thing as part of their work. -- **Maturity** - [Need Content] -- **Standards** - Having a strong and ever-evolving awareness of what standards exist inside and outside the enterprise, and a strategy for how standards will be applied. -- **Templates** - Providing as many ready-to-go and reusable templates that help demonstrate and apply patterns, standards, and other elements to APIs and operations. -- **Rules** - Establiishing sets of linting rules that can be applied at design time to guide the creation of standardized APIs, but also applied across the entire API lifecycle. -- **Policies** - Define standard source control, CI/CD, gateway, and other policies to help govern API operations, standardizing the configuration and velocity of API production. -- **Centralization** - Considering which parts of governance should be centralized, developing a single body within the enterprise who can help guide governance efforts. -- **Federation** - Considering which parts of governance should be federated, relying on teams to help define, shape, and lead when it comes to enablement across operations. -- **Design Reviews** - Formal reviews that look at the design of APIs, providing self-service, but also peer reviews that help API producers consider the big picture when designing. -- **Quality Reviews** - Formal reviews that look at the documentation and testing for each API, helping ensure that all APIs are fully documented and are properly tested. -- **Security Reviews** - Formal reviews that look at the security of each API, ensuring there is encryption, authentication, authorization, and are free from common vulnerabilities. - -It is important to remember that capital “G” governance is only a concept that lives higher up within the enterprise, and on the ground floor it is more lowercase “g” governance, or simply enablement, and helping API teams to the right thing without having to think twice or work hard–governance is about the standards, process, and tooling leadership provides to teams. diff --git a/docs/more/api/blueprints/graphql.md b/docs/more/api/blueprints/graphql.md deleted file mode 100644 index 3a3f388..0000000 --- a/docs/more/api/blueprints/graphql.md +++ /dev/null @@ -1,13 +0,0 @@ -# GraphQL -GraphQL provides the ability to apply queries, mutations, and subscriptions defined within a query language to synchronous and asynchronous APIs. Adding another type of contract to our API toolbox for providing a very flexible query language for front-end developers to use when getting the data they need. - -- **Operations** - The document which contains the types of operations: query, mutation, and subscription along with any fragments, describing what is possible wiith the API. -- **Document** - GraphQL has two types of documents: execution documents and schema documents, shaping what tpes of queries are possible, and how consumers can use. -- **Selection Sets** - With an execution document a set of fields that make up the content contained within curly braces, describing the desired fields associated with a query. -- **Fields** - And object-specific attribute that can be requested and returns a value, providing the atomic unit of any combination of data that is made available via API. -- **Fragments** - There are three types of fragments: named fragments allow us to reuse fields, type conditions allow us to conditionally select fields and inline fragments that don't have a name defined inside the selection set. -- **Directives** - There are four directives defined in the spec: @skip, @include, @specifiedBy and @deprecated, chnaging how a section of the document is executed. -- **Mutations** - Different types of operations that can be made, enabling different state changes when it comes to data, providing the ability to read, update, and other states. -- **Subscriptions** - These are long-lived requests that allow the server to send the client events as they happen, allowing consumers to be subscribed to the data they need. - -GraphQL is an optimal solution when you have large volumes of data with a wide surface area, and you have a known set of developers who are familiar with GraphQL, the schema, and are building a variety of applications that have specialized needs around specific domains of data. diff --git a/docs/more/api/blueprints/guidelines.md b/docs/more/api/blueprints/guidelines.md deleted file mode 100644 index aef7807..0000000 --- a/docs/more/api/blueprints/guidelines.md +++ /dev/null @@ -1,13 +0,0 @@ -# Guidelines -Guidelines dedicated to informing all stakeholders in the API lifecycle around the overall organization API strategy, and how teams are pushing APIs across the API lifecycle, and governing not just the design, but every other stop along the lifecycle are essential to getting all teams on the same page. - -- Strategy -- Protocols -- Patterns -- Standards -- Templates -- Naming -- Errors -- Security -- Lifecycle -- Enablement diff --git a/docs/more/api/blueprints/history-of-apis.md b/docs/more/api/blueprints/history-of-apis.md deleted file mode 100644 index 01973fe..0000000 --- a/docs/more/api/blueprints/history-of-apis.md +++ /dev/null @@ -1,25 +0,0 @@ -# History of APIs -APIs are not new, and are something that emerged in the earliest days of digital computing back in the 1950s, but are something that has evolved to meet the needs of a variety of business sectors, and have seen a massive standardization and growth as a result of the web and emergence of mobile networks. - -## Planting Early Seeds -​​The compute and network origins of APIs begins in the early days of computer technology, realizing that this new technology has the greatest impact when used in a collaborative way. - -- **SAGE**: In 1954, the Semi-Automatic Ground Environment (SAGE) began its six-year development to be used as an early warning air defense system. SAGE was developed by Massachusetts Institute of Technology (MIT) with some funding from the United States Air Force. A SAGE center could track up to 400 airplanes, using flight plans to differentiate between friendly and enemy aircraft. There were 23 SAGE centers, and each required at least 100 operators. The centers were able to communicate over telephone lines using modems, which started to become commercially available as part of the SAGE project. The first SAGE center came online in 1959. -- **SABRE**: In 1964, IBM built upon SAGE to develop Semi-Automatic Business Research Environment (SABRE), an air travel reservation system for American Airlines. This system connected 2000 air terminals across 60 cities by telephone lines. SABRE is one of the earliest examples of a real-time operating system (RTOS). An RTOS is an application designed to process data in real time at a consistent pace. -- **ARPANET**: The early stages of the Advanced Research Projects Agency Network (ARPANET) began in 1966, developed by the United States Defense Advanced Research Projects Agency (DARPA). Larry Roberts and Thomas Marill created the first wide-area network connection, linking the TX-2 computer at MIT to the AN/FSQ-32 computer in Santa Monica over a telephone line. The slow speeds, cost, and inefficiency of sending data across telephone lines led Larry Roberts to incorporate packet switching, a concept introduced by Donald Davies at the 1967 Symposium on Operating Systems Principles, into ARPANET. ARPANET was the first large-scale network to use packet switching, which groups data into packets and sends them across a digital network. A packet includes a header, which the networking hardware uses to direct the packet, and a payload, which contains the packet's main message. - -## Laying the Foundation -By the 1970s the earliest networks were being established, resulting in the need to share files, communicate via messages, and better explore how early APIs can be used for business. - -- **FTP**: In 1971, an MIT student named Abhay Bhushan published RFC 114 with the original specification for the File Transfer Protocol (FTP). FTP built upon earlier protocols such as Telnet. While Telnet allowed operators to transfer documents between machines, it did not account for differences in architecture and operating systems, so it was difficult to use. FTP introduced a standardized way to send files and messages between computers, acting as an early form of email. In 1980, a revision to FTP that allowed operators to send and receive files using a more secure TCP/IP connection was published in RFC 765. FTP was revised again in 1985 in RFC 959, adding support for several new commands. This version of FTP is still in use today, but in 2021, several web browsers removed support for FTP in favor of more secure file transfer standards. -- **EDI**: In the 1960s, electronic data interchange (EDI) was introduced. While managing shipment supply chains in the United States Army, Ed Guilbert developed the EDI process by digitizing shipping manifests. Using EDI, a business can exchange structured data in batches with external partners. These batches of bundled information conform to industry-specific messaging standards, which makes it possible to efficiently exchange large amounts of information, including purchase orders, invoices, reservations, or shipment statuses. The first EDI messages were sent in 1965, and in 1968 the Transportation Data Coordinating Committee (TDCC) was formed to develop EDI standards. With the help of Ed Guilbert, the TDCC published the first set of EDI standards in 1975. EDI continues to be used in industries such as automotive, manufacturing, logistics, and utilities. -- **Email**: In 1961, MIT introduced the Compatible Time-Sharing System (CTSS), which was the first system that allowed multiple users to remotely access a centralized server and share files. Operators could send messages to one another by sending text files over FTP, and by 1965 they implemented a `MAIL` command that saved the messages to a user's mailbox. Other groups of operators started to develop similar messaging systems around the same time, including ARPANET. In 1971, Ray Tomlinson introduced the idea of using the `@` symbol to direct a message to a user on a specific ARPANET system. In 1982, the Simple Mail Transfer Protocol (SMTP), based on Ray Tomlinson's work, was created to standardize the way servers send and receive electronic mail. SMTP replaced FTP as the protocol for sending and receiving mail. In 1988, Microsoft Mail was released for Mac OS, allowing users to send messages on the first commercial email application. In 1992, Multipurpose Internet Mail Extension (MIME) was introduced, which allowed users to send images, audio files, and videos as email attachments. In the mid-1990s, Internet service providers (ISPs) like America Online (AOL) began bundling webmail into their service, and several free webmail services such as Hotmail and Yahoo! Mail appeared during this time as well. Gmail was launched in 2001 as an internal tool for Google employees, and it was made available as an invite-only beta service in 2004 until its public release in 2007. - -## Conducting Business -The business value of early APIs was clear, there was just much more work to be done when it comes to standardizing how we would be able to communicate via the new networks we had developed. - -- **CORBA**: In the early 1990s, developers found it difficult to manage communication between applications with different operating systems, hardware, and programming languages. In 1991, the Object Management Group (OMG) introduced the Common Object Request Broker Architecture (CORBA), a middleware specification that allowed applications programmed in C to communicate with each other using object request brokers (ORBs) and application programming interfaces (APIs). An ORB exists between the server and the client, where it listens for requests, locates objects, and returns results for systems regardless of hardware or location. CORBA 2.0 was released in 1996, adding support for C++ and defined a standard protocol that improved interoperability between applications. Java language mapping was added two years later. As the Internet grew larger and more complex in the late 1990s, many developers found that it was difficult to build large-scale applications with CORBA, and its performance was unable to keep up with higher-speed networks. -- **XML**: In the 1970s, the Standard Generalized Markup Language (SGML) was developed at IBM as a standard for structured markup languages. It was designed to efficiently handle text files that were thousands of pages long, and included various markup tags that specified the document type such as `
`, sections such as ``, and formatting such as ``. SGML was very powerful, but difficult to use. When Dan Connolly joined the World Wide Web Consortium (W3C) in 1995, he added SGML to the list of activities that W3C should work on. In 1996, an 11-person working group began meeting remotely to develop a lighter version of SGML, which became the Extensible Markup Language (XML) in 1998. XML is designed to be both human-readable and machine-readable, using many SGML features in a more streamlined way. XML was quickly adopted by web developers, who were able to create websites that were more interactive than those programmed in HTML. XML also allows developers to define information about the document, which can make it easier for search engines to return useful results. -- **SOA**: In the 1990s and earlier, client-server was the dominant architecture for applications. Developers had to configure their applications to communicate directly with any other systems. Around 1998, the idea of service-oriented architecture (SOA) started to emerge. This model separated the concept of services from other applications. A service represents a self-contained, repeatable task that is typically a specific business function. For example, processing a loan application, pulling a consumer's credit score, or checking the weather are tasks that might be handled by a service. Services can communicate with applications, servers, and other services over a messaging system called the enterprise service bus (ESB). - -As the web emerged, many industry forces came together to formalize XML specifications in service of a SOA vision for industry, but the simpler, more low-cost, and increasingly ubiquitous web would provide a much more powerful approach to delivering the digital resources and capabilities we would need. diff --git a/docs/more/api/blueprints/history-of-web-apis.md b/docs/more/api/blueprints/history-of-web-apis.md deleted file mode 100644 index efe2e49..0000000 --- a/docs/more/api/blueprints/history-of-web-apis.md +++ /dev/null @@ -1,46 +0,0 @@ -# History of Web APIs -While taking shape for several decades, modern web APIs began to take their current form during the e-commerce and social networking evolution of the early 21st century. Pushing what was possible when it comes to buying and selling in a digital work, but then also realizing that consumers are a very social creature, and we’d want to bring our friends, family, and followers along for the ride. - -## Commerce - -In the early days of the Internet, many companies used the service-oriented architecture (SOA) model to deploy their systems. -This model includes components and services that are self-contained and reusable across a deployment. - -Early web APIs escaped from the controlled SOA experiment that was occurring within the enterprise and began to be applied to sales, products, affiliates, auctions, and the other expanding areas of the e-commerce shift that was occurring online. - -- **Salesforce**: In the late 1990s, Salesforce developed a web-based sales automation tool, which was formatted in Extensible Markup Language (XML) and used a remote procedure call (RPC) to communicate over HTTP. This tool is considered to be the first software as a service (SaaS). -- Amazon -- eBay - -## Social -The digital experience was rapidly becoming a social affair, with images, links, and other digital resources become more sharable via APIs, but also by expanding the use of APIs to define our profiles, connections, and networks where we are sharing these images and links. - -- Flickr -- Facebook -- Twitter - -## Web 1.0 - -The first implementation of the World Wide Web, between the years 1989 and 2005, is known as Web 1.0. -During this era, the web was made up of static HTML and XML sites. - -- **XML-RPC**: Early web APIs, including the tool that Salesforce built, communicated by sending XML-formatted documents over HTTP using an RPC, known as the XML-RPC protocol. The XML-RPC protocol evolved into the Simple Object Access Protocol (SOAP). -- **SOAP**: Like the earlier XML-RPC protocol, SOAP sends XML data over HTTP, and it adds standards and specifications for message formats, encoding rules for API requests and responses, as well as structured data in XML. SOAP supports sending either `GET` or `POST` requests, but it does not support other HTTP methods like `PUT`, `DELETE`, `HEAD`, or `OPTIONS`. In Web 1.0, SOAP requests were primarily used for machine-to-machine interaction, but occasionally used for web-server and web-client interactions as well. SOAP was the foundation of the broader concept of web services. - -## Web 2.0 - -The Web 2.0 era began around 2005. - -- **REST**: In 2000, Roy T. Fielding introduced the Representational State Transfer (REST) framework. This architectural style rejected the RPC-style approach and proposed guidelines for defining API interfaces. REST introduced new ways for web application APIs to exchange information, making use of resource identifiers, standard media types, and standard HTTP status codes. Unlike SOAP, the REST style supports the use of all HTTP methods. The concept of open, publicly-hosted APIs is based on the REST framework, and many current web APIs are considered RESTful. -- **AJAX**: In 2005, the Asynchronous JavaScript and XML (AJAX) technique was introduced. Using the lightweight AJAX approach, developers created websites that were more complex and dynamic. AJAX allows a web client to request content from a server asynchronously. Over time, developers using AJAX started moving away from XML to JavaScript Object Notation (JSON). -- **jQuery**: jQuery, along with other JavaScript libraries, grew in popularity as developers adopted the AJAX technique. Developers using AJAX and jQuery began to use web application APIs that communicated between the server and the browser, rather than between the server and client. - -## Web 3.0 - -Web 3.0 has been talked about since 2014, but it is still under development. -This era of the web involves a broader range of internet-connected machines, distributed architecture, and a faster and more interactive experience. The technologies that make these things possible include IoT, AI, ML, and Blockchain. - -- **IoT**: The Internet of things (IoT) includes physical devices and sensors that communicate with other devices over the Internet. For example, home automation devices such as smart thermostats, remote monitoring systems, and app-controlled lights are IoT devices. -- **AI**: Artificial intelligence (AI) applications are programmed to perceive their environment and take actions based on those perceptions. AI applications are designed to mimic human intelligence and simulate the way humans learn and solve problems. For example, video streaming sites can use AI to observe what content the viewer is watching and make recommendations for similar content that they might want to watch next. -- **ML**: Machine learning (ML) applications are designed to collect and learn from a set of data. While other types of AI applications must be explicitly programmed to make certain decisions, ML applications gain the ability over time to make predictions based on the data they've analyzed, and they can answer questions that are not specifically defined in their programming. ML applications are often used to classify information based on large amounts of historical data, such as speech or handwriting recognition, credit card fraud detection, and natural language processing. -- **Blockchain**: A blockchain is a linked list of transaction records, where each record block contains a cryptographic hash of the previous block. The blocks are linked together, forming a chain. If a block is altered, the rest of the chain is altered as well. Blockchain databases are often public and hosted on a peer-to-peer network, which makes their architecture highly distributed. diff --git a/docs/more/api/blueprints/how-do-you-do-api-first.md b/docs/more/api/blueprints/how-do-you-do-api-first.md deleted file mode 100644 index e364729..0000000 --- a/docs/more/api/blueprints/how-do-you-do-api-first.md +++ /dev/null @@ -1,18 +0,0 @@ -# How do you do API-first? -Becoming API-first begins with planning and taking small incremental steps to define, standardize, and perpetually optimize how you deliver and operate your applications and integrations using APIs. It’s critical to have a strategy for how you quantify APIs, but also move them forward consistently and reliably across the API lifecycle, redefining your teams and operations with a commitment to evolving your enterprise operations in an API-first world. - -- **Always work on your strategy** - To be API-first you need to have a plan. You need to draft an overview of what you are looking to achieve with your API operations and begin recording more of the details regarding how you are currently doing this and some of the things you’d like to see improve. Document what your current API strategy looks like, but then begin to develop a road map for your formal API strategy, defining what some of your next steps will be. This allows you to get more organized in how you approach doing APIs and do it in a way that can be easily communicated to others. -- **Make sure all APIs are discoverable** - To be API-first you should have a complete inventory of artifacts used across APIs being designed, developed, and operated in production while working to create artifacts for APIs where they don’t exist. There are four types of artifacts being used to describe the surface area of APIs, which when published to API workspaces and repositories can contribute to the discovery of APIs and microservices across operations. -- **Prioritize APIs over applications** - To be API-first you should always be prioritizing the planning and development of consistent and reusable APIs before you begin writing any code to deliver web, mobile, and device applications. The prioritization of APIs over applications allows for the digital resources and capabilities being used in applications to be defined early on, making sure you don’t duplicate API resources that already exist, and making sure APIs are designed and delivered in alignment with a wider platform API strategy. An organization will then have greater efficiency, reusability, and quality across all of the APIs behind the applications it depends on. -- **Be confident with your visibility** - To be API-first you will need to have the ability to effectively manage API authorization and access consistently across internal APIs so that they can be quickly made available to partners, or even third-party developers via publicly available APIs. Ensure you have the API gateway, authentication, rate limiting, logging, and other essential capabilities for managing APIs confidently in a zero-trust environment. Your organization’s digital resources and capabilities will always available wherever they are needed, whenever they are needed by your teams. -- **Realize quality across operations** - To be API-first you must have contract and performance tests available for every API and microservice, with tests available for developers to manually run locally and on the web as needed, baked into CI/CD pipelines, or scheduled from multiple regions via monitors. APIs must meet a minimum level of quality no matter what team is developing and supporting them, with machine-readable and verifiable tests across 100% of operations, and results made available via reporting systems and piped into existing API solutions. An API-first company makes API quality a priority across teams and consistent across private, partner, and public APIs. -- **Consistently apply security** - To be API-first you must have security tests in place for every API and microservices. Provide executable security tests that can be manually run by developers locally or on the web, enforced via CI/CD pipelines, and scheduled to run across multiple cloud regions via monitors. Push API security further left in the API lifecycle by equipping developers with standardized ways to make sure their APIs are secure throughout the API lifecycle. Make API security a default part of API operations without requiring that all developers become API security experts. -- **Increase developer productivity** - To be API-first you need to establish well-defined workspaces with the proper visibility for teams to design, develop, and manage APIs within. Each API workspace is in sync with repositories used to deploy and integrate with APIs, going to where developers are already working to deliver and iterate upon APIs. This helps organize API operations across teams into consistent workspaces where team members know they can find the artifacts, documentation, environments, tests, and history for every API across an organization. Team members have what they need, when they need it, across any API being delivered across business domains. -- **Reach your maximum velocity** - The velocity across teams developing APIs will be directly related to how consistent and well-known the API lifecycle is, and how comfortable each team is with moving APIs from design to deployment. To achieve maximum velocity across teams you will need teams to possess the skills and awareness required to deliver high-quality APIs using an agreed-upon lifecycle in a collaborative environment that is equipped with an asynchronous feedback loop. Establish proven and repeatable processes that teams are comfortable with to deliver APIs as part of regular operations. -- **Increase your API observability** - Being API-first requires there to be 100% observability across all APIs, tapping the outputs from across contract, performance, and other types of tests being applied to APIs via CI/CD pipelines and scheduled via monitors, with results available via reporting and existing APM solutions. API-first companies leverage the outputs of Postman Collections used to test APIs to make sure that the health and activity across all APIs are viewable via dashboards and reports. This provides leadership the awareness needed to understand the state of operations and make informed decisions around what to do next. -- **Platform-led governance across teams** - To be API-first you must invest in API design, documentation, testing, and monitoring governance. Allow for more consistency across the design of APIs, but also when it comes to documentation, testing, and how you monitor APIs. Establish a formal design style guide to communicate API governance across teams, but then also enable the automation of governance during design, development, and build time via CI/CD pipelines. This enables a platform-led approach to define, apply, and evolve API governance across API operations, helping make sure APIs—and the operations and teams around them—are more consistent. -- **Standardize all of your APIs across teams** - To be API-first you need to consistently apply common patterns and standards across APIs and teams. Web, industry, and organizational standards should be well-defined and made available to teams for use across their work. It should be easy for teams to learn about and apply common standards when designing, developing, deploying, and managing APIs, beginning with the design of APIs, then also standardizing processes and policies that are used across operations. Educate, but also make examples of standards being applied to APIs so that best practices are demonstrated to teams—setting the bar for how standardized APIs are across a platform. -- **Be proactive when it comes to regulations** - To be API-first you must have discoverability and observability across operations to be able to effectively and efficiently respond to regulatory inquiries, and be proactive in addressing regulatory compliance. All of your data should be available as simple, discoverable, and observable APIs, leaving satisfying regulatory requirements confidently and quickly. Establish a platform-led approach to managing operations in a way that will always be responsive to the regulatory environment that exists within an industry, while also being capable of automating the publishing of regulatory reporting using API provided by regulators. -- **Always incentivize innovation across your teams** - To truly be API-first, you must invest in the optimization and streamlining of your API operations until teams have the freedom to invest in the work that matters. Actively incentivize teams to innovate by optimizing operations around them and carving out a percentage of time that is dedicated to new and interesting products and capabilities. By reducing friction for developers across their work, they’ll have more time to develop creative solutions to problems within a specific domain. - -You should be able to begin working on almost all of these areas as part of your regular operations through the prioritization, development, and iteration of a formal API platform and lifecycle strategy. By optimizing and improving how your teams are working today, you will allow them to take the first steps towards moving your operations into an API-first world where you will be able to meet the future needs of your business. \ No newline at end of file diff --git a/docs/more/api/blueprints/innovation.md b/docs/more/api/blueprints/innovation.md deleted file mode 100644 index 8defd57..0000000 --- a/docs/more/api/blueprints/innovation.md +++ /dev/null @@ -1,12 +0,0 @@ -# Innovation -Allowing for, and incentivizing innovation to flourish across all teams is important to provide the room for experimentation and engagement with API consumers in new and interesting ways. Without explicitly investing in innovation it can often go overlooked, whereas in reality it should be baked into the regular process and make a default part of the API lifecycle and how teams work together. Leaving innovation the default way that teams produce and consume APIs, but also making the API lifecycle much richer, consistent, and meeting business needs. - -- **New APIs** - Encourage the development of new APIs, and a general acceptable and space for APIs that aren’t mature or accessible beyond the group that is developing them, allowing teams to quickly respond to potential demands, then validate with time. -- **Beta APIs** - Have a structure approach for defining and communicating around the maturity of APIs, making it acceptable to have unstable beta APIs, as long as there is a line of communication with all consumers, regarding the state and evolution of APis. -- **Experiments** - Encourage ephemeral experiments with delivering APis as well as the lifecycle and operations behind them, signaling to teams that they can play around with new ways of getting their work done, and it is acceptable for experiments to fail. -- **Labs** - Establish a formal labs environment for graduating experiments to a more mature state with access to more resources, helping incubate APIs, stabilizing them for wider consumption, further validating that they are something consumers will actually need. -- **Collaboration** - Open up as many avenues for collaboration between teams, opening up new lines of communication, sharing, and experimentation between different teams, or even domains, helping get teams talking and working together to deliver new APis. -- **Feedback** - Gather feedback from teams while they are innovating, working to understand what has been accomplishsed, and what the vision and motivation behind innovation is, helping look for other ways to apply ideas beyond their original intent. -- **Change** - Innovation leads to higher levels of comfort with the unexpected and the change that is required for forward motion across the enterprise, helping teams build the skills and muscle to adapt, innovate, and find solutions to the challenges they face. - -Innovation can be difficult in reactive and high pressure environments, and will be something that has to be invested in over time. Taking a certain portion of the gains realized from API-first optimization into free time for experimentation and trying out new ideas, so that over time, a significant portion of each teams time is spent innovating out ahead of the regularly planned road map. diff --git a/docs/more/api/blueprints/integrations.md b/docs/more/api/blueprints/integrations.md deleted file mode 100644 index 3e31fcf..0000000 --- a/docs/more/api/blueprints/integrations.md +++ /dev/null @@ -1,15 +0,0 @@ -# Integrations -Our business digital landscape is increasingly made up of many interconnected systems internally and externally available to our operations, and modern web APIs are increasingly how interoperability between our systems is defined, automated, and evolved. - -## Types of Integrations -APIs are powering are being used to connect all parts of enterprise operations, leveraging internal and external APIs to stitch together the expanding digital landscape required to do business today. APis are essential to the interoperability of a distributed enterprise, but also theindustries they operate across. - -- **SaaS** - APIs are how the growing number of software as a service, or SaaS solutions we depend on to run our business are made seamless with existing enterprise operations, allowing aspects of business to be outsourced while still maintaining control over usage.. -- **Infrastructure** - The infrastructure already in use across enterprise operations likely already possess APIs, providing a huge opportunity for more automation and orchestration across the software already in place within an organization, representing low hanging fruit. -- **Interoperability** - APIs are how distributed and federated parts of enterprise operations are made more interoperable, and how acquisitions, partners, and industry level ingteroperability is set into motion, reducing the challenges involved with working across business divides. -- **Syncing** - The syncing of data across operations and with external partners and services is a regular part of business operations, and APIs are how this is done as efficiently and cost effective as possible, allowing for more alignment across business domains. -- **Migrations** - APIs are how data and objects are migrated between servers, clouds, partners, and other constructs of our regular operations, ensuring small or large amounts of data is made accessible so that it can be migrated to a new location more optimal usage. -- **Automation** - The only way businesses can remain competitive in this digital landscape is through the automation of common business process, and APIs are how those process and the digital resources they depend on are defined, providing what we need to automate it all. -- **Orchestration** - APIs provide the knobs and levers that can be scheduled, triggered, and pulled, responding to common events and setting in motion the processes we need to keep business moving forward, producing just the right performance we desire for our operations. - -The average enterprise landscape has thousands of connected servers and platforms they need to do business across, and APIs are how that reality if made whole, or left operating in business isolation. diff --git a/docs/more/api/blueprints/interoperable.md b/docs/more/api/blueprints/interoperable.md deleted file mode 100644 index b79988c..0000000 --- a/docs/more/api/blueprints/interoperable.md +++ /dev/null @@ -1,9 +0,0 @@ -# Interoperable -Ensuring that distribute teams, federate groups and lines of businesses, as well as partners, and any other business construct that exists can be as interoperable as possible by default is the desired state of existence for any enterprise organization. While the desire to operate as a single entity may still prevail in some industries and organizations, the reality on the ground is more often distributed in both physical and virtual ways. Leaving interoperability is one of the top reasons for enterprise organizations to invest more into their API operations, ensuring their business can connect in many different ways whenever it is needed. - -- **Distributed** - The more distributed the enterprise is the more of a demand for interoperability there will be, and the more teams will have the abilit to flex their muscle in operating APIs that are interoperable. As a business we are depending on more distributed services, which is something that we should be reflecting in how we operate our businesses, giving as much freedom and agency to teams as possible using APIs. -- **Standardized** - Always leaning on existing Internet and industry standards, adopting, iterating, and contributing to standards before we ever develop our own standards. However, in the absence of existing standards, enterprises should be in the business of evolving their own common patterns to the status of being a standard first across the enterprise, but then at some point possibly even within an industry they operate in. -- **Seamless** - Seeing all of our infrastructure and applications as part of a seamless web of business interoperability. Producing and consuming APIs as rapidly as business dictates, trying out new services, but then working to integrate them with our API platform using APIs–then iterating and evolving beyond whenever possible. Seeing all digital resources and capabilities as seamless and interoperable building blocks. -- **Event-Driven** - Designing enterprise systems to respond to the most meaningful events occurring across our platforms, but also the platform we depend on, allowing operations to response in real-time to what is happening across the market. Allowing not just applications, but our enterprise infrastructure to identify, subscribe, and respond to change as it is happening across the seamless platform we use to do business. - -Interoperability in todays digital landscape isn’t just a nice to have, it is an essential part of doing business, and keeping up with the change happening across the global marketplace. Interoperability doesn’t put you at a disadvantage with your competition, it is the opposite, it is how you develop the muscles you need to outmaneuver and develop entire new categories of doing business today. diff --git a/docs/more/api/blueprints/jobs-to-be-done.md b/docs/more/api/blueprints/jobs-to-be-done.md deleted file mode 100644 index e91ab3a..0000000 --- a/docs/more/api/blueprints/jobs-to-be-done.md +++ /dev/null @@ -1,15 +0,0 @@ -# Jobs to be Done - -Jobs Theory provides a framework for categorizing, defining, capturing and organizing the inputs that are required to make innovation predictable, providing a useful framework to think about when designing, delivering, operating, and deprecating APis. Complementing an agile approach to delivering APIs, considering the Jobs to Be Done framework helps define the needs of consumers in the shortest possible timeframe, iterate and respond quicker than the competition. - -- **Get Something Done - People buy a product or a service to get something done, they don’t care what a companies incentives are, they just want to accomplish their job. -- **Jobs are Functional - APIs must have a purpose that speaks to the needs of consumers, making an emotional connection, and remain focused on their relationships. -- **Reaching Maturity - A Job-to-be-Done is stable over time, and APIs should reach a level of maturity after enough iterations, stabilizing into a reliable product for consumers. -- **Agnostic Interfaces - A Job-to-be-Done is solution agnostic, and APIs don’t know or care about other APis, it just does one thing, and does it well, focused on job at hand. -- **Measuring Right Thing - Success comes from making a job the unit of analysis, rather than the product or the consumer, focusing on the value being deliver by the API. -- **Marketing Has Impact - A deep understanding of the customer’s job makes marketing more effective — and innovation far more predictable when iterating upon your APis. -- **Get It Done on the Cheap - People want products and services that will help them get a job done better and/or more cheaply, opening up a perpetual opportunity for APIs. -- **Bring Value to Consumers - People seek out products and services that enable them to get the entire job done on a single platform, making API integrations what they need. -- **Delivering Success to Consumers - Innovation becomes predictable when “needs” are defined as the metrics customers use to measure success when getting a job done. - -A focus on the Jobs to be Done is about an unrelenting focus on the value being delivered by an API, providing what consumers are needing, then relying on your feedback loop to help guide the rapid iteration of your API products, while ensuring you are able to keep everything in alignment with consumers. Providing the fuel for your API to move forward, and the prioritization to help move your enterprise forward using APis. diff --git a/docs/more/api/blueprints/json-schema.md b/docs/more/api/blueprints/json-schema.md deleted file mode 100644 index 4bbb540..0000000 --- a/docs/more/api/blueprints/json-schema.md +++ /dev/null @@ -1,12 +0,0 @@ -# JSON Schema -The JSON Schema specification provides a machine and human-readable way for describing digital objects that are used as part of API requests and responses, as well as the messages we publish and subscribe to our more asynchronous APIs. Providing us with a way to describe the structure of our digital resources and capabilities in a way that we can validate during design, development, or while in production. - -- **Objects** - Objects are a way to define digital structures, providing a machine readable way to describe meaningful concepts that are exchanged online using different types of APIs, passing data around in a way that makes it easy to have a shared understanding. -- **Properties** - The individual characteristics of an object, providing the details that give an object meaning and value, describing the name and email for a person object, or name and description for a product object, providing a logical set of properties that applications can understand. -- **Property Names** - Each property has a name, allowing each individual characteristic of an object to be described in a way that makes sense to consumers, providing a shared meaning of a specific aspect of a digital object being made available via an API. -- **Property Description** - Each property can also have a description, providing much more detail about what the object property will contain, helping convey as much meaning and purpose behind why the property exists, and how API consumers can use in applications. -- **Property Type** - Allowing each property to be defined as a string, number, object, and other common or custom types, formeally articulating what can be expected a property to contain, helping be very strict or even loose when it comes to data available in objects. -- **Property Patterns** - Patterns allow for regular expressions to be used to articulate in a very precise way what a property should contain, providing a universal way of precesiy describing the contents, ordering, and structure of the data that is available via each object property. -- **Required** - Defining a list of the properties for each object is required to be present whenever you are moving objects around synchronously or asynchronous via APIs, help define the minimum amount of information needed to define an object made available. - -JSON Schema is how the digital bits we are passing around the web each day gets validated behind the scenes, making sure the requests we make and the response receive are as correct and usable as possible. diff --git a/docs/more/api/blueprints/landscape-mapping.md b/docs/more/api/blueprints/landscape-mapping.md deleted file mode 100644 index 2fc08ca..0000000 --- a/docs/more/api/blueprints/landscape-mapping.md +++ /dev/null @@ -1,35 +0,0 @@ -# Landscape Mapping - -A blueprint for helping jumpstart a mapping of the internal API landscape, allowing an organization to begin getting a handle on what is happening across operations. Beginning by profiling a specific group or domain, but then expanding it across operations once a portion of the landscape has been mapped and understood. - -## Team - -- **Team Groups** - Teams can be organized by groups that reflect lines of business, and business domains. Establishing bounded context for assigning team members to, allowing API operations, workspaces, artifacts, and other resources to be organized, accessed, and reported upon via logical groups. -- **Team Members** - Formerly defining who the team will be moving an API forward through all stages of it~s lifecycle, providing a clear definition of who is responsible for each part of producing an API. -- **Team Roles** - API teams can be assigned a specific role that is in alignment with their involvement in the API lifecycle, allowing them to have the access to APIs, artifacts, and resources based upon their role in the process. Allowing teams to play the part they have been assigned, while also still ensuring the integrity of operations, helping ensure more reliability, but also productivity in how teams work. -- **Members** - The individual and collective members of a specific team are involved in producing and consuming APIs, identifying the human beings behind API operations, organizing them into logical groups, applying designated roles and access permissions, helping be more organized about how API operations are defined. -- **Groups** - Establishing a logical separate of teams, grouping by domain, line of business, project, or another bounded context that makes sense to the human part of operations, but will ultimately shape the workspaces, APIs, documentation, and other elements of API operations, helping shape the API factory floor. -- **Role Based Access Control (RBAC)** - The ability to assign roles individual team members then shape access to workspaces, APIs, collections, and environments based upon their role in producing or consuming APIs, allowing for the protection of artifacts and other elements from unwanted changes, while still making them available to the widest possible team members for use, striking the right balance across domains, groups, and workspaces. - -## APIs - -- **Documentation** - Documentation published as human consumable HTML pages help potential API consumers learn about what an API does by describing the paths, channels, parameters, headers, schema, messages, and other building blocks of APIs, showing examples of what is possible or by providing an API client to make calls to each API as part of the documentation. -- **Swagger** - Swagger 2.0 is a common specification used by teams to describe what an API does. It is common for Swagger files to be manually or automatically created as part of API Operations. Swagger artifacts provide what is needed to power documentation, mock servers, testing, and what are make it so that teams can browse and search for APIs via catalogs and networks. -- **OpenAPI** - The OpenAPI specification provides a common vocabulary for describing the surface area of request and response APIs, as well as webhooks, ensuring that API producers and consumers are on the same page when it comes to integration, but also helps stabilize multiple areas of the API lifecycle providing a contract that can be used for delivering documentation, mocks, testing, and more. -- **Watch** - Watching of some element of API operations, allowing team members, partners, or public users to signal they want to receive notifications of any change to an API and it~s supporting elements, making API operations more observable and something that all stakeholders are able to stay in tune with as they evolve and change. -- **Changes** - Dealing with the inevitable change that is happening within any industry, and across enterprise operations in response to a changing world, taking the time to regularly assess what change is occurring, establishing common practices for managing and communicating around change across teams and with consumers. -- **Comments** - Comments on APIs, collections and other elements of API operations allow for more tightly coupled and inline conversations to occur around entire elements or specific parts and pieces of elements, allowing teams to collaborate and communicate across the API lifecycle. -API RBAC** - Providing guidance for how RBAC should be applied to APIs. - -## Workspace - -- **Team Workspace** - Establishing and properly setting up a dedicated workspace for each API helps ensure there is always a single place to go to find everything that is happening with an API across it~s entire lifecycle. -- **Github Repository** - Having a dedicated Github repository for an API provides a single place where code and other artifacts can be managed, with a pipeline, issues, and other supporting elements an API will need to operate. -- **Watch** - Watching of some element of API operations, allowing team members, partners, or public users to signal they want to receive notifications of any change to an API and it~s supporting elements, making API operations more observable and something that all stakeholders are able to stay in tune with as they evolve and change. -- **History** - Having access to the history of requests and other activity associated with API operations, providing an accounting of what is happening from both producer or a consumer point of view, relying on logging, but making it much more usable as part of team or consumer working within any API workspace. -- **Activity** - The changes made to any aspect of operations by team members, providing observability into when APIs, mock servers, documentation, testing, monitors, and other critical elements of API operations are changed, configured-- **helping give a log of everything that happens at the operational level. -- **Workspace Name** - Providing guidance for how workspaces should be named. -- **Workspace Overview** - Providing guidance for how workspaces overviews should be crated. -- **Workspace RBAC** - Providing guidance for how workspaces RBAC should be applied, - -There is so much more that you can invest in as you progress in yoru API journey, but mapping out the human, business, and technological layers of your operations is important to the stabilization of your platform–with a clear map of what exists today, you will be able to better prepare for what is next. \ No newline at end of file diff --git a/docs/more/api/blueprints/legacy.md b/docs/more/api/blueprints/legacy.md deleted file mode 100644 index fb09400..0000000 --- a/docs/more/api/blueprints/legacy.md +++ /dev/null @@ -1,10 +0,0 @@ -# Legacy -APIs are widely being used to address challenges with legacy infrastructure, and teams are finding success using more modular approaches to refactoring order systems to work with modern applications and integrations, and in some cases doing away with legacy infrastructure all together. Helping enterprise organizations deal with technical debt, but then also future proof operations against future tech debt by ensuring that systems are smaller, more modular, and able to be evolved and deprecated without the overhead associated with systems developed in the past. - -- **Monolith** - Teams are using APIs to decouple and redefine monolithic legacy systems, reverse engineering and implementing as microservices and APIs to modernize systems. -- **Microservices** - Modular, single-use, synchronous and asynchronous microservices are being used to redefine legacy systems as more distributed and easier to evolve and reuse. -- **Facades** - APIs are used to create facades that can be used to provide modern interfaces for applications, while working to evolve and deprecate legacy backends. -- **Gateways** - Gateways provide a industrial-grade approach to standing up a modernized stack in front of legacy systems without exposing backends infrastructure being modernized. -- **Proxies** - Proxies provide the ability to intercept and map out the traffic in legacy applications, which can be applied to defining and delivering facades and microservices. - -APIs are essential modernizing our legacy infrastructure, providing interfaces that can be used in applications, while abstracting away legacy solutions while they are being modernized, without disrupting applications and integrations that are dependent on the digital resources and capabilities delivered by legacy systems. diff --git a/docs/more/api/blueprints/lifecycle.md b/docs/more/api/blueprints/lifecycle.md deleted file mode 100644 index 0b7b63d..0000000 --- a/docs/more/api/blueprints/lifecycle.md +++ /dev/null @@ -1,11 +0,0 @@ -# Lifecycle -A well-known, common, and repeatable API lifecycle is essential to achieving success across API operations. Teams should be given the agency to leverage the tools that help them be successful, but a shared understanding of what the API lifecycle is will be required to achieve the desired productivity and velocity, while being also able to deliver APIs the possess high levels of quality. Providing a proven lifecycle for producers and consumers to engage when it comes to delivering and evolving APIs in a way that strikes the balance needed for forward motion. - -- **Existing** - Map out the API lifecycle as it exists today, documenting how people perceive the lifecycle they employ to produce and consume APIs. Document the differences, as well as the common aspects of how APIs are brought to life but also put to work. -- **Tooling** - Document what tools are in use by teams today, mapping ouit the commercial and open source solutions teams have adopted to help accomplish their work, assembling a toolbox that is required to move your operations forward right now. -- **Evangelize** - Spent every free moment evangelizing across teams about what the API lifecycle is, using the same words and phrasing to describe different stops along the API lifecycle, then producing visual aids that help teams to share API lifecycle knowledge. -- **Educate** - Invest in educational resources, workshops, and other approaches to helping keep teams understanding what the enterprise lifecycle looks like and how it reflects that practices of other leading API providers, providing ongoing education across all teams. -- **Awareness** - Foster awareness at each stop along the API lifecycle, demonstrating the value of possessing a common understanding and approach to defining, designing, and delivering APIs, showing how teams will benefit from mapping our the API lifecycle. -- **Observability** - Build in observability into the API lifecycle as soon as possible, routing data from testing, security, and governance efforts into existing APM solutions, then working to add more outputs across the lifecycle, measuring as much as possible. - -Without a known lifecycle you will never be able to stabilize your API operations enough to realize the productivity, quality, and governance you need. You can’t shift-left if there is not a common understanding of which direction is left, and what each stop along the API lifecycle is. Making the API lifecycle one of the first places you need to invest as you look to increase velocity in your API-first transformation. diff --git a/docs/more/api/blueprints/low-code-no-code.md b/docs/more/api/blueprints/low-code-no-code.md deleted file mode 100644 index c3b04f8..0000000 --- a/docs/more/api/blueprints/low-code-no-code.md +++ /dev/null @@ -1,13 +0,0 @@ -# Low-Code/No-Code -Low Code/ No Code refers to an ever-growing opportunity to deliver applications without writing code or at least minimizing the amount of code needing to be written. Making producing and consuming APIs much more inclusive to those who are not fluent in writing code, but also makes those who are even more productive in their work. Providing business and technical stakeholders with a variety of solutions for engaging with API operations, and the business that it supports. - -- **Inventory** - Provide business and technical stakeholders with a robust inventory of internal, partner, and 3rd party resources to use when creating low-code/no-code integrations and applications, enabling them to do whatever they need across services. -- **Workflows** - Offering the ability to create and evolve common business workflows using many different APIs, allowing APIs to be used by technical and business stakeholders to perform common business jobs they need done without getting hands dirty in the code. -- **Authentication** - The complexities of authentication and authorization should be abstracted away, making it easy for anyone to be able to navigate the secrets needed to authenticate with one or many APIs, and solve authentication problems along the way. -- **Visual** - Making it a visual experience to navigate API inventory, build workflows, configure authentication, so that anyone can stitch together many different APIs together, allowing busines or technical stakeholders tto take advantage of using APIs. -- **Runtime** - Providing a simple runtime that any user can employ to run their workflows, providing the compute needd to make each individual requests, handle responses, and execute workflows in the order intended by the user, powering the jobs they need done. -- **Automation** - Offering the ability to schedule and trigger when specific events occur, setting in motion the desired workflow, allowing anyone to automate common business tasks throughout the day, allowing teams to do more with less, taking advantage of APis. -- **Collaboration** - Allowing teams to collaborate and share workflows, provide feedback, and iterate upon the functionality provided by each workflow, making business integrations, and orchestration a team affair, and something everyone learns from. -- **Observability** - Providing users with the ability to see how they workflows and automations are running, the results of their work, and how it fits into the overall context of what each user is looking to accomplish, helping cultivate more awareness. - -Low-Code/No-Code is how enterprise organizations will deal with the demands of their own digital future where there is perpetually a shortage of developers, expanding more of pipes behind the applications and integrations we depend on to business stakeholders. Providing business stakeholders with the resources and capabilities they need to be successful in their regular operations. diff --git a/docs/more/api/blueprints/microservices.md b/docs/more/api/blueprints/microservices.md deleted file mode 100644 index 36ac26a..0000000 --- a/docs/more/api/blueprints/microservices.md +++ /dev/null @@ -1,15 +0,0 @@ -# Microservices -The benefits realized from doing APIs have manifested themselves in many different ways, but none rise to the value introduced over the last five years of redefining the enterprise using microservices. The microservices journey has helped us understand who we are as an organization, giving us the vocabulary for defining the technical, business, and human elements of doing business in the Internet age. Helping us not just decompose the monolithic enterprise infrastructure that has accumulated over the last few decades, but also reconfigure our operations to better serve the future we envision. - -## Enterprise Building Blocks -Microservices are the Lego building blocks we need to understand what digital resources and capabilities compose our enterprise operations and provide us with the building blocks to build any configuration of application, integration, and orchestration we will nee to operate. - -- **Composability** - Microservices are designed to be stitched together and consumed in a variety of ways that meet a mix of business needs, providing the digital building blocks enterprise organizations need to operate, grow, and adapt to what they need to do business. -- **Templates** - Having standardized templates for designing and deploying microservices, helps ensure that microservices are consistent, intuitive, and easy to put to use by consumers, no matter which team was responsible for bringing it to life. -- **Synchronous** - Synchronous microservices provide a simple and intuitive way to deliver solutions for a single digital resource or capabilitiy, providing a very modular and composable way for developers to implement the digital services needed. -- **Asynchronous** - Asyncrhoonous microservices provide a more event-driven approach to delivering the resources and capabilities needed across the enterprise, allowing developers to publish and subscribe to a variety of channels that help define business operations. -- **Risk** - There is risk involved with breaking down digital resources and capabilities into very modular, and potentially distributed microservices, and teams should be made aware of what is needed when managing a potentially sprawling landscape of digital services. -- **Reward** - Witt he right enablement, there is a big payoff when it comes to the flexibility and agility a robust microservices offers, making enterprise operations more nimble when it comes to the roadmap, but also responding to any changes that may come its way. -- **Lifecycle** - Delivering hundreds or thousands of APIs consistently across an organiation requires a common and well-known lifecycle to be employed across teams, helping ensure everyone is on the same page when it comes to delivering and iterating upon APIs. - -Microservices are not the silver bullet to solve all internal enterprise challenges, but it does provide a proven approach to how we break up and organize our digital resources and capabilities. Providing teams with a clear definition of a class of API that is only available internally, behind the enterprise facade. diff --git a/docs/more/api/blueprints/mobile.md b/docs/more/api/blueprints/mobile.md deleted file mode 100644 index a3a4575..0000000 --- a/docs/more/api/blueprints/mobile.md +++ /dev/null @@ -1,19 +0,0 @@ -# Our Lives Going Mobile -Mobile phones have been around since the 20th century, but in 2007 everything would change with the introduction of the iPhone from Apple. These new mobile devices would popularize a new category of Internet connection applications that we’d carry around with us in our pockets, requiring an entirely new set of digital resources, and unlocking an unlimited realm of possibilities when it came to new types of Internet-enabled digital capabilities. - -## Devices -There have been many different mobile devices and operating systems to emerge in the last couple of decades, but the iPhone from Apple and Android OS backed by a variety of mobile devices continue to dominate the mobile landscape. - -- **iPhone** - In 2004, a team of a thousand Apple employees began work on a secret project. Steve Jobs had noted that with the rising popularity of personal digital assistants (PDAs), many people were carrying around multiple devices: a mobile phone, an iPod, and a PDA. Apple collaborated with wireless carrier Cingular (now AT&T) for 30 months to develop the first iPhone, which aimed to combine these three devices into one. In January 2007, Apple first announced the iPhone, and it was released the following June. Unlike other mobile phones at the time, the iPhone's interface was designed around its touch screen and singular button, and its music player resembled the iPod that was already familiar to many people. The iPhone provided screen space that made web browsing, sending and reading emails, and viewing maps easier than they had ever been on a mobile phone. A year after the first iPhone launch, the App Store opened, allowing third party developers to make their applications available to any iPhone user. -- **Android** - In 2003, a small startup called Android Inc was formed in Palo Alto, California. Android Inc began working on a Linux-based operating system that would connect digital cameras to the cloud, but they soon found that the market was too small and shifted their focus to mobile phones. Google purchased Android Inc in 2005 and continued development, marketing the open source operating system to handset makers and wireless carriers. In 2006, early Android phone prototypes had full QWERTY keyboards and resembled existing smartphones that ran Windows Mobile or BlackBerry OS. After the iPhone released in 2007, Google began working on support for touchscreen controls in Android. The first Android phone, the HTC Dream (known as T-Mobile G1 in the United States), released in September 2008, a few months after the second version of the iPhone launched. Unlike the iPhone, this Android phone included an app store at launch, which was then called the Android Market. - -## Resources -Early mobile application developers realize that they could deliver the digital resources they needed on their mobile applications, and waves of API providers emerged to provide the essential digital resources we would be needing to power the growing spectrum of mobile applications being installed to the desktop of these powerful mobile computing devices. - -- **Twilio (SMS)** - Twilio was founded in November 2008 with the goal of integrating phone services into software. When Twilio Voice launched, it included an API that enabled developers to make and receive phone calls that were hosted in the cloud. This API could be used to make phone calls and play recordings, so it made automated phone calls possible. In February 2010, the Twilio text messaging API launched, which allowed customers to send a text query to a business and receive an automatic SMS reply. Developers can use the Twilio Programmable Messaging API to send and receive SMS messages to app users, and they can also track message delivery. -- **SendGrid (Email)** - In the early 2000s, Isaac Saldana began working on a solution to the email deliverability problems that many startups seemed to have in common. For example, email providers often marked legitimate messages from businesses as spam, and it was difficult for a business to gather any analytics about the emails they sent. These types of email included shipping notifications, sign-up confirmations, password reset requests, and other important automated messages. Saldana wanted to make it easier for developers to integrate email into their apps in a way that made sure customers received messages that were important and relevant. He started to develop an email management platform at smtpapi.com, offering it to Internet service providers (ISPs) for free in exchange for the computing resources that were required to keep it running. When it became clear that developers found value in this service, Saldana and recruited two other co-founders to officially launch SendGrid in 2009. The SendGrid platform manages email servers and infrastructure, as well as ISP monitoring, which includes monitoring email sender reputations and managing allowlists. Developers can also use the SendGrid platform to track what happens to emails after they reach a customer, such as unsubscribes, bounces, and spam reports. SendGrid makes APIs available to developers, who can then configure their apps to communicate with SendGrid. -- **Stripe (Payments)** - Brothers Patrick and John Collison launched several apps and startups as teenagers, and they noticed that with each Internet business they started, the most challenging part was setting up a way for customers to pay. At the time, e-commerce companies had to choose between setting up a gateway between their app and a legacy banking system directly, or using PayPal, whose regulations could cause revenue to be held up in reserves for months, in some cases. In 2008, the Collison brothers began to try solving this problem using a simple, consistent API. They wrote seven lines of code that allowed an app to communicate with a payment system. After iterating on this code, they founded Stripe in 2010. While Stripe originally partnered with a company that handled payments, they then chose to move payment processing to their internal servers so that they could control the process from beginning to end. Stripe processes credit card payments, checks for fraudulent transactions, and charges a percentage for each transaction. Developers can integrate the Stripe API into their app, which allows the app to securely accept credit cards from customers. -- **Google Maps (Maps)** - In 2003, an Australian mapping startup named Where 2 Technologies developed a desktop application in C++, which they called Expedition. In 2004, Google acquired Where 2 Technologies and released Expedition as the Google Maps web application. In 2005, Google launched the Google Maps API, which allowed developers to use their maps on their own websites. The Google Maps API made it possible for developers to embed maps into a website and include their own overlay content, retrieve static images of maps, generate navigation instructions, and retrieve other map information such as elevation and landmarks. The Google Maps API is currently the most-used web application API. In 2008, Google launched the Google Maps app for Android, around the same time the HTC Dream was released. -- **YouTube (Videos)** - In 2004, three PayPal employees, Chad Hurley, Steve Chen, and Jawed Karim, were discussing the difficulty of sharing videos online. Karim cites the Super Bowl XXXVIII controversy as the event that inspired their conversation. Hurley, Chen, and Karim wanted to create an online platform that allowed anyone to upload and share whatever video content they wanted, with whomever they wanted. In 2005, they created the platform and called it YouTube, with the intent to focus on enabling individual users to share their content. The beta site launched in April and the official YouTube site launched in December. In 2006, YouTube added features such as user profiles, video comments, and mobile phone upload support. Later that year, Google acquired YouTube. Developers can use one of the YouTube Player APIs to embed the YouTube player into an application, and they can use the YouTube Data APIs and YouTube Analytics APIs to retrieve and manage information about users, playlists, channels, subscriptions, ratings, and videos. -- **Instagram (Images)** - In 2009, Kevin Systrom started developing an app called Burbn, which allowed users to check in to a location, share their plans, and post photos from their phone. At the time, mobile check-in apps like Foursquare were starting to become popular, and Hipstamatic was gaining popularity as an iPhone photography app that applies stylized filters. Systrom observed that there was no app that combined location check-ins with photo sharing, which inspired him to start developing Burbn. After bringing in Mike Krieger as a developer and user experience designer, they decided to streamline the app to focus primarily on photo sharing and renamed it to Instagram. In October 2010, Instagram was released on the App Store for iPhones only. At first, Instagram did not make its API public, which motivated a developer named Mislav Marohnić to reverse engineer the app and code an unofficial API. In 2011, Instagram made its official API public. Facebook acquired Instagram in April 2012, and at around the same time, Instagram for Android was released. -- **WhatsApp (Messaging)** - In February 2009, Jan Koum and Brian Acton started developing an app they called WhatsApp, where users could share their current status and view the statuses of their friends. When they showed the app to their friends, there was little interest. In June 2009, Apple launched the push notification feature for the iPhone. With push notifications enabled, WhatsApp beta app users received notifications when their friends changed their statuses, and many began to use these statuses to send messages to one another. Koum and Acton realized that they could pivot WhatsApp to become an instant messaging app. In August 2009, WhatsApp 2.0 released, allowing users to log in to the app using their phone number, and displaying a double checkmark next to each sent message to confirm its delivery. In November 2009, WhatsApp went out of beta and was officially launched on the App Store. In 2014, Facebook acquired WhatsApp. In 2016, WhatsApp for Business launched, which allowed a business to create a free profile with links and enabled them to send messages to customers. In 2018, the WhatsApp for Business API was launched, which allows large companies to send out automated messages. diff --git a/docs/more/api/blueprints/observable-landscape.md b/docs/more/api/blueprints/observable-landscape.md deleted file mode 100644 index 5575ef0..0000000 --- a/docs/more/api/blueprints/observable-landscape.md +++ /dev/null @@ -1,8 +0,0 @@ -# Observable Landscape -An API-first landscape can be turned into an observable landscape. Taking every output available across all APIs, and the infrastructure behind them, as well as the governance overlaid on top of, and make sure it is considered as part of the overall feedback loop within each domain. Leveraging collections defined to test each instance of an APIs, the surface area of an API, and the infrastructure used to deliver an an API as the universal observability connector to allow us to see across API operations. - -- Domains -- Teams -- APIs -- Lifecycle -- Governance diff --git a/docs/more/api/blueprints/open-tech-standards-support.md b/docs/more/api/blueprints/open-tech-standards-support.md deleted file mode 100644 index 4be110f..0000000 --- a/docs/more/api/blueprints/open-tech-standards-support.md +++ /dev/null @@ -1,27 +0,0 @@ -# Open Technologies Standard Support -Postman Open Technologies works with standards bodies to help move their specifications forward using open source specifications like OpenAPI and AsyncAPI, helping define common objects across those contracts using JSON Schema, and document and certify them via a publicly available workspace that the community can engage and work with. - -We help our open standards and specification partners with the following: - -- **Team Profile** - We want your organization to have a robust team profile in our network of 20M API producers and consumers.. -- **Public Workspace** - We can help you setup, configure, and drive attention to your public API standards workspace, and make it available via your public team profile, and embedded in your existing documentation and portal. -- **OpenAPI Source of Truth** - WE can provide feedback and guidance on managing OpenAPI for your standards. -- **OpenAPI Versioning** - We want to help offer guidance when it comes to versioning your API standards over time. -- **Reference Documentation** - We can help you automatically generate reference documentation for your standards. -- **Use Case Documentation** - We can help you develop specific business use case documentation for your standards. -- **Github Syncing** - We can help you sync your source of truth OpenAPI, and documentation to a public Github repository. -- **Mock Servers** - We can help you develop mock servers for entire standard or for specific business use cases. -- **Certification** - We can help you develop modular, sharable, executable tests to certify implementations of your standard. -- **Contributor Engagement** - We can help speed up on-boarding, forking, watching, and engagement with your community stakeholders. -- **Feedback Loop** - We can help you strengthen your feedback loop around your standards, and the specific properties of your standard, helping gather more precise feedback. - -With a team profile for your organization, and a public workspace for your standards, complete with OpenAPI contracts, reference and use case docs, as well as mock servers and certification tests, we can then help with some go to market actives on a regular basis that would help bring attention to your standard. - -- **Blog Posts** - We are happy to write regular stories about your standards, as long as we can map it to the real world business concerns of the industry you serve. -- **Live Stream** - We have a weekly live stream we like to showcase real-time implementations of industry standards, helping our audience of 20M developers makes sense of standards. -- **Breaking Changes** - We have a podcast focused on business and technology leadership that we’d like to invite your leadership, as well as those of your members to have a conversation. -- **Social Amplification** - As a standards partner with Postman Open Technology team we are happy to help amplify the stories we produce together, as well as the news from your community. - -Postman is committed to helping increase our investment, but also help scale the communities investment in open standards, so we are happy to dedicate time each month from the team to brainstorm ideas with your organization, and find the best way to help add value to the great work you are already doing. It is important for us to have your standards as part of the Postman API Network, and after we get a workspace up and some storytelling under our belt, we’d love to explore other possibilities when it comes to baking your API standard into the API lifecycle and governance practices of your target audience—making it easy for developer within these industries to do the right thing when delivering new APIs. - -Let’s get a regular time in the calendar to discuss some ideas!! \ No newline at end of file diff --git a/docs/more/api/blueprints/open-technologies.md b/docs/more/api/blueprints/open-technologies.md deleted file mode 100644 index bea29ff..0000000 --- a/docs/more/api/blueprints/open-technologies.md +++ /dev/null @@ -1,33 +0,0 @@ -# Postman Open Technologies - -Postman Open Technologies is a program dedicated to educating the world when it comes to everything API, helping companies, organizations, and government agencies optimize their approach to the API lifecycle and how they approach API governance, while also investing in, adopting, and producing the open source specifications, standards, and tooling that are continuing to shape the modern API-first enterprise operation. - -## Engage (Learn) -Within Open Technologies, our first team, dubbed “Engage”, also known as Developer Relations (Devrel), is focused on advocating the impact that APIs are making across the globe, helping educate developers and non-developers about the important role that APIs play in our personal and professional lives, reaching across communities that are being impacted by the API economy to cultivate awareness of a modern approach to producing and consuming APIs. - -* Advocacy - Dedicated to telling stories that advocate for the important role that APIs play in our personal and professional lives, writing blog posts, producing videos, and meeting API builders where they are at. -* Education - Providing self-service and in-person learning opportunities when it comes to producing and consuming APIs, helping onboard the masses to the critical role that APIs are playing across all of our applications. -* Community - Engaging within existing developer and non-developer communities, working to better understand how APIs are shifting how technology is being developed and evolved, building relationships along the way. - -Engagement with both technical and non-technical business stakeholders across startups and enterprise organizations is an essential part of the Postman platform feedback loop, and engaging with API producers and consumers, understanding more about where they are at in their API journey, and finding ways we can help educate and advocate is what powers the Postman API platform. - -## Platform (Master) -The second team within Open Technologies is dedicated to helping API producers and consumers better master how they are putting APIs to work, providing blueprints for common concepts in play across enterprise and startup API operations, providing reusable processes, integrations and practices that help strength the API lifecycle across an organization while also effectively applying API governance across domains. - -* Blueprints - Simple blueprints that outline common concepts that are commonly applied across API operations, then augmenting with blog posts, videos, and other resources that help provide reusable components of a modern API lifecycle. -* Integrations - Showcase and develop native and collection-defined integrations that embrace the reality that API operations is defined by not just a handful of solutions, but many different on-premise and cloud services stitched together. -* Governance - Provider platform governance rules, guidelines, and other resources that help enable developers across steams while bringing alignment when it comes to API discovery, consistency, reliability, and the lifecycle. - -The Postman Open Technologies platform teams works internally across Postman, but also externally with customers and the community to help stabilize how we all see the API lifecycle, striking a balance between both API producer and consumer to establish healthier and more well-known API lifecycle, which can be governed centrally across the enterprise, but also effectively federated across many domains, teams, and external business and technical stakeholders. - -## Engineer (Build) -The third and final team within Postman Open Technologies is the “Engineer” team who represent a cast of characters assembled from across the most important open-source communities where API specifications like OpenAPI, AsyncAPI, and JSON are being shaped, API standards like FHIR and PSD2 are forged, and the tooling that put all of these critical specifications and standards to work, helping shape the future of the Postman API platform, but also the API platform of every company, organization, and government agency struggling with their digital transformation. - -* Specifications - We have team members dedicated to helping move forward OpenAPI, AsyncAPI, JSON Schema, GraphQL, SOAP, and other specifications that are shaping the modern API toolbox, helping inform all of our journey. -* Standards - The team is regularly staying in tune with industry standards that are shaping not just how we do APIs, but also entire industries, understanding the impact that GDPR, PSD2, FHIR, and other standards are having. -* Tooling - Specifications and standards provide the backbone of any API platform, but it is the tooling that makes everything move forward, and regular investment in open-source tooling is part of what the engineer team delivers. - -Our open source specifications, standards, and tooling teams are given the agency to immerse themselves full time into the open communities of their choice, helping contribute code, content, and other value with their respective community, while also informing the Postman API platform road map regarding the capabilities that truly matter, providing the building blocks we all need to ensure our API operations are as standardized and interoperable as they possibly can. - -## Postman Open Technologies -The Postman Open Technologies team is available around the clock to work with customers, partners, and within the community. We exist to help everyone learn, master, and build in an API-first world. You can tune into our storytelling via the Postman Blog (https://blog.postman.com/) and Youtube channel (https://www.youtube.com/c/postman). You can engage with and follow the Open Technologies team on Twitter (https://twitter.com/i/lists/1461421560028299266), and see what we are building via Github organization (https://github.com/postman-open-technologies) and centralized API workspace (https://www.postman.com/postman/workspace/postman-open-technologies) that connects together all of the areas we are investing in. The Postman Open Technologies team is here to help guide Postman, but also the community on this API journey we all find ourselves on, helping understand how we got here, how far along in our journey we are, and confidently point us in the direction we need to be heading to make our way through this API-first transformation. \ No newline at end of file diff --git a/docs/more/api/blueprints/openapi.md b/docs/more/api/blueprints/openapi.md deleted file mode 100644 index 0dadb34..0000000 --- a/docs/more/api/blueprints/openapi.md +++ /dev/null @@ -1,12 +0,0 @@ -# OpenAPI -The OpenAPI specification, formerly known as Swagger, provides the ability to describe the surface area of your HTTP 1.1 APIs using JSON or YAML. Provides a robust way to describe what is possible with each API, defining the surface area of each request and response, which can then be used as the source of truth for what is possible with each API. - -- **Info** - Provides a place to define common meta data for an API like a name, description, licensing, terms of service, and contact information, helping ensure al APIs have enough metadata available so that it’s purpose can be articulated across the API lifecycle. -- **Servers** - Includes a list of servers for an instance of API, providing possibly multiple regions, or different stages of the development of an API, allowing consumers to quickly find an instance of an API that they can use to meet their needs, and properly apply a resource. -- **Paths** - One or many paths that can be taken by API consumers to access different resources and capabilities, similar to browsing the web, but navigating the API landscape, looking for the resources and capabilities you need to power your application or integration. -- **Operations** - Defining the specific operations that can be taken using a specific path, providing the ability to read, write, update, delete, and perform other operations on API resources, and setting in motion different capabilities that are defined as part of each API. -- **Parameters** - Providing a defined set of parameters that can used to change the state of API responses, providing key/value pairs that provide common things like pagination or search, but getting specific depending on the objects being returned with API responses. -- **Responses** - Describing the HTTP Status Codes, headers, and media types being returned with each response, helping the consumer understand that structure and state of the response, helping provide consumers with as much information as possible about response. -- **Schema** - Providing JSON Schema descriptions of request and response bodies, allowing the responses to be validated, helping automate validation at the gateway, helping ensure the highest quality possible when it comes to consuming APIs within any application. -- **Security** - Describing the type of authentication required for accessing an API, providing a machine readable description of API keys required, OAuth, JWT, and other types of security protocols in place, helping automate the authentication layer of API usage within clients. - diff --git a/docs/more/api/blueprints/organizational-standards.md b/docs/more/api/blueprints/organizational-standards.md deleted file mode 100644 index 0ad5ba7..0000000 --- a/docs/more/api/blueprints/organizational-standards.md +++ /dev/null @@ -1,9 +0,0 @@ -# Organizational Standards -Every enterprise organization should have a set of standards they have adopted as part of API operations. There are plenty of redundant aspects of doing APIs that are easy to make consistent across APIs, things like pagination, sorting, filtering, usage of request bodies, HTTP methods, and error handling. - -- **Headers** - Key / Value pairs of data that can be passed back and forth as part of API requests, confirming to the HTTP standard, and relying on the IANA registry of headers to define and shape the routing and prioritization of requests being made to APIs when using HTTP as the transport protocol between client and server. -- **Pagination** - Providing a standardized way of navigating through large sets of data and content via an API, limiting the results that are returned with each request, but providing consumers with visibility into how to navigate results and shape their API requests to achieve the most optimum outcomes for both API producer and consumer. -- **Schema** - Establishing common schema for each domain, using Internet and industry standards when possible, but then standardizing your own schema, stabilizing common objects, versioning and evolving them for reuse, then reference them in contracts and use them validation, documentation, testing, and other parts of the API lifecycle to help stabilize how data moves inside and outside the enterprise. -- **Variables** - Defining a consistent set of variables that can be used to abstract away common properties that need to be used across different APIs, defining things like base URL, headers, secrets, and environmental, collection, or global values that are needed across many different APIs. - -Internet, industry, and organizational standards provide the base for the consistency and interoperability enterprise organizations need across their operations, helping make APIs more intuitive and speaking a common language, reducing friction for consumers putting to work in applications and integrations. diff --git a/docs/more/api/blueprints/partnerships.md b/docs/more/api/blueprints/partnerships.md deleted file mode 100644 index ca9cc18..0000000 --- a/docs/more/api/blueprints/partnerships.md +++ /dev/null @@ -1,13 +0,0 @@ -# Strengthening Partnerships -As business increasingly gets conducted online via partners, SaaS, and other external services, the urgency to make internal APIs available to partners has increased. The line between inside the firewall and outside the firewall is getting more blurred, making having always available digital resources for external trusted partners to tap into a priority. - -- **Purpose** - Providing a much clearer definition for approach partnerships, offering a clear menu of enterprise resources and capabilities being made available for partners, helping make partnerships more self-services, streamlined, and repeatable across relationships. -- **Onboarding** - Reducing the onboarding of partners from weeks or months down to hours and days, making the process as self-service and automated as possible, reducing friction when it comes to qualifying partners, but then quickly getting them access. -- **Access** - It is common for their to be early and exclusive access to API resources and capabilities, offering preferred access to a specific group, or groups of partners, offering the more valuable and interesting APIs to trusted partners before the general public. -- **Innovation** - Incentivizing innovation amongst partners, leveraging preferred access to fuel the development of new and interesting products and services, leaning on partners to delivering applications, integrationsm plugins, and other interesting business use cases. -- **Exposure** - APIs increase your exposure in partnership scenarios, increasing the chance you will be able to quickie respond to ephemeral situations that exposure your organization to new opportunities by being able to immediately respond and be ready for any situation. -- **Marketing** - The digital resources and capabilities made available via APIs provide what you need for marketing campaigns, and being API-first allows your teams to reach entirely new levels of marketing automation to be able to do more with less, and have a bigger impact. -- **Branding** - There is an opportunity to extend the brand of a company across external communities, and via existing social networks, leveraging APIs to public and aggregate content and media, that can push a brand out via partner platforms. -- **Communications** - Feedback loops are a natural part of modern API operations, and offer an opportunity to gather and use feedback from partners to inform the API roadmap, tapping into a network of trusted partners to help drive business. - -Self-service APIs equipped with a well-defined onboarding process, provide a proven way to securely connect businesses together, establishing a digital contact between two companies, which can then be fulfilled via synchronous and asynchronous digital transactions via APIs. diff --git a/docs/more/api/blueprints/patterns.md b/docs/more/api/blueprints/patterns.md deleted file mode 100644 index 68398f7..0000000 --- a/docs/more/api/blueprints/patterns.md +++ /dev/null @@ -1,10 +0,0 @@ -# Patterns -There are many common patterns in use across the API sector, but the foundation of the modern API toolbox continues to be REST. with GraphQL, Websockets, and increasingly gRPC. The line between pattern and protocol is often a blurry one, but there are a handful of well-known patterns that act as the cornerstone for API applications and integrations, depending on which industry or layer of the enterprise an API is operating in. - -- **REST** - Representational state transfer is a software architectural style that was created to guide the design and development of the architecture for the World Wide Web. REST defines a set of constraints for how the architecture of a distributed system should behave. -- **GraphQL** - GraphQL is a query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of the data in your API, gives clients the power to ask for exactly what they need and nothing more. -- **Websockets** - WebSocket is a computer communications protocol, providing full-duplex communication channels over a single TCP connection. The WebSocket protocol was standardized by the IETF as RFC 6455 in 2011, and is used heavily for financial APIs. -- **gRPC** - gRPC also known as Google Remote Procedure Call is an open source remote procedure call system initially developed at Google in 2015 as the next generation of the RPC infrastructure, but then has become a desired patent used with HTTP/2 internally. -- **Microservices** - A microservice architecture – a variant of the service-oriented architecture structural style – arranges an application as a collection of loosely-coupled services. In a microservices architecture, services are fine-grained and the protocols are lightweight. - -Some of these merely provide a style to follow, where others are standardized formats and protocols, but provide a set of agreed upon constraints that can be applied using a specific, or mix of transport protocols to consumers, brokers, and other stakeholders. diff --git a/docs/more/api/blueprints/performance-testing.md b/docs/more/api/blueprints/performance-testing.md deleted file mode 100644 index 9fa1fa1..0000000 --- a/docs/more/api/blueprints/performance-testing.md +++ /dev/null @@ -1,13 +0,0 @@ -# Performance Testing -Performance testing checks the availability and response time of each individual API, often selecting specific paths of an API, and testing how long it takes to make a request and receive a response, or publish and subscribe to messages. Considering the responsiveness of the API, but also potentially the network in between consumers and the API, gateway, and other influencing aspects of how APIs operate. Helping understand over time how performant each API is, keeping a record of the uptime and availability, but also making sure the API meets the expected SLA from the regions that matter most to consumers. - -- Testing -- Collections -- Authentication -- Scripting -- Environments -- Regions -- Automation -- Runner -- Monitor -- Pipeline diff --git a/docs/more/api/blueprints/platform-governance.md b/docs/more/api/blueprints/platform-governance.md deleted file mode 100644 index 4f629ab..0000000 --- a/docs/more/api/blueprints/platform-governance.md +++ /dev/null @@ -1,65 +0,0 @@ -# Platform Governance - -## API definition (discovery) -One reason we see enterprise organizations struggle with their API governance is the lack of discovery that exists across not just APIs, but also artifacts, documentation, and all the other work occurring around each API. You simply cannot govern what you can’t find, and the more visible APIs and the operations are behind them are, the more likely you will be able to make API governance take root. - -- **Workspace** - Ensure that every API has a private, partner, or public workspace to access the operations surrounding each API. -- **OpenAPI** - Ensure there is always a human and machine-readable API artifact available as the source of truth for each API. -- **Repository** - Establish a Git repository for each API, syncing OpenAPI and collections to the repository as part of the lifecycle. -- **Environment** - Provide development, staging production, and other environments available to apply manually or automatically. -- **Documentation** - Requiring that all APIs have complete and up-to-date documentation available to demonstrate what is possible. -- **Team** - Having the team behind each API available as a list, providing name, and contact information for consumers to use. - -This is the foundation for your API governance. Every API should possess these elements, and there should be a blueprint for teams to follow when it comes to setting up new APIs and bringing existing APIs up to current standards. Investment in these areas will make API governance possible in the area of design and other aspects of API operations. Once you get a handle in these areas and realize that each has APIs of its own, you’ll see entirely new ways you can elevate API governance efforts across your teams. - -## API instance (reliability) -The next area we are looking to include as part of API governance is the overall reliability of each instance of an API. We want to provide a baseline set of tests across all APIs. This ensures that the business purpose for each API is being realized, but also that it is done in a way that meets service level agreements (SLA) and doesn’t introduce any vulnerabilities or security issues into operations. It’s key to establish common ways in which teams can confidently deliver and operate reliable API instructors behind our applications and integrations: - -- **Contract Testing Collection** - Produce a single collection that pulls the JSON schema for each API operation and validates the request and response. -- **Performance Testing Collection** - Produce a single collection that tests one or more API operations, ensuring that it meets a minimum time threshold. -- **Security Testing Collection** - Produce a single collection that applies a common set of security vulnerability tests and 3rd party API security services. - -This dimension of API governance historically lives under quality, testing, and other areas of operations, but it makes a lot of sense to consider this as part of the API governance stack. This places reliability as part of governance and allows us to use the same artifacts and tooling we are using for testing to govern our APIs and the API operations around them. We can test the instances of our APIs, the surface area of those APIs, and the operations and infrastructure that move those APIs forward across a well-known API lifecycle. - -## API design (consistency) -Now we get to the portion of platform governance that people usually talk about when it comes to API governance—governing the design of your API. API design governance is about making sure the technical details of the surface area of your API are as consistent as possible, no matter which team designed and developed the API. This is an area of API governance that we feel is important. Still, it is also one that becomes more difficult without a single source of truth (OpenAPI) in a known workspace. It can quickly become a rabbit hole when it comes to thinking about all the different ways you can lint the OpenAPI for your API—so we recommend starting small. - -### Design Governance Collections -Establish a standalone collection that pulls the OpenAPI for each API using the Postman API and linting using Spectral. - -- **Info** - Ensure that there are title, description, and other essential information properties. -- **Versioning** - Require a standard semantic or date-based versioning applied to each API. -- **Operations** - Make sure each individual operation has a summary, description, and id. -- **Parameters** - Standardize the format of parameter names, and all have descriptions. -- **Responses** - Push for a common set of status codes, media types, and schema responses. -- **Schema** - Standardize all request and response schema using JSON Schema components. - -There are plenty of other governance rules that can be applied at this level, but this provides an introductory set of concerns that should be addressed early on as part of API governance efforts. Teams will learn a lot by ensuring that these simple rules are consistently applied across API operations, enabling all teams to apply governance manually using collection runners, scheduling using monitors, or baked into CI/CD pipelines. We’re able to use the same infrastructure we are using to test each API instance to test the surface area of the API for consistent design across any team. - -## API operations (delivery) -API governance isn’t something you do once and step away from. It’s something that is ongoing and should be wired up to your existing software development lifecycle and monitored in real time. Operational governance allows us to automate the reliability and consistency portions of our governance, but through additional PlatformOps collection integrations, we can configure, optimize, and automate our gateway, portals, documentation, and other building blocks of API operations. We then leverage the same infrastructure we are using to test individual APIs to also “test” the surface area of our APIs and the operations that surround them. - -- **Monitors** - Ensure you are monitoring operations and paying attention even when the team is off doing other things. -- **Contract** - There is a monitor scheduled to test the contract every 24 hours. -- **Performance** - There is a monitor to test the performance of the API every hour. -- **Security** - There is a monitor to test the security of the API every 24 hours. -- **Governance** - There is a monitor to test the governance of the API every 24 hours. -- **Pipeline** - Ensure you are applying governance at the CI/CD pipeline layer, running the contract, security, and governance tests with each build. -- **Contract** - Run the contract testing collection in the pipeline. -- **Security** - Run the contract security collection in the pipeline. -- **Governance** - Run the governance testing collection in the pipeline. -- **Gateway** - Govern the deployment of APIs via gateways, allowing for manual and automated configuration and observability to occur. -- **Authentication** - Ensure that a gateway authentication is properly configured. -- **Usage Plans** - Require that each API operates within a specific usage plan. -- **Usage History** - Check usage logs for an API for common patterns. - -The governance of operations doesn’t stop at just monitoring our testing, security, and design governance. Baking it into our CI/CD pipeline, we can extend this to other operational areas. The same collection-based approach we are using to test each instance of API and to govern API design can be used to validate that documentation is complete and updated, possesses examples, and always has SDKs and code snippets automatically generated. When you realize that your API operations themselves have APIs, your approach to API governance becomes much broader than just the design of your APIs. Your teams can become more productive and realize higher levels of quality—all while fulfilling governance at scale. - -## API observability (awareness) -The final and essential area of API governance needed to achieve the best results at scale across business domains and teams** - make sure everything is as observable as possible. Provide real-time reporting and activity visibility for each API, but also take more advantage of platform integrations to feed everything into existing API infrastructure. Operations around APIs then have the necessary observability, and governance itself becomes as observable as possible in real time. - -- **Reporting** - Leverage native reporting to understand what is happening with each API in use as part of API operations. -- **Application Performance Management (APM)** - Use monitors to pipe the results of collection runs into DataDog to make the API observable. -- **Activity** - Understand how the workspace, APIs, collections, monitors, and other elements are being configured as part of your work. - -Collections provide connections to APIs and can be monitored on a schedule, integrated into CI/CD pipelines, and can make the API lifecycle and governance more observable. Collections are the gears of our API operations** - use them to define, automate, and make everything observable via native integrations and PlatformOps collections in service of platform API governance. \ No newline at end of file diff --git a/docs/more/api/blueprints/platform.md b/docs/more/api/blueprints/platform.md deleted file mode 100644 index 45a42e1..0000000 --- a/docs/more/api/blueprints/platform.md +++ /dev/null @@ -1,12 +0,0 @@ -# Platform -API Platforms are software systems with integrated tools and processes that allow producers and consumers to effectively build, manage, publish, and consume APIs. Leveraging existing infrastructure, as well as an ever-expanding and changing mix of API-driven services to define, shape, and operate a platform to sustain business in a digital marketplace. - -- **Integrated** - API platform are seamlessly integrated with your existing operations, taking the source control and CI/CD used as part of the existing software development lifecycle, marrying it to your gateway, and APM solutions–equipping your teams with a modern API lifecycle that is bolted to your existing investment in on-premise and cloud infrastructure over the couple of decades. -- **Discoverable** - Search and discover of APIs and the operations around them is the natural state. Teams are relied on to always publish their artifacts and supporting metadata, with workspaces and repositories indexed and made available as a defaut part of regular operations. Enabling infrastructure to keep pace with business, and teams to just focus on the work that moves business forward, not just busy work. -- **Collaborative** - An API platform brings teams and APIs out of the shadows, lowering the walls between business and IT groups, while making both the producer and consumer side of the API lifecycle a collaborative affair. Making it so that every aspect of a modern API lifecycle is modular, portable, sharable, but also executable, helping support more stakeholders across API operations, attracing much needed business expertise to help bring APIs in alignment with meaningful business outcomes. -- **Observable** - API platforms allow for any output across API operations to be gathered, measured, and made available via visual dashboards and reporting, helping make very abstract APIs more tangible and associated to business outcomes. Bringing the awareness necessary to understand the overall state of enterprise operations, and begin making the changes needed to actually steer the direction a business is going, which eventually begins to translate into a more agile and nimbe enterprise operation. -- **Automated** - API platforms deliver APIs, but they are also defined by APIs, making every part of the API lifecycle across all of API operations automatable. Allowing APIs and the infrastructure behind them automated using workflows that are scheduled and executed from one or more cloud regions, via CI/CD pipelines, and responding to the most critical or even mundane events that happen daily at scale across operations. Providing enterprise teams with the resources they need to mee the future demand of their industries with smaller teams. -- **Governed** - Platform bolt API operations onto our existing organizational infrastructure via standards like SSO and SCIM, but allow us to “see” all of our APIs so that we can ensure they are reliable, secure, and consistent in how they power our digital resources and capabilities. An API platform grounds not just the design of our APIs, but also how we document, test, deploy, distribute, and observe our APIs, elevating what we know as governance beyond just the naming and ordering of our APIs, to platform level control over the entire life of hundreds or thousands of APIs. - -In the last decade we’ve moved from using tens or hundreds of APIs, to using thousands of APIs to make our busines operate. In the next decade this will move from thousands to hundreds of thousands, which something that can only be achieved with an industrial-grade API platform to carry the load. - diff --git a/docs/more/api/blueprints/portal-and-network.md b/docs/more/api/blueprints/portal-and-network.md deleted file mode 100644 index b3b28d2..0000000 --- a/docs/more/api/blueprints/portal-and-network.md +++ /dev/null @@ -1,13 +0,0 @@ -# Portal and Network -The relationship between an existing developer portal possessing API documentation and the Postman API network, workspaces, and collections helps bridge the developer experience across an API providers community and the wider Postman developer community. - -- Get Started - Create a dedicated guide like Datadog and Twilio with step-by-step instructions and helpful screenshots on how to get started with their collections in Postman. -- API resources - Some publishers, like EasyPost, catalog their Postman references under API libraries and standards. Some publishers, like Plaid, point to their collection in the README of a repository. -- Multiple collections - If you have lots of APIs, incorporate Postman Collections throughout your API reference, like Twitter and Zoho, for each of your documented APIs. -- Optimize search in Postman - In Postman, optimize your discoverability by including collection descriptions, keywords, and choose an eponymous team domain, like Stripe. -- Optimize search in your docs - In the same way that you optimize for search on Postman, consider the search experience within your own docs and ensure collections, tutorials, and references are indexed properly, like Cisco and Microsoft. -- Run in Postman - Use Run in Postman buttons, like Brex and Okta, so users can directly fork the collection (and optional environment) from within your developer docs. You will know it is a live version of the button instead of the deprecated, static version because clicking through prompts you to “fork” instead of “import” the collection. -- Bundle collection and environment - If you provide a corresponding environment for your collection, bundle them together in the button so users can fork both at the same time, instead of separately. -- Style the button - The bright orange Run in Postman button is recognizable to most developers, but you can also style the appearance to match your documentation’s theme and aesthetics, like Google does in their docs. -- Pre-fill the environment - Save your developers some copying and pasting, and pre-fill the environment with your users’ own credentials via the Run in Postman API if you have access to client-side JavaScript for your documentation. If not, be sure to include a link in your Postman collection to show users where to get an API key. -- Watch - Encourage your users to “watch” the workspace in addition to forking like Belvo does so users are notified about new updates and can pull those updates to their own collection. \ No newline at end of file diff --git a/docs/more/api/blueprints/privacy-regulations.md b/docs/more/api/blueprints/privacy-regulations.md deleted file mode 100644 index 44767ff..0000000 --- a/docs/more/api/blueprints/privacy-regulations.md +++ /dev/null @@ -1,7 +0,0 @@ -# Privacy Regulations -When your business runs on APIs, and your APIs are defined as contracts, you know where all of your PII, PCI, and PHI is located. Modern privacy regulation focuses on giving consumers access and control over their personal information, and APIs are how this privacy is defined and fulfilled. - -- GDPR -- CCPA - -The precedent has been set for end users rights to access their data via any platform, with APIs how users will access their data and allow 3rd party developers to access as well. APIs are essential to privacy on the Internet, and will continue to play a role in privacy regulation around the globe. diff --git a/docs/more/api/blueprints/producer-lifecycle-define.md b/docs/more/api/blueprints/producer-lifecycle-define.md deleted file mode 100644 index 62dd7ac..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-define.md +++ /dev/null @@ -1,14 +0,0 @@ -# Define -APIs are a very abstract digital concept, loosely wrapping a variety of text, documentation, artifacts, and code that define what an API is capable of doing. A thoughtful API lifecycle begins with sitting down with all stakeholders and finding a common way of defining each API, but also how it will be moved forward over time. - -- **Goals** - What are the goals for the API, defining the business value that the API will bring, helping provide a list that can help guide development and operation of each API. -- **Stakeholders** - Identifying who the business and technical stakeholders are, and who might need to be involved from external partners, and consumers, helping complete the picture. -- **Domain** - What domain will an API be operating within, defining the vocabulary, standards, and other patterns that are made available at design and development time for developers. -- **Regions** - Identifying which region(s) an API will operate in, helping comply with regulation and other business requirements, ensuring that APIs are as close to consumers as possible. -- **Teams** - Lining up who will be working on an API, bringing together designers, developers, technical writers, QA, and other roles who will be involved in moving APIs forward. -- **Roles** - Defining who on the team will have access to what, defining the roles access to editing, viewing, and working with APIs, and the operations that are moving them forward. -- **Workspaces** - Creating and setting up the workspaces where teams will be designing, developing, and managing APIs, iterating upon them and managing multiple versions. -- **Change** - Establish the underlying approach for managing change with an API, keeping the versioning, communication, and other elements in alignment with centralized governance. -- **Road Map** - Creating a road map for an API, beginning on day one with the planning for what the future holds for an API, ensuring there is always a plan guiding what is next. - -A well executed define stage of the API lifecycle helps take a moment to lay the important groundwork that will help contribute to the overall usability of an API, providing the nutrients needed to operate. diff --git a/docs/more/api/blueprints/producer-lifecycle-deploy.md b/docs/more/api/blueprints/producer-lifecycle-deploy.md deleted file mode 100644 index 91126a7..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-deploy.md +++ /dev/null @@ -1,12 +0,0 @@ -## Deploy -Once an API is ready for deployment to a staging or production environment, there should be a repeatable set of elements at work to move enterprise operations forward at scale. The deployment orchestration of APIs across teams help optimize the API factory floor across enterprise domains, making every step forward more deliberate and repeatable. - -- **Source Control** - Using source control to manage code and artifacts used to deploy an API, providing a single location to find everything behind each version of an API. -- **CI/CD Pipeline** - The pipeline ensures that the deployment of an API to each stage is as repeatable as possible, with tests and other essential needs of the PI build process. -- **Gateway** - Publishing contracts, policies, and other configurations to the API gateway, deploying an API into a staging or production enviironment if all tests pass in the pipeline. -- **Releases** - Establishing a formal release for this version of an API, documenting the changes being deployed, and communication around it, keeping consumers informed. -- **Stages** - Allowing for multiple stages to be deployed, providing development, staging, production, and potentially other environments for deploying and testing APIs within. -- **Environments -Applying commonly managed environments, with a coordinated variable strategy, applying when testing and automating configuration as part of the pipeline. -- **Plans** - Requiring that all APIs be deployed into a standardized set of plans, with consistent policies applied, limiting access to resources, and applying proper security. - -Deployment will mean different things to different organizations, what is important is that there is always a source of truth, a repeatable build process, and a standards set of releases, stages, environments, and plans are leveraged to deploy APIs consistently across teams and domains. diff --git a/docs/more/api/blueprints/producer-lifecycle-design.md b/docs/more/api/blueprints/producer-lifecycle-design.md deleted file mode 100644 index bfa645f..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-design.md +++ /dev/null @@ -1,14 +0,0 @@ -# Design -Investing in informed design of the surface area of each API, shaping how it works, what protocols and standards it uses, and how each API will conform to wider rules for how APIs should be designed across an organization. The investment in the design of each API, as well as the overall practice of design across operations helps fine tune and bring operations into alignment. - -- **Patterns** - Selecting which patterns like REST, GraphQL, and other common patterns are to be used, to help standardize how APIs work, while helping choose the right tool for the job. -- **Protocol** - Making a sensible decision regarding what protocol will be used when it comes to designing an API, picking the right solution for the project and the consumers using it. -- **Standards** - Understanding what Internet, industry, and organizational standards will be used when designing the API, making sure that each API is as consistent as possible. -- **Schema** - Establish schema for all of the objects that are in use as part of requests, responses, and to publish and subscribe to when integrating an API into applications. -- **Contracts** - Putting contracts like OpenAPI and AsyncAPI to work when it comes to defining the surface area of each API, providing a machine readable contract that guides work. -- **Versioning** - Change is inevitable, and there will need to be a clear plan for how an API will be versionied, leveraging a common pattern when it comes to managing change. -- **Rules** - Identifying which linting rules will be needed to help ensure the API is following central governance, and keeping APIs as consistent as they can be across all teams. -- **Editor** - Establishing a common and consistent way of directly, or visually editing the artifacts that are used as part of the APi design process, helping carry load of design. -- **Examples** - Pushing for there to be examples of each part of an API, providing actual exacmples as part of each contract which can be used for documentation and mocking. -- **Mocks** - Generate a mock representation of an API using it’s contract, providing an example of what the API will do in production, helping in the design process, but also onboarding. - diff --git a/docs/more/api/blueprints/producer-lifecycle-develop.md b/docs/more/api/blueprints/producer-lifecycle-develop.md deleted file mode 100644 index 7a0452e..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-develop.md +++ /dev/null @@ -1,14 +0,0 @@ -# Develop -Getting to work bringing an API to life. Assembling all the infrastructure that is needed, generating any skeletons, using frameworks, and assembling and begin hardening an APIs functionality. The development might take a design-first approach or it might rely on a code-first approach, whicher approach is taken, as long as it follows a common lifecycle, it can be seen as benefiting operations. - -- **Compute** - Establishing a baseline for what the underlying compute will be for an API, choosing from virtual servers, containers, and serverless to actually power each API. -- **Database** - Providing the data storage and querying needs for an API, leveraging a centralized database, or establishing a database just for use by this single API resource. -- **Storage** - Defining what the centralized storage will be for an API, having a plan for where objects, images, and other files will be stored, retrieved, and managed as part of API usage. -- **DNS** - Applying a consistent approach to using DNS for each API, following a larger domain strategy, but allowing for flexibility and redundancy when it comes to accessing each API. -- **Encryption** - Ensuring that encryption is the default in transport and storage, and each individual API has this aspect of security applied from the beginning, not as an afterthought. -- **Frameworks** - Leveraging open source frameworks as the scaffolding for your API, leveraging consistent approaches, and not doing much of the redundant work necessary. -- **Skeletons** - Generating code stubs or skeletons using the contract for an API, leveraging automation to much of the repetitive work of developing an API, help be more efficient. -- **Annotations** - Utilizing code annotations to auto-generate the contracts necessary to document, test, secure, and govern API, leaning on a more code-frist approach to APIs. -- **Integrated Development Environment (IDE)** - Maximizing productivity across teams by providing further enablement via their trusted IDE, helping increase developer productivity. -- **Source Control** - Establishing a Git repository for managing all of the code and the artifacts for an API, helping establish source control for each API early on in the API lifecycle. - diff --git a/docs/more/api/blueprints/producer-lifecycle-distribute.md b/docs/more/api/blueprints/producer-lifecycle-distribute.md deleted file mode 100644 index 2b8f2ea..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-distribute.md +++ /dev/null @@ -1,16 +0,0 @@ -# Distribute -An API does little good if it can’t be found and put to use. Distributing an API, as well as the supporting operations around it helps ensure that consumers can find it when building applications and integrations, but also teams who are developing potentially duplicate APIs are able to find it before creating another one. - -## Destinations - -- **Portal** - API deployed into production should be published to the central portal, providing centralized access to an API, making it available to internal or external consumers. -- **Catalog** - The meta data for each API should be updated and kept in sync with each API, ensuring that all the relevant information consumers will need is available in the catalog. -- **Network** - API should be made available via private, partner, and public consumers, giving it the visibility it needs to be successful, connecting with the intended consumers. -- **Workspace** - The workspaces around an API should also be made available via the network, providing teams behind, and consumers access to the operations behind. - -## Accessibility - -- **Buttons** - The documentation, tests, and even the workspace behind an API should be made available via blog posts, videos, wikis, and other resources available to support APIs. -- **Visibility** - The visibility of each API, and whether it is private, partner, or public should be leveraged deliberately and confidently to make sure each API is available to consumers. -- **Search** - APIs, and the operations around them should be indexed and available via search, providing teams, business stakeholders, and consumers with the able to search and use. - diff --git a/docs/more/api/blueprints/producer-lifecycle-observe.md b/docs/more/api/blueprints/producer-lifecycle-observe.md deleted file mode 100644 index 55f4c5a..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-observe.md +++ /dev/null @@ -1,11 +0,0 @@ -# Observe -Ensuring that there is observability via existing outputs for the API lifecycle, scheduling the results of monitors to be centrally stored, piped into existing APM solutions, and take advantage of the platform to understand what is happening. Providing teams with the ability to “see” APIs and the operations around them using a set of common metrics, helping provide the data needed to operate and move forward each API independently of others. - -- **Activity** - Using the activity across an API platform to understand how APIs are being moved forward, configured, and evolved over time by tapping infrastructure outputs. -- **Logs** - Actively using the logs for source control, CI/CD, and the gateway to provide the outputs needed to understand the velocity of individual APIs, and across domains. -- **Traces** - Leveraging traces added to clients, SDKs, gateways, and other ways to makes sense of API landscape, and how APIs are putting backend infrastructure to work. -- **Monitors** - Establishing monitors for all contract, performance, security, and governance tests, providing the results needed to understand the state of APIs and all of operations. -- **APM** - Routing all outputs from across API operations into APM solutions, tapping every output across the API lifecycle to be able to understand the health and state of the platform. -- **Reports** - Providing team, API, documentation, testing, and other reporting, showing what teams are doing across API operations, and how the lifecycle is unfolding across teams. - -Observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs, tapping into all of the existing outputs available across the API platform and the infrastructure used to move APIs forward across the API lifecycle into production. diff --git a/docs/more/api/blueprints/producer-lifecycle-secure.md b/docs/more/api/blueprints/producer-lifecycle-secure.md deleted file mode 100644 index f7eeb61..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-secure.md +++ /dev/null @@ -1,13 +0,0 @@ -# Secure -Properly securing the access and operations surrounding each API, ensuring only those who should have access are able to make requests and publish messages. Establishing an organization-wide approach to how API authentication work, encryption is applied, but also secrets, roles, and APIs are fuzzed and scanned for vulnerabilities. Providing teams with everything they need to secure each API and the operations around them, consistently securing the expanding API landscape. - -- **Authentication - Helping ensure APIs are access only by those who should have access, helping API producers apply consistently, and consumer easily apply when integrating. -- **Authorization - Once a user is authenticated it the authorization layer will make sure they only have access to the resources they are approved to,helping navigate resources. -- **RBAC - ole based access control should be applied at the authorization layer of an API, but also to the API operations around them, helping govern who has aces to operations. -- **Encryption - Ensuring that all API requests are encryption, making sure the staging of proper encryption, as well as the reading of encrypted messages is as easy as possible. -- **Environments - Have a solid mapping of the development, staging, and production environments available across all APIs in operation, helping manage essential details. -- **Variables - Providing a well-defined vocabulary of variables that abstract away the common aspects of authentication, authorization, helping standardized the way we engage with APIs. -- **Secrets - Adding a layer on top of environmental variable specifically for manaing secrets, making sure you have clear visibility and control of secrets an tokens being applied. -- **OWASP Top 10 - The OWASP Top 10 is a standard awareness for API producers, covering a broad consensus about the most critical security risks for web APIs. - -There are many layers to security when it comes to producing and consuming APIs, making it a consideration that is increasingly moving earlier on in the API lifecycle, and not after an API goes live. diff --git a/docs/more/api/blueprints/producer-lifecycle-test.md b/docs/more/api/blueprints/producer-lifecycle-test.md deleted file mode 100644 index ed89a2d..0000000 --- a/docs/more/api/blueprints/producer-lifecycle-test.md +++ /dev/null @@ -1,23 +0,0 @@ -# Test -Defining, documenting, and automating the testing of the underlying contract for each API, understanding the overall performance of the API, integration, and other types of tests. Providing supporting automation to properly test the entire surface area of each API, as well as the operations. - -## Artifacts - -- **Collections - Using collections to define one or a sequence of many API requests, establishing a modular, collaborative, and executable artifact used across API testing. -- **Scripts - Defining folder level, pre-request, or post-request scripts that run to configure request, validate responses, and automate testing to iterate and behave as consumers. - -## Types - -- **Contract - Using OpenAPI and AsyncAPI contracts to ensure 100% of the surface area of an API is being tested and behavior reflects the contract between producer and consumer. -- **Performance - Testing specific paths for each API to understand the performance of an API, gateway, and network from multiple regions to ensure the desired performance exists. - -## Use Cases - -- **Mocking - Leverage mock servers generated from the API contract, but then augmenting with examples for specific use cases, testing for specific outcomes. -- **Data - Injecting CSV or JSON data as part of the testing process making sure API requests reflect specific business workflows and outcomes, testing real-world API-driven scenarios. - -## Automation - -- **Monitor - Scheduling monitors of tests to run on a schedule that reflects the business needs of the API, but also the type of test being run, allowing testing to be automated by teams. -- **CI/CD Pipeline - Baking tests into CI/CD pipelines, ensuring that all tests run when APIs are being built, ensuring that no API goes into production with the tests being executed. - diff --git a/docs/more/api/blueprints/product-management.md b/docs/more/api/blueprints/product-management.md deleted file mode 100644 index c4a4eb8..0000000 --- a/docs/more/api/blueprints/product-management.md +++ /dev/null @@ -1,14 +0,0 @@ -# Product Management -API product management is the business and human bridge between API producers and consumers, providing a scaffolding to hang your APIs on that you are treating like products, but also making sure you are approaching the API lifecycle with an empathy for your API consumers, and delivering an experience that speaks to their worlds, while also being in alignment with your overall business goals. Creating and shaping the market for APIs, having a plan for how you are moving forward, and making sure you are measuring and reporting on the right things as you engage with consumers on this ongoing API journey you are both on. - -- **Customers** - Putting the consumer first when it comes to the design and evolution of APIs, prioritizing what they need, and what motivates them using a feedback loop. -- **Empathy** - Put a heavy focus on developing empathy with your consumers, exploring ways you can put yourself in their shoes, and develop a road map that speaks to this. -- **Experience** - Make sure each API you produce delivers an experience for consumers, and the overall experience onboarding, using, and providing feedback is always smooth. -- **Storytelling** - Develop stories to express what each API does by encouraging teams to write a narrative for each API, helping producers think through the story they wish to tell. -- **Marketing** - Have a go-to-market (GTM) strategy for each API, and a scaffolding to help teams think through and execute against their GTM strategy with each version release. -- **Road Map** - Publish a road map for each API, include other stakeholders, and consumers in the planning of this road map, then keep up to date and in sync with APIs. -- **Analytics** - Measure as much of the activity around the usage of your API, but also the operations around your API, providing the data you need to understand consumption. -- **Feedback Loop** - Make it easy for your consumers to provide feedback, aggregate and organize it for use as part of your road map planning, and iteration of your APIs. - -Moving APIs from being just implementation feature for your digital products, to APIs being the digital product will help produce the productivity and quality you are looking for, while delivering digital products that consumers want. Introducing the business nutrients needed in your very IT led API strategy today, bringing more alignment between business and technical groups across the enterprise. - diff --git a/docs/more/api/blueprints/productivity.md b/docs/more/api/blueprints/productivity.md deleted file mode 100644 index cb94f59..0000000 --- a/docs/more/api/blueprints/productivity.md +++ /dev/null @@ -1,10 +0,0 @@ -# Productivity -Productivity is achieved in an API-first world through well-defined workspaces, that possess everything you need to engage with an API throughout its well–defined lifecycle. Organizing the enterprise API factory floor into productive workspaces that possess the artifacts, documentation, mock servers, environments, monitors, and other building blocks of API operations. - -- **Workspaces** - Everything you need to engage with APsI is available via collaborative workspaces that posses the access controls needed to prevent undesired outcomes, ensuring not just APIs are discoverable, but everything needed to sustain and evolve APIs, allowing for turnover of teams without any disruption in how work occurs. -- **Collections** - The collection is a machine-readable, executable, and documented unit of work, providing everything you need to define a unit of business value in an API-first world, providing what is needed to define documentation, mock servers, testing, and the automation needed to validate, scale, and empower teams to do more with less. -- **Lifecycle** - There is a known lifecycle that allows all contributors to deliver the best possible API in a short amount of time, standardizing how APIs are delivered to optimize productivity, and developers have received the education and training they need on how to navigate the lifecycle, getting teams on the same page across enterprise domains. -- **Documentation** - Everything across the lifecycle is documented, not just the reference documentation for your APIs, but onboarding docs, workflow doc, and your mocks, tests, and other automation are also documented, turning workspaces into the institutional memory for what is happening across the enterprise. -- **Discovery** - All of API operations are discoverable, making teams, APIs, and operations available via search and discovery, ready for use across any stage of the API lifecycle, allowing teams to find what they need when they need it, helping make API operations more discoverable but also self-service, increasing the productivity of teams. - -Teams aren’t productive in a chaotic environment where they can’t find what they need, and there is no common vocabulary for how things work, and a lack of documentation. Workspaces help ground the lifecycle, and collections help provide the atomic units of the enterprise memory over time, helping teams move forward and iterate and evolve the value an organization produces regularly. diff --git a/docs/more/api/blueprints/protocol-buffer.md b/docs/more/api/blueprints/protocol-buffer.md deleted file mode 100644 index 485ffa6..0000000 --- a/docs/more/api/blueprints/protocol-buffer.md +++ /dev/null @@ -1,10 +0,0 @@ -# Protocol Buffers -Protocol Buffers (Protobuf) is a free and open-source cross-platform data format used to serialize structured data. It is useful in developing programs to communicate with each other over a network or for storing data. The method involves an interface description language that describes the structure of some data and a program that generates source code from that description for generating or parsing a stream of bytes that represents the structured data. - -- **Solutions** - Protocol buffers provide a serialization format for packets of typed, structured data that are up to a few megabytes in size, suitable for both ephemeral network traffic and long-term data storage, and extending with new information. -- **Services** - A definition for an individual system that supplies a digital resource or capability, providing a granular unit of business value that can be used internally within the enterprise, but also sometimes made available to partners in secure, but highly performant manner. -- **Messages** - A digital communication for sending serialized and structured, record-like, typed data in a language-neutral, platform-neutral, extensible manner, providing a highly efficient way of communicating between systems within the enterprise, and with partners. -- **Message Types** - Protocol Buffers allows you to define any type of message you will need to make your digital resources, and capabilities available, giving you full control over how data will be structured, interpreted, and consumed by internal and partner developers. -- **Language Compatibility** - Messages can be read by code written in any programming language, providing a high performant way to make data available across many different platforms, with a robust add-on ecosystem, allowing for a long tail of integrations. - -Protocol Buffers provides a solid contract for defining the relationship between your internal services, allowing for large volumes of high quality, and well defined data to be exchanged across the enterprise, and in some cases externally with partners, providing high volume business to business services. diff --git a/docs/more/api/blueprints/protocols.md b/docs/more/api/blueprints/protocols.md deleted file mode 100644 index 8ed1e4b..0000000 --- a/docs/more/api/blueprints/protocols.md +++ /dev/null @@ -1,12 +0,0 @@ -# Protocols -There are a variety of Internet protocols that are in use across the World Wide Web, regional, and local area networks, providing well-known protocols for transporting digital resources and capabilities around the globe in service of digital commerce. The most well-known protocol is the hypertext transfer protocol, or HTTP, which is the backbone of the web we know today, but HTTP is getting a makeover, as well as regular use of other leading protocols across the API landscape. - -## API Protocols - -- **HTTP 1.1** - The Hypertext Transfer Protocol, or HTTP is an protocol for distributed, collaborative, hypermedia information systems, providing a generic, stateless, protocol which can be used for many tasks involved in distributed object management systems. -- **HTTP/2** - HTTP/2 is a major revision of the HTTP network protocol used by the World Wide Web. It was derived from the earlier experimental SPDY protocol, originally developed by Google, then adopted by the HTTP Working Group of the Internet Engineering Task Force. -- **HTTP/3** - HTTP/3 is the third major version of the Hypertext Transfer Protocol used to exchange information on the World Wide Web, alongside HTTP/1.1 and HTTP/2. HTTP/3 always runs over QUIC, which provides a next generation approach to the Internet. -- **TCP** - The Transmission Control Protocol is one of the main protocols of the Internet protocol suite. It originated in the initial network implementation in which it complemented the Internet Protocol. Therefore, the entire suite is commonly referred to as TCP/IP. -- **MQTT** - MQTT is a lightweight, publish-subscribe network protocol that transports messages between devices. The protocol usually runs over TCP/IP, however, any network protocol that provides ordered, lossless, bi-directional connections can support MQTT. - -Whic protocol used when delivering an API will set the tone with consumers regarding what is possible, and will provide constraints regarding which patterns can be employed, and what is possible across the API lifecycle, shaping your journey, and the outcomes of your API operations. diff --git a/docs/more/api/blueprints/prototype-first.md b/docs/more/api/blueprints/prototype-first.md deleted file mode 100644 index 707445a..0000000 --- a/docs/more/api/blueprints/prototype-first.md +++ /dev/null @@ -1,12 +0,0 @@ -# Prototype-First -An approach to delivering APIs that involves building of a prototype of an API, opting to not build a contract first, but mock and document the desired functionality using a collection. Producing as much of the functionality as you can, iterating upon the design of the API amongst stakeholders, before producing a machine readable contract, as well as the tests that will be needed to verify an API behaves as expected when it is production–arriving at the same place you would through design-first, but beginning with the prototype instead of the actual contract. - -- **Workspace** - Whether this is a new API, or work to an existing API, a workspace is always the place to begin work, ensuring there is a single location to find all of the work happening. -- **Collection** - We will be hand-crafting a collection to describe the surface area of the API, defining the requests, and example responses, making our API as real as possible. -- **Mocks** - The contract for an API is perpetually used to generate mock servers helping make the design of the API as realistic as possible, matching specific use cases with examples. -- **Document** - Generating human readable documentation from an APIs contract, ensuring there is accurate and up-to-date documentation for each API as it is being designed. -- **Feedback** - Providing a feedback mechanism for all stakeholders to use when it comes to providing feedback on the current design of an API, helping guide producers froward. -- **Iterate** - Aggregate feedback from consumers and other stakeholders, identify the sensible changes to the API, then iterate on the contract, updating mocks and the documentation. -- **Test** - Once the contract for an API has been established, and there will be no more iterations to this version, then contract tests can be produced to validate in production. -- **Contract** - Once we’ve effectively prototyped our API, iterate upon the design usign the prototype, then we can choose to generate a contract from our collection prototype used. - diff --git a/docs/more/api/blueprints/public-workspace-adoption.md b/docs/more/api/blueprints/public-workspace-adoption.md deleted file mode 100644 index 0dd8b64..0000000 --- a/docs/more/api/blueprints/public-workspace-adoption.md +++ /dev/null @@ -1,21 +0,0 @@ -# Public Workspace Adoption - -- Incentives - - Time to First Call - Enhance developer onboarding with a faster “Time to First API Call” - - Use Cases - Inspire existing users with new use cases and resources - - Collaboration - Collaborate publicly with partners in joint workspaces - - Contributions - Solicit community contributions through comments and pull requests - - Feedback Loops - Gather product feedback through usage and comments - - Discovery - Increase discovery in organic search results and Postman search results - - Build - Empower community to build integrations and applications -- Metrics - - Watches - Count of people watching your API or collection for updates - - Forks - Count of active forks of a collection or environment - - Activity - Recent workspace activity to demonstrate resources are actively updated - - History - History of mock server call logs - - Tracing - Percent of API calls hitting your server that includes a Postman (or custom) User-Agent header -- Attention - - Profile Page - - - Documentation - - Blog Posts - - Videos \ No newline at end of file diff --git a/docs/more/api/blueprints/quality.md b/docs/more/api/blueprints/quality.md deleted file mode 100644 index f560889..0000000 --- a/docs/more/api/blueprints/quality.md +++ /dev/null @@ -1 +0,0 @@ -# Quality diff --git a/docs/more/api/blueprints/regulation-automation.md b/docs/more/api/blueprints/regulation-automation.md deleted file mode 100644 index 2461fae..0000000 --- a/docs/more/api/blueprints/regulation-automation.md +++ /dev/null @@ -1,6 +0,0 @@ -# Regulation Automation -As regulation increases, the need to automate the notification, reporting, and even evolution of policies and rules will increasingly be done via APIs. We see regulators not just requiring APIs and API standards as part of regulation, they are making APIs available as part of the regulatory process. - -- Regulation -- Reporting -- De-Regulation diff --git a/docs/more/api/blueprints/regulations.md b/docs/more/api/blueprints/regulations.md deleted file mode 100644 index 05508fe..0000000 --- a/docs/more/api/blueprints/regulations.md +++ /dev/null @@ -1 +0,0 @@ -# Regulations diff --git a/docs/more/api/blueprints/resources.md b/docs/more/api/blueprints/resources.md deleted file mode 100644 index e9df58d..0000000 --- a/docs/more/api/blueprints/resources.md +++ /dev/null @@ -1,11 +0,0 @@ -# Resources -The modern enterprise is made up of hundreds or thousands of individual digital resources that are stored in databases and file systems, and put to work across a dizzying array of desktop, web, mobile, and other types of applications or integrations. You see the vast inventory of enterprise digital resources present in the browser URLs, and digital experiences we encounter in our personal and professional lives. Shaping our offline and online experiences each day as we engage at work, at home, and throughout the day on our mobile devices. - -## Common Digital Objects - -- **Users** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Messages** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Images** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Videos** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Payments** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. -- **Documents** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod. diff --git a/docs/more/api/blueprints/rules.md b/docs/more/api/blueprints/rules.md deleted file mode 100644 index 15cc41b..0000000 --- a/docs/more/api/blueprints/rules.md +++ /dev/null @@ -1,13 +0,0 @@ -# Rules -Machine readable rules in YAML or JSON that can be used to lint any other YAML or JSON artifact, allowing for common or specialized rules to be established to help check for consistency in the design, development, deployment, and management of APIs, helping codify standards and health practices for delivering APIs into the API lifecycle across domains and teams. - -- **Name** - The name of the linting rule, describing what it applies to and how it benefits in making the design of an API more consistent, or also possibly the operation around it. -- **NDescription** - A verbose description of what a rule does, providing as much detail about how the rule helps standardize one small part of operations, helping stabilize teams. -- **NGiven** - The property being targeted for linting, identifying a specific aspect of a contract needing attention, focus the attention of linters on this particular part of each API. -- **NThen** - The criteria for evaluating the contract being linted, providing the logic for the rule being applied, being precise or more loose in how each contract is being scrutinized. -- **NFormats** - Identifying the different types of contract formats that rules are design to be applied to, organizing Swagger, OpenAPI, AsyncAPI, and other formats together. -- **NDocumentation URL** - An external URL for documentation and educational resources associated with a specific rule, turning every rule into a potentially teachable moment. -- **NRulesets** - Organizing multiple rules into sets so that they can all be applied at once, and organized by domain, or other bounded context, keeping rules easier to apply. -- **NJSONPath** - The specification used as part of the “given” and “then” properties to target specific sections of a JSON contract, providing any level of scope when linting contracts. - -Rules help us distill down all the things we need for usability, consistency, and stability down into modular rules that can be applied as sets or individually during design time, or more importantly throughout the API lifecycle, and in some cases gated at the pipeline level, ensuring we are always producing the best API we possibly can, no matter what team produce it. diff --git a/docs/more/api/blueprints/schema-registry.md b/docs/more/api/blueprints/schema-registry.md deleted file mode 100644 index 1db695f..0000000 --- a/docs/more/api/blueprints/schema-registry.md +++ /dev/null @@ -1,53 +0,0 @@ -# Schema Registry - -A schema defines the structure and format of a data record. -When different software or services use the same schema, they're able to communicate more effectively. -API schemas like JSON Schema and OpenAPI are both human-readable and machine-readable, which makes them friendly to both developers and software. -A schema registry serves as a central repository for storing and retrieving schemas. - -## Why you need a schema registry - -In order for applications to communicate with one another across your platform, you need a schema registry. -Having a schema registry is especially important in event-driven architecture and streaming applications. - -+ **Data sharing** - Multiple applications or services often use some of the same data. -If you want applications to share data, they need to use the same schema. -In an event-driven system, event publishers and consumers need to use the same schema. -Every consumer of the shared data needs to understand the metadata, including field names and types. -+ **Data governance and validation** - The data format is a contract between API producers and API consumers, and using a schema registry helps maintain this contract. -If a producer sends data that does not conform to the contract, consumers cannot interpret the event correctly. -This can have major consequences if a producer pushes a breaking change that consumers are not prepared to handle. -With a schema registry in place, changes that do not conform to the contract are not published. -+ **Backward and forward compatibility** - A schema registry maintains a versioned history, which makes schema evolution safer and allows producers and consumers to evolve at different times. -When a schema evolves, there is a chance of breaking compatibility for consumers. -A schema registry can allow older versions of the schema to be consumed, which maintains backward compatibility. -+ **Efficient stream processing** - When you configure stream processing, stream messages are not sent in containers. -The consumer needs to know which schema the incoming data uses, but including the schema with every message adds overhead. -By using a schema registry, messages can use a standardized schema and send only the relevant data in a smaller payload. -+ **Security** - You can use a schema registry to secure data by setting up access control. -You can control access to the event producers as well as the schema registry itself. - -## How a schema registry works - -The schema registry defines the object schema for events, property names, and values. - -+ **Event-driven architecture** - In event-driven architecture, events are a data representation of actions taken within an application or by an external producer. -Producers generate events, and consumers subscribe to these events so that they can receive updates and take action based on the new data. -An event bus can be used to connect the events to the consumers. -+ **Schema registration** - A producer can register schemas using the schema registry's API. -Every registered schema gets a unique ID, and the schema registry maintains version history. -When a new version of the same schema is registered, the original schema can still be used by any consumers that need it. -+ **Serialization and deserialization** - When a producer sends a message, it includes the schema ID of the schema that was used to serialize the message. -When the consumer receives a message, it uses the schema registry to deserialize the message based on the schema ID it contains. - -## How to implement a schema registry - -There are a few different data sharing models you can use when implementing a schema registry. - -+ **Hub to spoke** - In the hub-to-spoke model, a single schema registry is shared across an organization. -Users throughout the organization can write to the main schema registry hub, and the hub replicates data to read-only spokes. -Each different business units can use a spoke that serves only the schemas that they need. -This model works well if the entire organization needs to reference shared data as it offers a single source of truth and a global data contract. -+ **Spoke to hub** - In the spoke-to-hub model, each business unit can read and write to their own schema registry spoke. -The spokes replicate to a read-only schema registry hub. -This allows each business unit to create and enforce their own data contracts and choose which schemas are shared with the rest of the organization. \ No newline at end of file diff --git a/docs/more/api/blueprints/security-review.md b/docs/more/api/blueprints/security-review.md deleted file mode 100644 index b7bb9a9..0000000 --- a/docs/more/api/blueprints/security-review.md +++ /dev/null @@ -1,33 +0,0 @@ -# Security Review -Injecting a required security review of an API, considering the OWASP Top 10 as the starting point, running the following checks against every API before certifying it ready for production. - -## Security Fundamentals - -There are many unique security considerations teams should be thinking about early on in the defining and designing of APIs, but also as teams are developing them, leaving teams with a more secure API. - -- **Encryption** - Making encryption the default for all APIs, covering the transport layer, but also storage and database behind APIs, having a solid encryption plan from the start. -- **Authentication** - Leveraging common standards when it comes to authenticating API consumers for using any API, and reducing the complexity for consumers at this layer. -- **Authorization** - Considering an added authorization layer that defines what API-driven resources and capabilities each consumer has access to once they gain access to APIs. -- **Role-Based Access Control** - Applying RBAC to all of the elements of API operations, defining who can edit or just read artifacts, documentation, testing, and other elements. -- **Contracts** - Each API possess a complete contract, including full details of the authentication and authorization, providing the menu of security in place for each API. -- **Environments** - Evaluate the development, staging, sandbox, and production environments that teams have available and what they secret strategy is for their team. -- **Documentation** - Including security fundamentals as part of the documentation for each API, making sure that consumers are always fully aware of the security that is in use. -- **Tests** - Provide collection security tests, providing modular, reusable, executable, and fully documented security tests for all of the most common vulnerabilities teams face. - -There is plenty more your security team will be considering when it comes to API security, but this should be the baseline for security when it comes to your operations, providing the fundamentals. - -## OWASP Top 10 -Injecting a required security review of an API, considering the OWASP Top 10 as the starting point, running the following checks against every API before certifying it ready for production. - -- **Broken Object Level Authorization** - APIs tend to expose endpoints that handle object identifiers, creating a wide attack surface Level Access Control issue. -- **Broken User Authentication** - Authentication mechanisms are often implemented incorrectly, allowing attackers to compromise authentication tokens or to exploit implementation flaws to assume other user’s identities temporarily or permanently. -- **Excessive Data Exposure** - Looking forward to generic implementations, developers tend to expose all object properties without considering their individual sensitivity, relying on clients to perform the data filtering before displaying it to the user. -- **Lack of Resources & Rate Limiting** - Quite often, APIs do not impose any restrictions on the size or number of resources that can be requested by the client/user. Not only can this impact the API server performance, leading to Denial of Service (DoS), but also leaves the door open to authentication flaws such as brute force. -- **Broken Function Level Authorization** - Complex access control policies with different hierarchies, groups, and roles, and an unclear separation between administrative and regular functions, tend to lead to authorization flaws. By exploiting these issues, attackers gain access to other users’ resources and/or administrative functions. -- **Mass Assignment** - Binding client provided data (e.g., JSON) to data models, without proper properties filtering based on an allowlist, usually leads to Mass Assignment. Either guessing objects properties, exploring other API endpoints, reading the documentation, or providing additional object properties in request payloads, allows attackers to modify object properties they are not supposed to. -- **Security Misconfiguration** - Security misconfiguration is commonly a result of unsecure default configurations, incomplete or ad-hoc configurations, open cloud storage, misconfigured HTTP headers, unnecessary HTTP methods, permissive Cross-Origin resource sharing (CORS), and verbose error messages containing sensitive information. -- **Injection** - Injection flaws, such as SQL, NoSQL, Command Injection, etc., occur when untrusted data is sent to an interpreter as part of a command or query. The attacker’s malicious data can trick the interpreter into executing unintended commands or accessing data without proper authorization. -- **Improper Assets Management** - APIs tend to expose more endpoints than traditional web applications, making proper and updated documentation highly important. Proper hosts and deployed API versions inventory also play an important role to mitigate issues such as deprecated API versions and exposed debug endpoints. -- **Insufficient Logging & Monitoring** - Insufficient logging and monitoring, coupled with missing or ineffective integration with incident response, allows attackers to further attack systems, maintain persistence, pivot to more systems to tamper with, extract, or destroy data. Most breach studies demonstrate the time to detect a breach is over 200 days, typically detected by external parties rather than internal processes or monitoring. - -The OWASP Top 10 API vulnerabilities provide us with the baseline that should exist across 100% of the APIs in production, no matter whether they are for internal, partner, or public consumers. \ No newline at end of file diff --git a/docs/more/api/blueprints/shift-left.md b/docs/more/api/blueprints/shift-left.md deleted file mode 100644 index 3abc15b..0000000 --- a/docs/more/api/blueprints/shift-left.md +++ /dev/null @@ -1,48 +0,0 @@ -# Shift Left - -In the early years of software development, applications were typically programmed and tested by the same people. Over the years, organizations started to add more planning and structure to the way they managed the software development lifecycle, breaking it down into concrete phases so that different steps could be handled by different teams. After following this model for a decade or two, many companies began to realize the cost of having the testing phase so late in the lifecycle, and started to shift testing to the left by integrating it into the earlier phases of development. - -## Software Development Lifecycle Evolution - -- **Waterfall** - The waterfall model comes from industries such as construction and manufacturing, where the cost of materials is a major consideration. In the 1970s, many organizations started to adopt the waterfall model for their software development lifecycle. In this model, a project is broken down into several phases, and progress flows in one direction. Each phase must be complete before the project can move on to the next phase. The waterfall model phases include defining requirements, analysis, design, coding, testing, and deployment. The different phases can be handled by various teams or roles, including product managers, designers, engineering, quality assurance (QA) testing, and user experience (UX). The waterfall model allows for a small amount of overlap between phases. For example, when QA testers find a defect, the developers go back to the engineering phase to fix the defect before moving on to the next phase. Waterfall became the dominant model for software companies during the 1970s and 1980s, and is still used by many organizations. -- **Shorter cycles** - In the mid-1980s, other software development models began to emerge that organized the work into short cycles. For example, the iterative and incremental development model begins with early planning, and the next phase is a repeatable cycle that includes more detailed planning, defining requirements, analysis, design, development, testing, and evaluation. This iterative cycle continues until the application is ready for deployment. In contrast to the waterfall model, the iterative and incremental model allows testing and evaluation to lead to additional planning. Because this model leaves room for major changes to design and implementation while development work is well underway, it lends itself to working on small, incremental parts of an application, and also allows multiple development teams to work on different parts of the application at the same time. -- **Shifting left** - The software development lifecycle continued to evolve through the 1980s, 1990s, and early 2000s, each time placing more emphasis on testing and user feedback. This trend became known as "shifting left" because it moves testing out of its fixed position near the right side of the process diagram and into the earlier phases on the left side. Using the shift-left testing model, tests are run early and often so that bugs and vulnerabilities are found and resolved faster. The shift-left model is more proactive than earlier models because testing focuses on problem prevention throughout the development process rather than problem detection at the end. - -## Shift Left Models - -- **Traditional** - In the traditional shift-left testing model, also known as the V-model, testing is added to earlier phases of the development lifecycle. For example, user acceptance tests are developed during the initial requirements analysis phase. These tests simulate the way that customers will use the software in production using realistic data, and they are designed to validate that the final product can meet the user's needs. During the system design phase, system test plans are developed to validate that the requirements are met. System testing can include load testing, performance testing, and regression testing. When more detailed design work is done, integration test plans are developed that will verify that the various units of code that make up the software can run together. Finally, during the coding phase, testers develop and run unit tests on the smallest possible modules of the code. Then, testers run the integration tests, system tests, and user acceptance tests that were developed and planned during earlier phases. -- **Incremental Shift Left** - The incremental shift-left testing model takes the traditional V-model and breaks it down into smaller segments of work that can be completed in shorter stretches of time. When developing a large or complex system, an incremental model can be applied to different parts or components of the system. Using an incremental shift-left model allows development teams to code and test different system components at the same time, which means they can verify that the components work together. The incremental model also tends to split phases into smaller steps, such as high-level design and low-level design. -- **Agile Shift Left** - The Agile shift-left model is another evolution of the traditional shift-left model, where the work is divided into sprints. These sprints represent shorter stretches of time than in the incremental model and they run continuously throughout the development lifecycle. Using the Agile model, cross-functional teams stay in close communication with each other while they develop and test smaller components of a single software product. Like earlier shift-left models, Agile is focused on testing early and often, and collaborating with users to get regular feedback. Many teams that use the Agile model also use continuous integration and continuous delivery (CI/CD) tools to automate testing, integrating, and deploying code. -- **Model-based Shift Left** - With model-based shift-left testing, QA testers are involved in the earliest stages of the development lifecycle. Unlike the other shift-left models, testing can begin before any code is written. QA works with product, design, and engineering teams to develop and run tests on the requirements, architecture, and design of the software to identify and prevent potential problems before they occur. - -## Why Shift Left? - -- **Cost** - Bugs and security vulnerabilities can be expensive to fix. By shifting testing earlier in the development process, testers have more opportunities to identify these issues so that they can be fixed before an application is deployed. Based on a [2020 case study performed by the Ponemon Institute](https://www.ibm.com/account/reg/us-en/signup?formid=urx-46992), a vulnerability costs about $80 to fix during development, while that same vulnerability would cost about $7,600 to fix in production. -- **Speed** - When you include testing in the earlier phases of development, you can identify and resolve issues when they are still small and isolated. By running integration tests, functional tests, and unit tests while development is still in progress, issues can be fixed right away. Developers can use the early feedback from these test results to adapt to changes in requirements or expectations, which saves a significant amount of time compared to making the same changes to a more mature product. Fixing bugs early in the process can also prevent the need for developers to resolve issues in a rush before a product deadline. -- **Automation** - Shifting left can create more opportunities to automate testing. Static code analysis tools, including linters, can check code for programming and style errors without running it, which results in more consistent code. Automated testing leaves less room for human error because the tests are consistent. By integrating automated testing earlier in the CI/CD pipeline, failures appear earlier and can be fixed earlier. When testing is automated, multiple tests can run on the same code at the same time. Automation also requires less time from manual QA testers, which frees them up to work on more valuable tasks. -- **Security** - When testing shifts to the left and the development phase moves more efficiently, code reaches the security analysis phase faster. By the time the code reaches this phase, it has already gone through multiple rounds of testing and many issues have already been resolved. Like other defects, fixing security vulnerabilities as soon as possible saves the most time, effort, and money. - -## What Can Shift Left - -- **API contracts** -- **Design** -- **Testing** -- **Deployment** -- **Security** -- **IT services** - -## How to Start Shifting Left - -- **High-level testing strategy** - Define an end-to-end testing strategy that covers the software development lifecycle. As part of this process, evaluate the risk and impact of failure for each test scenario, and decide which team or individual is responsible for each failure that can occur. Developers can begin shifting left by writing a unit test for every new feature they develop. -- **Collaboration across teams** - Development teams should work together to define code standards. Using consistent conventions paves the way for configuring automation to enforce the standards, and it also makes it easier for developers to review each others' code. -- **Scripted configurations** - Configure static testing and linting tools to automatically identify issues early in development. Developers and QA testers configure automated integration tests that run every time new code is merged to the main development branch. -- **Monitoring** - Configure dashboards and tooling that allows developers to see where failures happen at every stage of the lifecycle. When they have insight into failures that occur in production, developers are better equipped to resolve the issues. - -## What is Shifting Right? - -In addition to shifting more testing to the left in the software development lifecycle, many organizations are also shifting some types of testing to the right. While shift left is about testing early and often, shift right is about testing in production to measure performance. - -- **Performance testing** - Operators can monitor how the application behaves in production by looking at logs and metrics. To test performance, they can send additional traffic to the application to see how it behaves under heavier load. -- **Chaos testing** - With chaos testing, testers intentionally introduce issues like errors, network delays, server outages, or missing data. Testers observe how the system attempts to recover from these errors. -- **User experience testing** - Performing usability testing helps give insight about the users' experience with the software. For example, with A/B testing, half of the users see one version of an application (A), and the other half see a different version (B). Testers analyze the users' activity on both versions of the application, and sometimes request direct feedback from the users. -- **Security testing** - Security tests in production can check for vulnerabilities or out-of-date libraries and dependencies. Testers can also run penetration tests to scan an application for security weaknesses. diff --git a/docs/more/api/blueprints/source-control.md b/docs/more/api/blueprints/source-control.md deleted file mode 100644 index d8bb00f..0000000 --- a/docs/more/api/blueprints/source-control.md +++ /dev/null @@ -1,14 +0,0 @@ -# Source Control -Leveraging existing source control used for tracking and managing changes to code and using it to also include the machine readable artifacts produced across a modern API lifecycle. Using source control to manage the collaboration necessary between multiple API stakeholders, consumers, and across the multiple versions of an API that might be in production at any given moment. Extending the existing software development lifecycle (SDLC) to support what is needed to deliver, but also govern each API across a well-known API lifecycle. - -- **Organizations** - Establishing organizations for source control that are in alignment with the wider API strategy for the enterprise. -- **Repositories** - Ensuring there is a clear mono or distributed repository strategy for the API lifecycle providing the source of truth for the API lifecycle. -- **Folders** - Including in the API strategy guidance for how folders within source control should be structured and used to consistently organize code and artifacts. -- **Artifacts** - Layering on machine readable artifacts produce across the API lifecycle into our existing software development lifecycle source control process. -- **Variables** - Include source control variables as part of an overall API lifecycle variable strategy that is applied across source control, but also documentation, and other stops. -- **Contributors** - Maintain a regular awareness of the contributors associated with any repository that is used as part of the API lifecycle, keeping the team and community well-defined. -- **Feedback** - Leverage source control feedback loops as part of the wider API lifecycle feedback loop, gathering feedback from stakeholders throughout the evolution of code and artifacts. -- **Integration** - Seamlessly integrate your source control into all aspects of the API lifecycle, leveraging Git or APIs to make your source control central to where the API work is happening. -- **Automation** - Put automation to work, ensuring that the evolution of code and artifacts in service of producing or consuming APIs is as standardized and repeatable as it possibly can be. - -Seamlessly weaving source control into your overall API platform and operations strategy is essential for managing change across not just your code base behind your APIs, but also the artifacts that may be generating, deploying, managing, and driving every down stream change that occurs across your API operations. \ No newline at end of file diff --git a/docs/more/api/blueprints/standards.md b/docs/more/api/blueprints/standards.md deleted file mode 100644 index 9f30851..0000000 --- a/docs/more/api/blueprints/standards.md +++ /dev/null @@ -1,15 +0,0 @@ -# Standards -Standards help us not reinvent the wheel when it comes to the digital resources and capabilities we are providing, but also help ensure our APIs are as interoperable with other systems as possible. There are a wide variety of more general or very precise API standards to apply as part of API operations, providing us with solutions ranging from the backbone of the Internet level considerations, as well as common healthy patterns for organizations to use when making APIs intuitive and easy to use. - -# Internet Standards -APIs are just the next evolution of the web, and there are a number of existing Internet standards that should be applied regularly as part of API operations. - -- **Internet Assigned Numbers Authority (IANA)** - The Internet Assigned Numbers Authority (IANA) is a standards organization that oversees global IP address allocation, autonomous system number allocation, root zone management in the Domain Name System (DNS), media types, and other Internet Protocol-related symbols and Internet numbers. -- **Request for Comments (RFCs)** - A Request for Comments is a publication in a series, from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force, authored by individuals or groups of engineers and computer scientists in the form of a memorandum describing methods, behaviors, research, or innovations applicable to the working of the Internet and Internet-connected systems. - -## Industry Standards -Data, and other interoperability and communication standards have existed at the industry level since the birth of compute and the Internet, but more recently are being evolved to support modern approaches to delivering APIs behind web, mobile, device, and other types of applications. - -- **PSD2** - A European Union (EU) Directive, administered by the European Commission to regulate payment services and payment service providers throughout the European Union (EU) and European Economic Area (EEA), providing a common set of industry API standards for financial enterprises, service, and tooling providers to follow when delivering API infrastructure. -- **Fast Healthcare Interoperability Resources (FHIR)** - A standard describing data formats and APIs for exchanging electronic health records (EHR), providing a common set of digital object and API paths for accessing digital healthcare records using modern API infrastructure. - diff --git a/docs/more/api/blueprints/strategy.md b/docs/more/api/blueprints/strategy.md deleted file mode 100644 index 5428e76..0000000 --- a/docs/more/api/blueprints/strategy.md +++ /dev/null @@ -1,12 +0,0 @@ -# Strategy -To realize the results you are looking for at scale your effort across an organization, you are going to need a strategy. You are going to need a living, evolving, and adaptive approach to set in motion the action you will need to shift existing behavior and begin moving teams, groups, domains, and the entire organization in the same intentional direction. - -- **Goals** - Have clear goals for why you are doing APIs, then asking yourself regularly if you are serving these goals across your API operations, using your enterprise goals as the North Start for every stop along the API lifecycle and governing your relationship with consumers. -- **Organization** - Defining the various business domains that exist within the enterprise, then establishing clear groups and teams within these domains, then governing and enabling team members across the API lifecycl eusing common domain standards and vocabulary. -- **Landscape** - Bring the enterprise API landscape into focus, making sure you know where your APIs are at all times, that you have your finger on the pulse across the feedback loops for these APIs, and that you are able to clearly see which direction you are heading. -- **Lifecycle** - Begin locking down a common definition of what is the API lifecycle already in existence across team, mapping out how work is occurring now, then begin identifying where more enablement is needed, and how more alignment can be established across teams. -- **Governance** - Gather together the people in your organization who care about the big picture and get to defining guidelines and rules that can be applied across the API lifecycle to help ensure more consistent and observable APIs across all teams. -- **Observability** - Tap every output that exists across your existing platform and infrastructure to understand the state of not just your APIs, but also the lifecycle around them, helping you better understand the state of API operations and the best way to influence change. -- **Change** - Establish a plan for how you will handle change across your API operations, addressing everything from the versioning of your APis to the turnover on teams, ensuring that as much forward momentum is maintained while also being able to change at any time. - -Your API strategy should be a living document, beginning as a simple set of goals, but then eventually evolving into the guidance and enablement all stakeholders across the organization wil need to successfully participate across the API lifecycle as both producers and consumers. \ No newline at end of file diff --git a/docs/more/api/blueprints/synchronous.md b/docs/more/api/blueprints/synchronous.md deleted file mode 100644 index a368aca..0000000 --- a/docs/more/api/blueprints/synchronous.md +++ /dev/null @@ -1,16 +0,0 @@ -# Synchronous -The web is created by the ability to request HTML pages via URLs and receive a response that hopefully contains what we are looking for. It was a low-cost approach that has been applied to the ability to synchronously GET, POST, PUT, or DELETE digital resources and capabilities via the Internet. - -## Request - -- **Authentication** - Requiring everyone making a request to provide authentication, ensuring that everyone who is using an API is supposed to be using it, keeping everything secure. -- **Parameters** - Allowing each request to be customized by providing a set of parameters along with each request, transforming the response the API consumer is looking to get. -- **Headers** - Defining and shaping the transport of the the request by setting the headers of the request which is used by the network and the server to handle the request nd response. -- **Body** - The machine readable XML or JSON request body being send as part of the request, providing the information being submitted by the consumer as part of each request. - -## Response - -- **Status Code** - Providing standards HTTP status codes that articulate the success or failure of a request in a way that any machine can be programmed to understand when handling. -- **Headers** - The headers of the request, defining how the transport and shaping of the request was handled, allowing for the receiving system to understand how it arrived. -- **Body** - The response data or message being returned as part of the API request, returning JSON or XML that can be used in any application or integration that is using the API. - diff --git a/docs/more/api/blueprints/team-performance.md b/docs/more/api/blueprints/team-performance.md deleted file mode 100644 index e90fafb..0000000 --- a/docs/more/api/blueprints/team-performance.md +++ /dev/null @@ -1,39 +0,0 @@ -## Team Performance -The performance of teams can be best understood using a platform approach, mixing native platform reporting and capabilities alongside platform-level testing that is observed through a balance of producer and consumer perspectives, helping us establish an awareness of how teams operate. - -- API Producers - DORA metrics lay our foundation, with two additional human considerations added in to prove a balanced look at how team performance can be measured as part of platform operations. -- Deployment Frequency - How often a version of API is released. -- Lead Time for Changes - How long until new version is in production. -- Time to Restore Service - How long it takes to recover from a problem. -- Change Failure Rate - How often does an API deployment fail. -- Satisfaction & Well-Being - How are your actual team members doing. -- Communication & Collaboration - Are teams working together on APIs. - -API Consumers - Onboarding and moving consumers forward with each version in unison is an extremely important aspect of understanding not just performance but the value of forward motion to business. - -- Time to First Call - How long for a new consumer to make API request. -- Time to Value - Time to generating value for producer and consumer. -- Time to Upgrade - How long does it take for consumers to upgrade. -- Satisfaction & Awareness - What is the overall satisfaction of consumers. - -Platform Tests - Leveraging collections to test the platform for specific states, using the same infrastructure we use to test each instance of our APis, but here we are testing team performance. - -- Builder - Test for new API contract version using Postman API. -- CI/CD - Test for successful CI/CD builds using CI/CD API. -- Gateway - Test for new user, API, and versions using gateway API. -- Survey - Test for team satisfaction and well being using survey API. -- Message - Test for new lifecycle activity via team message API. - -Aggregate Data - Publishing the results of of platform tests to aggregate locations for reporting upon, understanding the state of team performance across domains, APIs, and all development teams. - -- Source Control - Publish platform tests results to source control. -- Database - Publish platform tests results to a database. -- Data Lake - Publish platform tests results to the desired data lake. -- APM - Publish platform tests results to an existing APM solution. - -Observability - Understanding the state of team performance using a mix of approaches that include native enterprise platform reporting, visualizer reports, and using the existing enterprise APM solution. - -- Platform Reporting - Team, workspace, API, and security reporting. -- APM - Dashboards via existing APM solutions for observability.. - -Like API performance, understanding team performance as part of the API lifecycle requires a mix of metrics, automation, and observability, combined with the ability to run the tests we need and aggregate the data needed to understand exactly what team performance means. diff --git a/docs/more/api/blueprints/team-profile.md b/docs/more/api/blueprints/team-profile.md deleted file mode 100644 index 5fd4823..0000000 --- a/docs/more/api/blueprints/team-profile.md +++ /dev/null @@ -1 +0,0 @@ -# Team Profile diff --git a/docs/more/api/blueprints/tempates.md b/docs/more/api/blueprints/tempates.md deleted file mode 100644 index cfb7d1c..0000000 --- a/docs/more/api/blueprints/tempates.md +++ /dev/null @@ -1,13 +0,0 @@ -# Templates -Common, reusable, and standardized templates help enable teams to move faster while delivery more consistent and reusable APIs, reducing time, money, and friction downstream when it comes to using digital resources and capabilities in applications. Reusable templates often begin with design patterns that can be applied during the define and design stages of the API lifecycle, but then expand rapidly to include almost every other stop along a modern API lifecycle. - -- **Simple** - Provide simple templates that reduce the cognitive load when it comes to learning about new standards and patterns needed as part of producing APIs. -- **Modular** - Keep templates modular and resuable, daisy chaining together different concepts into large patterns, workflows, and processes for moving APIs forward. -- **Starter** - Offer entire starter kits for teams to use when starting new API, providing a complete example of the preferred way for designing, developing, and operating APIs. -- **Contracts** - Maintain catalog of complete contracts that show implementations of different types of contracts, or allow for easy editing, and setting new APIs in motion. -- **Components** - Leverage the components object for OpenAPI and AsyncAPI contracts, providing a rich set of templates that can be used to rapidly design new API contracts. -- **Extensions** - Using extensions for OpenAPI and AsyncAPI, going beyond what each specification can do, providing templates teams can use across the API lifeycle. -- **Rules** - Having template rules for linting different artifacts, helping jumpstart the usage of common rules, but also the development of new rules that help govern operations. -- **Policies** - Provide standardized policies and starter templates for applying across the API lifecycle, helping centralize policy management, but federate their usage. - -Templates help enable teams to do the right things across the API lifecycle, helping provide the common parts and pieces of delivering APIs, leaving teams to focus on the bits that are unique and specific to the value your enterprise provides to consumers via internal, partner, and public APis. diff --git a/docs/more/api/blueprints/time-to-first-call.md b/docs/more/api/blueprints/time-to-first-call.md deleted file mode 100644 index e47d838..0000000 --- a/docs/more/api/blueprints/time-to-first-call.md +++ /dev/null @@ -1,12 +0,0 @@ -# Time to First Call - -- Investment - - Urgency - Is the developer actively searching for a solution to an existing problem? Or did they hear about your technology in passing and have a mild curiosity? - - Constraints - Is the developer trying to meet a deadline? Or do they have unlimited time and budget to explore the possibilities? - - Alternative - Is the developer required by their organization to use this solution? Or are they choosing from many providers and considering other ways to solve their problem? -- Journey - - Browse - - Signup - - First API Call - - Implementation - - Usage diff --git a/docs/more/api/blueprints/versioning-governance.md b/docs/more/api/blueprints/versioning-governance.md deleted file mode 100644 index cd24336..0000000 --- a/docs/more/api/blueprints/versioning-governance.md +++ /dev/null @@ -1,28 +0,0 @@ -# Versioning Governance -Versioning of APIs is a good place to begin with governance efforts, ensuring that all APIs are applying a common approach when it comes to versioning each API. There are many ways in which you can govern an API and the operations around it, but there are few areas that will have a greater impact than standardizing how change is managed. Beginning with the governing of versioning allows for teams to get a handle on what it takes to change behavior across teams, and making an impact of operations before expanding API governance to other areas of operations. - -Having a blueprint for standardizing versioning will begin with having an OpenAPI contract for each API in operation, then establish what the versioning strategy will be. Once a pattern has been chosen, listing in service of governance can be set in motion, and automated across operations in a handful of ways. Defining what versioning governance will look like across teams, but then bringing it to developers using the tooling they are already using to deliver APIs. Governing ho change will be managed across all APIs, ensuring each API has a version, and change can be effectively communicated with development teams. - -You cannot govern APIs that do not have a definition, and OpenAPI is the preferred specification for defining the surface area of APIs. Any API being developed must have an OpenAPI available, either hand-crafted as part of a design-first approach, or generated from a gateway or code-first approach. With this definition, you will then be able to begin applying a consistent versioning approach, and then begin assessing whether or not versioning is being consistently applied across teams. - -- OpenAPI - The OpenAPI specification provides a common vocabulary for describing the surface area of request and response APIs, as well as webhooks, ensuring that API producers and consumers are on the same page when it comes to integration, but also helps stabilize multiple areas of the API lifecycle providing a contract that can be used for delivering documentation, mocks, testing, and more. - -Versioning -Before you can govern the versioning of each API you have to have a formal approach for how you will apply versioning across APIs. Picking one of the most common approaches to version each API, working out the implementation details of the chosen pattern, and providing guidance for teams around why versioning is important, and how to implement it. There are two primary approaches to versioning that are adopted across leading APIs, making it an easy choice to find the patterns you need. - -- Semantic Versioning - Applying a dotted notion for defining major, minor, and patch versions of an API, articulating the different between a breaking change, and non-breaking or fixes to an API for errors that may have be introduced via previous versions. Providing a very structured way for defining, but also communicating change occurring with each individual API. -- Date-Based Versioning - Applying a date to each version of an API, marking each change of an API with a date, allowing consumers to adopt an API on a specific date, integrate with it, and choose if they want to upgrade to the next version based upon the release date. Providing a simple way for defining change, and then communicating it with consumers. - -Linting -With an OpenAPI for each API, and a versioning pattern adopted, you can lint for this pattern across each API using a rules-based approach to governance. Leveraging the open-source tool Spectral to define a machine-readable rule that matches the versioning pattern adopted, and then ensuring the OpenAPI for each API possesses the pattern during design, development, and build time. Codifying governance rules, and then automating how they are applied to ensure that versioning is consistently applied across operations. - -- Versioning Governance Rules - Rules can be defined to govern the information provided for each API, leveraging the OpenAPI or AsynCaPI contracts, but then apply specific ruling looking for common versioning patterns in the path, parameters, headers, and other details of an API, meeting specific guidelines regarding what information is needed. - -Automation -With API versioning governance in place, it can now be automated as part of operations, ensuring the coverage of governance across teams. Providing the ability to lint for version governance rules, but eventually many other rules at the various stages of the lifecycle. Defining API governance in a centralized way, but then equipping development teams to apply it as part of their regular workflows, automating how governance is implemented, and pushing for 100% coverage of API versioning governance across operations. - -- Design Time Governance - Governance can be applied and automated at design time, providing real-time or manually triggered application of governance rules, contracts, and scripts, providing a tighter feedback loop with API designers in regards to the guidance around what is expected of the design of each API. -- Collection Governance - A Postman collection for applying governance provides a flexible way to apply rules, schema, or script-based governance, allowing governance to be defined as modular collections which can be manually run by developers, scheduled using a monitor, or easily dropped into a CI/CD pipeline. -- CLI Governance - Governance can be applied at the command line interface (CLI) level, enforcing API governance locally during development, ensuring that APIs are 100% compliant with rules, contract, and script-based API governance established centrally as part of broader governance efforts. -- IDE Governance - Governance can be applied at the integrated development environment (IDE) level, enforcing API governance locally during development, ensuring that APIs are 100% compliant with rules, contract, and script-based API governance established centrally as part of broader governance efforts. -- Pipeline Governance - Governance can be applied at the pipeline level, enforcing API governance at build time, ensuring that APIs are 100% compliant with rules, contract, and script-based API governance established centrally as part of broader governance efforts, working to automate the testing of the surface area of APIs, and other parts of operations in service of platform-wide API governance efforts. diff --git a/docs/more/api/blueprints/visibility.md b/docs/more/api/blueprints/visibility.md deleted file mode 100644 index 4493706..0000000 --- a/docs/more/api/blueprints/visibility.md +++ /dev/null @@ -1,26 +0,0 @@ -# Visibility -APIs are abstract and difficult to see, leaving a lot of anxiety when it comes to who has access and is able to see and put digital resources and capabilities to use. While a significant amount of the discussion around APIs over the last decade has been about public APIs, the majority of APIs that exist are only available privately, or exist in the shadows behind desktop, web, and mobile applications. Leaving enterprise organizations who are further along in their API journey with a significant amount of confidence when it comes to understanding and having control over not just access to APIs, but also the operations that surround them. - -## Access -Having the ability to effectively control the visibility of your APIs, being able to quickly and confidently move an API from private or team, to being available to partners or publicly to 3rd party developers will play an outsized role in the overall velocity of an enterprise organization doing business today. - -- **Private** - Keeping APIs and the operations around them private and available to an invite only state, keeping artifacts and documentation in the hands of a select audience. -- **Team** - Limiting the access to APIs, workspaces, documentation, and the other elements of the API lifecycle to only the team who will be producing or consuming an API internally. -- **Partner** - Expose APIs, documentation, mock servers, environments, and testing to partners, allowing them to view or even contribute to producing or consuming APIs. -- **Public** - Making workspaces, APIs, and other elements available to the public, allowing anyone to watch, fork, and learn from APIs, as well as the operations and work around them. - - -## Operations -Visibility of API operations begins with a portal, and has been also rapidly expanding via Git repositories and API workspaces, revealing more than just API documentation and pulling back more of the operational curtain around mock servers, environments, testing, and the monitoring and observability of operations. - -- **Portal** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Workspaces** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod** - **tempor. -- **Repos** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **APIs** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Docs** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Tests** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Mocks** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Environments** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. -- **Monitors** - Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor. - -The visibility of your APIs and your API operations will be defined by the overall quality and security of your APIs and the API lifecycle. With more experience comes greater confidence, and API teams who have been through more iterations, and are working across a shared API lifecycle, will find they struggle less when it comes to navigating the visibility of the enterprise API landscape. diff --git a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineer.md b/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineer.md deleted file mode 100644 index 769604a..0000000 --- a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineer.md +++ /dev/null @@ -1,16 +0,0 @@ -# What Can You Do to Be API- **First? (Engineer) -These are a few suggestions for what engineers can consider when it comes to becoming API- **first? - -- **Strategy** - Spend time learning about and contributing to any centralized effort to improve API Operations. -- **Discovery** - Always make time to ensure your API artifacts are up to date and published to a known location. -- **Applications** ~ Work to reuse APIs across the applications you are developing, optimizing the API surface area. -- **Visibility** - Make sure you are following a common plan when it comes to publishing internal and external aPis. -- **Quality** - Make sure there are contract, uptime, and other tests available for 100% of the APIs you are building. -- **Security** - Encourage your centralized security teams to equip you with more tooling for applying API security. -- **Productivity** - Spend time learning about and practice each step of the API lifecycle used by your team. -- **Velocity** - Define every step in the process, and the tooling used to produce and consume APIs on your team. -- **Observability** - Pipe all the results of your contract, performance, security, and other tests into APM solutions. -- **Governance** ~ Communicate with leadership where the friction with producing or consuming APIs exists. -- **Standards** ~ Leverage Internet, industry, and enterprise standards as part of the API design process. -- **Regulations** ~ Ensure your APIs are well documented, and you know where all of your data is avaialble. -- **Innovations** ~ Request the time to play and experiment with producing, but also consuming different APIs. \ No newline at end of file diff --git a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineering-leadership.md b/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineering-leadership.md deleted file mode 100644 index 5abee98..0000000 --- a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-engineering-leadership.md +++ /dev/null @@ -1,16 +0,0 @@ -# What Can You Do to Be API-First? (Engineering leadership) -These are a few suggestions for what engineering leadership can consider when it comes to becoming API-first? - -- **Strategy** - Document how you operate your team, see the API lifecycle, and share with other teams. -- **Discovery** - Encourage teams to publish API artifacts to source control and syncing via API workspaces. -- **Applications** - Prioritize the design, delivery, and operation of APIs over their individual applications. -- **Visibility** - Establish a strategy for delivering APIs internal and external, and how you can move between. -- **Quality** - Make contract and uptime testing the default across every API being produced across teams. -- **Security** - Shifting security left in the API lifecycle, equipping developers with the security tests they need. -- **Productivity** - Define, optimize, and automate the API lifecycle across teams leveraging existing tooling. -- **Velocity** - Define every step in the process, and the tooling used to produce and consume APIs on your team. -- **Observability** - Pipe all the results of your contract, performance, security, and other tests into APM solutions. -- **Governance** - Help define what the guardrails are for teams when it comes to designing and delivering APIs. -- **Standards** - Leverage Internet, industry, and enterprise standards as part of the API design process. -- **Regulations** - Ensure your APIs are well documented, and the infrastructure behind your APIs also have APIs. -- **Innovations** - Give your team the space to innovate when it comes to how they produce or consume APIs. \ No newline at end of file diff --git a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-executive.md b/docs/more/api/blueprints/what-can-you-do-to-be-api-first-executive.md deleted file mode 100644 index 8960a5b..0000000 --- a/docs/more/api/blueprints/what-can-you-do-to-be-api-first-executive.md +++ /dev/null @@ -1,18 +0,0 @@ -# What Can You Do to Be API-First? (Executive) -These are a few suggestions for what executives can consider when it comes to becoming API-first? - -- **Strategy** ~ Invest in the people and process to move forward a living API strategy for your organization. -- **Discovery** ~ Making APIs and the operations around them are discoverable by default across all teams. -- **Applications** ~ Prioritize the design, delivery, and operation of APIs over their individual applications. -- **Visibility** ~ Strengthen your organizational confidence when it comes to making APIs available externally. -- **Quality** ~ Automate contracts, performance, and other types of testing across 100% of API Operations. -- **Security** ~ Shifting security left in the API lifecycle, equipping developers with the security tests they need. -- **Productivity** ~ Define, optimize, and automate the API lifecycle across teams leveraging existing tooling. -- **Velocity** ~ Increase the velocity of teams, and the APIs they manage with a well-known API lifecycle. -- **Observability** ~ Powering observability across the API factory floor by routing data into existing APM solutions. -- **Governance** ~ Establish centralized governance and guidelines that are applied in a federated way. -- **Standards** ~ Leverage Internet, industry, and enterprise standards as part of the API design process. -- **Regulations** ~ Dedicate resources to strengthen regulatory awareness and response using APIs. -- **Innovations** ~ Allow developers to translate productivity gains into innovation time to remain competitive. - - diff --git a/docs/more/api/blueprints/what-is-an-api.md b/docs/more/api/blueprints/what-is-an-api.md deleted file mode 100644 index a5fc643..0000000 --- a/docs/more/api/blueprints/what-is-an-api.md +++ /dev/null @@ -1,19 +0,0 @@ -# What is an API? -In an internet-connected world, desktop, web, mobile, and device applications are designed for humans to use, where APIs are designed for systems, applications, and integrations to be put to work. Websites and APIs both do the same things, like return data, content, images, videos, and put algorithms to work. The only difference is that APIs don’t return all the details that are needed to make things look pretty for the human eye—you only get the raw data and other machine-readable information needed behind the scenes to put the resources being delivered to work. - -# Unpacking the Acronym -Let’s break down what an API is a little before we explore what they are capable of, understanding a bit more about how we interface to programmatically apply a variety of digital resources and capabilities: - -- **(A)pplication** - You are “applying” something digital, oftentimes on the desktop, web, or via a mobile device, but today it really can be anywhere you want by connecting it to the Internet, bringing the web into our homes, workplaces, cars, and in the public sphere. -- **(P)rogramming** - Allowing for something to be programmed, executed, and automated, making something that is repeatable and easy to use in software, allowing our physical and online world can be automated to do the things we can’t or don’t want to do. -- **(I)nterface** - The point where two entities meet and interact, enabling communication between two systems, and applying digital resources and capabilities, networking servers and common everyday objects, allowing the world around us to interact. - -Application Programming Interfaces, or simply APIs are how you standardize, automate, and apply the digital resources and capabilities that define how how digital world goes around in the 21st century. - -## Useful Analogies -When learning about abstract digital concepts it helps to lean on real-world analogies to help us make sense of the rapidly expanding digital landscape we are putting to work, or are putting us to work: - -- **Restaurant Menu** - APIs have been compared to a restaurant menu, providing you with a list of what is available, and providing a set of instructions for how to order the food you wish to order. -- **Utilities** - APIs are often compared to your electricity, telephone, and plumbing, but applied to an endless mix of digital resources and capabilities which you access via your digital interfaces. - -APIs come in many shapes and sizes and can be difficult to see, but you are already using APIs many times each day to interact with friends, family, co-workers, customers, and many other people you encounter as part of your work. diff --git a/docs/more/api/blueprints/what-is-api-aware.md b/docs/more/api/blueprints/what-is-api-aware.md deleted file mode 100644 index a536d31..0000000 --- a/docs/more/api/blueprints/what-is-api-aware.md +++ /dev/null @@ -1,16 +0,0 @@ -# API-Aware -Once an enterprise organization begins to wake up to the potential of being API-first they begin investing in a strategy for producing and consuming APIs that is in alignment with business objectives. Developing more awareness around what APIs already exist, and the lifecycle that is moving them forward or possibly holding them back. - -- Strategy -- Discovery -- Applications -- Visibility -- Quality -- Security -- Productivity -- Velocity -- Observability -- Governance -- Standards -- Regulations -- Innovations diff --git a/docs/more/api/blueprints/what-is-api-early.md b/docs/more/api/blueprints/what-is-api-early.md deleted file mode 100644 index f66d8d4..0000000 --- a/docs/more/api/blueprints/what-is-api-early.md +++ /dev/null @@ -1,16 +0,0 @@ -# API-Early -Enterprise organizations who are early on in their API journey are doing APIs, but just not doing APIs in any strategic, observable, or governable way. Leaving a pretty chaotic and unknown landscape of digital services that are powering a mix of web, mobile, and device applications, while cobbling together integrations across internal and external systems. - -- Strategy -- Discovery -- Applications -- Visibility -- Quality -- Security -- Productivity -- Velocity -- Observability -- Governance -- Standards -- Regulations -- Innovations diff --git a/docs/more/api/blueprints/what-is-api-first.md b/docs/more/api/blueprints/what-is-api-first.md deleted file mode 100644 index ac48dce..0000000 --- a/docs/more/api/blueprints/what-is-api-first.md +++ /dev/null @@ -1,18 +0,0 @@ -# What is API-first? -API-first is the new model of software development in which applications are conceptualized and built as an interconnection of internal and external services through APIs. An API-first company is an organization that has adopted the API-first development model. API-first companies understand that learning to deliver applications using a mix of internal and external services via APIs is an essential part of their digital transformation. Having a solid API platform for connecting many internal, partner, and public APIs while also leveraging a well-known API lifecycle sets API-first companies apart from those that are API-last. - -- **There is a living API strategy** ~ API-first companies have a well-planned, documented, and shared API strategy defining their operations. They prioritize APIs with a central definition of why they are doing APIs and have a shared understanding of what the API lifecycle is across teams. This means that companies are getting more organized about how they think about APIs in order to understand the sprawling API and microservices landscape: they establish centralized groups to define the API lifecycle and governance while leveraging federated approaches to keep teams in alignment with central enterprise API strategies. -- **APIs are prioritized over applications** ~ API-first companies are prioritizing the development and management of APIs over the development and operation of any single web, mobile, or device application. They make sure that every API is designed, developed, and supported as part of a larger enterprise API ecosystem beyond any application that uses it, ensuring that APIs always have documentation and are tested and secured according to organizational-wide API governance guidelines. This approach provides much higher quality, reliable, and secure infrastructure behind the applications and integrations in use across the enterprise. -- **API discovery is the default** ~ API-first companies are able to quickly find APIs and microservices via private, partner, and public API catalogs, searching and browsing across the digital resources and capabilities in use across the enterprise. Teams know they can find not just the APIs, but also all of the supporting artifacts and resources around APIs across team workspaces and repositories. They are mapping out the entire API landscape that exists across API-first organizations, and doing the manual and automated work to keep API catalogs always up to date. This lays a more solid foundation when it comes to API discovery across an organization, ensuring that APIs can be found before new APIs are to be developed and that APIs are easily found for use in an application or integration. -- **Visibility is clear inside and outside** ~ API-first companies confidently operate APIs internally within the enterprise and externally with trusted partners, or even publicly via third-party developers. API-first companies are not concerned about the boundaries between private and public APIs because they have a handle on their identity and access management layer, the security across APIs, and the observability needed to understand how APIs are being put to use. With visibility across APIs, who has access to them, and what they are doing with the digital resources and capabilities, teams confidently deliver and operate APIs privately or publicly. They operate APIs within clearly defined business domains where the balance between security, privacy, and accessibility is always at optimal levels. -- **Quality is consistent across teams** ~ API-first companies experience higher levels of quality across the APIs behind their web, mobile, and device applications.100% of the APIs in production have contract and performance tests available, ensuring that APIs are always doing what they are designed to do and meet the SLA on their API contract. API testing is centrally defined but then locally implemented as individual executable collections that developers can manually run during development, but then integrated into the CI/CD pipeline and scheduled via monitors. This standardizes testing across the organization while ensuring quality is part of each team’s toolbox for designing, developing, deploying, and managing APIs. -- **Security is shifting to the left** ~ Following up after quality, API-first companies also require that every API being put in production has a security collection present, allowing the surface area of each API to be scanned and evaluated for common vulnerabilities. This requires APIs to be in alignment with central security practices and also an executable security collection that developers can use during development, bake into the CI/CD pipeline, and schedule as a monitor. Companies are shifting API security left in the API lifecycle while keeping it working in concert across an organization so that all APIs are consistently secured against the most common vulnerabilities and the latest threats. -- **Productivity is optimized across teams** ~ Within API-first companies, teams are always working across well-defined workspaces using common standards, artifacts, and patterns while following an agreed-upon API lifecycle that is well-defined and automated whenever possible. Teams have the training they need to design, develop, deploy, manage, and iterate upon APIs in a collaborative and discoverable way, allowing teams to work in concert across multiple workspaces that are kept in sync with repositories and CI/CD workflows. How API-first organizations do APIs is much more proven, shared, and observable; the approach gives teams a better understanding of what success looks like, and new stakeholders are able to get up to speed and find what they need faster. This results in much more productivity across teams and domains, with higher quality and more reliability existing with consumers. -- **Velocity is maximized across teams** ~ Teams within an API-first company work across a well-known API lifecycle that is defined and enabled by an API platform that empowers them to move faster and deliver higher quality APIs across operations. APIs in an API-first company are right-sized to focus on a specific problem within a domain while being developed by a known team that is working in a well-defined workspace where you have all the artifacts, mock servers, documentation, testing, history, and other details available at your fingertips. APIs are much more precise in their implementation and are much easier and lightweight to move along a well-defined API lifecycle that possesses a feedback loop across the team but also with consumers. A cycle is set in motion that can effectively deliver and then iterate upon APIs faster while ensuring they are also better meeting the needs of consumers. -- **There is observability across all operations** ~ API-first companies enjoy more observability into the health and activity across 100% of APIs. They pipe the outputs from across all APIs into existing reporting and APM solutions to establish awareness regarding each API instance and the governance that exists across APIs. They also leverage collections that are defined for testing, security, governance, and other areas of the lifecycle and monitors that are scheduled across different regions to provide the outputs needed to achieve observability at scale. With 100% of the APIs having testing, security, and governance applied through modular executable collections, teams can better make sense of the state of the complex enterprise system and make more informed decisions. -- **Governance becomes much easier** ~ API governance becomes much more doable in an API-first company. With APIs possessing discoverable and machine-readable artifacts, you are able to better govern the design of each API being delivered. With well-defined workspaces containing artifacts, documentation, testing, and monitoring, critical aspects of operating our APIs are always in place. With executable collections present for testing and governance, teams can realize the observability needed across all APIs—which allows better understanding of the state of enterprise operations at scale. The core elements of API-first, like discoverability, quality, security, and observability, all contribute to making API governance possible across teams. -- **Standards are always baked in** ~ Common web, industry, and organizational standards are much more ubiquitous across API-first companies than those who are earlier on in their API journey. API-first companies understand how API standards help reduce the cognitive load necessary to make sense of what APIs do while also reducing friction when it comes to documenting, testing, and integrating with APIs. They leverage standards to make APIs more intuitive, consistent, and speak to specific domains using a common vocabulary that makes sense to the widest possible audience. API-first companies understand how API standards and common patterns help contribute to almost every aspect of operations, helping them produce APIs in a way that makes them more reliable for consumers and lightening their load. -- **Regulations are just part of doing business** ~ API-first companies see regulations as a normal constraint to doing business in any sector. With discoverability the default in an API-first company, and all data defined as simple, reliable, and observable APIs, planning for regulation and responding to any inquiries from regulators becomes much easier. Regulation compliance is then a less daunting task for teams, and acknowledging regulatory constraints is just a natural part of doing business in a digital world. This even allows API-first companies to reduce the overhead of regulatory reporting by utilizing the APIs provided by regulatory agencies as part of their own API journeys. API-first companies understand the important role APIs are playing in transforming the relationship between the public and private sector. -- **Innovation is a priority for teams** ~ Teams within API-first companies have more time for innovation. With a more streamlined lifecycle around APIs driving productivity and quality, teams have much more breathing room when it comes to thinking about what really mattersand what the next killer products and features might be. If teams aren’t just responding to problems with existing operations and are able to more confidently move forward with new products and features, they will have an increased likelihood of investing in the innovation that matters to consumers. We all want to think of our organizations as being more innovative, and API-first allows us to handle the current state of our API operations so that we can push for new work that has a much greater impact on the future of organizations and the industries we operate in. - -These are just a few of the key characteristics of companies who have realized that APIs are behind every major technological shift behind our online world, from mobile to the cloud. API-first companies understand that APIs are not just isolated technical concepts; they define how your business operates online today and are what will define your digital transformation for many years to come. API-first is much more than just the technical details of APIs defined by developers; it is about establishing a collective mindset across the enterprise to have a strategy for how APIs will be done, and leveraging APIs to realize more productivity, velocity, and quality across teams. \ No newline at end of file diff --git a/docs/more/api/blueprints/why-do-you-become-api-first.md b/docs/more/api/blueprints/why-do-you-become-api-first.md deleted file mode 100644 index ca5b9d4..0000000 --- a/docs/more/api/blueprints/why-do-you-become-api-first.md +++ /dev/null @@ -1,19 +0,0 @@ -# Why do you become API-first? -Enterprise organizations that are further along in their API journey are now finding themselves asking more questions about why they should become an API-first company. Choosing to be API-first has the potential to dramatically increase the productivity of your teams, ensure high levels of quality across all of the APIs you produce, and build your operations on a more solid API platform. When you shift towards an API-first approach in your business, you establish the understanding and control you need across your operations to more effectively drive your enterprise forward. - - - **Is it possible to do this without a strategy?** - Each one of us makes thousands of API calls a day in the course of our personal and professional lives. APIs are what have defined every major technological shift in the last 20 years. Even with this reality, many enterprise organizations still lack a formal API strategy for defining how APIs will be designed, developed, delivered, and operated. But, these are just for the APIs we build; there are hundreds of other partner and third-party APIs we depend on that we don’t have a strategy for either. There is no way for enterprise leadership to remain competitive within their industries if they don’t have a formal strategy for how APIs work across an organization, making having an API plan the top of our list of arguments for why you should be API-first. - - **So much API redundancy behind applications** - For the last 20 years, enterprise organizations have been participating in a non-stop race to stitch together a dizzying mix of applications and systems for their operations, but now they’re increasingly developing their own web, mobile, and device applications on top of it all. All of these applications are reliant on APIs to move the enterprise business forward, resulting in an often redundant chaotic mix of APIs that possess low levels of visibility and quality. Being API-first means that you prioritize all of the APIs behind your applications and consider them as a whole, taking into account how they will be used across all applications, rather than just scrambling to deliver a single application that often leaves vulnerable and redundant APIs in their shadows. - - **You must be able to discover all of your APis** - Very few enterprise organizations know where all of their APIs are, let alone have them properly published to an internal, partner, or public catalog. API discovery is a top concern when it comes to finding APIs before any new APIs are started (to reduce redundancy), and while you are looking to build an application or integration. Being API-first means that all APIs are prioritized, and APIs and microservices are discoverable by default; the right API platform makes them available via search and browsing of catalogs, and also provides names, descriptions, tags, and other metadata essential to robust and accurate API discovery. This saves time and resources across operations by making sure there isn’t redundancy and waste across the endless APIs enterprise organizations are putting to work today. - - **You must have visibility inside and outside** - Being API-first involves having well-defined boundaries, security, authentication, and other practices across the API lifecycle. With those in place, you have confidence when it comes to opening up access to digital resources and capabilities to partners and third-party developers. Without proven API lifecycle practices, and the visibility and observability you need regarding how APIs are being put to use, you just won’t have the confidence needed to share your APIs to outside sources. This will prevent you from rapidly responding to partner needs. Why? Access to your internal APIs should only be provided with a secure gateway and management layer, which is a critical dimension of how API-first companies are able to remain so competitive, agile, and adaptable in today’s business landscape. - - **Elevating quality benefits your bottom line** - Lack of quality across APIs hits an organization in the pocketbook when it comes to the loss of consumer business, but also through the time spent by teams troubleshooting, fixing, and responding to preventable incidents. Quality shapes and defines the enterprise, and sets the tone for how teams think, act, and behave. Being API-first elevates the testing of API contracts, performance, integrations, and other dimensions, helping enable teams to write and generate better tests and making sure tests are present across 100% of APIs. Teams are equipped with the training and tooling they need to develop and apply modular testing across every API they produce or consume, enforce tests via CI/CD pipelines, or schedule via monitors. - - **Providing a solid API security perimeter** - An API-first security perimeter is much more effective than firewalls and existing application security practices alone. Being API-first means that every API and microservices has a security collection that is centrally defined by security experts, but then also applied as part of the regular API development lifecycle by developers. API-first means that the shadow APIs that exist behind mobile applications—supporting system-to-system integrations and powering ephemeral applications—are elevated alongside every other API. Even the simplest of APIs are forced through the minimum security scanning and evaluation as it is being deployed or changed with each version. Security is consistently applied across all APIs used by teams, no matter what the application is or how long the API will be used by consumers. - - **Putting APIs first increases productivity** - An API-first approach means establishing well-known workspaces where API work is centralized, ensuring they possess artifacts, documentation, mock servers, environments, tests, monitors, history, and everything else new and existing team members need to get to work. Repeatable processes are established in order to optimize the design, development, deployment, and operation of APIs and microservices. Rough edges of the API lifecycle are smoothed out across teams, which are equipped with the knowledge and tooling they need to consistently deliver and iterate upon APIs. API-first organizations perpetually redefine themselves and move forward to meet the evolving demands of their consumers and the industries they operate in. - - **Meeting demands by realizing higher velocity** - Putting APIs first establishes a proven process for teams to develop, deploy, and iterate upon APIs much more efficiently. Then, when you combine this process with a consumer feedback loop, you end up with a virtuous cycle of development that can help an organization realize it’s maximum velocity. APIs designed as products are built for the future, and they leverage real-time feedback loops with consumers to rapidly design, iterate, test, and deploy new features. This increases the velocity in which the enterprise can respond to market needs. Teams more quickly deliver new digital resources and capabilities to help power business across applications and integrations. - - **Achieving the operational observability needed** - Without API-first you will never have the visibility you need to understand what is happening across the enterprise. Applying API-first processes ensures that each individual API has collections defined to test contract and performance, but also to realize security, governance, and other expectations of your operations. Modern approaches to API observability are all about tapping the existing outputs of your system to understand its state at any given time using your APM infrastructure. API-first ensures that every API and the lifecycle around it has outputs and is connected to reporting and existing APM solutions—this gives you 100% observability across all your APIs. Without this level of observability, there is no way for teams and business leaders to know what is truly happening across their operations, leaving them in the dark when it comes to making decisions that impact business. - - **Governing the state of your complex system** - In order to govern APIs at scale across teams, an enterprise needs API-first discoverability, quality, and observability at its foundation. This existing testing infrastructure allows you to not just test each individual API, but also the surface area and lifecycle for that API. Without being API-first, you just won’t have the visibility across your operations to understand where consistency exists or doesn’t exist in the design of an API, but also the supporting documentation, testing, security, and the monitoring of APIs. API governance is about being able to understand the state of your complex enterprise system and having the control and influence to make updates, guide, and realize the change you need to move in the right direction. Without an API-first mindset across all of the areas being discussed here, you will never be able to fully realize API governance at scale across teams, which means you’ll never be able to head in the direction you need to compete in today’s market. - - **Being stronger by using common standards** - Complex and non-standardized APIs require more resources to onboard and integrate with, resulting in friction throughout the life of an API. APIs that employ common patterns and adopt web, industry and organizational standards as part of a wider API-first approach ensure that APIs speak to the widest possible audience, and enable interoperability across an industry. Standards help reduce the cognitive load required to understand what an API does, and they increase the likelihood that there will be common open source libraries and tooling available when producing or consuming APIs. Utilizing standards as part of an API-first organization makes good business sense: it increases the productivity, quality, and velocity that exists across any API, leaving an organization stronger while also speaking to the widest possible number of consumers within each industry. - - **Seeing regulations as less threatening to business** - API-first makes it easier for enterprise organizations to find data across the enterprise. You’ll have the visibility needed across operations to effectively prepare for regulation, but also respond in the moment to them. In an API-first world, you have the discoverability and observability present as a default part of your operations, reducing the friction associated with responding to regulatory requirements. Without an API-first strategy, you are left with teams carrying the burden of responding to each individual inquiry; they will struggle to meet the minimum requirements associated with industry regulations, rather than working on the next digital resource or capability your business needs. API-first helps organizations see regulations as more of something to build on and around, rather than something that is perpetually disrupting business and getting in their way. This removes yet another barrier for an organization that is looking to operate, evolve, and move forward as a business. - - **Making space for team Innovation to occur** - Teams in an API-first organization have more freedom to innovate. With higher quality and reliability across operations, teams have the space to innovate around new processes, products, and features. Without API-first, you are perpetually stuck in a mode where you are responding to the past, left unable to properly plan for and build for the future. API-first prioritizes the unwinding of the complex spaghetti mess of infrastructure that has emerged behind the web, mobile, and device applications we’ve built and used over the last two decades—so that it runs more efficiently and can be more effectively iterated upon. API-first gives teams the space and peace of mind they need to think about what matters most, and ultimately what will benefit the business when it comes to innovating out ahead of regular business operations. - - **Being API-first means being an industry leader** - The enterprise digital transformation is dependent on being API-first and is essential for remaining agile, adaptable, efficient, and competitive in today’s digital marketplace. Enterprise organizations across every business sector are waking up to the importance of APIs, however, it is only the ones who have embraced API-first that are leading and shaping business around the globe today. API-first focuses on prioritizing your infrastructure investments in a more logical way. It isn’t about less priority for applications; it’s about more visibility and planning around the pipes that power them, allowing for higher quality and more reliable infrastructure behind the applications and integrations our businesses depend upon. - -There are many reasons why your enterprise organization should become an API-first company. We hope this blog post provides you with a list of some of the most common reasons why leading technology companies and other mainstream businesses are opting to be API-first into the future. Enterprise leaders are considering these areas as they strategize how they can make the changes they need across their teams. At this point in time, it isn’t whether you want to do APIs or not, it comes down to whether or not you are API-first or API-last. \ No newline at end of file diff --git a/docs/more/api/blueprints/workspace-checklist.md b/docs/more/api/blueprints/workspace-checklist.md deleted file mode 100644 index 0f45737..0000000 --- a/docs/more/api/blueprints/workspace-checklist.md +++ /dev/null @@ -1,25 +0,0 @@ -# Workspace Checklist -API workspaces provide a platform-driven approach to defining API operations, with a variety of building blocks that allow an organization to estabished workspaces for each API, or suite of APIs, making artifacts, the lifecycle, and other elements of API operations more tangible. Providing a rich list of elements that can be applied in many different ways to provide private, team, partner, and public workspaces for producing and consuming APIs. - -- Workspace Name -- Workspace Summary -- Workspace Overview -- Workspace Visibility -- Workspace OpenAPI API -- Workspace GraphQL API -- Workspace SOAP API -- Onboarding Collection -- Documentation Collection -- Mock Collection -- Test Collection -- Workflow Collection -- Mock Server -- Test Monitor -- Sandbox Environment -- Production Environment -- Workspace Watches -- API Watches -- Collection Watches -- Collection Forks -- History -- Activity \ No newline at end of file diff --git a/docs/more/api/blueprints/workspaces.md b/docs/more/api/blueprints/workspaces.md deleted file mode 100644 index f71bcb7..0000000 --- a/docs/more/api/blueprints/workspaces.md +++ /dev/null @@ -1,23 +0,0 @@ -# Workspaces - -API workspaces are to the API lifecycle, as Github repositories are to the software development lifecycle, and while they are separate, there is actually a symbiotic relationship between repositories and workspaces that help move forward each API along a well-known lifecycle. Workspaces provide a single place to design, develop, and manage APIs using OpenAPI, GraphQL, or WSDL, and manage documentation, mock servers, testing, and security using collections--providing a collaborative location for teams to move APIs forward. - -## Contents - -* [**Collection**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/12959542-5468c6eb-c110-4216-b65b-df0415513b6b) - A machine-readable artifact that acts as a container for storing and organizing multiple API requests, providing an executable, self-documented reference for a complete API, a subset of an API, as well as workflows containing multiple requests from across many different APIs in a specific order, with a precise business function. -* [**APIs**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - Providing programmatic interfaces for a variety of desktop, web, mobile, device, and other applications using HTTP, HTTP/2, HTTP/3, TCP, MQTT, and other protocols, while supporting a mix of patterns like REST, RPC, GraphQL, and event-driven to meet the growing demands of increasingly digital organizations. -* [**Environments**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/12959542-811d5926-7928-4762-926b-e4dd61954963) - Machine-readable environments for APIs allow for abstracting away common elements of an API environment from the definition of each API, allowing different environments to be paired with OpenAPI and collections for each API at design, development, and build time. -* [**Monitor**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/12959542-c6ce1a5d-5732-4ad5-9ac0-34bd7b895c01) - Monitoring any process across API operations defined as a collection, then bundled with any environment, setting a schedule for the execution of the collection, and viewing or publishing of the results to any other location, providing a very wide definition of what monitoring can mean across API operations. - -## Visibility - -* [**Private**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - Keeping something only accessible internally by a designated team of stakeholders, keeping workspaces, API, collections, environments, and monitors only accessible to team members who have been invited to access, then further delegating access control based upon their role, ensuring they can only view, or possibly edit all the artifacts and operations around any API with a private visibility. -* [**Partner**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - An established relationship between two organizations around a common activity, leveraging APIs as the way in which data, content, media, and other digital resources and capabilities are made available, helping ensure the partnership benefits both parties involved, but leverages APIs to ensure interoperability and access to partner benefits. -* [**Public**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - Designating a workspace, APIs, collections, environments, and monitors having a public visibility that makes them discoverable via the public API network and search engines, increasing the audience for an API by making it available publicly, while still limiting who can actually edit artifacts and change configuration, limiting engagement with public consumers to watches, forks, and commenting on APIs and collections. - -## Engagement - -* [**Contributors**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - The individual contributors to the APIs, collections, and other artifacts and activity occurring within workspaces, tracking and acknowledging the contribution of team members and 3rd party contributors who are helping make small or large changes to any aspect of what is happening in a workspace. -* [**Team**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/) - The team of people who are behind API operations, providing name, role, and other relevant details about who they are and what they will be contributing to API operations, helping manage the human side of operations, allowing hundreds or thousands of team members to be organized and managed for optimal deliver and consumption of APIs across operations. -* [**Watch**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/12959542-e933db77-d17f-4ba8-94cd-30de8ea19a35) - Watching of some element of API operations, allowing team members, partners, or public users to signal they want to receive notifications of any change to an API and it's supporting elements, making API operations more observable and something that all stakeholders are able to stay in tune with as they evolve and change. -* [**Activity**](https://www.postman.com/postman/workspace/postman-open-technologies-lifecycle/documentation/12959542-43ab9dd3-a7d9-484d-b52d-67ade5820dcf) - The changes made to any aspect of operations by team members, providing observability into when APIs, mock servers, documentation, testing, monitors, and other critical elements of API operations are changed, configured-- helping give a log of everything that happens at the operational level. \ No newline at end of file diff --git a/docs/more/api/blueprints/world-building.md b/docs/more/api/blueprints/world-building.md deleted file mode 100644 index 1b97426..0000000 --- a/docs/more/api/blueprints/world-building.md +++ /dev/null @@ -1,11 +0,0 @@ -# World Building -APIs are providing the building blocks of modern digital transformation. Providing the reusable, composable, and scalable units of value that are defining digital resources across every industry. Providing the ability to deploy and operate essential services across multiple clouds and multiple regions, allowing enterprise organizations to redefine and rebuild their business anew in a much more resilient manner. Providing the digital capabilities needed to sustain existing businesses but also define entirely new sectors we haven’t imagined yet. - -- **Resources** - The enterprise has the required digital resources at its disposal for assembling the applications, integrations, and automation needed, but it also as the API factory floor with the capacity to deliver the next generation of resources needed. Providing the building blocks any future world we want to build, allowing us to manifest the products we know markets will need. -- **Capabilities** - There are thousands of business workflows in motion across the enterprise aint any given second–providing discoverable, well-defined, and executable by business and technical stakeholders, providing the fuel for business operations today. The operatons that makes this possible allows for the rapid creation, maturing, and scaling of known and unknown workflows the enterprise needs to do business tomorrow. -- **Scale** - Combined with the elasticity of the cloud, an API-first digital platform provides the scale required to build entirely new world, providing a unlimited supply of digital resources and capabilities, as well as the teams to deliver what isn’t currently in inventory. Allowing new products to be rapidly created, iterated upon, and then scaled to meet the demand of markets within days and weeks, and not years, providing the just in time scale that is needed to compete in a digital world. -- **Multi-Cloud** - The next era of cloud computing will not be isolated to a single cloud, and the enterprises who have achieved a world building level of operations are finding success operating across all of the cloud platform. Investing in the talent, processes, and infrastructure needed to wrestle with the nuance of each cloud platform, but leveraging APIs to abstract away the differences, and successfully operate in all of the top cloud providers whenever business demands it. -- **Multi-Region** - API-first enterprises are applying their new found API-first agility and velocity to standing up and operating API infrastructure in regions that are closest to their consumers. Responding to regulation, data nationalism, but also reducing latency by operating closer to the edge of their busines networks. Helping organizations define their operations by domain, but also by geographic region, optimizing their platform experience by leverage APis and the cloud to provide a better consumer experience. -- **Regulated** - An API platform lends itself to a highly regulated enterprise operations, responding to internal as well as external influences in a balanced way. Layering government regulation into a mix of governance that has given leadership more control over how the enterprise responds to markets and regulators, while staying focused on the needs of consumers. Pushing regulation to a state where the enterprise has control to shape whether it is a positive or negative force how how a company does business. - -APIs are the digital building blocks of the online global marketplace. APis are how the last twenty years has be built, and it is how tomorrow digital landscape will be built, providing the building blocks of each new layer of the API economy that will be perpetually changing and iterating upon yesterdays economy. diff --git a/docs/more/api/blueprints/wsdl.md b/docs/more/api/blueprints/wsdl.md deleted file mode 100644 index b7f0c60..0000000 --- a/docs/more/api/blueprints/wsdl.md +++ /dev/null @@ -1,17 +0,0 @@ -# WSDL -WSDL is an XML format for describing network services as a set of endpoints operating on messages containing either document-oriented or procedure-oriented information. The operations and messages are described abstractly, and then bound to a concrete network protocol and message format to define an endpoint. While few new APIs choose to use WSDL, they are ubiqutious for web services across common enterprise systems. - -## Documents - -- **Types** - Defines the (XML Schema) data types used by the web service -- **Message** - Defines the data elements for each operation -- **Port Type** - Describes the operations that can be performed and the messages involved. -- **Binding** - Defines the protocol and data format for each port type - -## Port Type -The port type element defines a web service, the operations that can be performed, and the messages that are involved. - -- **One-Way** - The operation can receive a message but will not return a response -- **Request-Response** - The operation can receive a request and will return a response -- **Solicit-Response** - The operation can send a request and will wait for a response -- **Notification** - The operation can send a message but will not wait for a response diff --git a/docs/more/api/images/aws-api-gateway-icon.png b/docs/more/api/images/aws-api-gateway-icon.png deleted file mode 100755 index 7d3838b..0000000 Binary files a/docs/more/api/images/aws-api-gateway-icon.png and /dev/null differ diff --git a/docs/more/api/images/aws-lambda-icon.jpg b/docs/more/api/images/aws-lambda-icon.jpg deleted file mode 100755 index 7d0207d..0000000 Binary files a/docs/more/api/images/aws-lambda-icon.jpg and /dev/null differ diff --git a/docs/more/api/images/aws-lambda-icon.png b/docs/more/api/images/aws-lambda-icon.png deleted file mode 100644 index 9b4b5e7..0000000 Binary files a/docs/more/api/images/aws-lambda-icon.png and /dev/null differ diff --git a/docs/more/api/images/blueprint.png b/docs/more/api/images/blueprint.png deleted file mode 100644 index d3cc85b..0000000 Binary files a/docs/more/api/images/blueprint.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/always-investing-in-standards.jpeg b/docs/more/api/images/blueprints/always-investing-in-standards.jpeg deleted file mode 100644 index f6bf5c4..0000000 Binary files a/docs/more/api/images/blueprints/always-investing-in-standards.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/an-api-design-review.jpeg b/docs/more/api/images/blueprints/an-api-design-review.jpeg deleted file mode 100644 index 5beb950..0000000 Binary files a/docs/more/api/images/blueprints/an-api-design-review.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/anatomy-of-individual-api-requests.jpeg b/docs/more/api/images/blueprints/anatomy-of-individual-api-requests.jpeg deleted file mode 100644 index 36ad4aa..0000000 Binary files a/docs/more/api/images/blueprints/anatomy-of-individual-api-requests.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/api-design-first.jpeg b/docs/more/api/images/blueprints/api-design-first.jpeg deleted file mode 100644 index 95bbc14..0000000 Binary files a/docs/more/api/images/blueprints/api-design-first.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/api-design-first.jpg b/docs/more/api/images/blueprints/api-design-first.jpg deleted file mode 100644 index cdf1fa9..0000000 Binary files a/docs/more/api/images/blueprints/api-design-first.jpg and /dev/null differ diff --git a/docs/more/api/images/blueprints/api-documentation-is-fundamental.jpeg b/docs/more/api/images/blueprints/api-documentation-is-fundamental.jpeg deleted file mode 100644 index 4f81943..0000000 Binary files a/docs/more/api/images/blueprints/api-documentation-is-fundamental.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/api-regulation-is-here.jpeg b/docs/more/api/images/blueprints/api-regulation-is-here.jpeg deleted file mode 100644 index 5c4777a..0000000 Binary files a/docs/more/api/images/blueprints/api-regulation-is-here.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/api-workspaces-as-the-foundation.jpeg b/docs/more/api/images/blueprints/api-workspaces-as-the-foundation.jpeg deleted file mode 100644 index 5beb950..0000000 Binary files a/docs/more/api/images/blueprints/api-workspaces-as-the-foundation.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/applications-and-their-apis.png b/docs/more/api/images/blueprints/applications-and-their-apis.png deleted file mode 100644 index 3ea3c8a..0000000 Binary files a/docs/more/api/images/blueprints/applications-and-their-apis.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/bringing-more-observability-to-operations.jpeg b/docs/more/api/images/blueprints/bringing-more-observability-to-operations.jpeg deleted file mode 100644 index 04ed594..0000000 Binary files a/docs/more/api/images/blueprints/bringing-more-observability-to-operations.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/build-a-business-workflow.jpeg b/docs/more/api/images/blueprints/build-a-business-workflow.jpeg deleted file mode 100644 index f6bf5c4..0000000 Binary files a/docs/more/api/images/blueprints/build-a-business-workflow.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/community-theatrical-production.jpeg b/docs/more/api/images/blueprints/community-theatrical-production.jpeg deleted file mode 100644 index 8959ba8..0000000 Binary files a/docs/more/api/images/blueprints/community-theatrical-production.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/creating-a-new-graphql-api.jpeg b/docs/more/api/images/blueprints/creating-a-new-graphql-api.jpeg deleted file mode 100644 index 1917468..0000000 Binary files a/docs/more/api/images/blueprints/creating-a-new-graphql-api.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/defining-capabilities-using-collections.jpeg b/docs/more/api/images/blueprints/defining-capabilities-using-collections.jpeg deleted file mode 100644 index d00890f..0000000 Binary files a/docs/more/api/images/blueprints/defining-capabilities-using-collections.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/discovery-as-the-default.jpeg b/docs/more/api/images/blueprints/discovery-as-the-default.jpeg deleted file mode 100644 index c0adb0d..0000000 Binary files a/docs/more/api/images/blueprints/discovery-as-the-default.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/fundamental-api-testing.jpeg b/docs/more/api/images/blueprints/fundamental-api-testing.jpeg deleted file mode 100644 index 0458460..0000000 Binary files a/docs/more/api/images/blueprints/fundamental-api-testing.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/generate-collection-from-mobile-application.jpeg b/docs/more/api/images/blueprints/generate-collection-from-mobile-application.jpeg deleted file mode 100644 index ecf9bb6..0000000 Binary files a/docs/more/api/images/blueprints/generate-collection-from-mobile-application.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/generate-collection-from-web-application.png b/docs/more/api/images/blueprints/generate-collection-from-web-application.png deleted file mode 100644 index 265950e..0000000 Binary files a/docs/more/api/images/blueprints/generate-collection-from-web-application.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/generate-openapi-using-code-annotations.png b/docs/more/api/images/blueprints/generate-openapi-using-code-annotations.png deleted file mode 100644 index 25d624d..0000000 Binary files a/docs/more/api/images/blueprints/generate-openapi-using-code-annotations.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/having-an-api-strategy.jpeg b/docs/more/api/images/blueprints/having-an-api-strategy.jpeg deleted file mode 100644 index 2687ea2..0000000 Binary files a/docs/more/api/images/blueprints/having-an-api-strategy.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/high-level-api-governance.jpeg b/docs/more/api/images/blueprints/high-level-api-governance.jpeg deleted file mode 100644 index aa8c40b..0000000 Binary files a/docs/more/api/images/blueprints/high-level-api-governance.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/import-wsdl-for-an-existing-soap-web-service.png b/docs/more/api/images/blueprints/import-wsdl-for-an-existing-soap-web-service.png deleted file mode 100644 index 2bb29b4..0000000 Binary files a/docs/more/api/images/blueprints/import-wsdl-for-an-existing-soap-web-service.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/improving-team-productivity.png b/docs/more/api/images/blueprints/improving-team-productivity.png deleted file mode 100644 index a10d544..0000000 Binary files a/docs/more/api/images/blueprints/improving-team-productivity.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/internal-landscape-mapping.jpeg b/docs/more/api/images/blueprints/internal-landscape-mapping.jpeg deleted file mode 100644 index 214fb87..0000000 Binary files a/docs/more/api/images/blueprints/internal-landscape-mapping.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/investing-in-test-automation.jpeg b/docs/more/api/images/blueprints/investing-in-test-automation.jpeg deleted file mode 100644 index d00890f..0000000 Binary files a/docs/more/api/images/blueprints/investing-in-test-automation.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/leveraging-public-workspaces.jpeg b/docs/more/api/images/blueprints/leveraging-public-workspaces.jpeg deleted file mode 100644 index 2a94fe5..0000000 Binary files a/docs/more/api/images/blueprints/leveraging-public-workspaces.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/low-level-api-governance.jpeg b/docs/more/api/images/blueprints/low-level-api-governance.jpeg deleted file mode 100644 index 48ec24e..0000000 Binary files a/docs/more/api/images/blueprints/low-level-api-governance.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/making-quality-a-ubiquitous-part-of-operations.jpeg b/docs/more/api/images/blueprints/making-quality-a-ubiquitous-part-of-operations.jpeg deleted file mode 100644 index 266c3c6..0000000 Binary files a/docs/more/api/images/blueprints/making-quality-a-ubiquitous-part-of-operations.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/manually-define-collection-for-existing-api.jpeg b/docs/more/api/images/blueprints/manually-define-collection-for-existing-api.jpeg deleted file mode 100644 index edd3768..0000000 Binary files a/docs/more/api/images/blueprints/manually-define-collection-for-existing-api.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/more-organizational-visibility.jpeg b/docs/more/api/images/blueprints/more-organizational-visibility.jpeg deleted file mode 100644 index ac9bd1e..0000000 Binary files a/docs/more/api/images/blueprints/more-organizational-visibility.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/new-api-using-the-asyncapi-specification.jpeg b/docs/more/api/images/blueprints/new-api-using-the-asyncapi-specification.jpeg deleted file mode 100644 index ec581ea..0000000 Binary files a/docs/more/api/images/blueprints/new-api-using-the-asyncapi-specification.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/new-api-using-the-openapi-specification.jpg b/docs/more/api/images/blueprints/new-api-using-the-openapi-specification.jpg deleted file mode 100644 index bad5870..0000000 Binary files a/docs/more/api/images/blueprints/new-api-using-the-openapi-specification.jpg and /dev/null differ diff --git a/docs/more/api/images/blueprints/platform-api-governance.jpeg b/docs/more/api/images/blueprints/platform-api-governance.jpeg deleted file mode 100644 index c0adb0d..0000000 Binary files a/docs/more/api/images/blueprints/platform-api-governance.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/prioritizing-api-first.jpeg b/docs/more/api/images/blueprints/prioritizing-api-first.jpeg deleted file mode 100644 index bfaa911..0000000 Binary files a/docs/more/api/images/blueprints/prioritizing-api-first.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/prototyping-a-new-api-using-a-postman-collection.jpeg b/docs/more/api/images/blueprints/prototyping-a-new-api-using-a-postman-collection.jpeg deleted file mode 100644 index 71993a4..0000000 Binary files a/docs/more/api/images/blueprints/prototyping-a-new-api-using-a-postman-collection.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/providing-onboarding-collections.png b/docs/more/api/images/blueprints/providing-onboarding-collections.png deleted file mode 100644 index 5b1ce0f..0000000 Binary files a/docs/more/api/images/blueprints/providing-onboarding-collections.png and /dev/null differ diff --git a/docs/more/api/images/blueprints/public-teams-workspaces-and-apis.jpeg b/docs/more/api/images/blueprints/public-teams-workspaces-and-apis.jpeg deleted file mode 100644 index 1de1628..0000000 Binary files a/docs/more/api/images/blueprints/public-teams-workspaces-and-apis.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/putting-security-in-the-hands-of-developers-with-collections.jpeg b/docs/more/api/images/blueprints/putting-security-in-the-hands-of-developers-with-collections.jpeg deleted file mode 100644 index 049f251..0000000 Binary files a/docs/more/api/images/blueprints/putting-security-in-the-hands-of-developers-with-collections.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/starting-with-a-reference-collection.jpeg b/docs/more/api/images/blueprints/starting-with-a-reference-collection.jpeg deleted file mode 100644 index 0458460..0000000 Binary files a/docs/more/api/images/blueprints/starting-with-a-reference-collection.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/the-base-of-the-api-lifecycle.jpeg b/docs/more/api/images/blueprints/the-base-of-the-api-lifecycle.jpeg deleted file mode 100644 index 36ad4aa..0000000 Binary files a/docs/more/api/images/blueprints/the-base-of-the-api-lifecycle.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/versioning-governance.jpeg b/docs/more/api/images/blueprints/versioning-governance.jpeg deleted file mode 100644 index e9ecf2e..0000000 Binary files a/docs/more/api/images/blueprints/versioning-governance.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/why-openapi-matters-to-operations.jpeg b/docs/more/api/images/blueprints/why-openapi-matters-to-operations.jpeg deleted file mode 100644 index bad5870..0000000 Binary files a/docs/more/api/images/blueprints/why-openapi-matters-to-operations.jpeg and /dev/null differ diff --git a/docs/more/api/images/blueprints/workspace-standardization.jpeg b/docs/more/api/images/blueprints/workspace-standardization.jpeg deleted file mode 100644 index d3e1441..0000000 Binary files a/docs/more/api/images/blueprints/workspace-standardization.jpeg and /dev/null differ diff --git a/docs/more/api/images/code-first-icon.png b/docs/more/api/images/code-first-icon.png deleted file mode 100644 index 3ae341e..0000000 Binary files a/docs/more/api/images/code-first-icon.png and /dev/null differ diff --git a/docs/more/api/images/code-first-icon.pxm b/docs/more/api/images/code-first-icon.pxm deleted file mode 100644 index a4a86de..0000000 Binary files a/docs/more/api/images/code-first-icon.pxm and /dev/null differ diff --git a/docs/more/api/images/collection-icon.png b/docs/more/api/images/collection-icon.png deleted file mode 100644 index 9d68f23..0000000 Binary files a/docs/more/api/images/collection-icon.png and /dev/null differ diff --git a/docs/more/api/images/confluence-icon.png b/docs/more/api/images/confluence-icon.png deleted file mode 100755 index 5ba0831..0000000 Binary files a/docs/more/api/images/confluence-icon.png and /dev/null differ diff --git a/docs/more/api/images/design-first-icon.png b/docs/more/api/images/design-first-icon.png deleted file mode 100644 index c751c7f..0000000 Binary files a/docs/more/api/images/design-first-icon.png and /dev/null differ diff --git a/docs/more/api/images/design-first-icon.pxm b/docs/more/api/images/design-first-icon.pxm deleted file mode 100644 index 99ff684..0000000 Binary files a/docs/more/api/images/design-first-icon.pxm and /dev/null differ diff --git a/docs/more/api/images/github-icon.png b/docs/more/api/images/github-icon.png deleted file mode 100755 index aa8e1b7..0000000 Binary files a/docs/more/api/images/github-icon.png and /dev/null differ diff --git a/docs/more/api/images/github-icon.pxm b/docs/more/api/images/github-icon.pxm deleted file mode 100755 index 610de6c..0000000 Binary files a/docs/more/api/images/github-icon.pxm and /dev/null differ diff --git a/docs/more/api/images/graphql-icon.png b/docs/more/api/images/graphql-icon.png deleted file mode 100644 index 7a8e39c..0000000 Binary files a/docs/more/api/images/graphql-icon.png and /dev/null differ diff --git a/docs/more/api/images/grpc-icon.png b/docs/more/api/images/grpc-icon.png deleted file mode 100644 index 5351cdf..0000000 Binary files a/docs/more/api/images/grpc-icon.png and /dev/null differ diff --git a/docs/more/api/images/grpc-icon.pxm b/docs/more/api/images/grpc-icon.pxm deleted file mode 100644 index 5c27306..0000000 Binary files a/docs/more/api/images/grpc-icon.pxm and /dev/null differ diff --git a/docs/more/api/images/json-schema-icon.png b/docs/more/api/images/json-schema-icon.png deleted file mode 100644 index 6d1271c..0000000 Binary files a/docs/more/api/images/json-schema-icon.png and /dev/null differ diff --git a/docs/more/api/images/new-relic-icon.png b/docs/more/api/images/new-relic-icon.png deleted file mode 100755 index 73d7e46..0000000 Binary files a/docs/more/api/images/new-relic-icon.png and /dev/null differ diff --git a/docs/more/api/images/openapi-icon 2.pxm b/docs/more/api/images/openapi-icon 2.pxm deleted file mode 100644 index 6e0702d..0000000 Binary files a/docs/more/api/images/openapi-icon 2.pxm and /dev/null differ diff --git a/docs/more/api/images/openapi-icon.png b/docs/more/api/images/openapi-icon.png deleted file mode 100644 index 3ac2cee..0000000 Binary files a/docs/more/api/images/openapi-icon.png and /dev/null differ diff --git a/docs/more/api/images/postman-icon.png b/docs/more/api/images/postman-icon.png deleted file mode 100644 index 83c0ed5..0000000 Binary files a/docs/more/api/images/postman-icon.png and /dev/null differ diff --git a/docs/more/api/images/postman-screenshot.png b/docs/more/api/images/postman-screenshot.png deleted file mode 100644 index 3321a18..0000000 Binary files a/docs/more/api/images/postman-screenshot.png and /dev/null differ diff --git a/docs/more/api/images/producer-consumer-lifecycle-handoff.png b/docs/more/api/images/producer-consumer-lifecycle-handoff.png deleted file mode 100644 index e6ef71a..0000000 Binary files a/docs/more/api/images/producer-consumer-lifecycle-handoff.png and /dev/null differ diff --git a/docs/more/api/images/producer-consumer-lifecycle.png b/docs/more/api/images/producer-consumer-lifecycle.png deleted file mode 100644 index a3553c4..0000000 Binary files a/docs/more/api/images/producer-consumer-lifecycle.png and /dev/null differ diff --git a/docs/more/api/images/proxy-icon.png b/docs/more/api/images/proxy-icon.png deleted file mode 100644 index bb1d8ab..0000000 Binary files a/docs/more/api/images/proxy-icon.png and /dev/null differ diff --git a/docs/more/api/images/spectral-icon.png b/docs/more/api/images/spectral-icon.png deleted file mode 100644 index 76be336..0000000 Binary files a/docs/more/api/images/spectral-icon.png and /dev/null differ diff --git a/docs/more/api/images/visual-studio-icon.png b/docs/more/api/images/visual-studio-icon.png deleted file mode 100644 index 90f6a83..0000000 Binary files a/docs/more/api/images/visual-studio-icon.png and /dev/null differ diff --git a/docs/more/api/images/websockets-icon.png b/docs/more/api/images/websockets-icon.png deleted file mode 100644 index 5ed15e0..0000000 Binary files a/docs/more/api/images/websockets-icon.png and /dev/null differ diff --git a/docs/more/api/images/youtube-icon.png b/docs/more/api/images/youtube-icon.png deleted file mode 100644 index 5abdb7b..0000000 Binary files a/docs/more/api/images/youtube-icon.png and /dev/null differ diff --git a/docs/more/arch/references.yaml b/docs/more/arch/references.yaml deleted file mode 100644 index ff5217f..0000000 --- a/docs/more/arch/references.yaml +++ /dev/null @@ -1,8 +0,0 @@ -low-code: - backend: - - https://www.jhipster.tech/tech-board/ - frontend: - - https://github.com/ReactBricks -code-composition: - - https://developer.entando.com/v7.0/docs/ - diff --git a/docs/more/backup/cicd/quality-setting.md b/docs/more/backup/cicd/quality-setting.md deleted file mode 100644 index c5c167c..0000000 --- a/docs/more/backup/cicd/quality-setting.md +++ /dev/null @@ -1,112 +0,0 @@ -# README - -A Java Template Project support: - -- [X] MAVEN Java Lib Template -- [] MAVEN JAVA UI Testing Template -- [] MAVEN Springboot Template -- [] Github Action -- [] Code Coverage -- [] statistics Analysis -- [] CI/CD Pipeline Support - -## Java Project - -In real dev activity, there are a few things included in daily workflow: - -1. Unit Testing -2. Code Coverage -3. Test Report -4. Code statistics -5. Version Checker -6. jenkins pipeline -7. Docker files -8. K8S support 9 ...... - -This template project is to target to make setting project easier. - -## CheckStyle - -- [spotless] -- [checkstyle](https://github.com/checkstyle/checkstyle) -- [checkstyle-github](https://github.com/checkstyle) -- [google java format](https://github.com/google/google-java-format) -- [google-style-precommit-check](https://github.com/maltzj/google-style-precommit-hook) -- [google style format maven plugin](https://github.com/Cosium/git-code-format-maven-plugin) - -with google check style setting, and maven command is ***mvn checkstyle:check*** - -```xml - - - org.apache.maven.plugins - maven-checkstyle-plugin - 3.1.1 - - google_checks.xml - UTF-8 - true - true - false - - - - validate - validate - - check - - - - -``` - -## spotless check - -- [spotless](https://github.com/diffplug/spotless/) - -command for running spotless check: - -```shell -mvn spotless:check -mvn spotless:apply -``` - -## JUNIT setup - -## maven surefire plugin setup - -- [maven-surefire](https://maven.apache.org/surefire/maven-surefire-plugin/index.html) - -## Sonar: TODO - -- [sonarqube](https://www.sonarqube.org/) -- [sonar source](https://www.sonarsource.com/) -- [sonar plugins marketplace](https://www.sonarplugins.com/) - -## Unit Testing coverage - -- [junit]() -- [testng]() - -## Github CICD: TODO - -- [Action](../github/workflows/build.yml) -- [dependablebot](../github/dependabot.yml) - -## Gitlab-cicd - -- [gitlab-cicd](https://docs.gitlab.com/ee/ci/yaml/README.html) - -## Application Security Scanner - -please refer,[gitlab-security scan](https://docs.gitlab.com/ee/user/application_security/security_dashboard/index.html) - -## Gradle To Maven: TODO - -reference: [gradle to maven](https://www.baeldung.com/gradle-build-to-maven-pom) - - -## Examples - -- [Security & QA, by L1NNA Lab](https://github.com/CISC-CMPE-327) \ No newline at end of file diff --git a/docs/more/backup/cicd/scripts.sh b/docs/more/backup/cicd/scripts.sh deleted file mode 100644 index 0f460a6..0000000 --- a/docs/more/backup/cicd/scripts.sh +++ /dev/null @@ -1,4 +0,0 @@ -#!/usr/bin/env bash - -mvn clean checkstyle:check -mvn clean checkstyle:checkstyle-aggregate \ No newline at end of file diff --git a/docs/more/backup/help/maven-wrapper.md b/docs/more/backup/help/maven-wrapper.md deleted file mode 100644 index cb6db43..0000000 --- a/docs/more/backup/help/maven-wrapper.md +++ /dev/null @@ -1,23 +0,0 @@ -# MAVEN Wrapper setup - -```shell -mvn -N wrapper:wrapper -``` -生成了: -- .mvnw -- .mvn -- mvnw.cmd - -```shell -README.md fluentqa-parent mvnw pom.xml -docs fluentqa-thirdparty mvnw.cmd -``` - - -## mvnw commands - -```shell -./mvnw clean all -``` - -其实命令和maven的命令基本一样. \ No newline at end of file diff --git a/docs/more/backup/help/spring-gradle-list.md b/docs/more/backup/help/spring-gradle-list.md deleted file mode 100644 index 585efe2..0000000 --- a/docs/more/backup/help/spring-gradle-list.md +++ /dev/null @@ -1,93 +0,0 @@ -# Getting Started - -### Reference Documentation -For further reference, please consider the following sections: - -* [Official Gradle documentation](https://docs.gradle.org) -* [Spring Boot Gradle Plugin Reference Guide](https://docs.spring.io/spring-boot/docs/2.6.1/gradle-plugin/reference/html/) -* [Create an OCI image](https://docs.spring.io/spring-boot/docs/2.6.1/gradle-plugin/reference/html/#build-image) -* [Spring Integration AMQP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/amqp.html) -* [Spring Integration JDBC Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jdbc.html) -* [Spring Integration JPA Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jpa.html) -* [Spring Integration Redis Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/redis.html) -* [Spring Integration Test Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/testing.html) -* [Spring Integration Apache Kafka Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/kafka.html) -* [Spring Integration Security Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/security.html) -* [Spring Integration HTTP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/http.html) -* [Spring Integration STOMP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/stomp.html) -* [Spring Integration WebSocket Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/websocket.html) -* [Spring Web](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-developing-web-applications) -* [Rest Repositories](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-use-exposing-spring-data-repositories-rest-endpoint) -* [Spring Boot DevTools](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#using-boot-devtools) -* [Spring Configuration Processor](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#configuration-metadata-annotation-processor) -* [Vaadin](https://vaadin.com/spring) -* [Apache Freemarker](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Groovy Templates](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Spring Security](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-security) -* [Spring LDAP](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-ldap) -* [JDBC API](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-sql) -* [Spring Data JPA](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jpa-and-spring-data) -* [Spring Data JDBC](https://docs.spring.io/spring-data/jdbc/docs/current/reference/html/) -* [MyBatis Framework](https://mybatis.org/spring-boot-starter/mybatis-spring-boot-autoconfigure/) -* [Flyway Migration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-execute-flyway-database-migrations-on-startup) -* [JOOQ Access Layer](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jooq) -* [Spring Data Redis (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-redis) -* [Resilience4J](https://cloud.spring.io/spring-cloud-static/spring-cloud-circuitbreaker/current/reference/html) -* [Spring Data Elasticsearch (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-elasticsearch) -* [Spring Boot Actuator](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#production-ready) -* [Spring cache abstraction](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-caching) -* [Wavefront for Spring Boot documentation](https://docs.wavefront.com/wavefront_springboot.html) -* [Wavefront for Spring Boot repository](https://github.com/wavefrontHQ/wavefront-spring-boot) -* [Function](https://cloud.spring.io/spring-cloud-function/) -* [Vault Client Quick Start](https://docs.spring.io/spring-cloud-vault/docs/current/reference/html/#client-side-usage) -* [Spring for RabbitMQ](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-amqp) -* [Spring Integration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-integration) -* [Apache Kafka Streams Support](https://docs.spring.io/spring-kafka/docs/current/reference/html/_reference.html#kafka-streams) -* [Apache Kafka Streams Binding Capabilities of Spring Cloud Stream](https://docs.spring.io/spring-cloud-stream/docs/current/reference/htmlsingle/#_kafka_streams_binding_capabilities_of_spring_cloud_stream) -* [Spring for Apache Kafka](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-kafka) -* [WebSocket](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-websockets) -* [Mustache](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Thymeleaf](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) - -### Guides -The following guides illustrate how to use some features concretely: - -* [Building a RESTful Web Service](https://spring.io/guides/gs/rest-service/) -* [Serving Web Content with Spring MVC](https://spring.io/guides/gs/serving-web-content/) -* [Building REST services with Spring](https://spring.io/guides/tutorials/bookmarks/) -* [Accessing JPA Data with REST](https://spring.io/guides/gs/accessing-data-rest/) -* [Accessing Neo4j Data with REST](https://spring.io/guides/gs/accessing-neo4j-data-rest/) -* [Accessing MongoDB Data with REST](https://spring.io/guides/gs/accessing-mongodb-data-rest/) -* [Creating CRUD UI with Vaadin](https://spring.io/guides/gs/crud-with-vaadin/) -* [Securing a Web Application](https://spring.io/guides/gs/securing-web/) -* [Spring Boot and OAuth2](https://spring.io/guides/tutorials/spring-boot-oauth2/) -* [Authenticating a User with LDAP](https://spring.io/guides/gs/authenticating-ldap/) -* [Accessing Relational Data using JDBC with Spring](https://spring.io/guides/gs/relational-data-access/) -* [Managing Transactions](https://spring.io/guides/gs/managing-transactions/) -* [Accessing Data with JPA](https://spring.io/guides/gs/accessing-data-jpa/) -* [Using Spring Data JDBC](https://github.com/spring-projects/spring-data-examples/tree/master/jdbc/basics) -* [MyBatis Quick Start](https://github.com/mybatis/spring-boot-starter/wiki/Quick-Start) -* [Accessing data with MySQL](https://spring.io/guides/gs/accessing-data-mysql/) -* [Messaging with Redis](https://spring.io/guides/gs/messaging-redis/) -* [Building a RESTful Web Service with Spring Boot Actuator](https://spring.io/guides/gs/actuator-service/) -* [Caching Data with Spring](https://spring.io/guides/gs/caching/) -* [Messaging with RabbitMQ](https://spring.io/guides/gs/messaging-rabbitmq/) -* [Integrating Data](https://spring.io/guides/gs/integration/) -* [Samples for using Apache Kafka Streams with Spring Cloud stream](https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/kafka-streams-samples) -* [Using WebSocket to build an interactive web application](https://spring.io/guides/gs/messaging-stomp-websocket/) -* [Handling Form Submission](https://spring.io/guides/gs/handling-form-submission/) - -### Additional Links -These additional references should also help you: - -* [Gradle Build Scans – insights for your project's build](https://scans.gradle.com#gradle) -* [Various sample apps using Spring Cloud Function](https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples) - -## Observability with Wavefront - -If you don't have a Wavefront account, the starter will create a freemium account for you. -The URL to access the Wavefront Service dashboard is logged on startup. - -You can also access your dashboard using the `/actuator/wavefront` endpoint. - -Finally, you can opt-in for distributed tracing by adding the Spring Cloud Sleuth starter. diff --git a/docs/more/backup/help/spring-maven-list.md b/docs/more/backup/help/spring-maven-list.md deleted file mode 100644 index eb850df..0000000 --- a/docs/more/backup/help/spring-maven-list.md +++ /dev/null @@ -1,92 +0,0 @@ -# Getting Started - -### Reference Documentation -For further reference, please consider the following sections: - -* [Official Apache Maven documentation](https://maven.apache.org/guides/index.html) -* [Spring Boot Maven Plugin Reference Guide](https://docs.spring.io/spring-boot/docs/2.6.1/maven-plugin/reference/html/) -* [Create an OCI image](https://docs.spring.io/spring-boot/docs/2.6.1/maven-plugin/reference/html/#build-image) -* [Spring Integration AMQP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/amqp.html) -* [Spring Integration JDBC Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jdbc.html) -* [Spring Integration JPA Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jpa.html) -* [Spring Integration Redis Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/redis.html) -* [Spring Integration Test Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/testing.html) -* [Spring Integration Apache Kafka Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/kafka.html) -* [Spring Integration Security Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/security.html) -* [Spring Integration HTTP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/http.html) -* [Spring Integration STOMP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/stomp.html) -* [Spring Integration WebSocket Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/websocket.html) -* [Spring Web](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-developing-web-applications) -* [Rest Repositories](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-use-exposing-spring-data-repositories-rest-endpoint) -* [Spring Boot DevTools](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#using-boot-devtools) -* [Spring Configuration Processor](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#configuration-metadata-annotation-processor) -* [Vaadin](https://vaadin.com/spring) -* [Apache Freemarker](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Groovy Templates](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Spring Security](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-security) -* [Spring LDAP](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-ldap) -* [JDBC API](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-sql) -* [Spring Data JPA](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jpa-and-spring-data) -* [Spring Data JDBC](https://docs.spring.io/spring-data/jdbc/docs/current/reference/html/) -* [MyBatis Framework](https://mybatis.org/spring-boot-starter/mybatis-spring-boot-autoconfigure/) -* [Flyway Migration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-execute-flyway-database-migrations-on-startup) -* [JOOQ Access Layer](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jooq) -* [Spring Data Redis (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-redis) -* [Resilience4J](https://cloud.spring.io/spring-cloud-static/spring-cloud-circuitbreaker/current/reference/html) -* [Spring Data Elasticsearch (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-elasticsearch) -* [Spring Boot Actuator](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#production-ready) -* [Spring cache abstraction](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-caching) -* [Wavefront for Spring Boot documentation](https://docs.wavefront.com/wavefront_springboot.html) -* [Wavefront for Spring Boot repository](https://github.com/wavefrontHQ/wavefront-spring-boot) -* [Function](https://cloud.spring.io/spring-cloud-function/) -* [Vault Client Quick Start](https://docs.spring.io/spring-cloud-vault/docs/current/reference/html/#client-side-usage) -* [Spring for RabbitMQ](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-amqp) -* [Spring Integration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-integration) -* [Apache Kafka Streams Support](https://docs.spring.io/spring-kafka/docs/current/reference/html/_reference.html#kafka-streams) -* [Apache Kafka Streams Binding Capabilities of Spring Cloud Stream](https://docs.spring.io/spring-cloud-stream/docs/current/reference/htmlsingle/#_kafka_streams_binding_capabilities_of_spring_cloud_stream) -* [Spring for Apache Kafka](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-kafka) -* [WebSocket](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-websockets) -* [Mustache](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Thymeleaf](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) - -### Guides -The following guides illustrate how to use some features concretely: - -* [Building a RESTful Web Service](https://spring.io/guides/gs/rest-service/) -* [Serving Web Content with Spring MVC](https://spring.io/guides/gs/serving-web-content/) -* [Building REST services with Spring](https://spring.io/guides/tutorials/bookmarks/) -* [Accessing JPA Data with REST](https://spring.io/guides/gs/accessing-data-rest/) -* [Accessing Neo4j Data with REST](https://spring.io/guides/gs/accessing-neo4j-data-rest/) -* [Accessing MongoDB Data with REST](https://spring.io/guides/gs/accessing-mongodb-data-rest/) -* [Creating CRUD UI with Vaadin](https://spring.io/guides/gs/crud-with-vaadin/) -* [Securing a Web Application](https://spring.io/guides/gs/securing-web/) -* [Spring Boot and OAuth2](https://spring.io/guides/tutorials/spring-boot-oauth2/) -* [Authenticating a User with LDAP](https://spring.io/guides/gs/authenticating-ldap/) -* [Accessing Relational Data using JDBC with Spring](https://spring.io/guides/gs/relational-data-access/) -* [Managing Transactions](https://spring.io/guides/gs/managing-transactions/) -* [Accessing Data with JPA](https://spring.io/guides/gs/accessing-data-jpa/) -* [Using Spring Data JDBC](https://github.com/spring-projects/spring-data-examples/tree/master/jdbc/basics) -* [MyBatis Quick Start](https://github.com/mybatis/spring-boot-starter/wiki/Quick-Start) -* [Accessing data with MySQL](https://spring.io/guides/gs/accessing-data-mysql/) -* [Messaging with Redis](https://spring.io/guides/gs/messaging-redis/) -* [Building a RESTful Web Service with Spring Boot Actuator](https://spring.io/guides/gs/actuator-service/) -* [Caching Data with Spring](https://spring.io/guides/gs/caching/) -* [Messaging with RabbitMQ](https://spring.io/guides/gs/messaging-rabbitmq/) -* [Integrating Data](https://spring.io/guides/gs/integration/) -* [Samples for using Apache Kafka Streams with Spring Cloud stream](https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/kafka-streams-samples) -* [Using WebSocket to build an interactive web application](https://spring.io/guides/gs/messaging-stomp-websocket/) -* [Handling Form Submission](https://spring.io/guides/gs/handling-form-submission/) - -### Additional Links -These additional references should also help you: - -* [Various sample apps using Spring Cloud Function](https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples) - -## Observability with Wavefront - -If you don't have a Wavefront account, the starter will create a freemium account for you. -The URL to access the Wavefront Service dashboard is logged on startup. - -You can also access your dashboard using the `/actuator/wavefront` endpoint. - -Finally, you can opt-in for distributed tracing by adding the Spring Cloud Sleuth starter. diff --git a/docs/more/backup/openapi/1-openapi-diff.md b/docs/more/backup/openapi/1-openapi-diff.md deleted file mode 100644 index c497d54..0000000 --- a/docs/more/backup/openapi/1-openapi-diff.md +++ /dev/null @@ -1,37 +0,0 @@ -# OpenAPI Diff - -- Maven POM setting -```xml - - org.openapitools.openapidiff - openapi-diff-core - ${openapi-diff-version} - -``` - -- How to Get OpenAPI differences - * API Difference - * API Difference in HTMLRender - * API Difference in JSONRender - * API Difference in Markdown Render -```java - String originPetStore = "./petstore_v3.yml"; - String newPetStore = "./petstore_v2.json"; - - ChangedOpenApi diff = OpenApiCompare.fromLocations(originPetStore,newPetStore); - System.out.println(diff); - - //write to html - String html = new HtmlRender("Changelog", - "http://deepoove.com/swagger-diff/stylesheets/demo.css") - .render(diff); - FileUtils.writeStringToFile(new File("apiDifference.html"),html, - Charset.defaultCharset()); - String markdownRender = new MarkdownRender().render(diff); - FileUtils.writeStringToFile(new File("apiDifference.md"),markdownRender, - Charset.defaultCharset()); - - String jsonDiff = new JsonRender().render(diff); - FileUtils.writeStringToFile(new File("apiDifference.json"),jsonDiff, - Charset.defaultCharset()); -``` \ No newline at end of file diff --git a/docs/more/backup/openapi/10-references.md b/docs/more/backup/openapi/10-references.md deleted file mode 100644 index 64a3290..0000000 --- a/docs/more/backup/openapi/10-references.md +++ /dev/null @@ -1,7 +0,0 @@ -# References - -## OpenAPI Resource - -- [openapi github](https://github.com/OAI) -- [openapi-diff](https://github.com/OpenAPITools/openapi-diff) openapi diff tools to - compare different api version \ No newline at end of file diff --git a/docs/more/backup/thirdparty/smook.md b/docs/more/backup/thirdparty/smook.md deleted file mode 100644 index dfca14e..0000000 --- a/docs/more/backup/thirdparty/smook.md +++ /dev/null @@ -1,19 +0,0 @@ -# smooks intro - -While Smooks can be used as a lightweight platform on which to build your own custom processing logic (for a wide range of data formats ``out-of-the-box''), it comes with some very useful features that can be used individually, or seamlessly combined together. -[docs](https://www.smooks.org/v2/documentation/) -## Java Binding - -![img](https://www.smooks.org/v2/assets/images/Binding.png) - -## Transformation -![img](https://www.smooks.org/v2/assets/images/Transform.png) - -## Huge Message Processing - -![img](https://www.smooks.org/v2/assets/images/Hugetrans.png) - -## Message Enrichment - -![img](https://www.smooks.org/v2/assets/images/Enrich.png) - diff --git a/docs/more/basic/JAVA_BASIC.png b/docs/more/basic/JAVA_BASIC.png deleted file mode 100644 index e494321..0000000 Binary files a/docs/more/basic/JAVA_BASIC.png and /dev/null differ diff --git a/docs/more/ddd/README.md b/docs/more/ddd/README.md deleted file mode 100644 index 7173631..0000000 --- a/docs/more/ddd/README.md +++ /dev/null @@ -1,1299 +0,0 @@ -# Domain-Driven Hexagon - -**Check out my other repositories**: - -- [Backend best practices](https://github.com/Sairyss/backend-best-practices) - Best practices, tools and guidelines for backend development. -- [System Design Patterns](https://github.com/Sairyss/system-design-patterns) - list of topics and resources related to distributed systems, system design, microservices, scalability and performance, etc. -- [Full Stack starter template](https://github.com/Sairyss/fullstack-starter-template) - template for full stack applications based on TypeScript, React, Vite, ChakraUI, tRPC, Fastify, Prisma, zod, etc. - ---- - -The main emphasis of this project is to provide recommendations on how to design software applications. This readme includes techniques, tools, best practices, architectural patterns and guidelines gathered from different sources. - -Code examples are written using [NodeJS](https://nodejs.org/en/), [TypeScript](https://www.typescriptlang.org/), [NestJS](https://docs.nestjs.com/) framework and [Slonik](https://github.com/gajus/slonik) for the database access. - -Patterns and principles presented here are **framework/language agnostic**. Therefore, the above technologies can be easily replaced with any alternative. No matter what language or framework is used, any application can benefit from principles described below. - -**Note**: code examples are adapted to TypeScript and frameworks mentioned above.
-(Implementations in other languages will look differently) - -**Everything below is provided as a recommendation, not a rule**. Different projects have different requirements, so any pattern mentioned in this readme should be adjusted to project needs or even skipped entirely if it doesn't fit. In real world production applications, you will most likely only need a fraction of those patterns depending on your use cases. More info in [this](#general-recommendations-on-architectures-best-practices-design-patterns-and-principles) section. - ---- - -- [Domain-Driven Hexagon](#domain-driven-hexagon) -- [Architecture](#architecture) - - [Pros](#pros) - - [Cons](#cons) -- [Diagram](#diagram) -- [Modules](#modules) -- [Application Core](#application-core) -- [Application layer](#application-layer) - - [Application Services](#application-services) - - [Commands and Queries](#commands-and-queries) - - [Commands](#commands) - - [Queries](#queries) - - [Ports](#ports) -- [Domain Layer](#domain-layer) - - [Entities](#entities) - - [Aggregates](#aggregates) - - [Domain Events](#domain-events) - - [Integration Events](#integration-events) - - [Domain Services](#domain-services) - - [Value objects](#value-objects) - - [Domain Invariants](#domain-invariants) - - [Replacing primitives with Value Objects](#replacing-primitives-with-value-objects) - - [Make illegal states unrepresentable](#make-illegal-states-unrepresentable) - - [Validation at compile time](#validation-at-compile-time) - - [Validation at runtime](#validation-at-runtime) - - [Guarding vs validating](#guarding-vs-validating) - - [Domain Errors](#domain-errors) - - [Using libraries inside Application's core](#using-libraries-inside-applications-core) -- [Interface Adapters](#interface-adapters) - - [Controllers](#controllers) - - [Resolvers](#resolvers) - - [DTOs](#dtos) - - [Request DTOs](#request-dtos) - - [Response DTOs](#response-dtos) - - [Additional recommendations](#additional-recommendations) - - [Local DTOs](#local-dtos) -- [Infrastructure layer](#infrastructure-layer) - - [Adapters](#adapters) - - [Repositories](#repositories) - - [Persistence models](#persistence-models) - - [Other things that can be a part of Infrastructure layer](#other-things-that-can-be-a-part-of-infrastructure-layer) -- [Other recommendations](#other-recommendations) - - [General recommendations on architectures, best practices, design patterns and principles](#general-recommendations-on-architectures-best-practices-design-patterns-and-principles) - - [Recommendations for smaller APIs](#recommendations-for-smaller-apis) - - [Behavioral Testing](#behavioral-testing) - - [Folder and File Structure](#folder-and-file-structure) - - [File names](#file-names) - - [Enforcing architecture](#enforcing-architecture) - - [Prevent massive inheritance chains](#prevent-massive-inheritance-chains) -- [Additional resources](#additional-resources) - - [Articles](#articles) - - [Websites](#websites) - - [Blogs](#blogs) - - [Videos](#videos) - - [Books](#books) - -# Architecture - -This is an attempt to combine multiple architectural patterns and styles together, such as: - -- [Domain-Driven Design (DDD)](https://en.wikipedia.org/wiki/Domain-driven_design) -- [Hexagonal (Ports and Adapters) Architecture](https://blog.octo.com/en/hexagonal-architecture-three-principles-and-an-implementation-example/) -- [Secure by Design](https://www.manning.com/books/secure-by-design) -- [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) -- [Onion Architecture](https://herbertograca.com/2017/09/21/onion-architecture/) -- [SOLID Principles](https://en.wikipedia.org/wiki/SOLID) -- [Software Design Patterns](https://refactoring.guru/design-patterns/what-is-pattern) - -And many others (more links below in every chapter). - -Before we begin, here are the PROS and CONS of using a complete architecture like this: - -#### Pros - -- Independent of external frameworks, technologies, databases, etc. Frameworks and external resources can be plugged/unplugged with much less effort. -- Easily testable and scalable. -- More secure. Some security principles are baked in design itself. -- The solution can be worked on and maintained by different teams, without stepping on each other's toes. -- Easier to add new features. As the system grows over time, the difficulty in adding new features remains constant and relatively small. -- If the solution is properly broken apart along [bounded context](https://martinfowler.com/bliki/BoundedContext.html) lines, it becomes easy to convert pieces of it into microservices if needed. - -#### Cons - -- This is a sophisticated architecture which requires a firm understanding of quality software principles, such as SOLID, Clean/Hexagonal Architecture, Domain-Driven Design, etc. Any team implementing such a solution will almost certainly require an expert to drive the solution and keep it from evolving the wrong way and accumulating technical debt. - -- Some practices presented here are not recommended for small-medium sized applications with not a lot of business logic. There is added up-front complexity to support all those building blocks and layers, boilerplate code, abstractions, data mapping etc. Thus, implementing a complete architecture like this is generally ill-suited to simple [CRUD](https://en.wikipedia.org/wiki/Create,_read,_update_and_delete) applications and could over-complicate such solutions. Some principles which are described below can be used in smaller sized applications, but must be implemented only after analyzing and understanding all pros and cons. - -# Diagram - -![Domain-Driven Hexagon](assets/images/DomainDrivenHexagon.png) -Diagram is mostly based on [this one](https://github.com/hgraca/explicit-architecture-php#explicit-architecture-1) + others found online - -In short, data flow looks like this (from left to right): - -- Request/CLI command/event is sent to the controller using plain DTO; -- Controller parses this DTO, maps it to a Command/Query object format and passes it to an Application service; -- Application service handles this Command/Query; it executes business logic using domain services and entities/aggregates and uses the infrastructure layer through ports(interfaces); -- Infrastructure layer maps data to a format that it needs, retrieves/persists data from/to a database, uses adapters for other I/O communications (like sending an event to an external broker or calling external APIs), maps data back to domain format and returns it back to Application service; -- After the Application service finishes doing its job, it returns data/confirmation back to Controllers; -- Controllers return data back to the user (if application has presenters/views, those are returned instead). - -Each layer is in charge of its own logic and has building blocks that usually should follow a [Single-responsibility principle](https://en.wikipedia.org/wiki/Single-responsibility_principle) when possible and when it makes sense (for example, using `Repositories` only for database access, using `Entities` for business logic, etc.). - -**Keep in mind** that different projects can have more or less steps/layers/building blocks than described here. Add more if the application requires it, and skip some if the application is not that complex and doesn't need all that abstraction. - -General recommendation for any project: analyze how big/complex the application will be, find a compromise and use as many layers/building blocks as needed for the project and skip ones that may over-complicate things. - -More in details on each step below. - -# Modules - -This project's code examples use separation by modules (also called components). Each module's name should reflect an important concept from the Domain and have its own folder with a dedicated codebase. Each business use case inside that module gets its own folder to store most of the things it needs (this is also called _Vertical Slicing_). It's easier to work on things that change together if those things are gathered relatively close to each other. Think of a module as a "box" that groups together related business logic. - -Using modules is a great way to [encapsulate]() parts of highly [cohesive]() business domain rules. - -Try to make every module independent and keep interactions between modules minimal. Think of each module as a mini application bounded by a single context. Consider module internals private and try to avoid direct imports between modules (like importing a class `import SomeClass from '../SomeOtherModule'`) since this creates [tight coupling]() and can turn your code into a [spaghetti](https://en.wikipedia.org/wiki/Spaghetti_code) and application into a [big ball of mud](https://en.wikipedia.org/wiki/Big_ball_of_mud). - -Few advices to avoid coupling: - -- Try not to create dependencies between modules or use cases. Instead, move shared logic into a separate files and make both depend on that instead of depending on each other. -- Modules can cooperate through a [mediator](https://en.wikipedia.org/wiki/Mediator_pattern#:~:text=In%20software%20engineering%2C%20the%20mediator,often%20consist%20of%20many%20classes.) or a public [facade](https://en.wikipedia.org/wiki/Facade_pattern), hiding all private internals of the module to avoid its misuse, and giving public access only to certain pieces of functionality that meant to be public. -- Alternatively modules can communicate with each other by using messages. For example, you can send commands using a commands bus or subscribe to events that other modules emit (more info on events and commands bus below). - -This ensures [loose coupling](https://en.wikipedia.org/wiki/Loose_coupling), refactoring of a module internals can be done easier because outside world only depends on module's public interface, and if bounded contexts are defined and designed properly each module can be easily separated into a microservice if needed without touching any domain logic or major refactoring. - -Keep your modules small. You should be able to rewrite a module in a relatively short period of time. This applies not only to modules pattern, but to software development in general: objects, functions, microservices, processes, etc. Keep them small and composable. This is incredibly powerful in a constantly changing environments of software development, since when your requirements change, changing small modules is much easier than changing a big program. You can just delete a module and rewrite it from scratch in a matter of days. This idea is further described in this talk: [Greg Young - The art of destroying software](https://youtu.be/Ed94CfxgsCA). - -Code Examples: - -- Check [src/modules](src/modules) directory structure. -- [src/modules/user/commands](src/modules/user/commands) - "commands" directory in a user module includes business use cases (commands) that a module can execute, each with its own Vertical Slice. - -Read more: - -- [Modular programming: Beyond the spaghetti mess](https://www.tiny.cloud/blog/modular-programming-principle/). -- [What are Modules in Domain Driven Design?](https://www.culttt.com/2014/12/10/modules-domain-driven-design/) -- [How to Implement Vertical Slice Architecture](https://garywoodfine.com/implementing-vertical-slice-architecture/) - -Each module consists of layers described below. - -# Application Core - -This is the core of the system which is built using [DDD building blocks](https://dzone.com/articles/ddd-part-ii-ddd-building-blocks): - -**Domain layer**: - -- Entities -- Aggregates -- Domain Services -- Value Objects -- Domain Errors - -**Application layer**: - -- Application Services -- Commands and Queries -- Ports - -**Note**: different implementations may have slightly different layer structures depending on applications needs. Also, more layers and building blocks may be added if needed. - ---- - -# Application layer - -## Application Services - -Application Services (also called "Workflow Services", "Use Cases", "Interactors", etc.) are used to orchestrate the steps required to fulfill the commands imposed by the client. - -Application services: - -- Typically used to orchestrate how the outside world interacts with your application and performs tasks required by the end users; -- Contain no domain-specific business logic; -- Operate on scalar types, transforming them into Domain types. A scalar type can be considered any type that's unknown to the Domain Model. This includes primitive types and types that don't belong to the Domain; -- Uses ports to declare dependencies on infrastructural services/adapters required to execute domain logic (ports are just interfaces, we will discuss this topic in details below); -- Fetch domain `Entities`/`Aggregates` (or anything else) from database/external APIs (through ports/interfaces, with concrete implementations injected by the [DI](https://en.wikipedia.org/wiki/Dependency_injection) library); -- Execute domain logic on those `Entities`/`Aggregates` (by invoking their methods); -- In case of working with multiple `Entities`/`Aggregates`, use a `Domain Service` to orchestrate them; -- Execute other out-of-process communications through Ports (like event emits, sending emails, etc.); -- Services can be used as a `Command`/`Query` handlers; -- Should not depend on other application services since it may cause problems (like cyclic dependencies); - -One service per use case is considered a good practice. - -
-What are "Use Cases"? - -[wiki](https://en.wikipedia.org/wiki/Use_case): - -> In software and systems engineering, a use case is a list of actions or event steps typically defining the interactions between a role (known in the Unified Modeling Language as an actor) and a system to achieve a goal. - -Use cases are, simply said, list of actions required from an application. - ---- - -
- -Example file: [create-user.service.ts](src/modules/user/commands/create-user/create-user.service.ts) - -More about services: - -- [Domain-Application-Infrastructure Services pattern](https://badia-kharroubi.gitbooks.io/microservices-architecture/content/patterns/tactical-patterns/domain-application-infrastructure-services-pattern.html) -- [Services in DDD finally explained](https://developer20.com/services-in-ddd-finally-explained/) - -## Commands and Queries - -This principle is called [Command–Query Separation(CQS)](https://en.wikipedia.org/wiki/Command%E2%80%93query_separation). When possible, methods should be separated into `Commands` (state-changing operations) and `Queries` (data-retrieval operations). To make a clear distinction between those two types of operations, input objects can be represented as `Commands` and `Queries`. Before DTO reaches the domain, it's converted into a `Command`/`Query` object. - -### Commands - -`Command` is an object that signals user intent, for example `CreateUserCommand`. It describes a single action (but does not perform it). - -`Commands` are used for state-changing actions, like creating new user and saving it to the database. Create, Update and Delete operations are considered as state-changing. - -Data retrieval is responsibility of `Queries`, so `Command` methods should not return business data. - -Some CQS purists may say that a `Command` shouldn't return anything at all. But you will need at least an ID of a created item to access it later. To achieve that you can let clients generate a [UUID](https://en.wikipedia.org/wiki/Universally_unique_identifier) (more info here: [CQS versus server generated IDs](https://blog.ploeh.dk/2014/08/11/cqs-versus-server-generated-ids/)). - -Though, violating this rule and returning some metadata, like `ID` of a created item, redirect link, confirmation message, status, or other metadata is a more practical approach than following dogmas. - -**Note**: `Command` is similar but not the same as described here: [Command Pattern](https://refactoring.guru/design-patterns/command). There are multiple definitions across the internet with similar but slightly different implementations. - -To execute a command you can use a `Command Bus` instead of importing a service directly. This will decouple a command Invoker from a Receiver, so you can send your commands from anywhere without creating coupling. - -Avoid command handlers executing other commands in this fashion: Command → Command. Instead, use events for that purpose, and execute next commands in a chain in an Event handler: Command → Event → Command. - -Example files: - -- [create-user.command.ts](src/modules/user/commands/create-user/create-user.command.ts) - a command Object -- [create-user.message.controller.ts](src/modules/user/commands/create-user/create-user.message.controller.ts) - controller executes a command using a command bus. This decouples it from a command handler. -- [create-user.service.ts](src/modules/user/commands/create-user/create-user.service.ts) - a command handler. - -Read more: - -- [What is a command bus and why should you use it?](https://barryvanveen.nl/blog/49-what-is-a-command-bus-and-why-should-you-use-it) -- [Why You Should Avoid Command Handlers Calling Other Commands?](https://www.rahulpnath.com/blog/avoid-commands-calling-commands/) - -### Queries - -`Query` is similar to a `Command`. It belongs to a read model and signals user intent to find something and describes how to do it. - -`Query` is just a data retrieval operation and should not make any state changes (like writes to the database, files, third party APIs, etc.). For this reason, in read model we can bypass a domain and repository layers completely and query database directly from a query handler. - -Similarly to Commands, Queries can use a `Query Bus` if needed. This way you can query anything from anywhere without importing classes directly and avoid coupling. - -Example files: - -- [find-users.query-handler.ts](src/modules/user/queries/find-users/find-users.query-handler.ts) - a query handler. Notice how we query the database directly, without using domain objects or repositories (more info [here](https://codeopinion.com/should-you-use-the-repository-pattern-with-cqrs-yes-and-no/)). - ---- - -By enforcing `Command` and `Query` separation, the code becomes simpler to understand. One changes something, another just retrieves data. - -Also, following CQS from the start will facilitate separating write and read models into different databases if someday in the future the need for it arises. - -**Note**: this repo uses [NestJS CQRS](https://docs.nestjs.com/recipes/cqrs) package that provides a command/query bus. - -Read more about CQS and CQRS: - -- [Command Query Segregation](https://khalilstemmler.com/articles/oop-design-principles/command-query-segregation/). -- [Exposing CQRS Through a RESTful API](https://www.infoq.com/articles/rest-api-on-cqrs/) -- [What is the CQRS pattern?](https://docs.microsoft.com/en-us/azure/architecture/patterns/cqrs) -- [CQRS and REST: the perfect match](https://lostechies.com/jimmybogard/2016/06/01/cqrs-and-rest-the-perfect-match/) - ---- - -## Ports - -Ports are interfaces that define contracts that should be implemented by adapters. For example, a port can abstract technology details (like what type of database is used to retrieve some data), and infrastructure layer can implement an adapter in order to execute some action more related to technology details rather than business logic. Ports act like [abstractions]() for technology details that business logic does not care about. Name "port" most actively is used in [Hexagonal Architecture](). - -In Application Core **dependencies point inwards**. Outer layers can depend on inner layers, but inner layers never depend on outer layers. Application Core shouldn't depend on frameworks or access external resources directly. Any external calls to out-of-process resources/retrieval of data from remote processes should be done through `ports` (interfaces), with class implementations created somewhere in infrastructure layer and injected into application's core ([Dependency Injection](https://en.wikipedia.org/wiki/Dependency_injection) and [Dependency Inversion](https://en.wikipedia.org/wiki/Dependency_inversion_principle)). This makes business logic independent of technology, facilitates testing, allows to plug/unplug/swap any external resources easily making application modular and [loosely coupled](https://en.wikipedia.org/wiki/Loose_coupling). - -- Ports are basically just interfaces that define what has to be done and don't care about how it's done. -- Ports can be created to abstract side effects like I/O operations and database access, technology details, invasive libraries, legacy code etc. from the Domain. -- By abstracting side effects, you can test your application logic in isolation by [mocking](https://en.wikipedia.org/wiki/Mock_object) the implementation. This can be useful for [unit testing](https://en.wikipedia.org/wiki/Unit_testing). -- Ports should be created to fit the Domain needs, not simply mimic the tools APIs. -- Mock implementations can be passed to ports while testing. Mocking makes your tests faster and independent of the environment. -- Abstraction provided by ports can be used to inject different implementations to a port if needed ([polymorphism]()). -- When designing ports, remember the [Interface segregation principle](https://en.wikipedia.org/wiki/Interface_segregation_principle). Split large interfaces into smaller ones when it makes sense, but also keep in mind to not overdo it when not necessary. -- Ports can also help to delay decisions. The Domain layer can be implemented even before deciding what technologies (frameworks, databases etc.) will be used. - -**Note**: since most ports implementations are injected and executed in application service, Application Layer can be a good place to keep those ports. But there are times when the Domain Layer's business logic depends on executing some external resource, in such cases those ports can be put in a Domain Layer. - -**Note**: abusing ports/interfaces may lead to [unnecessary abstractions](https://mortoray.com/2014/08/01/the-false-abstraction-antipattern/) and overcomplicate your application. In a lot of cases it's totally fine to depend on a concrete implementation instead of abstracting it with an interface. Think carefully if you really need an abstraction before using it. - -Example files: - -- [repository.port.ts](src/libs/ddd/repository.port.ts) - generic port for repositories -- [user.repository.port.ts](src/modules/user/database/user.repository.port.ts) - a port for user repository -- [find-users.query-handler.ts](src/modules/user/queries/find-users/find-users.query-handler.ts) - notice how query handler depends on a port instead of concrete repository implementation, and an implementation is injected -- [logger.port.ts](src/libs/ports/logger.port.ts) - another example of a port for application logger - -Read more: - -- [A Color Coded Guide to Ports and Adapters](https://8thlight.com/blog/damon-kelley/2021/05/18/a-color-coded-guide-to-ports-and-adapters.html) - ---- - -# Domain Layer - -This layer contains the application's business rules. - -Domain should operate using domain objects described by [ubiquitous language](https://martinfowler.com/bliki/UbiquitousLanguage.html). Most important domain building blocks are described below. - -- [Developing the ubiquitous language](https://medium.com/@felipefreitasbatista/developing-the-ubiquitous-language-1382b720bb8c) - -## Entities - -Entities are the core of the domain. They encapsulate Enterprise-wide business rules and attributes. An entity can be an object with properties and methods, or it can be a set of data structures and functions. - -Entities represent business models and express what properties a particular model has, what it can do, when and at what conditions it can do it. An example of business model can be a User, Product, Booking, Ticket, Wallet etc. - -Entities must always protect their [invariant](https://en.wikipedia.org/wiki/Class_invariant): - -> Domain entities should always be valid entities. There are a certain number of invariants for an object that should always be true. For example, an order item object always has to have a quantity that must be a positive integer, plus an article name and price. Therefore, invariants enforcement is the responsibility of the domain entities (especially of the aggregate root) and an entity object should not be able to exist without being valid. - -Entities: - -- Contain Domain business logic. Avoid having business logic in your services when possible, this leads to [Anemic Domain Model](https://martinfowler.com/bliki/AnemicDomainModel.html) (Domain Services are an exception for business logic that can't be put in a single entity). -- Have an identity that defines it and makes it distinguishable from others. Its identity is consistent during its life cycle. -- Equality between two entities is determined by comparing their identificators (usually its `id` field). -- Can contain other objects, such as other entities or value objects. -- Are responsible for collecting all the understanding of state and how it changes in the same place. -- Responsible for the coordination of operations on the objects it owns. -- Know nothing about upper layers (services, controllers etc.). -- Domain entities data should be modelled to accommodate business logic, not some database schema. -- Entities must protect their invariants, try to avoid public setters - update state using methods and execute invariant validation on each update if needed (this can be a simple `validate()` method that checks if business rules are not violated by update). -- Must be consistent on creation. Validate Entities and other domain objects on creation and throw an error on first failure. [Fail Fast](https://en.wikipedia.org/wiki/Fail-fast). -- Avoid no-arg (empty) constructors, accept and validate all required properties in a constructor (or in a [factory method](https://en.wikipedia.org/wiki/Factory_method_pattern) like `create()`). -- For optional properties that require some complex setting up, [Fluent interface](https://en.wikipedia.org/wiki/Fluent_interface) and [Builder Pattern](https://refactoring.guru/design-patterns/builder) can be used. -- Make Entities partially immutable. Identify what properties shouldn't change after creation and make them `readonly` (for example `id` or `createdAt`). - -**Note**: A lot of people tend to create one module per entity, but this approach is not very good. Each module may have multiple entities. One thing to keep in mind is that putting entities in a single module requires those entities to have related business logic, don't group unrelated entities in one module. - -Example files: - -- [user.entity.ts](src/modules/user/domain/user.entity.ts) -- [wallet.entity.ts](src/modules/wallet/domain/wallet.entity.ts) - -Read more: - -- [Domain Entity pattern](https://badia-kharroubi.gitbooks.io/microservices-architecture/content/patterns/tactical-patterns/domain-entity-pattern.html) -- [Secure by design: Chapter 6 Ensuring integrity of state](https://livebook.manning.com/book/secure-by-design/chapter-6/) - ---- - -## Aggregates - -[Aggregate](https://martinfowler.com/bliki/DDD_Aggregate.html) is a cluster of domain objects that can be treated as a single unit. It encapsulates entities and value objects which conceptually belong together. It also contains a set of operations which those domain objects can be operated on. - -- Aggregates help to simplify the domain model by gathering multiple domain objects under a single abstraction. -- Aggregates should not be influenced by the data model. Associations between domain objects are not the same as database relationships. -- Aggregate root is an entity that contains other entities/value objects and all logic to operate them. -- Aggregate root has global identity ([UUID / GUID](https://en.wikipedia.org/wiki/Universally_unique_identifier) / primary key). Entities inside the aggregate boundary have local identities, unique only within the Aggregate. -- Aggregate root is a gateway to entire aggregate. Any references from outside the aggregate should **only** go to the aggregate root. -- Any operations on an aggregate must be [transactional operations](https://en.wikipedia.org/wiki/Database_transaction). Either everything gets saved/updated/deleted or nothing. -- Only Aggregate Roots can be obtained directly with database queries. Everything else must be done through traversal. -- Similar to `Entities`, aggregates must protect their invariants through entire lifecycle. When a change to any object within the Aggregate boundary is committed, all invariants of the whole Aggregate must be satisfied. Simply said, all objects in an aggregate must be consistent, meaning that if one object inside an aggregate changes state, this shouldn't conflict with other domain objects inside this aggregate (this is called _Consistency Boundary_). -- Objects within the Aggregate can reference other Aggregate roots via their globally unique identifier (id). Avoid holding a direct object reference. -- Try to avoid aggregates that are too big, this can lead to performance and maintaining problems. -- Aggregates can publish `Domain Events` (more on that below). - -All of these rules just come from the idea of creating a boundary around Aggregates. The boundary simplifies business model, as it forces us to consider each relationship very carefully, and within a well-defined set of rules. - -In summary, if you combine multiple related entities and value objects inside one root `Entity`, this root `Entity` becomes an `Aggregate Root`, and this cluster of related entities and value objects becomes an `Aggregate`. - -Example files: - -- [aggregate-root.base.ts](src/libs/ddd/aggregate-root.base.ts) - abstract base class. -- [user.entity.ts](src/modules/user/domain/user.entity.ts) - aggregates are just entities that have to follow a set of specific rules described above. - -Read more: - -- [Understanding Aggregates in Domain-Driven Design](https://dzone.com/articles/domain-driven-design-aggregate) -- [What Are Aggregates In Domain-Driven Design?](https://www.jamesmichaelhickey.com/domain-driven-design-aggregates/) <- this is a series of multiple articles, don't forget to click "Next article" at the end. -- [Effective Aggregate Design Part I: Modeling a Single Aggregate](https://www.dddcommunity.org/wp-content/uploads/files/pdf_articles/Vernon_2011_1.pdf) -- [Effective Aggregate Design Part II: Making Aggregates Work Together](https://www.dddcommunity.org/wp-content/uploads/files/pdf_articles/Vernon_2011_2.pdf) - ---- - -## Domain Events - -Domain Event indicates that something happened in a domain that you want other parts of the same domain (in-process) to be aware of. Domain events are just messages pushed to an in-memory Domain Event dispatcher. - -For example, if a user buys something, you may want to: - -- Update his shopping cart; -- Withdraw money from his wallet; -- Create a new shipping order; -- Perform other domain operations that are not a concern of an aggregate that executes a "buy" command. - -The typical approach involves executing all this logic in a service that performs a "buy" operation. However, this creates coupling between different subdomains. - -An alternative approach would be publishing a `Domain Event`. If executing a command related to one aggregate instance requires additional domain rules to be run on one or more additional aggregates, you can design and implement those side effects to be triggered by Domain Events. Propagation of state changes across multiple aggregates within the same domain model can be performed by subscribing to a concrete `Domain Event` and creating as many event handlers as needed. This prevents coupling between aggregates. - -Domain Events may be useful for creating an [audit log](https://en.wikipedia.org/wiki/Audit_trail) to track all changes to important entities by saving each event to the database. Read more on why audit logs may be useful: [Why soft deletes are evil and what to do instead](https://jameshalsall.co.uk/posts/why-soft-deletes-are-evil-and-what-to-do-instead). - -All changes caused by Domain Events across multiple aggregates in a single process can be saved in a single database [transaction](https://en.wikipedia.org/wiki/Database_transaction). This approach ensures consistency and integrity of your data. Wrapping an entire flow in a transaction or using patterns like [Unit of Work](https://java-design-patterns.com/patterns/unit-of-work/) or similar can help with that. -**Keep in mind** that abusing transactions can create bottlenecks when multiple users try to modify single record concurrently. Use it only when you can afford it, otherwise go for other approaches (like [eventual consistency](https://en.wikipedia.org/wiki/Eventual_consistency)). - -There are multiple ways on implementing an event bus for Domain Events, for example by using ideas from patterns like [Mediator](https://refactoring.guru/design-patterns/mediator) or [Observer](https://refactoring.guru/design-patterns/observer). - -Examples: - -- [user-created.domain-event.ts](src/modules/user/domain/events/user-created.domain-event.ts) - simple object that holds data related to published event. -- [create-wallet-when-user-is-created.domain-event-handler.ts](src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts) - this is an example of Domain Event Handler that executes some actions when a domain event is raised (in this case, when user is created it also creates a wallet for that user). -- [sql-repository.base.ts](src/libs/db/sql-repository.base.ts) - repository publishes all domain events for execution when it persists changes to an aggregate. -- [create-user.service.ts](src/modules/user/commands/create-user/create-user.service.ts) - in a service we execute a global transaction to make sure all the changes done by Domain Events across the application are stored atomically (all or nothing). - -To have a better understanding on domain events and implementation read this: - -- [Domain Event pattern](https://badia-kharroubi.gitbooks.io/microservices-architecture/content/patterns/tactical-patterns/domain-event-pattern.html) -- [Domain events: design and implementation](https://docs.microsoft.com/en-us/dotnet/architecture/microservices/microservice-ddd-cqrs-patterns/domain-events-design-implementation) - -**Additional notes**: - -- When using only events for complex workflows with a lot of steps, it will be hard to track everything that is happening across the application. One event may trigger another one, then another one, and so on. To track the entire workflow you'll have to go multiple places and search for an event handler for each step, which is hard to maintain. In this case, using a service/orchestrator/mediator might be a preferred approach compared to only using events since you will have an entire workflow in one place. This might create some coupling, but is easier to maintain. Don't rely on events only, pick the right tool for the job. - -- In some cases you will not be able to save all changes done by your events to multiple aggregates in a single transaction. For example, if you are using microservices that span transaction between multiple services, or [Event Sourcing pattern](https://docs.microsoft.com/en-us/azure/architecture/patterns/event-sourcing) that has a single stream per aggregate. In this case saving events across multiple aggregates can be eventually consistent (for example by using [Sagas](https://microservices.io/patterns/data/saga.html) with compensating events or a [Process Manager](https://www.enterpriseintegrationpatterns.com/patterns/messaging/ProcessManager.html) or something similar). - -## Integration Events - -Out-of-process communications (calling microservices, external APIs) are called `Integration Events`. If sending a Domain Event to external process is needed then domain event handler should send an `Integration Event`. - -Integration Events usually should be published only after all Domain Events finished executing and saving all changes to the database. - -To handle integration events in microservices you may need an external message broker / event bus like [RabbitMQ](https://www.rabbitmq.com/) or [Kafka](https://kafka.apache.org/) together with patterns like [Transactional outbox](https://microservices.io/patterns/data/transactional-outbox.html), [Change Data Capture](https://en.wikipedia.org/wiki/Change_data_capture), [Sagas](https://microservices.io/patterns/data/saga.html) or a [Process Manager](https://www.enterpriseintegrationpatterns.com/patterns/messaging/ProcessManager.html) to maintain [eventual consistency](https://en.wikipedia.org/wiki/Eventual_consistency). - -Read more: - -- [Domain Events vs. Integration Events in Domain-Driven Design and microservices architectures](https://devblogs.microsoft.com/cesardelatorre/domain-events-vs-integration-events-in-domain-driven-design-and-microservices-architectures/) - -For integration events in distributed systems here are some patterns that may be useful: - -- [Saga distributed transactions](https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/saga/saga) -- [Saga vs. Process Manager](https://blog.devarchive.net/2015/11/saga-vs-process-manager.html) -- [The Outbox Pattern](https://www.kamilgrzybek.com/design/the-outbox-pattern/) -- [Event Sourcing pattern](https://docs.microsoft.com/en-us/azure/architecture/patterns/event-sourcing) - ---- - -## Domain Services - -Eric Evans, Domain-Driven Design: - -> Domain services are used for "a significant process or transformation in the domain that is not a natural responsibility of an ENTITY or VALUE OBJECT" - -- Domain Service is a specific type of domain layer class that is used to execute domain logic that relies on two or more `Entities`. -- Domain Services are used when putting the logic on a particular `Entity` would break encapsulation and require the `Entity` to know about things it really shouldn't be concerned with. -- Domain services are very granular, while application services are a facade purposed with providing an API. -- Domain services operate only on types belonging to the Domain. They contain meaningful concepts that can be found within the Ubiquitous Language. They hold operations that don't fit well into Value Objects or Entities. - ---- - -## Value objects - -Some Attributes and behaviors can be moved out of the entity itself and put into `Value Objects`. - -Value Objects: - -- Have no identity. Equality is determined through structural property. -- Are immutable. -- Can be used as an attribute of `entities` and other `value objects`. -- Explicitly defines and enforces important constraints (invariants). - -Value object shouldn’t be just a convenient grouping of attributes but should form a well-defined concept in the domain model. This is true even if it contains only one attribute. When modeled as a conceptual whole, it carries meaning when passed around, and it can uphold its constraints. - -Imagine you have a `User` entity which needs to have an `address` of a user. Usually an address is simply a complex value that has no identity in the domain and is composed of multiple other values, like `country`, `street`, `postalCode` etc., so it can be modeled and treated as a `Value Object` with its own business logic. - -`Value object` isn’t just a data structure that holds values. It can also encapsulate logic associated with the concept it represents. - -Example files: - -- [address.value-object.ts](src/modules/user/domain/value-objects/address.value-object.ts) - -Read more about Value Objects: - -- [Martin Fowler blog](https://martinfowler.com/bliki/ValueObject.html) -- [Value Objects to the rescue](https://medium.com/swlh/value-objects-to-the-rescue-28c563ad97c6). -- [Value Object pattern](https://badia-kharroubi.gitbooks.io/microservices-architecture/content/patterns/tactical-patterns/value-object-pattern.html) - -## Domain Invariants - -Domain [invariants]() are the policies and conditions that are always met for the Domain in particular context. Invariants determine what is possible or what is prohibited in the context. - -Invariants enforcement is the responsibility of domain objects (especially of the entities and aggregate roots). - -There are a certain number of invariants for an object that should always be true. For example: - -- When sending money, amount must always be a positive integer, and there always must be a receiver credit card number in a correct format; -- Client cannot purchase a product that is out of stock; -- Client's wallet cannot have less than 0 balance; -- etc. - -If the business has some rules similar to described above, the domain object should not be able to exist without following those rules. - -Below we will discuss some validation techniques for your domain objects. - -Example files: - -- [wallet.entity.ts](src/modules/wallet/domain/wallet.entity.ts) - notice `validate` method. This is a simplified example of enforcing a domain invariant. - -Read more: - -- [Design validations in the domain model layer](https://docs.microsoft.com/en-us/dotnet/architecture/microservices/microservice-ddd-cqrs-patterns/domain-model-layer-validations) -- [Why Domain Invariants are critical to build good software?](https://no-kill-switch.ghost.io/why-domain-invariants-are-critical-to-build-good-software/) - -### Replacing primitives with Value Objects - -Most of the code bases operate on primitive types – `strings`, `numbers` etc. In the Domain Model, this level of abstraction may be too low. - -Significant business concepts can be expressed using specific types and classes. `Value Objects` can be used instead primitives to avoid [primitives obsession](https://refactoring.guru/smells/primitive-obsession). -So, for example, `email` of type `string`: - -```typescript -const email: string = 'john@gmail.com'; -``` - -could be represented as a `Value Object` instead: - -```typescript -export class Email extends ValueObject { - constructor(value: string) { - super({ value }); - } - - get value(): string { - return this.props.value; - } -} -``` - -```typescript -const email: Email = new Email('john@gmail.com'); -``` - -Now the only way to make an `email` is to create a new instance of `Email` class first, this ensures it will be validated on creation and a wrong value won't get into `Entities`. - -Also, an important behavior of the domain primitive is encapsulated in one place. By having the domain primitive own and control domain operations, you reduce the risk of bugs caused by lack of detailed domain knowledge of the concepts involved in the operation. - -Creating an object for primitive values may be cumbersome, but it somewhat forces a developer to study domain more in details instead of just throwing a primitive type without even thinking what that value represents in domain. - -Using `Value Objects` for primitive types is also called a `domain primitive`. The concept and naming are proposed in the book ["Secure by Design"](https://www.manning.com/books/secure-by-design). - -Using `Value Objects` instead of primitives: - -- Makes code easier to understand by using ubiquitous language instead of just `string`. -- Improves security by ensuring invariants of every property. -- Encapsulates specific business rules associated with a value. - -`Value Object` can represent a typed value in domain (a _domain primitive_). The goal here is to encapsulate validations and business logic related only to the represented fields and make it impossible to pass around raw values by forcing a creation of valid `Value Objects` first. This object only accepts values which make sense in its context. - -If every argument and return value of a method is valid by definition, you’ll have input and output validation in every single method in your codebase without any extra effort. This will make application more resilient to errors and will protect it from a whole class of bugs and security vulnerabilities caused by invalid input data. - -> Without domain primitives, the remaining code needs to take care of validation, formatting, comparing, and lots of other details. Entities represent long-lived objects with a distinguished identity, such as articles in a news feed, rooms in a hotel, and shopping carts in online sales. The functionality in a system often centers around changing the state of these objects: hotel rooms are booked, shopping cart contents are -> paid for, and so on. Sooner or later the flow of control will be guided to some code representing these entities. And if all the data is transmitted as generic types such as int or String , responsibilities fall on the entity code to validate, compare, and format the data, among other tasks. The entity code will be burdened with a lot of -> tasks, rather than focusing on the central business flow-of-state changes that it models. Using domain primitives can counteract the tendency for entities to grow overly complex. - -Quote from: [Secure by design: Chapter 5.3 Standing on the shoulders of domain primitives](https://livebook.manning.com/book/secure-by-design/chapter-5/96) - -Also, an alternative for creating an object may be a [type alias](https://www.typescriptlang.org/docs/handbook/advanced-types.html#type-aliases) (ideally using [nominal types](https://betterprogramming.pub/nominal-typescript-eee36e9432d2)) just to give this primitive a semantic meaning. - -**Warning**: Don't include Value Objects in objects that can be sent to other processes, like dtos, events, database models etc. Serialize them to primitive types first. - -**Note**: In languages like TypeScript, creating value objects for single values/primitives adds some extra complexity and boilerplate code, since you need to access an underlying value by doing something like `email.value`. Also, it can have performance penalties due to creation of so many objects. This technique works best in languages like [Scala](https://www.scala-lang.org/) with its [value classes](https://docs.scala-lang.org/overviews/core/value-classes.html) that represents such classes as primitives at runtime, meaning that object `Email` will be represented as `String` at runtime. - -**Note**: if you are using nodejs, [Runtypes](https://www.npmjs.com/package/runtypes) is a nice library that you can use instead of creating your own value objects for primitives. - -**Note**: Some people say that _primitive obsession_ is a code smell, some people consider making a class/object for every primitive may be overengineering (unless you are using Scala with its value classes). For less complex and smaller projects it's definitely an overkill. For bigger projects, there are people who advocate for and against this approach. If you notice that creating a class for every primitive doesn't give you much benefit, create classes just for those primitives that have specific rules or behavior, or just validate only outside of domain using some validation framework. Here are some thoughts on this topic: [From Primitive Obsession to Domain Modelling - Over-engineering?](https://blog.ploeh.dk/2015/01/19/from-primitive-obsession-to-domain-modelling/#7172fd9ca69c467e8123a20f43ea76c2). - -Recommended reading: - -- [Primitive Obsession — A Code Smell that Hurts People the Most](https://medium.com/the-sixt-india-blog/primitive-obsession-code-smell-that-hurt-people-the-most-5cbdd70496e9) -- [Domain Primitives: what they are and how you can use them to make more secure software](https://freecontent.manning.com/domain-primitives-what-they-are-and-how-you-can-use-them-to-make-more-secure-software/) -- [Value Objects Like a Pro](https://medium.com/@nicolopigna/value-objects-like-a-pro-f1bfc1548c72) -- ["Secure by Design" Chapter 5: Domain Primitives](https://livebook.manning.com/book/secure-by-design/chapter-5/) (a full chapter of the article above) - -### Make illegal states unrepresentable - -Use Value Objects/Domain Primitives and Types ([Algebraic Data Types (ADT)](https://en.wikipedia.org/wiki/Algebraic_data_type)) to make illegal states impossible to represent in your program. - -Some people recommend using objects for every value: - -Quote from [John A De Goes](https://twitter.com/jdegoes): - -> Making illegal states unrepresentable is all about statically proving that all runtime values (without exception) correspond to valid objects in the business domain. The effect of this technique on eliminating meaningless runtime states is astounding and cannot be overstated. - -Let's distinguish two types of protection from illegal states: at **compile time** and at **runtime**. - -#### Validation at compile time - -Types give useful semantic information to a developer. Good code should be easy to use correctly, and hard to use incorrectly. Types system can be a good help for that. It can prevent some nasty errors at compile time, so IDE will show type errors right away. - -The simplest example may be using enums instead of constants, and use those enums as input type for something. When passing anything that is not intended the IDE will show a type error: - -```typescript -export enum UserRoles { - admin = 'admin', - moderator = 'moderator', - guest = 'guest', -} - -const userRole: UserRoles = 'some string'; // <-- error -``` - -Or, for example, imagine that business logic requires to have contact info of a person by either having `email`, or `phone`, or both. Both `email` and `phone` could be represented as optional, for example: - -```typescript -interface ContactInfo { - email?: Email; - phone?: Phone; -} -``` - -But what happens if both are not provided by a programmer? Business rule violated. Illegal state allowed. - -Solution: this could be presented as a [union type](https://www.typescriptlang.org/docs/handbook/unions-and-intersections.html#union-types) - -```typescript -type ContactInfo = Email | Phone | [Email, Phone]; -``` - -Now only either `Email`, or `Phone`, or both must be provided. If nothing is provided, the IDE will show a type error right away. Now business rule validation is moved from runtime to **compile time**, which makes the application more secure and gives a faster feedback when something is not used as intended. - -This is called a _typestate pattern_. - -> The typestate pattern is an API design pattern that encodes information about an object’s run-time state in its compile-time type. - -Read more: - -- [Making illegal states unrepresentable](https://v5.chriskrycho.com/journal/making-illegal-states-unrepresentable-in-ts/) -- [Typestates Would Have Saved the Roman Republic](https://blog.yoavlavi.com/state-machines-would-have-saved-the-roman-republic/) -- [The Typestate Pattern](https://cliffle.com/blog/rust-typestate/) -- [Make illegal states unrepresentable — but how? The Typestate Pattern in Erlang](https://erszcz.medium.com/make-illegal-states-unrepresentable-but-how-the-typestate-pattern-in-erlang-16b37b090d9d) - -#### Validation at runtime - -Data should not be trusted. There are a lot of cases when invalid data may end up in a domain. For example, if data comes from external API, database, or if it's just a programmer error. - -Things that can't be validated at compile time (like user input) are validated at runtime. - -First line of defense is validation of user input DTOs. - -Second line of defense are Domain Objects. Entities and value objects have to protect their invariants. Having some validation rules here will protect their state from corruption. You can use techniques like [Design by contract](https://en.wikipedia.org/wiki/Design_by_contract) by defining preconditions in object constructors and checking postconditions and invariants before saving an object to the database. - -Enforcing self-validation of your domain objects will inform immediately when data is corrupted. Not validating domain objects allows them to be in an incorrect state, this leads to problems. - -By combining compile and runtime validations, using objects instead of primitives, enforcing self-validation and invariants of your domain objects, using Design by contract, [Algebraic Data Types (ADT)](https://en.wikipedia.org/wiki/Algebraic_data_type) and typestate pattern, and other similar techniques, you can achieve an architecture where it's hard, or even impossible, to end up in illegal states, thus improving security and robustness of your application dramatically (at a cost of extra boilerplate code). - -**Recommended to read**: - -- [Backend Best Practices: Data Validation](https://github.com/Sairyss/backend-best-practices#data-validation) - -### Guarding vs validating - -You may have noticed that we do validation in multiple places: - -1. First when user input is sent to our application. In our example we use DTO decorators: [create-user.request-dto.ts](src/modules/user/commands/create-user/create-user.request.dto.ts). -2. Second time in domain objects, for example: [address.value-object.ts](src/modules/user/domain/value-objects/address.value-object.ts). - -So, why are we validating things twice? Let's call a second validation "_guarding_", and distinguish between guarding and validating: - -- Guarding is a failsafe mechanism. Domain layer views it as invariants to comply with always-valid domain model. -- Validation is a filtration mechanism. Outside layers view them as input validation rules. - -> This difference leads to different treatment of violations of these business rules. An invariant violation in the domain model is an exceptional situation and should be met with throwing an exception. On the other hand, there’s nothing exceptional in external input being incorrect. - -The input coming from the outside world should be filtered out before passing it further to the domain model. It’s the first line of defense against data inconsistency. At this stage, any incorrect data is denied with corresponding error messages. -Once the filtration has confirmed that the incoming data is valid it's passed to a domain. When the data enters the always-valid domain boundary, it's assumed to be valid and any violation of this assumption means that you’ve introduced a bug. -Guards help to reveal those bugs. They are the failsafe mechanism, the last line of defense that ensures data in the always-valid boundary is indeed valid. Guards comply with the [Fail Fast principle](https://enterprisecraftsmanship.com/posts/fail-fast-principle) by throwing runtime exceptions. - -Domain classes should always guard themselves against becoming invalid. - -For preventing null/undefined values, empty objects and arrays, incorrect input length etc. a library of [guards]() can be created. - -Example file: [guard.ts](src/libs/guard.ts) - -**Keep in mind** that not all validations/guarding can be done in a single domain object, it should validate only rules shared by all contexts. There are cases when validation may be different depending on a context, or one field may involve another field, or even a different entity. Handle those cases accordingly. - -Read more: - -- [Refactoring: Guard Clauses](https://medium.com/better-programming/refactoring-guard-clauses-2ceeaa1a9da) -- [Always-Valid Domain Model](https://enterprisecraftsmanship.com/posts/always-valid-domain-model/) - -
-Note: Using validation library instead of custom guards - -Instead of using custom _guards_ you could use an external validation library, but it's not a good practice to tie domain to external libraries and is not usually recommended. - -Although exceptions can be made if needed, especially for very specific validation libraries that validate only one thing (like specific IDs, for example bitcoin wallet address). Tying only one or just few `Value Objects` to such a specific library won't cause any harm. Unlike general purpose validation libraries which will be tied to domain everywhere, and it will be troublesome to change it in every `Value Object` in case when old library is no longer maintained, contains critical bugs or is compromised by hackers etc. - -Though, it's fine to do full sanity checks using validation framework or library **outside** the domain (for example [class-validator](https://www.npmjs.com/package/class-validator) decorators in `DTOs`), and do only some basic checks (guarding) inside of domain objects (besides business rules), like checking for `null` or `undefined`, checking length, matching against simple regexp etc. to check if value makes sense and for extra security. - -
-Note about using regexp - -Be careful with custom regexp validations for things like validating `email`, only use custom regexp for some very simple rules and, if possible, let validation library do its job on more difficult ones to avoid problems in case your regexp is not good enough. - -Also, keep in mind that custom regexp that does same type of validation that is already done by validation library outside of domain may create conflicts between your regexp and the one used by a validation library. - -For example, value can be accepted as valid by a validation library, but `Value Object` may throw an error because custom regexp is not good enough (validating `email` is more complex than just copy - pasting a regular expression found in google. Though, it can be validated by a simple rule that is true all the time and won't cause any conflicts, like every `email` must contain an `@`). Try finding and validating only patterns that won't cause conflicts. - ---- - -
- -Although there are other strategies on how to do validation inside domain, like passing validation schema as a dependency when creating new `Value Object`, but this creates extra complexity. - -Either to use external library/framework for validation inside domain or not is a tradeoff, analyze all the pros and cons and choose what is more appropriate for current application. - -For some projects, especially smaller ones, it might be easier and more appropriate to just use validation library/framework. - -
- -## Domain Errors - -Application's core and domain layers shouldn't throw HTTP exceptions or statuses since it shouldn't know in what context it's used, since it can be used by anything: HTTP controller, Microservice event handler, Command Line Interface etc. A better approach is to create custom error classes with appropriate error codes. - -Exceptions are for exceptional situations. Complex domains usually have a lot of errors that are not exceptional, but a part of a business logic (like "seat already booked, choose another one"). Those errors may need special handling. In those cases returning explicit error types can be a better approach than throwing. - -Returning an error instead of throwing explicitly shows a type of each exception that a method can return so you can handle it accordingly. It can make an error handling and tracing easier. - -To help with that you can create an [Algebraic Data Types (ADT)](https://en.wikipedia.org/wiki/Algebraic_data_type) for your errors and use some kind of Result object type with a Success or a Failure condition (a [monad]() like [Either](https://typelevel.org/cats/datatypes/either.html) from functional languages similar to Haskell or Scala). Unlike throwing exceptions, this approach allows defining types (ADTs) for every error and will let you see and handle them explicitly instead of using `try/catch` and avoid throwing exceptions that are invisible at compile time. For example: - -```typescript -// User errors: -class UserError extends Error { - /* ... */ -} - -class UserAlreadyExistsError extends UserError { - /* ... */ -} - -class IncorrectUserAddressError extends UserError { - /* ... */ -} - -// ... other user errors -``` - -```typescript -// Sum type for user errors -type CreateUserError = UserAlreadyExistsError | IncorrectUserAddressError; - -function createUser( - command: CreateUserCommand, -): Result { - // ^ explicitly showing what function returns - if (await userRepo.exists(command.email)) { - return Err(new UserAlreadyExistsError()); // <- returning an Error - } - if (!validate(command.address)) { - return Err(new IncorrectUserAddressError()); - } - // else - const user = UserEntity.create(command); - await this.userRepo.save(user); - return Ok(user); -} -``` - -This approach gives us a fixed set of expected error types, so we can decide what to do with each: - -```typescript -/* in HTTP context we want to convert each error to an -error with a corresponding HTTP status code: 409, 400 or 500 */ -const result = await this.commandBus.execute(command); -return match(result, { - Ok: (id: string) => new IdResponse(id), - Err: (error: Error) => { - if (error instanceof UserAlreadyExistsError) - throw new ConflictHttpException(error.message); - if (error instanceof IncorrectUserAddressError) - throw new BadRequestException(error.message); - throw error; - }, -}); -``` - -Throwing makes errors invisible for the consumer of your functions/methods (until those errors happen at runtime, or until you dig deeply into the source code and find them). This means those errors are less likely to be handled properly. - -Returning errors instead of throwing them adds some extra boilerplate code, but can make your application robust and secure since errors are now explicitly documented and visible as return types. You decide what to do with each error: propagate it further, transform it, add extra metadata, or try to recover from it (for example, by retrying the operation). - -**Note**: Distinguish between Domain Errors and Exceptions. Exceptions are usually thrown and not returned. If you return technical Exceptions (like connection failed, process out of memory, etc.), It may cause some security issues and goes against [Fail-fast](https://en.wikipedia.org/wiki/Fail-fast) principle. Instead of terminating a program flow, returning an exception continues program execution and allows it to run in an incorrect state, which may lead to more unexpected errors, so it's generally better to throw an Exception in those cases rather than returning it. - -Libraries you can use: - -- [oxide.ts](https://www.npmjs.com/package/oxide.ts) - this is a nice npm package if you want to use a Result object -- [@badrap/result](https://www.npmjs.com/package/@badrap/result) - alternative - -Example files: - -- [user.errors.ts](src/modules/user/domain/user.errors.ts) - user errors -- [create-user.service.ts](src/modules/user/commands/create-user/create-user.service.ts) - notice how `Err(new UserAlreadyExistsError())` is returned instead of throwing it. -- [create-user.http.controller.ts](src/modules/user/commands/create-user/create-user.http.controller.ts) - in a user http controller we match an error and decide what to do with it. If an error is `UserAlreadyExistsError` we throw a `Conflict Exception` which a user will receive as `409 - Conflict`. If an error is unknown we just throw it and our framework will return it to the user as `500 - Internal Server Error`. -- [create-user.cli.controller.ts](src/modules/user/commands/create-user/create-user.cli.controller.ts) - in a CLI controller we don't care about returning a correct status code so we just `.unwrap()` a result, which will just throw in case of an error. -- [exceptions](src/libs/exceptions) folder contains some generic app exceptions (not domain specific) - -Read more: - -- [Flexible Error Handling w/ the Result Class](https://khalilstemmler.com/articles/enterprise-typescript-nodejs/handling-errors-result-class/) -- [Advanced error handling techniques](https://enterprisecraftsmanship.com/posts/advanced-error-handling-techniques/) -- ["Secure by Design" Chapter 9.2: Handling failures without exceptions](https://livebook.manning.com/book/secure-by-design/chapter-9/51) -- ["Functional Programming in Scala" Chapter 4. Handling errors without exceptions](https://livebook.manning.com/book/functional-programming-in-scala/chapter-4/) - -## Using libraries inside Application's core - -Whether to use libraries in application core and especially domain layer is a subject of a lot of debates. In real world, injecting every library instead of importing it directly is not always practical, so exceptions can be made for some single responsibility libraries that help to implement domain logic (like working with numbers). - -Main recommendations to keep in mind is that libraries imported in application's core **shouldn't** expose: - -- Functionality to access any out-of-process resources (http calls, database access etc); -- Functionality not relevant to domain (frameworks, technology details like ORMs, Logger etc.). -- Functionality that brings randomness (generating random IDs, timestamps etc.) since this makes tests unpredictable (though in TypeScript world it's not that big of a deal since this can be mocked by a test library without using DI); -- If a library changes often or has a lot of dependencies of its own it most likely shouldn't be used in domain layer. - -To use such libraries consider creating an `anti-corruption` layer by using [adapter](https://refactoring.guru/design-patterns/adapter) or [facade](https://refactoring.guru/design-patterns/facade) patterns. - -We sometimes tolerate libraries in the center, but be careful with general purpose libraries that may scatter across many domain objects. It will be hard to replace those libraries if needed. Tying only one or just a few domain objects to some single-responsibility library should be fine. It's way easier to replace a specific library that is tied to one or few objects than a general purpose library that is everywhere. - -In addition to different libraries there are Frameworks. Frameworks can be a real nuisance, because by definition they want to be in control, and it's hard to replace a Framework later when your entire application is glued to it. It's fine to use Frameworks in outside layers (like infrastructure), but keep your domain clean of them when possible. You should be able to extract your domain layer and build a new infrastructure around it using any other framework without breaking your business logic. - -NestJS does a good job, as it uses decorators which are not very intrusive, so you could use decorators like `@Inject()` without affecting your business logic at all, and it's relatively easy to remove or replace it when needed. Don't give up on frameworks completely, but keep them in boundaries and don't let them affect your business logic. - -Offload as much of irrelevant responsibilities as possible from the core, especially from domain layer. In addition, try to minimize usage of dependencies in general. More dependencies your software has means more potential errors and security holes. One technique for making software more robust is to minimize what your software depends on - the less that can go wrong, the less will go wrong. On the other hand, removing all dependencies would be counterproductive as replicating that functionality would require huge amount of work and would be less reliable than just using a popular, battle-tested library. Finding a good balance is important, this skill requires experience. - -Read more: - -- [Referencing external libs](https://khorikov.org/posts/2019-08-07-referencing-external-libs/). -- [Anti-corruption Layer — An effective Shield](https://medium.com/@malotor/anticorruption-layer-a-effective-shield-caa4d5ba548c) - ---- - -# Interface Adapters - -Interface adapters (also called driving/primary adapters) are user-facing interfaces that take input data from the user and repackage it in a form that is convenient for the use cases(services/command handlers) and entities. Then they take the output from those use cases and entities and repackage it in a form that is convenient for displaying it back for the user. User can be either a person using an application or another server. - -Contains `Controllers` and `Request`/`Response` DTOs (can also contain `Views`, like backend-generated HTML templates, if required). - -## Controllers - -- Controller is a user-facing API that is used for parsing requests, triggering business logic and presenting the result back to the client. -- One controller per use case is considered a good practice. -- In [NestJS](https://docs.nestjs.com/) world controllers may be a good place to use [OpenAPI/Swagger decorators](https://docs.nestjs.com/openapi/operations) for documentation. - -One controller per trigger type can be used to have a clearer separation. For example: - -- [create-user.http.controller.ts](src/modules/user/commands/create-user/create-user.http.controller.ts) for http requests ([NestJS Controllers](https://docs.nestjs.com/controllers)), -- [create-user.cli.controller.ts](src/modules/user/commands/create-user/create-user.cli.controller.ts) for command line interface access ([NestJS Console](https://www.npmjs.com/package/nestjs-console)) -- [create-user.message.controller.ts](src/modules/user/commands/create-user/create-user.message.controller.ts) for external messages ([NestJS Microservices](https://docs.nestjs.com/microservices/basics)). -- etc. - -### Resolvers - -If you are using [GraphQL](https://graphql.org/) instead of controllers, you will use [Resolvers](https://docs.nestjs.com/graphql/resolvers). - -One of the main benefits of a layered architecture is separation of concerns. As you can see, it doesn't matter if you use [REST](https://en.wikipedia.org/wiki/Representational_state_transfer) or GraphQL, the only thing that changes is user-facing API layer (interface-adapters). All the application Core stays the same since it doesn't depend on technology you are using. - -Example files: - -- [create-user.graphql-resolver.ts](src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts) - ---- - -## DTOs - -Data that comes from external applications should be represented by a special type of classes - Data Transfer Objects ([DTO](https://en.wikipedia.org/wiki/Data_transfer_object) for short). -Data Transfer Object is an object that carries data between processes. It defines a contract between your API and clients. - -### Request DTOs - -Input data sent by a user. - -- Using Request DTOs gives a contract that a client of your API has to follow to make a correct request. - -Examples: - -- [create-user.request.dto.ts](src/modules/user/commands/create-user/create-user.request.dto.ts) - -### Response DTOs - -Output data returned to a user. - -- Using Response DTOs ensures clients only receive data described in DTOs contract, not everything that your model/entity owns (which may result in data leaks). - -Examples: - -- [user.response.dto.ts](src/modules/user/dtos/user.response.dto.ts) - ---- - -DTO contracts protect your clients from internal data structure changes that may happen in your API. When internal data models change (like renaming variables or splitting tables), they can still be mapped to match a corresponding DTO to maintain compatibility for anyone using your API. - -When updating DTO interfaces, a new version of API can be created by prefixing an endpoint with a version number, for example: `v2/users`. This will make transition painless by preventing breaking compatibility for users that are slow to update their apps that uses your API. - -You may have noticed that our [create-user.command.ts](src/modules/user/commands/create-user/create-user.command.ts) contains the same properties as [create-user.request.dto.ts](src/modules/user/commands/create-user/create-user.request.dto.ts). -So why do we need DTOs if we already have Command objects that carry properties? Shouldn't we just have one class to avoid duplication? - -> Because commands and DTOs are different things, they tackle different problems. Commands are serializable method calls - calls of the methods in the domain model. Whereas DTOs are the data contracts. The main reason to introduce this separate layer with data contracts is to provide backward compatibility for the clients of your API. Without the DTOs, the API will have breaking changes with every modification of the domain model. - -More info on this subject here: [Are CQRS commands part of the domain model?](https://enterprisecraftsmanship.com/posts/cqrs-commands-part-domain-model/) (read "_Commands vs DTOs_" section). - -### Additional recommendations - -- DTOs should be data-oriented, not object-oriented. Its properties should be mostly primitives. We are not modeling anything here, just sending flat data around. -- When returning a `Response` prefer _whitelisting_ properties over _blacklisting_. This ensures that no sensitive data will leak in case if programmer forgets to blacklist newly added properties that shouldn't be returned to the user. -- If you use the same DTOs in multiple apps (frontend and backend, or between microservices), you can keep them somewhere in a shared directory instead of module directory and create a git submodule or a separate package for sharing them. -- `Request`/`Response` DTO classes may be a good place to use validation and sanitization decorators like [class-validator](https://www.npmjs.com/package/class-validator) and [class-sanitizer](https://www.npmjs.com/package/class-sanitizer) (make sure that all validation errors are gathered first and only then return them to the user, this is called [Notification pattern](https://martinfowler.com/eaaDev/Notification.html). Class-validator does this by default). -- `Request`/`Response` DTO classes may also be a good place to use Swagger/OpenAPI library decorators that [NestJS provides](https://docs.nestjs.com/openapi/types-and-parameters). -- If DTO decorators for validation/documentation are not used, DTO can be just an interface instead of a class. -- Data can be transformed to DTO format using a separate mapper or right in the constructor of a DTO class. - -### Local DTOs - -Another thing that can be seen in some projects is local DTOs. Some people prefer to never use domain objects (like entities) outside its domain (in `controllers`, for example) and return a plain DTO object instead. This project doesn't use this technique, to avoid extra complexity and boilerplate code like interfaces and data mapping. - -[Here](https://martinfowler.com/bliki/LocalDTO.html) are Martin Fowler's thoughts on local DTOs, in short (quote): - -> Some people argue for them (DTOs) as part of a Service Layer API because they ensure that service layer clients aren't dependent upon an underlying Domain Model. While that may be handy, I don't think it's worth the cost of all of that data mapping. - -Though you may want to introduce Local DTOs when you need to decouple modules properly. For example, when querying from one module to another you don't want to leak your entities between modules. In that case using a Local DTO may be justified. - ---- - -# Infrastructure layer - -The Infrastructure layer is responsible for encapsulating technology. You can find there the implementations of database repositories for storing/retrieving business entities, message brokers to emit messages/events, I/O services to access external resources, framework related code and any other code that represents a replaceable detail for the architecture. - -It's the most volatile layer. Since the things in this layer are so likely to change, they are kept as far away as possible from the more stable domain layers. Because they are kept separate, it's relatively easy to make changes or swap one component for another. - -Infrastructure layer can contain `Adapters`, database related files like `Repositories`, `ORM entities`/`Schemas`, framework related files etc. - -## Adapters - -- Infrastructure adapters (also called driven/secondary adapters) enable a software system to interact with external systems by receiving, storing and providing data when requested (like persistence, message brokers, sending emails or messages, requesting 3rd party APIs etc). -- Adapters also can be used to interact with different domains inside single process to avoid coupling between those domains. -- Adapters are essentially an implementation of ports. They are not supposed to be called directly in any point in code, only through ports(interfaces). -- Adapters can be used as Anti-Corruption Layer (ACL) for legacy code. - -Read more on ACL: [Anti-Corruption Layer: How to Keep Legacy Support from Breaking New Systems](https://www.cloudbees.com/blog/anti-corruption-layer-how-keep-legacy-support-breaking-new-systems) - -Adapters should have: - -- a `port` somewhere in application/domain layer that it implements; -- a mapper that maps data **from** and **to** domain (if it's needed); -- a DTO/interface for received data; -- a validator to make sure incoming data is not corrupted (validation can reside in DTO class using decorators, or it can be validated by `Value Objects`). - -## Repositories - -Repositories are abstractions over collections of entities that are living in a database. -They centralize common data access functionality and encapsulate the logic required to access that data. Entities/aggregates can be put into a repository and then retrieved at a later time without domain even knowing where data is saved: in a database, in a file, or some other source. - -We use repositories to decouple the infrastructure or technology used to access databases from the domain model layer. - -Martin Fowler describes a repository as follows: - -> A repository performs the tasks of an intermediary between the domain model layers and data mapping, acting similarly to a set of domain objects in memory. Client objects declaratively build queries and send them to the repositories for answers. Conceptually, a repository encapsulates a set of objects stored in the database and operations that can be performed on them, providing a way that is closer to the persistence layer. Repositories, also, support the purpose of separating, clearly and in one direction, the dependency between the work domain and the data allocation or mapping. - -The data flow here looks something like this: repository receives a domain `Entity` from application service, maps it to database schema/ORM format, does required operations (saving/updating/retrieving etc), then maps it back to domain `Entity` format and returns it back to service. - -Application's core usually is not allowed to depend on repositories directly, instead it depends on abstractions (ports/interfaces). This makes data retrieval technology-agnostic. - -**Note**: in theory, most publications out there recommend abstracting a database with interfaces. In practice, it's not always useful. Most of the projects out there never change database technology (or rewrite most of the code anyway if they do). Another downside is that if you abstract a database you are more likely not using its full potential. This project abstracts repositories with a generic port to make a practical example [repository.port.ts](src/libs/ddd/repository.port.ts), but this doesn't mean you should do that too. Think carefully before using abstractions. More info on this topic: [Should you Abstract the Database?](https://enterprisecraftsmanship.com/posts/should-you-abstract-database/) - -Example files: - -This project contains abstract repository class that allows to make basic CRUD operations: [sql-repository.base.ts](src/libs/db/sql-repository.base.ts). This base class is then extended by a specific repository, and all specific operations that an entity may need are implemented in that specific repo: [user.repository.ts](src/modules/user/database/user.repository.ts). - -Read more: - -- [Design the infrastructure persistence layer](https://docs.microsoft.com/en-us/dotnet/architecture/microservices/microservice-ddd-cqrs-patterns/infrastructure-persistence-layer-design) -- [Should you use the Repository Pattern? With CQRS, Yes and No!](https://codeopinion.com/should-you-use-the-repository-pattern-with-cqrs-yes-and-no/) - in a read model / query handlers it is not required to use a repository pattern. - -## Persistence models - -Using a single entity for domain logic and database concerns leads to a database-centric architecture. In DDD world domain model and persistence model should be separated. - -Since domain `Entities` have their data modeled so that it best accommodates domain logic, it may be not in the best shape to save in a database. For that purpose `Persistence models` can be created that have a shape that is better represented in a particular database that is used. Domain layer should not know anything about persistence models, and it should not care. - -There can be multiple models optimized for different purposes, for example: - -- Domain with its own models - `Entities`, `Aggregates` and `Value Objects`. -- Persistence layer with its own models - ORM ([Object–relational mapping](https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping)), schemas, read/write models if databases are separated into a read and write db ([CQRS](https://en.wikipedia.org/wiki/Command%E2%80%93query_separation)) etc. - -Over time, when the amount of data grows, there may be a need to make some changes in the database like improving performance or data integrity by re-designing some tables or even changing the database entirely. Without an explicit separation between `Domain` and `Persistance` models any change to the database will lead to change in your domain `Entities` or `Aggregates`. For example, when performing a database [normalization](https://en.wikipedia.org/wiki/Database_normalization) data can spread across multiple tables rather than being in one table, or vice-versa for [denormalization](https://en.wikipedia.org/wiki/Denormalization). This may force a team to do a complete refactoring of a domain layer which may cause unexpected bugs and challenges. Separating Domain and Persistence models prevents that. - -**Note**: separating domain and persistence models may be overkill for smaller applications. It requires a lot of effort creating and maintaining boilerplate code like mappers and abstractions. Consider all pros and cons before making this decision. - -Example files: - -- [user.repository.ts](src/modules/user/database/user.repository.ts) <- notice `userSchema` and `UserModel` type that describe how user looks in a database -- [user.mapper.ts](src/modules/user/user.mapper.ts) <- Persistence models should also have a corresponding mapper to map from domain to persistence and back. - -For smaller projects you could use [ORM](https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping) libraries like [Typeorm](https://typeorm.io/) for simplicity. But for projects with more complexity ORMs are not flexible and performant enough. For this reason, this project uses raw queries with a [Slonik](https://github.com/gajus/slonik) client library. - -Read more: - -- [Stack Overflow question: DDD - Persistence Model and Domain Model](https://stackoverflow.com/questions/14024912/ddd-persistence-model-and-domain-model) -- [Just Stop It! The Domain Model Is Not The Persistence Model](https://blog.sapiensworks.com/post/2012/04/07/Just-Stop-It!-The-Domain-Model-Is-Not-The-Persistence-Model.aspx) -- [Comparing SQL, query builders, and ORMs](https://www.prisma.io/dataguide/types/relational/comparing-sql-query-builders-and-orms) -- [Secure by Design: Chapter 6.2.2 ORM frameworks and no-arg constructors](https://livebook.manning.com/book/secure-by-design/chapter-6/40) - -## Other things that can be a part of Infrastructure layer - -- Framework related files; -- Application logger implementation; -- Infrastructure related events ([Nest-event](https://www.npmjs.com/package/nest-event)) -- Periodic cron jobs or tasks launchers ([NestJS Schedule](https://docs.nestjs.com/techniques/task-scheduling)); -- Other technology related files. - ---- - -# Other recommendations - -## General recommendations on architectures, best practices, design patterns and principles - -Different projects most likely will have different requirements. Some principles/patterns in such projects can be implemented in a simplified form, some can be skipped. Follow [YAGNI](https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) principle and don't overengineer. - -Sometimes complex architecture and principles like [SOLID](https://en.wikipedia.org/wiki/SOLID) can be incompatible with [YAGNI](https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) and [KISS](https://en.wikipedia.org/wiki/KISS_principle). A good programmer should be pragmatic and has to be able to combine his skills and knowledge with a common sense to choose the best solution for the problem. - -> You need some experience with object-oriented software development in real world projects before they are of any use to you. Furthermore, they don’t tell you when you have found a good solution and when you went too far. Going too far means that you are outside the “scope” of a principle and the expected advantages don’t appear. -> Principles, Heuristics, ‘laws of engineering’ are like hint signs, they are helpful when you know where they are pointing to and you know when you have gone too far. Applying them requires experience, that is trying things out, failing, analyzing, talking to people, failing again, fixing, learning and failing some more. There is no shortcut as far as I know. - -**Before implementing any pattern always analyze if benefit given by using it worth extra code complexity**. - -> Effective design argues that we need to know the price of a pattern is worth paying - that's its own skill. - -Don't blindly follow practices, patterns and architectures just because books and articles say so. Sometimes rewriting a software from scratch is the best solution, and all your efforts to fit in all the patterns and architectural styles you know into the project will be a waste of time. Try to evaluate the cost and benefit of every pattern you implement and avoid overengineering. Remember that architectures, patterns and principles are your tools that may be useful in certain situations, not dogmas that you have to follow blindly. - -However, remember: - -> It's easier to refactor over-design than it's to refactor no design. - -Read more: - -- [Which Software Architecture should you pick?](https://youtu.be/8B445kqSKwg) -- [SOLID Principles and the Arts of Finding the Beach](https://sebastiankuebeck.wordpress.com/2017/09/17/solid-principles-and-the-arts-of-finding-the-beach/) -- [Martin Fowler blog: Yagni](https://martinfowler.com/bliki/Yagni.html) -- [7 Software Development Principles That Should Be Embraced Daily](https://betterprogramming.pub/7-software-development-principles-that-should-be-embraced-daily-c26a94ec4ecc?gi=3b5b298ddc23) - -## Recommendations for smaller APIs - -Be careful when implementing any complex architecture in small-medium sized projects with not a lot of business logic. Some building blocks/patterns/principles may fit well, but others may be an overengineering. - -For example: - -- Separating code into modules/layers/use-cases, using some building blocks like controllers/services/entities, respecting boundaries and dependency injections etc. may be a good idea for any project. -- But practices like creating an object for every primitive, using `Value Objects` to separate business logic into smaller classes, separating `Domain Models` from `Persistence Models` etc. in projects that are more data-centric and have little or no business logic may only complicate such solutions and add extra boilerplate code, data mapping, maintenance overheads etc. without adding much benefit. - -[DDD](https://en.wikipedia.org/wiki/Domain-driven_design) and other practices described here are mostly about creating software with complex business logic. But what would be a better approach for simpler applications? - -For applications with not a lot of business logic, where code mostly exists as a glue between database and a client, consider other architectures. The most popular is probably [MVC](https://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller). _Model-View-Controller_ is better suited for [CRUD](https://en.wikipedia.org/wiki/Create,_read,_update_and_delete) applications with little business logic since it tends to favor designs where software is mostly the view of the database. - -[Full-stack application example](https://github.com/Sairyss/full-stack-application-example) - here is an example of a simple CRUD full-stack application. - -Additional resources: - -- [Do you have enough Complexity for a Domain Model (Domain Driven Design)?](https://youtu.be/L1foFiqopIc) - -## Behavioral Testing - -Behavioral Testing (and also [BDD](https://en.wikipedia.org/wiki/Behavior-driven_development)) is a testing of the external behavior of the program, also known as black box testing. - -Domain-Driven Design with its ubiquitous language plays nicely with Behavioral tests. - -For BDD tests [Cucumber](https://cucumber.io/) with [Gherkin](https://cucumber.io/docs/gherkin/reference/) syntax can give a structure and meaning to your tests. This way even people not involved in a development can define steps needed for testing. In node.js world [cucumber](https://www.npmjs.com/package/@cucumber/cucumber) or [jest-cucumber](https://www.npmjs.com/package/jest-cucumber) are nice packages to achieve that. - -Example files: - -- [create-user.feature](tests/user/create-user/create-user.feature) - feature file that contains human-readable Gherkin steps -- [create-user.e2e-spec.ts](tests/user/create-user/create-user.e2e-spec.ts) - e2e / behavioral test - -Read more: - -- [Backend best practices - Testing](https://github.com/Sairyss/backend-best-practices#testing) - -## Folder and File Structure - -Some typical approaches are: - -- **Layered architecture**: split an entire application into directories divided by functionality, like `controllers`, `services`, `repositories`, etc. For example: - -```text -- Controllers - - UserController - - WalletController - - OtherControllers... -- Services - - UserService - - WalletService - - OtherServices... -- Repositories - - ... -``` - -This approach makes navigation harder. Every time you need to change some feature, instead of having all related files in the same place (in a module), you have to jump multiple directories to find all related files. This approach usually leads to tight coupling and spaghetti code. - -- **Divide application by modules** and split each module by some business domain: - -```text -- User - - UserController - - UserService - - UserRepository -- Wallet - - WalletController - - WalletService - - WalletRepository - ... -``` - -This looks better. With this approach each module is encapsulated and only contains its own business logic. The only downside is: over time those controllers and services can end up hundreds of lines long, making it difficult to navigate and merge conflicts harder to manage. - -- **Divide a module by subcomponents:** use modular approach discussed above and divide each module by slices and use cases. We divide a module further into smaller components: - -```text -- User - - CreateUser - - CreateUserController - - CreateUserService - - CreateUserDTO - - UpdateUser - - UpdateUserController - - UpdateUserService - - UpdateUserDTO - - UserRepository - - UserEntity -- Wallet - - CreateWallet - - CreateWalletController - - CreateWalletService - - CreateWalletDto - ... -``` - -This way each module is further split into highly cohesive subcomponents (by feature). Now when you open the project, instead of just seeing directories like `controllers`, `services`, `repositories`, etc. you can see right away what features application has from just reading directory names. - -This approach makes navigation and maintaining easier since all related files are close to each other. It also makes every feature properly encapsulated and gives you an ability to make localized decisions per component, based on each particular feature's needs. - -Shared files like domain objects (entities/aggregates), repositories, shared DTOs, interfaces, etc. can be stored outside of feature directory since they are usually reused by multiple subcomponents. - -This is called [The Common Closure Principle (CCP)](https://ericbackhage.net/clean-code/the-common-closure-principle/). Folder/file structure in this project uses this principle. Related files that usually change together (and are not used by anything else outside that component) are stored close together. - -> The aim here should be to be strategic and place classes that we, from experience, know often changes together into the same component. - -Keep in mind that this project's folder/file structure is an example and might not work for everyone. The main recommendations here are: - -- Separate your application into modules; -- Keep files that change together close to each other (_Common Closure Principle_ and _Vertical Slicing_); -- Group files by their behavior that changes together, not by a type of functionality that file provides; -- Keep files that are reused by multiple components apart; -- Respect boundaries in your code, keeping files together doesn't mean inner layers can import outer layers; -- Try to avoid a lot of nested folders; -- [Move files around until it feels right](https://dev.to/dance2die/move-files-around-until-it-feels-right-2lek). - -There are different approaches to file/folder structuring, choose what suits better for the project/personal preference. - -Examples: - -- [user](src/modules/user) module. -- [create-user](src/modules/user/commands/create-user) subcomponent. - -- [Commands](src/modules/user/commands) directory contains all state changing use cases and each use case inside it contains most of the things that it needs: controller, service, DTOs, command, etc. -- [Queries](src/modules/user/queries) directory is structured in the same way as commands but contains data retrieval use cases. - -Read more: - -- [Out with the Onion, in with Vertical Slices](https://medium.com/@jacobcunningham/out-with-the-onion-in-with-vertical-slices-c3edfdafe118) -- [[YouTube] Tired of Layers? Vertical Slice Architecture to the rescue!](https://youtu.be/lsddiYwWaOQ) -- [Vertical Slice Architecture](https://jimmybogard.com/vertical-slice-architecture/) -- [Why I don’t like layered architecture for microservices](https://garywoodfine.com/why-i-dont-like-layered-architecture-for-microservices/) - this explains more in details what disadvantages a typical horizontal Layered Architecture have compared to Modular / Vertical Slice architectures. - -### File names - -Consider giving a descriptive type names to files after a dot "`.`", like `*.service.ts` or `*.entity.ts`. This makes it easier to differentiate what files do what and makes it easier to find those files using [fuzzy search](https://en.wikipedia.org/wiki/Approximate_string_matching) (`CTRL+P` for Windows/Linux and `⌘+P` for MacOS in VSCode to try it out). - -Alternatively you could use class names as file names, but consider adding descriptive suffixes like `Service` or `Controller`, etc. - -Read more: - -- [Angular Style Guides: Separate file names with dots and dashes](https://angular.io/guide/styleguide#separate-file-names-with-dots-and-dashes). - -## Enforcing architecture - -To make sure everyone in the team adheres to defined architectural practices, use tools and libraries that can analyze and validate dependencies between files and layers. - -For example: - -```typescript - // Dependency cruiser example - { - name: 'no-domain-deps', - comment: 'Domain layer cannot depend on api or database layers', - severity: 'error', - from: { path: ['domain', 'entity', 'aggregate', 'value-object'] }, - to: { path: ['api', 'controller', 'dtos', 'database', 'repository'] }, - }, -``` - -Snippet of code above will prevent your domain layer to depend on the API layer or database layer. Example config: [.dependency-cruiser.js](.dependency-cruiser.js) - -You can also generate graphs like this: - -
-Click to see dependency graph - Dependency graph -
-
- -Example tools: - -- [Dependency cruiser](https://github.com/sverweij/dependency-cruiser) - Validate and visualize dependencies for JavaScript / TypeScript. -- [ArchUnit](https://www.archunit.org/) - library for checking the architecture of Java applications - -Read more: - -- [Validate Dependencies According to Clean Architecture](https://betterprogramming.pub/validate-dependencies-according-to-clean-architecture-743077ea084c) -- [Clean Architecture Boundaries with Spring Boot and ArchUnit](https://reflectoring.io/java-components-clean-boundaries/) - -## Prevent massive inheritance chains - -Classes that can be extended should be designed for extensibility and usually should be `abstract`. If class is not designed to be extended, prevent extending it by making class `final`. Don't create inheritance more than 1-2 levels deep since this makes refactoring harder and leads to a bad design. You can use [composition](https://en.wikipedia.org/wiki/Composition_over_inheritance) instead. - -**Note**: in TypeScript, unlike other languages, there is no default way to make class `final`. But there is a way around it using a custom decorator. - -Example file: [final.decorator.ts](src/libs/decorators/final.decorator.ts) - -Read more: - -- [When to declare classes final](https://ocramius.github.io/blog/when-to-declare-classes-final/) -- [Final classes by default, why?](https://matthiasnoback.nl/2018/09/final-classes-by-default-why/) -- [Prefer Composition Over Inheritance](https://medium.com/better-programming/prefer-composition-over-inheritance-1602d5149ea1) - ---- - -# Additional resources - -- [Backend best practices](https://github.com/Sairyss/backend-best-practices) - more best practices that are used here -- [Full-stack application example](https://github.com/Sairyss/full-stack-application-example) - architecture example of a simple full stack application - -## Articles - -- [DDD, Hexagonal, Onion, Clean, CQRS, … How I put it all together](https://herbertograca.com/2017/11/16/explicit-architecture-01-ddd-hexagonal-onion-clean-cqrs-how-i-put-it-all-together) -- [Hexagonal Architecture](https://www.qwan.eu/2020/08/20/hexagonal-architecture.html) -- [Clean architecture series](https://medium.com/@pereiren/clean-architecture-series-part-1-f34ef6b04b62) -- [Clean architecture for the rest of us](https://pusher.com/tutorials/clean-architecture-introduction) -- [An illustrated guide to 12 Factor Apps](https://www.redhat.com/architect/12-factor-app) - -## Websites - -- [The Twelve-Factor App](https://12factor.net/) -- [Refactoring guru - Catalog of Design Patterns](https://refactoring.guru/design-patterns/catalog) - -## Blogs - -- [Vladimir Khorikov](https://enterprisecraftsmanship.com/) -- [Derek Comartin](https://codeopinion.com/) -- [Kamil Grzybek](https://www.kamilgrzybek.com/) -- [Martin Fowler](https://martinfowler.com/) -- [Khalil Stemmler](https://khalilstemmler.com) -- [Herberto Graca](https://herbertograca.com/) - -## Videos - -- [More Testable Code with the Hexagonal Architecture](https://youtu.be/ujb_O6myknY) -- [Playlist: Design Patterns Video Tutorial](https://youtube.com/playlist?list=PLF206E906175C7E07) -- [Playlist: Design Patterns in Object Oriented Programming](https://youtube.com/playlist?list=PLrhzvIcii6GNjpARdnO4ueTUAVR9eMBpc) -- [Herberto Graca - Making architecture explicit](https://www.youtube.com/watch?v=_yoZN9Sb3PM&feature=youtu.be) - -## Books - -- ["Domain-Driven Design: Tackling Complexity in the Heart of Software"](https://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215) by Eric Evans -- ["Secure by Design"](https://www.manning.com/books/secure-by-design) by Dan Bergh Johnsson, Daniel Deogun, Daniel Sawano -- ["Implementing Domain-Driven Design"](https://www.amazon.com/Implementing-Domain-Driven-Design-Vaughn-Vernon/dp/0321834577) by Vaughn Vernon -- ["Clean Architecture: A Craftsman's Guide to Software Structure and Design"](https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164/ref=sr_1_1?dchild=1&keywords=clean+architecture&qid=1605343702&s=books&sr=1-1) by Robert Martin diff --git a/docs/more/ddd/assets/dependency-graph.svg b/docs/more/ddd/assets/dependency-graph.svg deleted file mode 100644 index 48c23e1..0000000 --- a/docs/more/ddd/assets/dependency-graph.svg +++ /dev/null @@ -1,2033 +0,0 @@ - - - - - - -dependency-cruiser output - - -cluster_src - -src - - -cluster_src/configs - -configs - - -cluster_src/libs - -libs - - -cluster_src/libs/api - -api - - -cluster_src/libs/api/graphql - -graphql - - -cluster_src/libs/application - -application - - -cluster_src/libs/application/context - -context - - -cluster_src/libs/application/interceptors - -interceptors - - -cluster_src/libs/db - -db - - -cluster_src/libs/ddd - -ddd - - -cluster_src/libs/decorators - -decorators - - -cluster_src/libs/exceptions - -exceptions - - -cluster_src/libs/ports - -ports - - -cluster_src/libs/types - -types - - -cluster_src/libs/utils - -utils - - -cluster_src/modules - -modules - - -cluster_src/modules/user - -user - - -cluster_src/modules/user/commands - -commands - - -cluster_src/modules/user/commands/create-user - -create-user - - -cluster_src/modules/user/commands/create-user/graphql-example - -graphql-example - - -cluster_src/modules/user/commands/create-user/graphql-example/dtos - -dtos - - -cluster_src/modules/user/commands/delete-user - -delete-user - - -cluster_src/modules/user/database - -database - - -cluster_src/modules/user/domain - -domain - - -cluster_src/modules/user/domain/events - -events - - -cluster_src/modules/user/domain/value-objects - -value-objects - - -cluster_src/modules/user/dtos - -dtos - - -cluster_src/modules/user/dtos/graphql - -graphql - - -cluster_src/modules/user/queries - -queries - - -cluster_src/modules/user/queries/find-users - -find-users - - -cluster_src/modules/wallet - -wallet - - -cluster_src/modules/wallet/application - -application - - -cluster_src/modules/wallet/application/event-handlers - -event-handlers - - -cluster_src/modules/wallet/database - -database - - -cluster_src/modules/wallet/domain - -domain - - -cluster_src/modules/wallet/domain/events - -events - - - -src/app.module.ts - - -app.module.ts - - - - - -src/configs/database.config.ts - - -database.config.ts - - - - - -src/app.module.ts->src/configs/database.config.ts - - - - - -src/libs/application/context/ContextInterceptor.ts - - -ContextInterceptor.ts - - - - - -src/app.module.ts->src/libs/application/context/ContextInterceptor.ts - - - - - -src/libs/application/interceptors/exception.interceptor.ts - - -exception.interceptor.ts - - - - - -src/app.module.ts->src/libs/application/interceptors/exception.interceptor.ts - - - - - -src/modules/user/user.module.ts - - -user.module.ts - - - - - -src/app.module.ts->src/modules/user/user.module.ts - - - - - -src/modules/wallet/wallet.module.ts - - -wallet.module.ts - - - - - -src/app.module.ts->src/modules/wallet/wallet.module.ts - - - - - -src/libs/utils/dotenv.ts - - -dotenv.ts - - - - - -src/configs/database.config.ts->src/libs/utils/dotenv.ts - - - - - -src/libs/application/context/AppRequestContext.ts - - -AppRequestContext.ts - - - - - -src/libs/application/context/ContextInterceptor.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/api/api-error.response.ts - - -api-error.response.ts - - - - - -src/libs/application/interceptors/exception.interceptor.ts->src/libs/api/api-error.response.ts - - - - - -src/libs/application/interceptors/exception.interceptor.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/exceptions/index.ts - - -index.ts - - - - - -src/libs/application/interceptors/exception.interceptor.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/commands/create-user/create-user.cli.controller.ts - - -create-user.cli.controller.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/create-user/create-user.cli.controller.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts - - -create-user.http.controller.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/create-user/create-user.http.controller.ts - - - - - -src/modules/user/commands/create-user/create-user.message.controller.ts - - -create-user.message.controller.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/create-user/create-user.message.controller.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts - - -create-user.service.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/create-user/create-user.service.ts - - - - - -src/modules/user/user.di-tokens.ts - - -user.di-tokens.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/user.di-tokens.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts - - -create-user.graphql-resolver.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts - - - - - -src/modules/user/commands/delete-user/delete-user.http-controller.ts - - -delete-user.http-controller.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/delete-user/delete-user.http-controller.ts - - - - - -src/modules/user/commands/delete-user/delete-user.service.ts - - -delete-user.service.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/commands/delete-user/delete-user.service.ts - - - - - -src/modules/user/database/user.repository.ts - - -user.repository.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/database/user.repository.ts - - - - - -src/modules/user/user.mapper.ts - - -user.mapper.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/user.mapper.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts - - -find-users.graphql-resolver.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/queries/find-users/find-users.graphql-resolver.ts - - - - - -src/modules/user/queries/find-users/find-users.query-handler.ts - - -find-users.query-handler.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/queries/find-users/find-users.query-handler.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts - - -find-users.http.controller.ts - - - - - -src/modules/user/user.module.ts->src/modules/user/queries/find-users/find-users.http.controller.ts - - - - - -src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts - - -create-wallet-when-user-is-created.domain-event-handler.ts - - - - - -src/modules/wallet/wallet.module.ts->src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts - - - - - -src/modules/wallet/wallet.di-tokens.ts - - -wallet.di-tokens.ts - - - - - -src/modules/wallet/wallet.module.ts->src/modules/wallet/wallet.di-tokens.ts - - - - - -src/modules/wallet/database/wallet.repository.ts - - -wallet.repository.ts - - - - - -src/modules/wallet/wallet.module.ts->src/modules/wallet/database/wallet.repository.ts - - - - - -src/modules/wallet/wallet.mapper.ts - - -wallet.mapper.ts - - - - - -src/modules/wallet/wallet.module.ts->src/modules/wallet/wallet.mapper.ts - - - - - -src/configs/app.routes.ts - - -app.routes.ts - - - - - -src/libs/api/graphql/paginated.graphql-response.base.ts - - -paginated.graphql-response.base.ts - - - - - -src/libs/api/id.response.dto.ts - - -id.response.dto.ts - - - - - -src/libs/api/paginated-query.request.dto.ts - - -paginated-query.request.dto.ts - - - - - -src/libs/api/paginated.response.base.ts - - -paginated.response.base.ts - - - - - -src/libs/ddd/index.ts - - -index.ts - - - - - -src/libs/api/paginated.response.base.ts->src/libs/ddd/index.ts - - - - - -src/libs/ddd/aggregate-root.base.ts - - -aggregate-root.base.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/aggregate-root.base.ts - - - - - -src/libs/ddd/domain-event.base.ts - - -domain-event.base.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/domain-event.base.ts - - - - - -src/libs/ddd/entity.base.ts - - -entity.base.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/entity.base.ts - - - - - -src/libs/ddd/command.base.ts - - -command.base.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/command.base.ts - - - - - -src/libs/ddd/mapper.interface.ts - - -mapper.interface.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/mapper.interface.ts - - - - - -src/libs/ddd/repository.port.ts - - -repository.port.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/repository.port.ts - - - - - -src/libs/ddd/value-object.base.ts - - -value-object.base.ts - - - - - -src/libs/ddd/index.ts->src/libs/ddd/value-object.base.ts - - - - - -src/libs/api/response.base.ts - - -response.base.ts - - - - - -src/libs/api/response.base.ts->src/libs/api/id.response.dto.ts - - - - - -src/libs/exceptions/exception.base.ts - - -exception.base.ts - - - - - -src/libs/exceptions/index.ts->src/libs/exceptions/exception.base.ts - - - - - -src/libs/exceptions/exception.codes.ts - - -exception.codes.ts - - - - - -src/libs/exceptions/index.ts->src/libs/exceptions/exception.codes.ts - - - - - -src/libs/exceptions/exceptions.ts - - -exceptions.ts - - - - - -src/libs/exceptions/index.ts->src/libs/exceptions/exceptions.ts - - - - - - - -src/libs/db/sql-repository.base.ts - - -sql-repository.base.ts - - - - - -src/libs/db/sql-repository.base.ts->src/libs/ddd/index.ts - - - - - -src/libs/db/sql-repository.base.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/db/sql-repository.base.ts->src/libs/exceptions/index.ts - - - - - -src/libs/ports/logger.port.ts - - -logger.port.ts - - - - - -src/libs/db/sql-repository.base.ts->src/libs/ports/logger.port.ts - - - - - -src/libs/types/index.ts - - -index.ts - - - - - -src/libs/db/sql-repository.base.ts->src/libs/types/index.ts - - - - - -src/libs/types/deep-partial.type.ts - - -deep-partial.type.ts - - - - - -src/libs/types/index.ts->src/libs/types/deep-partial.type.ts - - - - - -src/libs/types/mutable.type.ts - - -mutable.type.ts - - - - - -src/libs/types/index.ts->src/libs/types/mutable.type.ts - - - - - -src/libs/types/non-function-properties.type.ts - - -non-function-properties.type.ts - - - - - -src/libs/types/index.ts->src/libs/types/non-function-properties.type.ts - - - - - -src/libs/types/object-literal.type.ts - - -object-literal.type.ts - - - - - -src/libs/types/index.ts->src/libs/types/object-literal.type.ts - - - - - -src/libs/types/require-one.type.ts - - -require-one.type.ts - - - - - -src/libs/types/index.ts->src/libs/types/require-one.type.ts - - - - - -src/libs/ddd/aggregate-root.base.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/ddd/aggregate-root.base.ts->src/libs/ports/logger.port.ts - - - - - -src/libs/ddd/aggregate-root.base.ts->src/libs/ddd/domain-event.base.ts - - - - - -src/libs/ddd/aggregate-root.base.ts->src/libs/ddd/entity.base.ts - - - - - -src/libs/ddd/domain-event.base.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/ddd/domain-event.base.ts->src/libs/exceptions/index.ts - - - - - -src/libs/guard.ts - - -guard.ts - - - - - -src/libs/ddd/domain-event.base.ts->src/libs/guard.ts - - - - - -src/libs/ddd/entity.base.ts->src/libs/exceptions/index.ts - - - - - -src/libs/ddd/entity.base.ts->src/libs/guard.ts - - - - - -src/libs/utils/index.ts - - -index.ts - - - - - -src/libs/ddd/entity.base.ts->src/libs/utils/index.ts - - - - - - - -src/libs/ddd/command.base.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/ddd/command.base.ts->src/libs/exceptions/index.ts - - - - - -src/libs/ddd/command.base.ts->src/libs/guard.ts - - - - - -src/libs/utils/convert-props-to-object.util.ts - - -convert-props-to-object.util.ts - - - - - -src/libs/utils/index.ts->src/libs/utils/convert-props-to-object.util.ts - - - - - - - -src/libs/ddd/mapper.interface.ts->src/libs/ddd/entity.base.ts - - - - - -src/libs/ddd/value-object.base.ts->src/libs/exceptions/index.ts - - - - - -src/libs/ddd/value-object.base.ts->src/libs/guard.ts - - - - - -src/libs/ddd/value-object.base.ts->src/libs/utils/index.ts - - - - - - - -src/libs/ddd/query.base.ts - - -query.base.ts - - - - - -src/libs/ddd/query.base.ts->src/libs/ddd/repository.port.ts - - - - - -src/libs/decorators/final.decorator.ts - - -final.decorator.ts - - - - - -src/libs/decorators/frozen.decorator.ts - - -frozen.decorator.ts - - - - - -src/libs/decorators/index.ts - - -index.ts - - - - - -src/libs/decorators/index.ts->src/libs/decorators/final.decorator.ts - - - - - -src/libs/decorators/index.ts->src/libs/decorators/frozen.decorator.ts - - - - - -src/libs/exceptions/exception.base.ts->src/libs/application/context/AppRequestContext.ts - - - - - -src/libs/exceptions/exceptions.ts->src/libs/exceptions/index.ts - - - - - - - -src/libs/exceptions/exceptions.ts->src/libs/exceptions/exception.base.ts - - - - - -src/libs/utils/convert-props-to-object.util.ts->src/libs/ddd/entity.base.ts - - - - - - - -src/libs/utils/convert-props-to-object.util.ts->src/libs/ddd/value-object.base.ts - - - - - - - -src/main.ts - - -main.ts - - - - - -src/main.ts->src/app.module.ts - - - - - -src/modules/user/commands/create-user/create-user.cli.controller.ts->src/libs/ports/logger.port.ts - - - - - -src/modules/user/commands/create-user/create-user.command.ts - - -create-user.command.ts - - - - - -src/modules/user/commands/create-user/create-user.cli.controller.ts->src/modules/user/commands/create-user/create-user.command.ts - - - - - -src/modules/user/commands/create-user/create-user.command.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/configs/app.routes.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/libs/api/api-error.response.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/libs/api/id.response.dto.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/modules/user/commands/create-user/create-user.command.ts - - - - - -src/modules/user/commands/create-user/create-user.request.dto.ts - - -create-user.request.dto.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/modules/user/commands/create-user/create-user.request.dto.ts - - - - - -src/modules/user/domain/user.errors.ts - - -user.errors.ts - - - - - -src/modules/user/commands/create-user/create-user.http.controller.ts->src/modules/user/domain/user.errors.ts - - - - - -src/modules/user/domain/user.errors.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/commands/create-user/create-user.message.controller.ts->src/libs/api/id.response.dto.ts - - - - - -src/modules/user/commands/create-user/create-user.message.controller.ts->src/modules/user/commands/create-user/create-user.command.ts - - - - - -src/modules/user/commands/create-user/create-user.message.controller.ts->src/modules/user/commands/create-user/create-user.request.dto.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/commands/create-user/create-user.command.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/domain/user.errors.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/user.di-tokens.ts - - - - - -src/modules/user/database/user.repository.port.ts - - -user.repository.port.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/database/user.repository.port.ts - - - - - -src/modules/user/domain/user.entity.ts - - -user.entity.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/domain/user.entity.ts - - - - - -src/modules/user/domain/value-objects/address.value-object.ts - - -address.value-object.ts - - - - - -src/modules/user/commands/create-user/create-user.service.ts->src/modules/user/domain/value-objects/address.value-object.ts - - - - - -src/modules/user/database/user.repository.port.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/database/user.repository.port.ts->src/modules/user/domain/user.entity.ts - - - - - -src/modules/user/domain/user.entity.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/value-objects/address.value-object.ts - - - - - -src/modules/user/domain/user.types.ts - - -user.types.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/user.types.ts - - - - - -src/modules/user/domain/events/user-address-updated.domain-event.ts - - -user-address-updated.domain-event.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/events/user-address-updated.domain-event.ts - - - - - -src/modules/user/domain/events/user-created.domain-event.ts - - -user-created.domain-event.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/events/user-created.domain-event.ts - - - - - -src/modules/user/domain/events/user-deleted.domain-event.ts - - -user-deleted.domain-event.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/events/user-deleted.domain-event.ts - - - - - -src/modules/user/domain/events/user-role-changed.domain-event.ts - - -user-role-changed.domain-event.ts - - - - - -src/modules/user/domain/user.entity.ts->src/modules/user/domain/events/user-role-changed.domain-event.ts - - - - - -src/modules/user/domain/value-objects/address.value-object.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/value-objects/address.value-object.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/domain/value-objects/address.value-object.ts->src/libs/guard.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts->src/modules/user/commands/create-user/create-user.command.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts->src/modules/user/domain/user.errors.ts - - - - - -src/modules/user/commands/create-user/graphql-example/dtos/create-user.gql-request.dto.ts - - -create-user.gql-request.dto.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts->src/modules/user/commands/create-user/graphql-example/dtos/create-user.gql-request.dto.ts - - - - - -src/modules/user/commands/create-user/graphql-example/dtos/id.gql-response.dto.ts - - -id.gql-response.dto.ts - - - - - -src/modules/user/commands/create-user/graphql-example/create-user.graphql-resolver.ts->src/modules/user/commands/create-user/graphql-example/dtos/id.gql-response.dto.ts - - - - - -src/modules/user/commands/delete-user/delete-user.http-controller.ts->src/configs/app.routes.ts - - - - - -src/modules/user/commands/delete-user/delete-user.http-controller.ts->src/libs/api/api-error.response.ts - - - - - -src/modules/user/commands/delete-user/delete-user.http-controller.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/commands/delete-user/delete-user.http-controller.ts->src/modules/user/commands/delete-user/delete-user.service.ts - - - - - -src/modules/user/commands/delete-user/delete-user.service.ts->src/libs/exceptions/index.ts - - - - - -src/modules/user/commands/delete-user/delete-user.service.ts->src/modules/user/user.di-tokens.ts - - - - - -src/modules/user/commands/delete-user/delete-user.service.ts->src/modules/user/database/user.repository.port.ts - - - - - -src/modules/user/database/user.repository.ts->src/libs/db/sql-repository.base.ts - - - - - -src/modules/user/database/user.repository.ts->src/modules/user/database/user.repository.port.ts - - - - - -src/modules/user/database/user.repository.ts->src/modules/user/domain/user.entity.ts - - - - - -src/modules/user/database/user.repository.ts->src/modules/user/domain/user.types.ts - - - - - -src/modules/user/database/user.repository.ts->src/modules/user/user.mapper.ts - - - - - - - -src/modules/user/domain/user.types.ts->src/modules/user/domain/value-objects/address.value-object.ts - - - - - -src/modules/user/user.mapper.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/user.mapper.ts->src/modules/user/domain/user.entity.ts - - - - - -src/modules/user/user.mapper.ts->src/modules/user/domain/value-objects/address.value-object.ts - - - - - -src/modules/user/user.mapper.ts->src/modules/user/database/user.repository.ts - - - - - - - -src/modules/user/dtos/user.response.dto.ts - - -user.response.dto.ts - - - - - -src/modules/user/user.mapper.ts->src/modules/user/dtos/user.response.dto.ts - - - - - -src/modules/user/domain/events/user-address-updated.domain-event.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/events/user-created.domain-event.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/events/user-deleted.domain-event.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/events/user-role-changed.domain-event.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/domain/events/user-role-changed.domain-event.ts->src/modules/user/domain/user.types.ts - - - - - -src/modules/user/dtos/graphql/user.graphql-response.dto.ts - - -user.graphql-response.dto.ts - - - - - -src/modules/user/dtos/graphql/user.graphql-response.dto.ts->src/libs/api/response.base.ts - - - - - -src/modules/user/dtos/graphql/user.paginated-gql-response.dto.ts - - -user.paginated-gql-response.dto.ts - - - - - -src/modules/user/dtos/graphql/user.paginated-gql-response.dto.ts->src/libs/api/graphql/paginated.graphql-response.base.ts - - - - - -src/modules/user/dtos/graphql/user.paginated-gql-response.dto.ts->src/modules/user/dtos/graphql/user.graphql-response.dto.ts - - - - - -src/modules/user/dtos/user.paginated.response.dto.ts - - -user.paginated.response.dto.ts - - - - - -src/modules/user/dtos/user.paginated.response.dto.ts->src/libs/api/paginated.response.base.ts - - - - - -src/modules/user/dtos/user.paginated.response.dto.ts->src/modules/user/dtos/user.response.dto.ts - - - - - -src/modules/user/dtos/user.response.dto.ts->src/libs/api/response.base.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/libs/api/response.base.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/libs/ddd/query.base.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/modules/user/database/user.repository.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/modules/user/dtos/graphql/user.paginated-gql-response.dto.ts - - - - - -src/modules/user/queries/find-users/find-users.graphql-resolver.ts->src/modules/user/queries/find-users/find-users.query-handler.ts - - - - - -src/modules/user/queries/find-users/find-users.query-handler.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/queries/find-users/find-users.query-handler.ts->src/libs/ddd/query.base.ts - - - - - -src/modules/user/queries/find-users/find-users.query-handler.ts->src/modules/user/database/user.repository.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/configs/app.routes.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/libs/api/paginated-query.request.dto.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/libs/ddd/index.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/libs/api/response.base.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/modules/user/database/user.repository.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/modules/user/dtos/user.paginated.response.dto.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/modules/user/queries/find-users/find-users.query-handler.ts - - - - - -src/modules/user/queries/find-users/find-users.request.dto.ts - - -find-users.request.dto.ts - - - - - -src/modules/user/queries/find-users/find-users.http.controller.ts->src/modules/user/queries/find-users/find-users.request.dto.ts - - - - - -src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts->src/modules/user/domain/events/user-created.domain-event.ts - - - - - -src/modules/wallet/domain/wallet.entity.ts - - -wallet.entity.ts - - - - - -src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts->src/modules/wallet/domain/wallet.entity.ts - - - - - -src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts->src/modules/wallet/wallet.di-tokens.ts - - - - - -src/modules/wallet/database/wallet.repository.port.ts - - -wallet.repository.port.ts - - - - - -src/modules/wallet/application/event-handlers/create-wallet-when-user-is-created.domain-event-handler.ts->src/modules/wallet/database/wallet.repository.port.ts - - - - - -src/modules/wallet/domain/wallet.entity.ts->src/libs/ddd/index.ts - - - - - -src/modules/wallet/domain/wallet.entity.ts->src/libs/exceptions/index.ts - - - - - -src/modules/wallet/domain/events/wallet-created.domain-event.ts - - -wallet-created.domain-event.ts - - - - - -src/modules/wallet/domain/wallet.entity.ts->src/modules/wallet/domain/events/wallet-created.domain-event.ts - - - - - -src/modules/wallet/domain/wallet.errors.ts - - -wallet.errors.ts - - - - - -src/modules/wallet/domain/wallet.entity.ts->src/modules/wallet/domain/wallet.errors.ts - - - - - -src/modules/wallet/database/wallet.repository.port.ts->src/libs/ddd/index.ts - - - - - -src/modules/wallet/database/wallet.repository.port.ts->src/modules/wallet/domain/wallet.entity.ts - - - - - -src/modules/wallet/database/wallet.repository.ts->src/libs/db/sql-repository.base.ts - - - - - -src/modules/wallet/database/wallet.repository.ts->src/modules/wallet/domain/wallet.entity.ts - - - - - -src/modules/wallet/database/wallet.repository.ts->src/modules/wallet/database/wallet.repository.port.ts - - - - - -src/modules/wallet/database/wallet.repository.ts->src/modules/wallet/wallet.mapper.ts - - - - - - - -src/modules/wallet/wallet.mapper.ts->src/libs/ddd/index.ts - - - - - -src/modules/wallet/wallet.mapper.ts->src/modules/wallet/domain/wallet.entity.ts - - - - - -src/modules/wallet/wallet.mapper.ts->src/modules/wallet/database/wallet.repository.ts - - - - - - - -src/modules/wallet/domain/events/wallet-created.domain-event.ts->src/libs/ddd/index.ts - - - - - -src/modules/wallet/domain/wallet.errors.ts->src/libs/exceptions/index.ts - - - - - diff --git a/docs/more/ddd/assets/images/DomainDrivenHexagon.png b/docs/more/ddd/assets/images/DomainDrivenHexagon.png deleted file mode 100644 index d7fdcb4..0000000 Binary files a/docs/more/ddd/assets/images/DomainDrivenHexagon.png and /dev/null differ diff --git a/docs/more/erupt/crud-show.png b/docs/more/erupt/crud-show.png deleted file mode 100644 index bc415f0..0000000 Binary files a/docs/more/erupt/crud-show.png and /dev/null differ diff --git a/docs/more/erupt/crud-simple-menu.png b/docs/more/erupt/crud-simple-menu.png deleted file mode 100644 index 4628092..0000000 Binary files a/docs/more/erupt/crud-simple-menu.png and /dev/null differ diff --git a/docs/more/erupt/menu-main.png b/docs/more/erupt/menu-main.png deleted file mode 100644 index ca730f3..0000000 Binary files a/docs/more/erupt/menu-main.png and /dev/null differ diff --git a/docs/more/erupt/quick-start.md b/docs/more/erupt/quick-start.md deleted file mode 100644 index 0c7450d..0000000 --- a/docs/more/erupt/quick-start.md +++ /dev/null @@ -1,18 +0,0 @@ -# Erupt Quick Start - - -## Simple CRUD -单表操作 -代码: - -```java - -``` -## Menu 配置 - -![img.png](menu-main.png) - -![img_2.png](crud-simple-menu.png) - -## 功能结果 - -![img.png](crud-show.png) \ No newline at end of file diff --git a/docs/more/gradle/gradle-cheatsheet.md b/docs/more/gradle/gradle-cheatsheet.md deleted file mode 100644 index 33b90b4..0000000 --- a/docs/more/gradle/gradle-cheatsheet.md +++ /dev/null @@ -1,7 +0,0 @@ -# Gradle Cheat Sheet - -## Upgrade Gradle Version - -```shell -gradle wrapper --gradle-version 8.3 -``` \ No newline at end of file diff --git a/docs/more/project-structure.png b/docs/more/project-structure.png deleted file mode 100644 index 08aab13..0000000 Binary files a/docs/more/project-structure.png and /dev/null differ diff --git a/docs/more/qa-learn-java/What-to-do.md b/docs/more/qa-learn-java/What-to-do.md deleted file mode 100644 index 9b4ef12..0000000 --- a/docs/more/qa-learn-java/What-to-do.md +++ /dev/null @@ -1,30 +0,0 @@ -# 测试学习JAVA - -如何快速学习? - -- 学习不是目的,目的是可以解决你实际中的问题 -- 实际过程中有什么问题需要使用JAVA解决 - - 接口测试 - - 测试开发平台 - - 集成其他系统 - -## 先大致了解JAVA - -- 环境搭建 -- 基本语法 -- 常见代码运行方式 - -## 学习JAVA的检验供借鉴 - -- 基本操作库 -- 利用现有的仓库 -- 定义问题,快速阅读文档,解决问题 -- 通过例子学习/阅读源码 -- chatgpt - -## 推荐一些基本操作库 - -- [hutool] -- [](fluentqa-qabox) - -## 学习中前进 diff --git a/docs/more/qabox/1-mindmap.md b/docs/more/qabox/1-mindmap.md deleted file mode 100644 index e69de29..0000000 diff --git a/docs/more/qabox/application/6-error-handler.md b/docs/more/qabox/application/6-error-handler.md deleted file mode 100644 index d9d235e..0000000 --- a/docs/more/qabox/application/6-error-handler.md +++ /dev/null @@ -1,131 +0,0 @@ -# Error Handler - -- [core-errors](https://github.com/linux-china/java-error-messages-wizard.git) - -Java Error Messages Wizard - Write Good Error Message -====================================================== - -# Features - -* Error Code Design -* Standard Error Message Format -* Resource Bundle: properties file and i18n -* slf4j friendly -* IntelliJ IDEA friendly - -# Error Code Design - -Error code a unique string value for a kind of error, and includes 3 parts: - -* System/App short name: such as RST, APP1. Yes, Jira alike -* Component short name or code: such as LOGIN, 001 -* Status code: a 3 digital number to describe error's status, such as 404, 500. - Reference from HTTP Status Code. - -Error Code examples: - -* OSS-001-404 -* RST-002-500 -* UIC-LOGIN-500 - -# Error Message - -A good error message with following parts: - -* Context: What led to the error? What was the code trying to do when it failed? - where? -* The error itself: What exactly failed? description and reason -* Mitigation: What needs to be done in order to overcome the error? Solutions - -Fields for an error: - -* context: such as app name, component, status code -* description: Long(Short) to describe error -* because/reason: explain the reason with data -* documentedAt: error link -* solutions: possible solutions - -Message format for an -error: `long description(short desc): because/reason --- document link -- solutions` - -# Use properties file to save error code and message - -Example as following: - -```properties -ERROR-CODE:long description(short desc): because/reason --- document link -- solutions -RST-100400=Failed to log in system with email and password(Email login failed): can not find account with email {0} --- please refer https://example.com/login/byemail --- Solutions: 1. check your email 2. check your password -RST-100401=Failed to log in system with phone and pass(Phone login failed): can not find account with phone {0} --- please refer https://example.com/login/byphone --- Solutions: 1. check your phone 2. check your pass code in SMS -``` - -# FAQ - -### Why Choose HTTP Status Code as Error status code? - -Most developers know HTTP status code: 200, 404, 500 - -* Informational responses (100–199) -* Successful responses (200–299) -* Redirection messages (300–399) -* Client error responses (400–499) -* Server error responses (500–599) - -### Why Choose properties file to store error messages? - -Properties file is friendly for i18n and IDE friendly now - -* Code completion support for error code -* Error Code rename support -* Quick view support -* MessageFormat support -* Resource Bundle for i18n support - -Yes, you can choose Enum and POJO class, but some complication. - -If you use Rust, and Enum is good choice, for example `thiserror` -+ `error-stack` : - -```rust -use thiserror::Error as ThisError; - -/// errors for config component: app-100 -#[derive(ThisError, Debug)] -pub enum ConfigError { - #[error("APP-100404: config file not found: {0}")] - NotFound(String), - #[error("APP-100422: invalid JSON Format: {0}")] - Invalid(String), -} - -fn parse_config() -> Result { - let json_file = "config.json"; - let config = std::fs::read_to_string(json_file) - .report() - .change_context(ConfigError::NotFound(json_file.to_string()))?; - let map: ConfigMap = serde_json::from_str(&config) - .report() - .change_context(ConfigError::Invalid(json_file.to_string()))?; - Ok(map) -} -``` - -For more error code design with Rust, please -visit https://github.com/linux-china/rust-error-messages-wizard - -# References - -* What's in a Good Error - Message? https://www.morling.dev/blog/whats-in-a-good-error-message/ -* jdoctor: https://github.com/melix/jdoctor -* HTTP response status - codes: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status -* HTTP Status cheatsheet: https://devhints.io/http-status -* @PropertyKey support for slf4j message format - - https://youtrack.jetbrains.com/issue/IDEA-286726 -* @PrintFormat: annotation to printf-like methods - - https://youtrack.jetbrains.com/issue/IDEA-283556 - - -## API Errors - -- [api-url](https://github.com/alimate/errors-spring-boot-starter.git) \ No newline at end of file diff --git a/docs/more/ref.yaml b/docs/more/ref.yaml deleted file mode 100644 index 86826d9..0000000 --- a/docs/more/ref.yaml +++ /dev/null @@ -1,39 +0,0 @@ -repos: - - https://github.com/alibaba/COLA.git - - https://apis.how//products/web-design/ - - https://github.com/yegor256/requs.git - - https://github.com/Erudika/scoold.git - - https://github.com/mfatihercik/dsm.git -components: - - https://github.com/alimate/errors-spring-boot-starter.git - - https://github.com/InventivetalentDev/ReflectionHelper.git - - https://github.com/cronn/reflection-util.git - - https://github.com/xxDark/reflectionhooks.git - -data: - - https://github.com/modood/Administrative-divisions-of-China.git - - https://objenesis.org - - https://github.com/JCTools/JCTools.git - - https://github.com/perplexhub/rsql-jpa-specification.git - - https://github.com/bes2008/sqlhelper.git -MUST: - - https://github.com/javalin/javalin.git - -READ: - - https://github.com/bes2008/agileway.git - - https://github.com/bes2008/sqlhelper.git - - https://gitee.com/anyline/anyline.git - - https://gitee.com/clougence/hasor.git - - https://github.com/antelopesystems/crud-framework - - https://github.com/google/it-cert-automation-practice - - https://github.com/MaterializeInc/materialize - - https://github.com/wgzhao/Addax.git - - https://github.com/wgzhao/addax-ui -CI/CD: - - https://github.com/screwdriver-cd/ui - -data-structure: - - https://github.com/Scalified/tree - -mindmap: - - https://github.com/mindolph/Mindolph \ No newline at end of file diff --git a/docs/more/references-1.yaml b/docs/more/references-1.yaml deleted file mode 100644 index 0f17ffa..0000000 --- a/docs/more/references-1.yaml +++ /dev/null @@ -1,31 +0,0 @@ -products: - - https://www.getxray.app/ - - https://testrigor.com/ - - https://www.testing-whiz.com/ - - https://directus.cloud/fluentqa/projects?status=success - - https://strapi.io/ - - https://docs.strapi.io/developer-docs/latest/getting-started/quick-start.html - - https://www.directual.com/ - - https://theqalead.com/ - - https://softwareengineeringdaily.com/ - - https://manage.testmo.com/trial - - https://www.pixtastock.com/illustration/58163085 - - https://reqtest.com/testing-blog/software-quality-assurance/ - - https://intland.com/codebeamer/quality-assurance-software-testing/ - - https://www.scnsoft.com/software-testing/quality-management-optimization - - https://www.bmc.com/blogs/quality-assurance-software-testing/ - - https://content.intland.com/blog/modern-software-qa-the-importance-of-requirements-based-testing - - https://mobidev.biz/blog/what_is_the_value_brought_by_qa_to_your_software_product - - https://www.projectmanager.com/ - - https://www.altexsoft.com/blog/engineering/software-testing-qa-best-practices/ - - http://flammen.bg/software-qa-and-testing-services/ - - https://radixweb.com/blog/signs-you-need-a-software-qa-process-audit - - https://monday.com/blog/project-management/software-quality-assurance/ - - https://intland.com/codebeamer/quality-assurance-software-testing/ - - https://www.testing-whiz.com/ - - https://www.spec-qa.com/ - - https://www.gurock.com/testrail/qa-software/?utm_campaign=gg_dg_isr_can_search_generic_medium_intent&utm_source=google&utm_medium=cpc&utm_content=qa_software&utm_term=qa%20software&gclid=CjwKCAjw0dKXBhBPEiwA2bmObQFJVlBblQfYHFxAL3mKHYhZTdcahRnm6lyijUsOj8mK5NYHVYwSKRoCBCoQAvD_BwE - - https://www.altexsoft.com/whitepapers/quality-assurance-quality-control-and-testing-the-basics-of-software-quality-management/ - - https://www.thoughtco.com/ - - https://fluentqa.testmo.net/projects/view/1 - - https://docs.testmo.com/docs/ \ No newline at end of file diff --git a/docs/more/references.yaml b/docs/more/references.yaml deleted file mode 100644 index 6b60592..0000000 --- a/docs/more/references.yaml +++ /dev/null @@ -1,8 +0,0 @@ -java: - - category: project-template - repos: - - https://github.com/jshaptic/java-project-template.git - - - category: integration - repos: - - https://github.com/rh-messaging/jira-git-report.git diff --git a/docs/more/test-automation.yaml b/docs/more/test-automation.yaml deleted file mode 100644 index 0ecc107..0000000 --- a/docs/more/test-automation.yaml +++ /dev/null @@ -1,2 +0,0 @@ -api-tools: - - https://github.com/xyyxhcj/vpi.git \ No newline at end of file diff --git a/docs/more/todo.md b/docs/more/todo.md deleted file mode 100644 index 56b3ef5..0000000 --- a/docs/more/todo.md +++ /dev/null @@ -1,12 +0,0 @@ -# To Do List - -## Toolkits - -- [*] hutool integrations - -- [] Lealone Function Service - -## Table functions - -- [] EXCEL -- [] Table Integration for Low-Code Tools \ No newline at end of file diff --git a/docs/more/tutorials.yaml b/docs/more/tutorials.yaml deleted file mode 100644 index 6c27fc6..0000000 --- a/docs/more/tutorials.yaml +++ /dev/null @@ -1,2 +0,0 @@ -tutorials: - - https://github.com/AnghelLeonard/Hibernate-SpringBoot.git \ No newline at end of file diff --git a/docs/raw-materials/api-recorder/api-record-server.md b/docs/raw-materials/api-recorder/api-record-server.md deleted file mode 100644 index 4d69db8..0000000 --- a/docs/raw-materials/api-recorder/api-record-server.md +++ /dev/null @@ -1,166 +0,0 @@ -# API Record Server - -上面提到的接口录制保存到数据库, -1. 可以是本地 -2. 也可以是服务端 - -考虑到实际团队需要共享数据的情况,就进行了一个录制共享数据的后台原型。 - -下面是服务端页面的一个例子,还是一样,做这个后台主要是成本不高,实际也就是花了一天时间,当然每个人情况不同,这个需要额外考虑. - -大体效果: **基本增删改查** 都实现了. - -![img.png](search.png) -![img.png](img.png) - -目前数据已经记录下来了,服务端也可以保存了,那么这个数据可以共享了. 通过下载功能可以把所有的录制场景下的请求和返回数据通过excel下载. - -后续的修改和代码生成就是通过这个excel来进行了. 也可以只进行数据修改就可以,通过接口直接运行. - -## 实现这样一个后台,其实就定义了一个实体 - -实现这样一个后台,其实就定义了一个实体,然后运行通过Springboot运行就可以,启动后相应的菜单都可以配置, -这个会在后面在详细介绍下,目前先把录制数据/下载说完了. - -```java -@Data -public class ApiMonitorRecord extends MetaModel { - - @EruptField( - views = @View(title = "app"), - edit = @Edit( - title = "app应用名", - type = EditType.TAGS, search = @Search(vague = true), notNull = true, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct app from api_monitor_record" - ) - )) - private String app; - @EruptField( - views = @View(title = "场景名称"), - edit = @Edit(title = "场景名称", - type = EditType.TAGS, search = @Search(vague = true), notNull = true, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct scenario_name from api_monitor_record" - ) - )) - private String scenarioName; - @EruptField( - views = @View(title = "请求地址"), - edit = @Edit(title = "请求地址", notNull = true, search = @Search) - ) - private String requestUrl; - - @EruptField( - views = @View(title = "服务"), - edit = @Edit( - title = "服务", - type = EditType.TAGS, search = @Search(vague = true), notNull = true, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct service from api_monitor_record" - ) - ) - ) - - private String service; - @EruptField( - views = @View(title = "接口API"), - edit = @Edit(title = "接口API", notNull = true, search = @Search) - ) - private String api; - - @EruptField( - views = @View(title = "请求路径"), - edit = @Edit(title = "请求路径", notNull = true, search = @Search) - ) - private String path; - - @EruptField( - views = @View(title = "请求头"), - edit = @Edit(title = "请求头", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String requestHeaders; - - @EruptField( - views = @View(title = "方法"), - edit = @Edit(title = "方法", notNull = true, search = @Search) - ) - private String method; - - @EruptField( - views = @View(title = "请求报文", type = ViewType.CODE), - edit = @Edit(title = "请求报文", type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String requestBody; - - @EruptField( - views = @View(title = "response_headers"), - edit = @Edit(title = "responseHeaders", type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String responseHeaders; - - @EruptField( - views = @View(title = "status_code"), - edit = @Edit(title = "status_code", notNull = true, search = @Search) - ) - private int statusCode; - - @EruptField( - views = @View(title = "返回报文", type = ViewType.CODE), - edit = @Edit(title = "返回报文", type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String responseBody; - - -} - -``` - -## 工作量和效果 - -- 功能: - 1. 基本的增删改查都可以,下面是个编辑页面,支持代码高亮和JSON等 - ![img_1.png](edit.png) - 2. 支持不同录制场景的查询 -- 工作量: - - 这个server框架由于之前了解,所以这个时间我自己没有考虑在里面 - - 后台的构建我自己也就今天1个下午 -- 后续打算 - - 和自动化框架做对接,下载的EXCEL经过修改之后可以用作接口测试 - - 录制场景修改完的数据,可以用来造数据,通过管理后台实现 - - 自动化跑的CASE进行记录 - - 录制的数据和目前所有的API清单做一个对比,可以知道那些有接口测试覆盖了 - -其实按照目前的效果看,后续打算里面的内容都会在这几天里面可以完成一个原型。 - -## 不足和说明 - -目前肯定是比较粗糙的, 主要还是实现成本低,来验证思路是否符合实际情况的使用,是否真的可以提高效率. -也欢迎大家多提意见和建议,有人反馈是最开心的事情. - -后台代码在: [github](https://github.com/fluent-qa/fluentqa-workspace.git) -代码量其实真的没有太多夸张,真的很少. - -```shell -main -├── java -│   └── io -│   └── fluentqa -│   └── server -│   ├── QAWorkspaceApp.java -│   └── recorder -│   ├── handles -│   │   └── SqlTagFetchHandler.java -│   └── model -│   └── ApiMonitorRecord.java -└── resources - ├── application-dev.yaml - └── application.yaml - -``` diff --git a/docs/raw-materials/api-recorder/edit.png b/docs/raw-materials/api-recorder/edit.png deleted file mode 100644 index 4d5c057..0000000 Binary files a/docs/raw-materials/api-recorder/edit.png and /dev/null differ diff --git a/docs/raw-materials/api-recorder/img.png b/docs/raw-materials/api-recorder/img.png deleted file mode 100644 index 6fe8dfe..0000000 Binary files a/docs/raw-materials/api-recorder/img.png and /dev/null differ diff --git a/docs/raw-materials/api-recorder/search.png b/docs/raw-materials/api-recorder/search.png deleted file mode 100644 index 6f800fc..0000000 Binary files a/docs/raw-materials/api-recorder/search.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/1-api-collections.md b/docs/raw-materials/api/api-testing/1-api-collections.md deleted file mode 100644 index 555ccaf..0000000 --- a/docs/raw-materials/api/api-testing/1-api-collections.md +++ /dev/null @@ -1,45 +0,0 @@ -# API 接口定义 - -接口定义阶段主要是定义了系统间调用的方式. 如果从本地和远程API的角度理解的话: - -- 本地API调用,直接函数调用: 函数+参数 -```sh -func(arg1,arg2) -``` -- 远程API调用: 本质也是函数+参数 - * 使用远程协议如http/http2/rpc 传递 - * 函数+参数隐含在协议中 - * 协议层处理大部分可以通过框架实现,从而让远程调用看起来像本地调用一样 - -```shell -protocol(func(arg1,arg2)) -``` ---- - -## 使用 API设计工具 - -- stoplight studio -- insominia - -## HTTP api 来源 - -- Postman 文件 -- Swagger 文件 -- grpc -- 自定义协议 - -## HTTP API 设计和管理工具 -- 设计工具 - - spotlint -- api 调用工具 - - postman - - insomia - -## API统一收集方式 - -1. POSTMAN/SWAGGER文件上传统一管理 -2. 接口记录保存 - - - - diff --git a/docs/raw-materials/api/api-testing/2-api-test.md b/docs/raw-materials/api/api-testing/2-api-test.md deleted file mode 100644 index 74ada21..0000000 --- a/docs/raw-materials/api/api-testing/2-api-test.md +++ /dev/null @@ -1,9 +0,0 @@ -# API 测试代码生成 - -## 自定义协议生成POSTMAN -## 根据POSTMAN直接生成调用方式 -## 根据POSTMAN直接生成测试用例 - - - - diff --git a/docs/raw-materials/api/api-testing/3-api-test-data.md b/docs/raw-materials/api/api-testing/3-api-test-data.md deleted file mode 100644 index 9361d72..0000000 --- a/docs/raw-materials/api/api-testing/3-api-test-data.md +++ /dev/null @@ -1,7 +0,0 @@ -# API 测试数据 - -- 日常测试数据获取 -- 测试数据和接口定义合成 - - - diff --git a/docs/raw-materials/api/api-testing/4-api-testmgr.md b/docs/raw-materials/api/api-testing/4-api-testmgr.md deleted file mode 100644 index 2971ed3..0000000 --- a/docs/raw-materials/api/api-testing/4-api-testmgr.md +++ /dev/null @@ -1,19 +0,0 @@ -# API 测试管理工具 - -- 产品管理/模块管理: 方便管理不同的接口模块 -- 接口管理 - - 接口清单 - - 接口更新情况 - - 接口数据抓取 -- 接口代码生成 - - 客户端代码生成 -- 接口测试数据省生成 - - 接口测试数据EXCEL生成 -- 接口流程测试数据生成 - - 接口流程测试数据 - -## 接口管理 - -接口管理主要包括了: -1. - diff --git a/docs/raw-materials/api/api-testing/5-api-automation.md b/docs/raw-materials/api/api-testing/5-api-automation.md deleted file mode 100644 index b2fde00..0000000 --- a/docs/raw-materials/api/api-testing/5-api-automation.md +++ /dev/null @@ -1,17 +0,0 @@ -# API 测试 - -API测试大部分都可以通过自动化测试解决,这是我的结论。那么这个实现成本高不高呢? -我像说其实也没有那么高. - -以下是使用了各种小工具和自主开发完成的一些小功能组合而成,实际成本大家可以自行评估. - -首先把一个API的开发过程分成以下几个阶段: -1. 接口定义阶段 -2. 接口用例设计阶段 -3. 接口代码生成 -4. 接口测试数据生成 -5. 接口测试运行 -6. 接口测试报告 - -后面说一下每一个阶段不同的部门的产出物,以其测试需要做什么. - diff --git a/docs/raw-materials/api/api-testing/how/1-run-app.md b/docs/raw-materials/api/api-testing/how/1-run-app.md deleted file mode 100644 index 1f29de7..0000000 --- a/docs/raw-materials/api/api-testing/how/1-run-app.md +++ /dev/null @@ -1,18 +0,0 @@ -# Run APP - -1. 启动APP -```java -@SpringBootApplication -@EntityScan -@EruptScan -public class QAdminApp { - public static void main(String[] args) { - SpringApplication.run(QAdminApp.class); - } -} -``` - -依赖文件设定: - - -## 如何启动和打包 \ No newline at end of file diff --git "a/docs/raw-materials/api/api-testing/how/2-product/2-\344\272\247\345\223\201\351\205\215\347\275\256.md" "b/docs/raw-materials/api/api-testing/how/2-product/2-\344\272\247\345\223\201\351\205\215\347\275\256.md" deleted file mode 100644 index 4f538ac..0000000 --- "a/docs/raw-materials/api/api-testing/how/2-product/2-\344\272\247\345\223\201\351\205\215\347\275\256.md" +++ /dev/null @@ -1,26 +0,0 @@ -# 产品配置 -## 菜单配置 - -- 系统配置菜单 -![img.png](img.png) -![img_2.png](img_2.png) -![img_1.png](img_1.png) -![img_3.png](img_3.png) -![img_4.png](img_4.png) - -- 添加配置数据 -![img_5.png](img_5.png) -![img_6.png](img_6.png) - - -## 字典表配置 - -![img.png](master-data-1.png) -![img.png](master-data-2.png) - -## 产品配置 - -![img_7.png](img_7.png) -![img_8.png](img_8.png) -![img_9.png](img_9.png) -![img_10.png](img_10.png) \ No newline at end of file diff --git a/docs/raw-materials/api/api-testing/how/2-product/img.png b/docs/raw-materials/api/api-testing/how/2-product/img.png deleted file mode 100644 index 2d17cdd..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_1.png b/docs/raw-materials/api/api-testing/how/2-product/img_1.png deleted file mode 100644 index c1da6ae..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_1.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_10.png b/docs/raw-materials/api/api-testing/how/2-product/img_10.png deleted file mode 100644 index e27e65d..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_10.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_2.png b/docs/raw-materials/api/api-testing/how/2-product/img_2.png deleted file mode 100644 index bce0cf5..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_2.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_3.png b/docs/raw-materials/api/api-testing/how/2-product/img_3.png deleted file mode 100644 index 263bc87..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_3.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_4.png b/docs/raw-materials/api/api-testing/how/2-product/img_4.png deleted file mode 100644 index 141cf33..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_4.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_5.png b/docs/raw-materials/api/api-testing/how/2-product/img_5.png deleted file mode 100644 index 73af3bb..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_5.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_6.png b/docs/raw-materials/api/api-testing/how/2-product/img_6.png deleted file mode 100644 index 6c45a9c..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_6.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_7.png b/docs/raw-materials/api/api-testing/how/2-product/img_7.png deleted file mode 100644 index cf28a2e..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_7.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_8.png b/docs/raw-materials/api/api-testing/how/2-product/img_8.png deleted file mode 100644 index c8087e9..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_8.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/img_9.png b/docs/raw-materials/api/api-testing/how/2-product/img_9.png deleted file mode 100644 index a3abc78..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/img_9.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/master-data-1.png b/docs/raw-materials/api/api-testing/how/2-product/master-data-1.png deleted file mode 100644 index 2433365..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/master-data-1.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/2-product/master-data-2.png b/docs/raw-materials/api/api-testing/how/2-product/master-data-2.png deleted file mode 100644 index da3eec5..0000000 Binary files a/docs/raw-materials/api/api-testing/how/2-product/master-data-2.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/1-python-code-gen.md b/docs/raw-materials/api/api-testing/how/3-openapi/1-python-code-gen.md deleted file mode 100644 index 54067f2..0000000 --- a/docs/raw-materials/api/api-testing/how/3-openapi/1-python-code-gen.md +++ /dev/null @@ -1,22 +0,0 @@ -# 测试代码生成 - -- python - - -## skel 文件 - -- golang 生成postman文件 -![img_14.png](img_14.png) -![img_15.png](img_15.png) - -- python 根据postman生成python代码 -![img_12.png](img_12.png) -![img_9.png](img_9.png) -![img_10.png](img_10.png) -![img_11.png](img_11.png) -![img_13.png](img_13.png) - -## 测试报告 - -- ![img_16.png](img_16.png) -- ![img_17.png](img_17.png) \ No newline at end of file diff --git "a/docs/raw-materials/api/api-testing/how/3-openapi/3-api-\347\256\241\347\220\206.md" "b/docs/raw-materials/api/api-testing/how/3-openapi/3-api-\347\256\241\347\220\206.md" deleted file mode 100644 index cb6cce9..0000000 --- "a/docs/raw-materials/api/api-testing/how/3-openapi/3-api-\347\256\241\347\220\206.md" +++ /dev/null @@ -1,20 +0,0 @@ -# API 管理 - -## 文件上传模块配置 - -![img.png](img.png) -![img_1.png](img_1.png) -![img_2.png](img_2.png) -![img_3.png](img_3.png) - -## 接口清单 - -![img_4.png](img_4.png) -![img_5.png](img_5.png) -![img_6.png](img_6.png) -![img_7.png](img_7.png) - -## 测试数据直接在测试代码中运行 - -![img_8.png](img_8.png) -代码虽然多,但是都是生成的 \ No newline at end of file diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img.png b/docs/raw-materials/api/api-testing/how/3-openapi/img.png deleted file mode 100644 index c1eb89f..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_1.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_1.png deleted file mode 100644 index 0f9e083..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_1.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_10.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_10.png deleted file mode 100644 index a851277..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_10.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_11.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_11.png deleted file mode 100644 index 4b8bc21..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_11.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_12.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_12.png deleted file mode 100644 index 0603629..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_12.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_13.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_13.png deleted file mode 100644 index 6f51e22..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_13.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_14.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_14.png deleted file mode 100644 index 5fcfb36..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_14.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_15.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_15.png deleted file mode 100644 index e9af6cc..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_15.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_16.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_16.png deleted file mode 100644 index 7d349d6..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_16.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_17.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_17.png deleted file mode 100644 index 92da05f..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_17.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_2.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_2.png deleted file mode 100644 index 85a39f4..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_2.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_3.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_3.png deleted file mode 100644 index f62ef6b..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_3.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_4.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_4.png deleted file mode 100644 index 65f5f83..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_4.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_5.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_5.png deleted file mode 100644 index 2599ce7..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_5.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_6.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_6.png deleted file mode 100644 index ee92300..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_6.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_7.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_7.png deleted file mode 100644 index 25fe965..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_7.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_8.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_8.png deleted file mode 100644 index 07f054a..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_8.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/3-openapi/img_9.png b/docs/raw-materials/api/api-testing/how/3-openapi/img_9.png deleted file mode 100644 index 0d475cd..0000000 Binary files a/docs/raw-materials/api/api-testing/how/3-openapi/img_9.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/4-proxy/4-proxy-data.md b/docs/raw-materials/api/api-testing/how/4-proxy/4-proxy-data.md deleted file mode 100644 index e8d7dd7..0000000 --- a/docs/raw-materials/api/api-testing/how/4-proxy/4-proxy-data.md +++ /dev/null @@ -1,7 +0,0 @@ -# Proxy Data - - -![img.png](img.png) -![img_1.png](img_1.png) - -![img_2.png](img_2.png) diff --git a/docs/raw-materials/api/api-testing/how/4-proxy/img.png b/docs/raw-materials/api/api-testing/how/4-proxy/img.png deleted file mode 100644 index 23092f7..0000000 Binary files a/docs/raw-materials/api/api-testing/how/4-proxy/img.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/4-proxy/img_1.png b/docs/raw-materials/api/api-testing/how/4-proxy/img_1.png deleted file mode 100644 index 6d74409..0000000 Binary files a/docs/raw-materials/api/api-testing/how/4-proxy/img_1.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/how/4-proxy/img_2.png b/docs/raw-materials/api/api-testing/how/4-proxy/img_2.png deleted file mode 100644 index 8e196ce..0000000 Binary files a/docs/raw-materials/api/api-testing/how/4-proxy/img_2.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/0-overview.md b/docs/raw-materials/api/api-testing/overview/0-overview.md deleted file mode 100644 index cb22bae..0000000 --- a/docs/raw-materials/api/api-testing/overview/0-overview.md +++ /dev/null @@ -1,133 +0,0 @@ -# 接口测试行动派-API测试冒险之旅 - -> 纸上得来终觉浅,绝知此事要躬行 - -听了那么多理念,看了那么多文章,可以为什么我还是这样呢? 所以我决定自己动手,自己尝试,自己体会. -所以我觉得写出我的体会,写出我的感受,写出我的结果. - -在一个小公司进行的API测试冒险之旅,先开一个头,达成什么很重要,但是中间学到什么也很关键. -没有太多华丽的图标,只有现实的代码和截图, 解决问题在行动. - -## API接口测试尝试 - -每一次尝试都已需要一个目标,所以进行这次尝试的目标是: -1. 减少接口测试工作量,也就是代码量 -2. 能够让不同部门的人一起协作: - * 开发/前端确定好API接口定义 - * 测试可以根据API定义直接生成代码 - * 测试数据可以在日常测试工作中方便获取 - * 测试数据可以通过API定义中进行部分生成 - * 满足批量接口自动化的同时也可以进行随机的调用,每次POSTMAN改一些数据, - 填请求/请求体也是花时间的 -3. 作为TEAM Leader的需求: - * 市面上是有很多工具如metersphere/eolinkapi等等,但是同学们也有自我学习的需求 - * Team Leader的时间都是团队成员节省出来给你的,你需要给他们一些软性的回报和分享 - * 证明写代码并不是特别难,让同学消除代码恐惧感 - * 确实提高了效率和扩展性 -4. 低成本的实现以上内容 - -## 2. 实际实现的内容和效果 - -这是实际实现过程中的大体思路和想法: -![img_2.png](api-workflow.png) - -### 2.1 测试代码生成实现内容 - -1. 根据API开发定义接口文档生成postman文件和部分python代码 -![img_2.png](go-code-gen.png) -golang代码片段: -![img_2.png](go-codes.png) -2. 生成python代码: python返回和请求都是采用类的方式而不是json,方便维护 -python代码更好的结构化: -![python-structs.png](python-structs.png) -3. postman 文件可以少自己写一些很多请求,方便零时调用请求 -![postman.png](postman.png) -postman该有的前置脚本该有的也都有,: -![img_2.png](postman-prerequest.png) -4. 根据postman文件生成接口调用和测试代码: 同时通过postman文件的方式, - 让代码生成更通用化,而不是只正对公司自定义协议 -![img_2.png](python-api-tests.png) -![img_2.png](api-client.png) -![img_2.png](python-api-cases.png) - -### 2.2 测试数据生成和测试用例生成 - -测试数据生成主要有两块内容: -1. 通过协议生成的数据结构,这块在postman中可以看到 -2. 通过日常页面操作产生的数据获取 -3. 融合1和2的数据,生成测试数据 - -具体怎么做? -1. 使用mitmproxy,获取运行测试过程中的数据,并且保存到数据库,算是某种录制方式 -2. 通过一个管理平台让同学们可以通过页面操作生成测试用例数据 -3. 测试用例数据可以直接在python代码中使用 - -同学问实现这些难吗?其实也没有那么难,因为有工具可以方便的给你用: - -- 录制部分的其实主要就一个python文件,核心代码就这么多: -![img_2.png](mitm-proxy.png) -- 管理平台代码其实也很少,使用了一个JAVA类似于低代码工具,也是很少代码就实现: -![img_2.png](api-monitor.png) - -### 2.3 接口清单变化和测试用例生成 - -接口可能有些变化,所以有必要让API可以有一个清单让同学们了解API的变化情况,具体怎么做? -1. 上传生成的POSTMAN文件保存到数据库 -![img_2.png](upload-postman.png) -2. 保存POSTMAN中的接口,并且标注出那些是新加的,那些是改变的,那些是已经知道的 -![img_2.png](api-list.png) -3. 测试数据生成: 已经知道接口定义和有部分录制数据,直接点击生成原始接口测试用例就可以 -![img_2.png](data-generator.png) -4. 生成的测试数据/测试用例可以编辑和导出: -![img_2.png](testcase-raw-data.png) -5. 下载测试用例数据之后,接可以在python代码中运行 -6. 如果有额外需要加的测试用例,可以直接在代码中做修改 - -## 3. 总结一些在实现这些化的成本 - -1. golang代码生成: 1.5天 - * 协议定义是开发完成的,我这边成本是零 - * 代码生成代码,由于开发也在使用代码生成,我在这个基础上加入postman和python,成本是1.5天左右 -2. python接口自动化代码: 5天 - * python接口自动化基础库实现花了4天时间,包括: - * httpx调用封装,支持http/http2 - * database访问封装: 使用ORM/pydantic 方式 - * 常用的工具类方式: copy/paste - * python接口代码生成器: 1天 - * jinja2模版实现 -3. mitm proxy数据录制: 1天 - * 学习mitmproxy的机制 - * 解析请求,保存数据库 -4. 接口后台管理工具: 3天 - * 使用一个JAVA开源的低代码工具,不用写前端代码 - * 产品配置管理 - * 接口清单管理 - * 接口数据管理 - * 测试用例生成 - * postman文件上传 -总体而言在我使用这些工具的过程中,花费不多,但是每个人实际情况可能不同,只是作为参考.不代表任何含义. - -## 4. 收获是什么? - -1. 工具有很多,如何组合起来使用满足自身需求 -2. python收获: 重新认识了新的python的一些库 - * poetry: python 包管理工具,比requirement好用 - * python 命令行工具构建: typer/fire - * python ORM: sqlmodel/sqlalchemy/repository pattern - * python 使用依赖注入方便写sdk - * python pydantic使用,如何更好的结构化数据 - * python对于测试人员来说学习的路径图 -3. 内部平台收获: 使用低代码工具能够极大的帮助开发效率 - * 使用低代码工具,一个JAVA类就可以实现前后端功能最简单的CRUD功能 -4. 一些观念的变化 - * 一遍一遍看语法,效率太低,花20%的精力了解80%现实中马上用到的东西 - * 语言和语言差别越来越小,都有类型 - * API测试,可以和命令行,RPC,函数都可以统一起来看,协议就是一次函数调用 - 函数就是,函数名+参数,协议只是把这个分散到了不同地方 - * 数据结构+算法,这个才是最不变的东西,为什么这么说,自己遇到的例子是: - * EXCEl读取到python/java 类,就一行代码给外部使用 - * freemind/xmind转化为数据库保存,也是同样道理 - * postman 文件转化保存到数据库也是同样的道理 -5. 一小点一小点的知识和代码使用,大部分可以在30分钟说完,最主要的是练习熟练程度 - 各种知识点的30分钟的介绍,正在准备中,希望可以给我们TEAM的同学有些帮助 -6. 知己知彼,可以更深入的测试,在自己写的过程了解开发细节,也可以推测和你差不多的开发在哪里容易出错 \ No newline at end of file diff --git a/docs/raw-materials/api/api-testing/overview/api-client.png b/docs/raw-materials/api/api-testing/overview/api-client.png deleted file mode 100644 index aef7ac0..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-client.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-explain.png b/docs/raw-materials/api/api-testing/overview/api-explain.png deleted file mode 100644 index a1a72ba..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-explain.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-integration-2.png b/docs/raw-materials/api/api-testing/overview/api-integration-2.png deleted file mode 100644 index e3ff6fd..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-integration-2.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-integration.md b/docs/raw-materials/api/api-testing/overview/api-integration.md deleted file mode 100644 index 4c7ca5b..0000000 --- a/docs/raw-materials/api/api-testing/overview/api-integration.md +++ /dev/null @@ -1,6 +0,0 @@ -![img.png](api-integration.png) -![img.png](api-integration-2.png) -![img.png](img.png) -![img_1.png](img_1.png) - -https://www.excellentwebworld.com/ \ No newline at end of file diff --git a/docs/raw-materials/api/api-testing/overview/api-integration.png b/docs/raw-materials/api/api-testing/overview/api-integration.png deleted file mode 100644 index 9ea33e7..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-integration.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-list.png b/docs/raw-materials/api/api-testing/overview/api-list.png deleted file mode 100644 index 9eb73a1..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-list.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-monitor.png b/docs/raw-materials/api/api-testing/overview/api-monitor.png deleted file mode 100644 index eabf779..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-monitor.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/api-workflow.png b/docs/raw-materials/api/api-testing/overview/api-workflow.png deleted file mode 100644 index 73441e3..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/api-workflow.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/backup.md b/docs/raw-materials/api/api-testing/overview/backup.md deleted file mode 100644 index d155e56..0000000 --- a/docs/raw-materials/api/api-testing/overview/backup.md +++ /dev/null @@ -1,150 +0,0 @@ -# 接口测试行动派-API测试冒险之旅 - -> 纸上得来终觉浅,绝知此事要躬行 - -听了那么多理念,看了那么多文章,可以为什么我还是这样呢? 所以我决定自己动手,自己尝试,自己体会. -那些都是平台,那些框架,我们自己也可以动手试试,但是加上工资才是王道,不是吗? -所以我觉得写出我的代码,写出我的体会,写出我的感受,写出我的结果. - -在一个小公司进行的API测试冒险之旅,先开一个头,达成什么很重要,但是中间学到什么也很关键. -没有太多华丽的图标,只有现实的代码和截图, 解决问题在行动. - - -## API接口测试尝试 - -每一次尝试都已需要一个目标,所以进行这次尝试的目标是: -1. 减少接口测试工作量,也就是代码量 -2. 能够让不同部门的人一起协作: - * 开发/前端确定好API接口定义 - * 测试可以根据API定义直接生成代码 - * 测试数据可以在日常测试工作中方便获取 - * 测试数据可以通过API定义中进行部分生成 - * 满足批量接口自动化的同时也可以进行随机的调用,每次POSTMAN改一些数据, - 填请求/请求体也是花时间的 -3. 作为TEAM Leader的需求: - * 市面上是有很多工具如metersphere/eolinkapi等等,但是同学们也有自我学习的需求 - * Team Leader的时间都是团队成员节省出来给你的,你需要给他们一些软性的回报和分享 - * 证明写代码并不是特别难,让同学消除代码恐惧感 - * 确实提高了效率和扩展性 -4. 低成本的实现以上内容 - -## 2. 实际实现的内容和效果 - -这是实际实现过程中的大体思路和想法: -![](/uploads/photo/2022/9f10c32e-4d63-4a5c-a2e1-e457b905cbc0.png!large) - -### 2.1 测试代码生成实现内容 - -1. 根据API开发定义接口文档生成postman文件和部分python代码 - ![](/uploads/photo/2022/2bf599bc-c762-4aa9-bc38-5b15b1c0ef89.png!large) - -golang代码片段: -![](/uploads/photo/2022/eee4a108-862e-4389-88d9-c4c168daa3de.png!large) - -2. 生成python代码: python返回和请求都是采用类的方式而不是json,方便维护 - python代码更好的结构化: - ![](/uploads/photo/2022/356d432b-1ed7-4dbb-a31c-810adad53257.png!large) - -3. postman 文件可以少自己写一些很多请求,方便零时调用请求 - ![](/uploads/photo/2022/0a372cfd-da64-4853-8202-ac0e46f3e6ac.png!large) - -postman该有的前置脚本该有的也都有,: -![](/uploads/photo/2022/7a0d1f02-0ee1-4bb6-86fc-d3db98674ba4.png!large) - -4. 根据postman文件生成接口调用和测试代码: 同时通过postman文件的方式, - 让代码生成更通用化,而不是只正对公司自定义协议 - ![](/uploads/photo/2022/30460816-3fe5-481a-8cf7-63c7f53e3491.png!large) - ![](/uploads/photo/2022/b60df7e5-961f-4d7b-8422-07727d3a0791.png!large) - ![](/uploads/photo/2022/914a59c8-ac0b-4c02-8f53-b70499e59f0e.png!large) - ![](/uploads/photo/2022/63b5c49c-94d4-41c7-a8f4-3a7ecdef4d3a.png!large) - - -### 2.2 测试数据生成和测试用例生成 - -测试数据生成主要有两块内容: -1. 通过协议生成的数据结构,这块在postman中可以看到 -2. 通过日常页面操作产生的数据获取 -3. 融合1和2的数据,生成测试数据 - -具体怎么做? -1. 使用mitmproxy,获取运行测试过程中的数据,并且保存到数据库,算是某种录制方式 -2. 通过一个管理平台让同学们可以通过页面操作生成测试用例数据 -3. 测试用例数据可以直接在python代码中使用 - -同学问实现这些难吗?其实也没有那么难,因为有工具可以方便的给你用: - -- 录制部分的其实主要就一个python文件,核心代码就这么多: - ![](/uploads/photo/2022/940f17a6-846f-4946-be78-e2383cd602fb.png!large) - -- 管理平台代码其实也很少,使用了一个JAVA类似于低代码工具,也是很少代码就实现: - ![](/uploads/photo/2022/88366023-a057-4d74-a3c3-6bb3a5e0a6a9.png!large) - - -### 2.3 接口清单变化和测试用例生成 - -接口可能有些变化,所以有必要让API可以有一个清单让同学们了解API的变化情况,具体怎么做? -1. 上传生成的POSTMAN文件保存到数据库 - ![](/uploads/photo/2022/5b7a58bd-2b6e-480d-aa11-a0ce71a76723.png!large) - -2. 保存POSTMAN中的接口,并且标注出那些是新加的,那些是改变的,那些是已经知道的 - ![](/uploads/photo/2022/42f099ae-16ee-408d-9386-ccb34148694b.png!large) - -3. 测试数据生成: 已经知道接口定义和有部分录制数据,直接点击生成原始接口测试用例就可以 - ![](/uploads/photo/2022/fb980dd5-bae4-4cc4-8aed-c6ca902b0fd0.png!large) - -4. 生成的测试数据/测试用例可以编辑和导出: - ![](/uploads/photo/2022/c8c93b92-3b25-4400-889d-d2c6f0812825.png!large) - -5. 下载测试用例数据之后,接可以在python代码中运行 -6. 如果有额外需要加的测试用例,可以直接在代码中做修改 - -## 3. 总结一些在实现这些化的成本 - -1. golang代码生成: 1.5天 - * 协议定义是开发完成的,我这边成本是零 - * 代码生成代码,由于开发也在使用代码生成,我在这个基础上加入postman和python,成本是1.5天左右 -2. python接口自动化代码: 5天 - * python接口自动化基础库实现花了4天时间,包括: - * httpx调用封装,支持http/http2 - * database访问封装: 使用ORM/pydantic 方式 - * 常用的工具类方式: copy/paste - * python接口代码生成器: 1天 - * jinja2模版实现 -3. mitm proxy数据录制: 1天 - * 学习mitmproxy的机制 - * 解析请求,保存数据库 -4. 接口后台管理工具: 3天 - * 使用一个JAVA开源的低代码工具,不用写前端代码 - * 产品配置管理 - * 接口清单管理 - * 接口数据管理 - * 测试用例生成 - * postman文件上传 - 总体而言在我使用这些工具的过程中,花费不多,但是每个人实际情况可能不同,只是作为参考.不代表任何含义. - -## 4. 收获是什么? - -1. 工具有很多,如何组合起来使用满足自身需求 -2. python收获: 重新认识了新的python的一些库 - * poetry: python 包管理工具,比requirement好用 - * python 命令行工具构建: typer/fire - * python ORM: sqlmodel/sqlalchemy/repository pattern - * python 使用依赖注入方便写sdk - * python pydantic使用,如何更好的结构化数据 - * python对于测试人员来说学习的路径图 -3. 内部平台收获: 使用低代码工具能够极大的帮助开发效率 - * 使用低代码工具,一个JAVA类就可以实现前后端功能最简单的CRUD功能 -4. 一些观念的变化 - * 一遍一遍看语法,效率太低,花20%的精力了解80%现实中马上用到的东西 - * 语言和语言差别越来越小,都有类型 - * API测试,可以和命令行,RPC,函数都可以统一起来看,协议就是一次函数调用 - 函数就是,函数名+参数,协议只是把这个分散到了不同地方 - * 数据结构+算法,这个才是最不变的东西,为什么这么说,自己遇到的例子是: - * EXCEl读取到python/java 类,就一行代码给外部使用 - * freemind/xmind转化为数据库保存,也是同样道理 - * postman 文件转化保存到数据库也是同样的道理 -5. 一小点一小点的知识和代码使用,大部分可以在30分钟说完,最主要的是练习熟练程度 - 各种知识点的30分钟的介绍,正在准备中,希望可以给我们TEAM的同学有些帮助 -6. 知己知彼,可以更深入的测试,在自己写的过程了解开发细节,也可以推测和你差不多的开发在哪里容易出错 - -微信公众号也同时发布,欢迎关注:[不仅是软件测试](https://mp.weixin.qq.com/s?__biz=MzIxMzgzNjA3NA==&mid=2247484158&idx=1&sn=17fd09e2a05dbce4e8e01cb83cc45259&chksm=97b18abba0c603ad020ba073456839f099ea68a14d9febb805221052e8775572e1a9f33faceb&token=1168301534&lang=zh_CN#rd) \ No newline at end of file diff --git a/docs/raw-materials/api/api-testing/overview/data-generator.png b/docs/raw-materials/api/api-testing/overview/data-generator.png deleted file mode 100644 index 79c7930..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/data-generator.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/go-code-gen.png b/docs/raw-materials/api/api-testing/overview/go-code-gen.png deleted file mode 100644 index 87ca4af..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/go-code-gen.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/go-codes.png b/docs/raw-materials/api/api-testing/overview/go-codes.png deleted file mode 100644 index 1856c22..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/go-codes.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/img.png b/docs/raw-materials/api/api-testing/overview/img.png deleted file mode 100644 index 4f4b5af..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/img.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/img_1.png b/docs/raw-materials/api/api-testing/overview/img_1.png deleted file mode 100644 index 1a511a8..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/img_1.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/mitm-proxy.png b/docs/raw-materials/api/api-testing/overview/mitm-proxy.png deleted file mode 100644 index ceff796..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/mitm-proxy.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/postman-prerequest.png b/docs/raw-materials/api/api-testing/overview/postman-prerequest.png deleted file mode 100644 index 4d496e2..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/postman-prerequest.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/postman.png b/docs/raw-materials/api/api-testing/overview/postman.png deleted file mode 100644 index c37db6b..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/postman.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/python-api-cases.png b/docs/raw-materials/api/api-testing/overview/python-api-cases.png deleted file mode 100644 index 4fcac86..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/python-api-cases.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/python-api-tests.png b/docs/raw-materials/api/api-testing/overview/python-api-tests.png deleted file mode 100644 index c08ebf9..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/python-api-tests.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/python-structs.png b/docs/raw-materials/api/api-testing/overview/python-structs.png deleted file mode 100644 index 4161b16..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/python-structs.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/testcase-raw-data.png b/docs/raw-materials/api/api-testing/overview/testcase-raw-data.png deleted file mode 100644 index e192a30..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/testcase-raw-data.png and /dev/null differ diff --git a/docs/raw-materials/api/api-testing/overview/upload-postman.png b/docs/raw-materials/api/api-testing/overview/upload-postman.png deleted file mode 100644 index ae2b0fc..0000000 Binary files a/docs/raw-materials/api/api-testing/overview/upload-postman.png and /dev/null differ diff --git a/docs/raw-materials/api/openapi/1-openapi-diff.md b/docs/raw-materials/api/openapi/1-openapi-diff.md deleted file mode 100644 index c497d54..0000000 --- a/docs/raw-materials/api/openapi/1-openapi-diff.md +++ /dev/null @@ -1,37 +0,0 @@ -# OpenAPI Diff - -- Maven POM setting -```xml - - org.openapitools.openapidiff - openapi-diff-core - ${openapi-diff-version} - -``` - -- How to Get OpenAPI differences - * API Difference - * API Difference in HTMLRender - * API Difference in JSONRender - * API Difference in Markdown Render -```java - String originPetStore = "./petstore_v3.yml"; - String newPetStore = "./petstore_v2.json"; - - ChangedOpenApi diff = OpenApiCompare.fromLocations(originPetStore,newPetStore); - System.out.println(diff); - - //write to html - String html = new HtmlRender("Changelog", - "http://deepoove.com/swagger-diff/stylesheets/demo.css") - .render(diff); - FileUtils.writeStringToFile(new File("apiDifference.html"),html, - Charset.defaultCharset()); - String markdownRender = new MarkdownRender().render(diff); - FileUtils.writeStringToFile(new File("apiDifference.md"),markdownRender, - Charset.defaultCharset()); - - String jsonDiff = new JsonRender().render(diff); - FileUtils.writeStringToFile(new File("apiDifference.json"),jsonDiff, - Charset.defaultCharset()); -``` \ No newline at end of file diff --git a/docs/raw-materials/api/openapi/10-references.md b/docs/raw-materials/api/openapi/10-references.md deleted file mode 100644 index ccf77e3..0000000 --- a/docs/raw-materials/api/openapi/10-references.md +++ /dev/null @@ -1,13 +0,0 @@ -# References - -## OpenAPI Resource - -- [openapi github](https://github.com/OAI) -- [openapi-diff](https://github.com/OpenAPITools/openapi-diff) openapi diff tools to - compare different api version - -## github - -- [postman-gradle-runner](https://github.com/michaelruocco/gradle-postman-runner.git) -- [postman-integration](https://github.com/PortSwigger/postman-integration.git) -- [postman-openapi converter](https://github.com/apideck-libraries/portman.git) \ No newline at end of file diff --git a/docs/raw-materials/awesome/Performance-Engineers-DevOps-Banner.png b/docs/raw-materials/awesome/Performance-Engineers-DevOps-Banner.png deleted file mode 100644 index f1db921..0000000 Binary files a/docs/raw-materials/awesome/Performance-Engineers-DevOps-Banner.png and /dev/null differ diff --git a/docs/raw-materials/awesome/README.md b/docs/raw-materials/awesome/README.md deleted file mode 100644 index ba8c5fa..0000000 --- a/docs/raw-materials/awesome/README.md +++ /dev/null @@ -1,116 +0,0 @@ -# 📐 From Performance Engineering to DevOps - -[![contributions welcome](https://img.shields.io/badge/contributions-welcome-1EAEDB)]() -[![saythanks](https://img.shields.io/badge/say-thanks-1EAEDB.svg)](https://saythanks.io/to/catch.nkn%40gmail.com) -[![](https://img.shields.io/badge/license-MIT-0a0a0a.svg?style=flat&colorA=1EAEDB)](https://qainsights.com) -[![](https://img.shields.io/badge/%E2%9D%A4-QAInsights-0a0a0a.svg?style=flat&colorA=1EAEDB)](https://qainsights.com) -[![](https://img.shields.io/badge/%E2%9D%A4-YouTube%20Channel-0a0a0a.svg?style=flat&colorA=1EAEDB)](https://www.youtube.com/user/QAInsights?sub_confirmation=1) -[![](https://img.shields.io/badge/donate-paypal-1EAEDB)](https://www.paypal.com/paypalme/NAVEENKUMARN) - -![Performance Engineers DevOps Roadmap](/assets/Performance-Engineers-DevOps-Banner.png) - - -> Roadmap to becoming a DevOps from Performance Testing/Engineering background in 2020 - -This repository helps performance testers and engineers who wants to dive into DevOps and SRE world. - -The intention of this roadmap is not to list every software/solution out there, but give you a perspective. - -It lists the relevant links, books, courses, certifications and more. Feel free to submit a PR. - -

The Purpose

- -> The purpose of this roadmap is to give you an idea about the DevOps arena and to guide you if you are confused about what to do next. I will try to furnish the relevant information. If you feel something is misleading or suggestion, please submit a PR. - -# ⚡ Learning Roadmap - -This roadmap has been created using [Coggle](https://coggle.it/recommend/5f319149992aa26cd62beaae). Right click on the image to open in a new tab to zoom in/out. - -![Roadmap](/assets/Roadmap.png) - ---- - -# 📖 Read / Courses -* [LinkedIn SRE School](https://linkedin.github.io/school-of-sre/) 🆕 -* [Linux Command Line for Beginners](https://ubuntu.com/tutorials/command-line-for-beginners#1-overview) -* [AWS DevOps](https://aws.amazon.com/devops/) -* [Google DevOps](https://cloud.google.com/devops) -* [DevOps Practices and Principles from Microsoft](https://www.edx.org/course/devops-practices-and-principles) -* [Docker for Beginners](https://docker-curriculum.com/) -* [Learn Docker](https://learndocker.online/) -* [The Docker Handbook](https://github.com/fhsinchy/docker-handbook-projects) 🆕 -* [Kubernetes Basics](https://kubernetes.io/docs/tutorials/kubernetes-basics/) -* [Kubernetes Handbook](https://github.com/fhsinchy/kubernetes-handbook-projects) 🆕 -* [Kubernetes Bootcamp](https://www.cockroachlabs.com/kubernetes-bootcamp) 🆕 -* [Kubernetes: Up and Running: Dive into the Future of Infrastructure](https://amzn.to/2PuoSjx) -* [Kubernetes in Action](https://www.manning.com/books/kubernetes-in-action) -* [Kubernetes Patterns](https://www.redhat.com/en/engage/kubernetes-containers-architecture-s-201910240918) -* [Kubernetes Learning Path](https://azure.microsoft.com/en-us/resources/kubernetes-learning-path) 🆕 -* [Free Elastic Training](https://www.elastic.co/training/free) 🆕 -* [Site Reliability Engineering](https://landing.google.com/sre/) -* [Site Reliability Engineering: How Google Runs Production Systems](https://amzn.to/33yoIzZ) -* [Seeking SRE: Conversations About Running Production Systems at Scale](https://amzn.to/33x2VZk) -* [Chaos Engineering](https://www.gremlin.com/chaos-engineering/) 🆕 -* [Chaos Monkey](https://www.gremlin.com/chaos-monkey/) 🆕 -* [Gremlin SRE](https://www.gremlin.com/site-reliability-engineering) 🆕 -* [Fundamentals of Software Architecture: An Engineering Approach](https://amzn.to/3igKQTG) -* [Architecting with Google Compute Engine Specialization](https://www.coursera.org/specializations/gcp-architecture?) -* [The Phoenix Project: A Novel about IT, DevOps, and Helping Your Business Win](https://amzn.to/3gyctHe) -* [Prometheus LFS241](https://training.linuxfoundation.org/training/monitoring-systems-and-services-with-prometheus-lfs241/) -* [Kubernetes Tutorial for Beginners – Basic Concepts and Examples](https://spacelift.io/blog/kubernetes-tutorial) - ---- - -# 📺 Watch - -* [Start Kubernetes](https://gumroad.com/a/488174707/CgjFn) -* KodeKloud -* Linux Academy -* [Kube Academy](https://kube.academy/) -* [Whiz Labs](https://found.ee/JsuU) -* [Channel 9](https://channel9.msdn.com/Shows/DevOps-Lab) -* [Cognitive Class - Introduction to Containers, Kubernetes, and OpenShift](https://cognitiveclass.ai/courses/kubernetes-course) -* [Learn Kubernetes with your kids](https://redhat.lookbookhq.com/c/ne-bpmp_esi?x=Z3V20d&sc_cid=7013a000002gmsuAAA) 🆕 - ---- - -# 🏑 Play - -* [k3s](https://bit.ly/30Rkx0B) - Free credits during beta (Referral Link) -* [KataKoda](https://www.katacoda.com/) -* [Play with Docker](https://labs.play-with-docker.com/) -* [QwikLabs](https://www.qwiklabs.com/) -* [k8s basicLearning](https://github.com/knrt10/kubernetes-basicLearning) 🆕 -* [Git](https://learngitbranching.js.org/) 🆕 - ---- - -# 🎉 Show-off - -* [Certified Rancher Operator: Level 1](https://academy.rancher.com/courses/course-v1:RANCHER+K101+2019/about) 0️⃣ -* [Certified Kubernetes Administrator](https://qain.si/cka) -* [Certified Kubernetes Application Developer](https://qain.si/ckad) -* [Microsoft Certified: DevOps Engineer Expert](https://docs.microsoft.com/en-us/learn/certifications/devops-engineer) -* [Exam AZ-400: Designing and Implementing Microsoft DevOps Solutions](https://docs.microsoft.com/en-us/learn/certifications/exams/az-400) -* [AWS Certified DevOps Engineer - Professional](https://aws.amazon.com/certification/certified-devops-engineer-professional/) -* [Puppet Professional](https://puppet.com/learning-training/certification/) -* [Professional Cloud DevOps Engineer](https://cloud.google.com/certification/cloud-devops-engineer) -* [Gremlin Chaos Engineering Practitioner](https://www.gremlin.com/blog/announcing-the-gremlin-chaos-engineering-practitioner-certificate-program/) 0️⃣ - ---- - -# 📌 Others - -* [Dynatrace DevOps](https://www.dynatrace.com/resources/ebooks/what-is-devops-and-release-management/) -* [New Relic DevOps](https://newrelic.com/devops/) -* [New Relic AI](https://newr.ai/) 🆕 - ---- - -## 💲 Donate -☕ Buy me a tea - ---- - -> 🆕 - recently added -> 0️⃣ - free certification diff --git a/docs/raw-materials/awesome/Roadmap.png b/docs/raw-materials/awesome/Roadmap.png deleted file mode 100644 index dcfde76..0000000 Binary files a/docs/raw-materials/awesome/Roadmap.png and /dev/null differ diff --git a/docs/raw-materials/backup/qabox-alt/README.md b/docs/raw-materials/backup/qabox-alt/README.md deleted file mode 100644 index 9c006c9..0000000 --- a/docs/raw-materials/backup/qabox-alt/README.md +++ /dev/null @@ -1,4 +0,0 @@ -# README - -Springboot Alternative - diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/README.md b/docs/raw-materials/backup/qabox-alt/api-mock/README.md deleted file mode 100644 index 1c8b609..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/README.md +++ /dev/null @@ -1,1064 +0,0 @@ -# Mock.java使用说明手册 - - - -## 简介 - -这是一个仿照Mock.js语法的Java语言使用的假数据生成工具框架。 -部分方法与类介绍详细可查看JavaDoc文档(推荐先下载下来再看):[JavaDoc文档](helpDoc/index.html) - -码云生成的在线javaDoc文档:[在线文档](https://apidoc.gitee.com/ForteScarlet/Mock.java/) - -如果存在BUG或者有什么意见、建议,可以通过 issue 进行反馈。 - - -github: [github](https://github.com/ForteScarlet/Mock.java) - -gitee : [码云地址](https://gitee.com/ForteScarlet/Mock.java) - -此框架中不仅仅只可以作为假数据获取用,还有一些比较实用的工具类可以拿来单独使用。 - -*工具类介绍:工具类介绍 - -当前版本:![[maven](https://search.maven.org/artifact/io.gitee.ForteScarlet/mock.java)](https://img.shields.io/maven-central/v/io.gitee.ForteScarlet/mock.java) - -最低JDK版本:JDK8 - -※ 版本更新内容与预期更新计划详见于文档末尾 : 更新公告 - -# **WIKI** - -文档将会开始转移至WIKI处,转移完成后,此处README中的说明性文档将不再更新并择日删除,替换为简单的介绍与demo示例。 - -**wiki文档:[github wiki](https://github.com/ForteScarlet/Mock.java/wiki) or [gitee wiki](https://gitee.com/ForteScarlet/Mock.java/wikis/pages)** - -## 注意 -未来2.x版本将会使用与1.x版本不同的包路径。如果迭代版本请注意包路径的修改。 -仅为修改包路径,其余内容不变。 -如果有2.x的话 - -
- -## 友情链接 -|项目名称|项目介绍|项目地址| -|---|---|---| -|Mock.JDBC|基于Mock.java与JDBC向数据库插入假数据(暂时停工)|https://github.com/ForteScarlet/Mock.JDBC| - - -
- -## 使用方法 -### **Maven** - -在maven项目下,从pom.xml中导入以下地址: -> 最新版本以maven仓库中的地址为准。仓库地址:`https://mvnrepository.com/artifact/io.gitee.ForteScarlet/mock.java` - -```xml - - io.gitee.ForteScarlet - mock.java - ${version} - -``` - -### **Gradle** - -```gradle -compile group: 'io.gitee.ForteScarlet', name: 'mock.java', version: '${version}' -``` - -### **Jar** - -使用jar包导入的时候,记得同时把作为依赖的`commons-beanutils.commons-beanutils:1.9.3`中的jar包也导入进去。我上传了这些依赖在 [dependencies文件夹](./dependencies) 中。 - - - - - -### **使用** - -相信使用过Mock.js的各位大佬应该知道,在使用Mock.js的时候是用的JSON格式的参数。 -但是,Java可是没法直接识别JSON的啊! -所以,我们采用最接近JSON格式的方式:**Map集合**。 - -简单来说,就是将一个类的字段根据Mock.js那样的key-value的键值对转化为一个Map对象就好了!我习惯将这种Map对象称为 *字段映射表* 。 - -而且作为Java语言,数据类型是必须要多加考虑的问题。我在获取值的时候已经尽可能的增加了容错率,但是还是需要您注意数据类型的问题,请尽可能不要犯下将一个字符串赋值给整数这类难以防范的错误.. - -或许感觉上比JSON格式的使用要麻烦一些,但是这也是没有办法的事情嘛!假如您有更好的代替方式,希望您能告诉我 :) - - -### 设置字段映射的方式: - -#### 1·创建对象字段与随机值语法的映射关系(Map 类型的键值对) - -创建的这个Map,Key值代表了映射的字段名,value值代表了映射语法 -由于这毕竟与弱引用类型语言不同,所以在设置映射的时候请务必注意字段的数据类型。 - -`Map map = new HashMap<>();` - -#### 2·添加字段映射 - -字段映射中,value值所用到的 @函数 可以从 [JavaDoc文档](https://apidoc.gitee.com/ForteScarlet/Mock.java/) 中查阅 [**MockUtil**](https://apidoc.gitee.com/ForteScarlet/Mock.java/com/forte/util/utils/MockUtil.html) 类中的方法,MockUtil中的全部方法均可作为 @函数 出现在value值中。 - -> **再次提醒,请务必注意对应好字段的字段类型** - -```java -map.put("age","@age"); -map.put("list|2-3","@title"); -map.put("user","@name"); -...... -``` - -key值中,有三种写法: - -仅有字段映射、字段映射与整数部分区间参数、字段映射、整数部分区间参数与小数部分区间参数。 - -例如如下这么两个字段映射: - -```java -map.put("money1|10-40.2-4" , 0); -map.put("money2|10-40.2" , 0); -``` - -其中,字段名与区间参数之间的分割符为 **|** 符号,左边为字段名,右半边为区间参数。 - -区间参数中,整数部分与小数部分用 **.** 符号分割,左半边为整数部分区间参数,右半边为小数部分区间参数。 - -- ##### 仅有字段映射 - - 任务分配器首先会根据参数(value)的类型分配字段解析器,然后再根据字段类型进行取值。 - - 参数类型有一下几种情况: - - - **字符串类型**:如果存在一个或多个@函数,解析@函数并取值,(如果有多个@函数则会尝试对@函数的取值进行加法计算);如果不存在@函数或@函数不存在于MockUtil中的方法列表则将其会视为普通字符串。 - - **整数(Integer)、浮点数(Double)类型**:如果参数为Integer或Double类型,则字段值获取器会直接将此值作为默认值赋给字段。 - - **数组或集合**:如果参数是数组或集合类型,字段值获取器会从其中随机获取一个值赋予字段。 - - **Map集合**:如果参数是Map集合类型,则会对字段的类型进行判断,如果: - - 字段为Map类型,则直接将此Map作为字段值赋予字段,不做处理。 - - 字段为List类型,则字段值获取器会将将此Map集合封装至List集合中并返回。 - - 字段为其他任意类型,则任务分配器会将此Map视为此字段类型的字段映射集合进行解析并获取一个实例对象为字段赋值。**(※注:此字段映射同样会被Mock的映射集合记录下来。即嵌套的字段映射不需要单独再进行set了。)** - - **其他任意类型**:如果参数不是上面的任何类型,则字段值获取器会将此参数原样赋值,不做处理。 - - ```java - //假设以下字段映射的是User类 - Map map = new HashMap<>(); - map.put("age1" , "@age"); - map.put("age2" , 15); - map.put("age3" , new Integer(){1,2,3,4}); - map.put("name1" , "@name"); - map.put("name2" , "@title(2)"); - map.put("name3" , "这是一个名字"); - - //下面三个email字段的参数,如果是中文,必须放在单引号或双引号中才会生效,英文不受限制 - map.put("email1" , "@email('这是中文')"); - map.put("email2" , "@email('this is english')"); - map.put("email3" , "@email(this is english)"); - - //下面的friend字段的字段类型是一个Friend类,friendMap是对friend字段的映射,也就是嵌套映射 - //此friendMap的映射无需单独进行记录 - map.put("friend" , friendMap); - - //记录映射 - Mock.set(User.class, map); - //User类的映射被直接记录,可以获取 - MockObject userMockObject = Mock.get(User.class); - //Friend类的映射以嵌套的形式被记录过了,可以直接获取 - MockObject friendMockObject = Mock.get(Friend.class); - ``` - - - ##### 字段映射与仅整数部分的区间参数 - - 参数类型有一下几种情况: - > - - **字符串类型**:如果存在一个或多个@函数,解析@函数并取值,(如果有多个@函数则会尝试对@函数的取值进行加法计算);在存在@函数的情况下,区间参数将被忽略。 - - **※ 从`v1.4.2与v1.4.3`之后,当字段类型为Object类型(常见于创建Map类型对象)则会根据区间函数创建范围内大小的List集合。(详细见`v1.4.2与v1.4.3`更新日志)** - - - - 如果不存在@函数或@函数不存在于MockUtil中的方法列表则将其会视为普通字符串,然后根据整数参数区间获取一个随机数值并对此字符串进行重复输出。 - - - **整数(Integer)、浮点数(Double)类型**:如果参数为Integer或Double类型,则字段值获取器将区间参数作为方法参数,根据字段的类型使用随机函数获取对应的随机值。 - - 例如: - - ```java - //age为一个Integer类型的字段,等同于使用了@integer(2,4)函数 - map.put("age|2-4" , 0); - //money为一个Double类型,等同于使用了@doubles(2,4)函数 - map.put("money|2-4" , 0); - ``` - - 则age将会被赋予一个2-4之间的随机整数(Integer),money将会被赋予一个2-4之间最大小数位数为0的浮点数(Double)。 - - - **数组或集合**:如果参数是数组或集合类型,任务分配器会判断字段的类型分配字段值获取器: - - - 字段类型为整数或浮点数,则区间参数将会被忽略,直接从参数中获取一个随机元素并赋值。 - - *如下所示三种情况的取值是完全相同的:* - - ```java - //age为一个Integer类型的字段 - map.put("age|2-4" , new Integer[]{1,2,3}); - map.put("age|2" , new Integer[]{1,2,3}); - map.put("age" , new Integer[]{1,2,3}); - ``` - - - 字段类型为数组或集合的时候,会根据区间参数获取一个随机数量,并从传入的参数中获取此数量的随机元素。 - - - **Map集合**:如果参数是Map集合类型,则任务分配器会对字段的类型进行判断,如果: - - - 字段为Map类型,则直接将此Map作为字段值赋予字段且忽略区间参数,不做处理。 - - - 字段为List类型,则字段值获取器会将将此Map集合封装至List集合并根据区间参数重复一个随即数量并返回。 - - - 字段为List 类型,即一个任意泛型的List类型的时候,任务分配器会将此Map视为此泛型类型的字段映射集合进行解析,再根据区间参数获取指定范围内数量的实例对象,并封装为List类型为字段赋值。**(※注:上文提到过,内嵌字段映射同样会被记录。)** - - - 字段为其他任意类型,则任务分配器会将此Map视为此字段类型的字段映射集合进行解析并获取一个实例对象为字段赋值,忽略区间参数。**(※注:上文提到过,内嵌字段映射同样会被记录。)** - - - - ※ 在`v1.4.3`版本之后,存在整数区间的Map映射机制存在改动(主要为结果对象为Map类型的时候)详见`v1.4.3`版本日志。 - - - **其他任意类型**: - - 如果字段是list类型或数组类型,则会根区间参数重复输出并为字段赋值。 - - 如果字段类型为其他未知类型,则会忽略区间参数并使用参数值作为默认值赋值。 - -```java - map.put("list|2-6" , "@title"); - map.put("age|10-40" , 2); -``` - - - 字段映射、整数区间参数和小数区间参数 - - 参数类型有一下几种情况: - - - **字符串类型**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **整数(Integer)、浮点数(Double)类型**:如果字段类型为整数,则会无视小数部分区间参数,与整数区间参数时的情况相同( 字段映射与仅整数部分的区间参数 )。 - - 如果字段类型为小数,则会根据区间参数获取一个指定区间内的随机小数,例如: - - ```java - // money为一个Double类型的字段,此映射等同于使用了@doubles(2,4,2,4)函数 - map.put("money|2-4.2-4" , 0); - // money为一个Double类型的字段,此映射等同于使用了@doubles(2,4,2)函数 - map.put("money|2-4.2" , 0); - ``` - - - **数组或集合**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **Map集合**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **其他任意类型**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - - -#### 3·获取假字段封装对象 - -通过Mock的get方法获取一个已经添加过映射记录的数据 - -* 首先使用set方法记录类的字段映射 - - ```java - //映射表尽可能是Sting,Object类型的 - Map map = new HashMap<>(); - //添加一个映射 - map.put("age|10-40" , 2); - //记录类的映射 - //1、使用javaBean封装 - Mock.set(User.class , map); - //2、或者直接使用Map类型,不再需要javaBean的class对象,但是需要指定一个映射名 - Mock.set("userMap", map); - ``` - - - * 然后使用get方法得到假对象封装类 - -```java -//已经记录过User类的映射,获取封装类 -//1、如果是使用的javaBean记录的,使用javaBean获取 -MockObject mockObject = Mock.get(User.class); -//2、或者你之前是使用map记录的,使用记录时保存的映射名获取 -//注:MockMapObject 对象实现了MockObject接口 -MockMapObject mockMapObject = Mock.get("userMap"); -``` - - * 根据MockObject中提供的API来获取你所需要的结果: - - - -```java -// 获取一个结果,并使用Optional类进行封装。 -Optional get(); -``` - -```java -// 获取一个结果 -T getOne(); -``` - -```java -// 获取指定数量的多个结果,返回List集合 -List getList(int num); -``` - - -```java -// 获取指定数量的多个结果,并根据给定规则进行转化,返回List集合 -List getList(int num , Function mapper); -``` - -```java -// 获取指定数量的多个结果,返回Set集合 -Set getSet(int num); -``` -```java -// 获取指定数量的多个结果,并根据给定规则进行转化,返回Set集合 -Set getSet(int num , Function mapper); -``` - - - ```java -// 获取指定数量的多个结果,并根据给定规则转化为Map集合 -Map getMap(int num , Function keyMapper, Function valueMapper); - ``` - - - - -​ -​ **※ 自1.3版本之后,我优化了`MockObject`接口内部结构,并增加了大量parallel方法与collect方法,您现在可以在1.3版本中更加灵活的对数据进行转化,或者根据数据量的需求自行决定是否需要使用并行线程进行对象创建。** - -## **自定义@函数** - -有时候,我提供的MockUtil中的方法可能无法满足您的需求,那么这时候,就需要一个可以对@函数进行扩展、加强的窗口。在v1.1版本中,我添加了这个功能。(这个功能测数量很少,可能会存在很多bug) - -### 1· 获取自定义@函数加载器 - -```java -//获取@函数加载器 -MethodLoader methodLoader = Mock.mockMethodLoader(); -``` - -函数加载器支持链式加载,也支持一次性加载 - -链式: - -```java -LoadResults loadResults = methodLoader - //添加指定类中的指定方法名的方法 - .append(Demo1.class, "testMethod") - //添加指定类中的多个指定方法名的方法 - .appendByNames(Demo2.class, new String[]{"method1" , "method2"}) - //添加指定类中的多个符合指定正则回则的方法 - .appendByRegex(Demo3.class, "[a-zA-Z]+") - //还有很多...敬请查阅API文档 - .load(); -``` - -> 使用链式加载的时候,请务必记住在结尾使用load()进行加载,否则方法集将无法被加载,而是一直留存在等待区。 - -非链式: - -```java -methodLoader.add(Demo1.class, "testMethod"); -``` - -通过以上代码可以发现,加载完成后都会有一个` LoadResults` 类作为返回值,这个类是在方法加载后的一个加载报告封装类,通过`LoadResults` 可以获取到刚刚加载的方法谁成功了,谁失败了,失败了的方法为什么失败等信息: - -```java - -Map> map = loadResults.loadResults();//加载的方法集根据成功与否分组 -Set successMethods = loadResults.loadSuccessResults();//加载成功的方法集 -Map whyFailMap = loadResults.whyFail();//加载失败的方法以及抛出的异常 -int successNum = loadResults.successNums();//成功的个数 -int failNum = loadResults.failNums();//失败的个数 -``` - -假若加载成功后,则此方法便可以直接在映射中直接用@开头作为使用@函数使用了~ - - - -# **注解形式映射** - -1.4版本之后我提供了两个可以使用在字段上的注解:`@MockValue` 和 `@MockArray` - -## @MockValue - -使用在类的字段上,参数: - -```java - /** - * 映射值,如果为空则视为无效 - */ - String value(); - - /* --- 1.6.0后增加 --- */ - - /** - * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值 - * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为 {@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。 - * @since 1.6.0 - */ - String param() default ""; - - /** - * 参数value的最终类型,在转化的时候会使用beanutils中的工具类 - {@link org.apache.commons.beanutils.ConvertUtils}进行类型转化, 默认为String类型。 - * @return - */ - Class valueType() default String.class; -``` - -也就是说,假设这个字段叫做:`field_A`,则映射结果大致相当于: - -```java -// 其中,${value()} 的最终结果值为通过ConvertUtils进行转化的结果。 -// 其中,[|${param()}]的存在与否取决于param()里有没有值 -xxxMap.put("${field_A}[|${param()}]", (${valueType()}) ${value()}) -``` - - - -用来指定此字段的映射值。例如: - -```java -public class User { - - // 相当于 ("name", "@cname") - @MockValue("@name") - private String name; - - // 相当于 ("age|20-40", 0) - @MockValue(value = "0", param = "20-40", valueType = Integer.class) - private Integer age; - - // 省略 getter & setter - -} -``` - - - - - -## @MockArray - -使用在类的字段上,参数: - -```java - /** - * 数组参数, 必填参数 - */ - String[] value(); - - /** - * 类型转化器实现类,需要存在无参构造 - * 默认转化为字符串,即默认不变 - */ - Class mapper() default ArrayMapperType.ToString.class; - - /* --- 1.6.0后增加 --- */ - - /** - * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值 - * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为 - {@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。 - * @since 1.6.0 - */ - String param() default ""; -``` - -其中,`mapper()`参数可选,其类型为`ArrayMapper`接口的的实现类,用于指定将字符串数组,也就是`value()`中的值进行转化的规则。此参数默认为不进行转化,即转化为字符串类型。 - -`ArrayMapper`接口中的抽象方法: - -```java - /** - * 给你一个数组长度,返回一个数组实例的function,用于数组的实例化获取 - * @return 数组实例获取函数,例如:Integer[]::new; 或者 size -> new Integer[size]; - */ - IntFunction getArrayParseFunction(); - - /** - * 将字符串转化为指定类型 - */ - T apply(String t); - -``` - -在对`ArrayMapper`接口进行实现的时候,请务必保留下无参构造用于对其进行实例化。 - - - -对于一些比较常见的类型转化,我提供了几个已经实现好的实现类。这些实现类以内部类的形式存在于`ArrayMapperType`接口中。 - -- `ArrayMapperType.ToString.class` - - 转化为字符串类型,即不进行转化 - -- `ArrayMapperType.ToInt.class` - - 转化为Integer类型 - -- `ArrayMapperType.ToLong.class` - - 转化为Long类型 - -- `ArrayMapperType.ToDouble.class` - - 转化为Double类型 - -例如: - -```java -public class User { - @MockArray(value = {"1", "2", "3"}, mapper = ArrayMapperType.ToInt.class) - private int age; - - // 省略 getter & setter -} -``` - - - - - -## **使用** - -使用也很简单,我在`Mock`中增加了4个方法,2个`set`方法 2个`reset`方法。 - -```java - /* --- 1.4版本之后增加 --- */ - - /** - * 通过注解来获取映射 - */ - public static void set(Class objClass); - - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void setWithOther(Class objClass, Map other); - - /** - * 通过注解来获取映射 - */ - public static void reset(Class objClass); - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void resetWithOther(Class objClass, Map other); -``` - - - -## **注意事项** - -### 注解优先级 - -假如你在同一个字段上同时使用了两个注解,则会优先使用`@MockValue`; - - -### 额外映射 - -可以发现,4个方法中各有一个方法需要提供额外参数,他会在注解映射创建完毕后进行添加,也就是假如额外参数和字段中有冲突的键,则额外参数的值将会覆盖注解映射值。 - - -# **映射扫描** - -1.6.0版本后,我更新了**映射扫描**与**映射代理**功能。感谢提出建议的朋友。[Issue#I1CCMT](https://gitee.com/ForteScarlet/Mock.java/issues/I1CCMT) - -在您使用注解形式映射的时候,是否有感觉到每个类都需要使用`Mock.set(...)`进行设置很麻烦?希望能够通过包扫描一键批量set?现在我增加了一个注解:`@MockBean`,将其标注在您的类上,此时再配合使用`Mock.scan(...)`方法即可扫描指定的一个或多个包路径中所有标注了`@MockBean`的javaBean。 - -对于`Mock.scan(...)`的方法定义如下: - -```java - /** - * 扫描包路径,加载标记了{@link com.forte.util.mapper.MockBean}注解的类。 - * - * @param classLoader nullable, 类加载器, null则默认为当前类加载器 - * @param withOther nullable, 假如扫描的类中存在某些类,你想要为它提供一些额外的参数,此函数用于获取对应class所需要添加的额外参数。可以为null - * @param reset 加载注解映射的时候是否使用reset - * @param packages emptyable, 要扫描的包路径列表, 为空则直接返回空set - * @return 扫描并加载成功的类 - */ - public static Set> scan(ClassLoader classLoader, Function, Map> withOther, boolean reset, String... packages) throws Exception; - -``` - -这么多参数?先别怕,我先简单介绍下这些参数: - -- classLoader:包扫描使用的类加载器。**可以为null。** -- withOther:一个Function函数,这个参数接收一个`Class`参数,返回一个`Map`结果,即获取一个对应类的额外参数。类似于注解映射中set方法的额外映射。**可以为null。** -- reset:即如果扫描到了已经被添加的映射,是否覆盖。 -- packages:需要扫描的包路径列表。 - -除了这个方法,我还提供了一些重载方法: - -```java - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(Function, Map> withOther, boolean reset, String... packages) throws Exception; - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(boolean reset, String... packages) throws Exception; - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法, reset默认为false - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(String... packages) throws Exception; -``` - - - -## **使用** - -所以一般情况下,你可以直接这么使用: - -```java -// 扫描两个包 -Mock.scan("forte.test2.beans", "forte.test1.beans", ...); -// 然后直接获取 -Mock.get(Xxxx.class); -// 使用 -``` - -# **映射代理** - -1.6.0版本后,我更新了**映射扫描**与**映射代理**功能。感谢提出建议的朋友。[Issue#I1CCMT](https://gitee.com/ForteScarlet/Mock.java/issues/I1CCMT) - -首先看一下Issue上提出的模拟场景: - -```java -// interface -public interface ServiceA{ - VoA methodA(); -} -// bean, can with @MockBean -public class VoA{ - @MockValue("@cname") - private String p1; -} -``` - - - -此时,接口中的`methodA()`方法的返回值`VoA`恰好是一个MockBean,这时候,我想要得到`ServiceA`的一个代理对象,使其能够通过`methodA()`得到`VoA`的实例对象。 - -ok,因此我添加了方法`Mock.proxy(...)`及其重载。方法定义如下: - -```java - /** - *
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * @param type    要代理的接口类型。
-     * @param factory 接口代理处理器的获取工厂。可自行实现。
-     * @param  接口类型
-     * @return 代理结果
-     */
-    public static  T proxy(Class type, MockProxyHandlerFactory factory);
-```
-
-此方法传入一个接口类型`Class type` 和一个动态代理处理器`MockProxyHandlerFactory factory`,来获取一个代理对象。
-
-`MockProxyHandlerFactory`是一个接口类型,只存在一个抽象方法:
-
-```java
-    /**
-     * 获取代理处理接口{@link InvocationHandler}实例
-     * @param mockObjectFunction 传入一个类型和一个可能为null的name字符串,获取一个mockObject对象。如果存在name,则会尝试先用name获取
-     * @return JDK动态代理所需要的代理处理器示例。
-     * @see InvocationHandler
-     */
-    InvocationHandler getHandler(BiFunction, String, MockObject> mockObjectFunction);
-
-```
-
-此接口定义了如何创建一个代理类。`InvocationHandler`是JDK为动态代理的创建所提供的一个接口,知道动态代理的人对他应该不会很陌生。
-
-但是如果不熟悉也没关系,我在内部提供了一个默认的实现`MockProxyHandlerFactoryImpl`,同时也为`Mock.proxy(...)`提供了一个重载方法:
-
-```java
-    /**
-     * 
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * 
 使用默认的接口代理处理器工厂{@link MockProxyHandlerFactoryImpl}。
-     * 
 默认处理工厂中,代理接口时,被代理的方法需要:
-     * 
 不是default方法。default方法会根据其自己的逻辑执行。
-     * 
 没有参数
-     * 
 没有标注{@code @MockProxy(ignore=true) ignore=true的时候代表忽略}
-     * 
-     * @see MockProxyHandlerFactoryImpl
-     * @param type    要代理的接口类型。
-     * @return 接口代理
-     */
-    public static  T proxy(Class type);
-```
-
-因此,一般情况下,你可以直接这么使用:
-
-```java
-MockTestInterface proxy = Mock.proxy(MockInterface.class);
-```
-
-让我们来创建一个示例接口来看看:
-
-```java
-public interface MockTestInterface {
-
-    /**
-     * 指定List的泛型类型为User类型,长度为2-4之间
-     */
-    @MockProxy(size = {2, 4}, genericType = User.class)
-    List list();
-
-    /**
-     * 默认情况下,长度为1
-     */
-    Teacher[] array();
-
-    /**
-     * 直接获取,不用注解。
-     */
-    Admin admin();
-    
-    /**
-     * 获取name为{@code mock_map}的mockObject, 基本上返回值都是Map类型。
-     */
-    @MockProxy(name = "mock_map")
-    Map map();
-
-    /**
-     * 忽略, 返回值会默认为null
-     * @return null
-     */
-    @MockProxy(ignore = true)
-    Admin justNullAdmin();
-}
-```
-
-
-
-可以看到,上面的这些抽象方法中,有一部分方法标注了注解`@MockProxy`。此注解参数如下:
-
-```java
-
-    /**
-     * 
 是否忽略此方法。如果为是,则接口的最终代理结果为返回一个null。
-     * 
 当然,如果获取不到对应的Mock类型,无论是否忽略都会返回null或者默认值。
-     * 
 如果是基础数据类型相关,数字类型,返回{@code 0 或 0.0}。
-     * 
 如果是基础数据类型相关,char类型,返回{@code ' '}。
-     * 
 如果是基础数据类型相关,boolean类型,返回{@code false}。
-     */
-    boolean ignore() default false;
-
-    /**
-     * 如果此参数存在值,则会优先尝试通过name获取MockObject对象。一般应用在返回值为Map类型的时候。
-     */
-    String name() default "";
-
-    /**
-     * 
 当接口返回值为数组或者集合的时候,此方法标记其返回值数量大小区间{@link [min, max], 即 max >= size >= min}。是数学上的闭区间。
-     * 
 如果此参数长度为0,则返回值为1。
-     * 
 如果参数长度为1,则相当于不是随机长度。
-     * 
 如果参数长度大于2,只取前两位。
-     */
-    int[] size() default {1,1};
-
-    /**
-     * 
 指定返回值类型,三种可能类型:list类型,array类型,Object其他任意类型。默认值为Unknown类型。当为Unknown类型的时候,会根据返回值类型自动判断。
-     * 
 当类型为list与array类型的时候,需要通过{@link #genericType()}方法指定泛型的类型,获取mock类型的时候将会通过此方法得到的类型来获取。
-     */
-    MockProxyType proxyType() default MockProxyType.UNKNOWN;
-
-
-    /**
-     * 假如类型为List或者数组类型,此处代表泛型的实际类型。
-     */
-    Class genericType() default Object.class;
-
-
-```
-
-
-
-简单汇总一下此注解的参数:
-
-- ignore:忽略这个方法。
-
-- name:指定mockObject的name。一般在返回值为Map类型的时候使用。
-
-- size:指定大小区间。只有在返回值为array或者List类型的时候才有用。
-
-- proxyType:指定返回值类型。一般情况下可以自动推断。
-
-- genericType:当返回值为List类型的时候,此参数指定他的泛型类型。array类型可以进行推断。
-
-  
-
-`@MockProxy`并不是必须的,但也不是可能完全省略的,需要根据实际情况来。一般来讲,如果返回值是一个任意的bean类型,则基本上可以省略,而数组类型如果没有长度要求的话也可以省略,但是例如list、map类型基本上是需要配置部分参数的。
-
-
-
-## **使用**
-
-接着上面的`MockTestInterface`,在使用的时候就是这个样子的:
-
-```java
-public static void main(String[] args) throws Exception {
-        // 扫描包
-        Mock.scan("com.test");
-        // 注册一个map类型的mockbean
-        Mock.set("mock_map", new HashMap(){{
-            put("name_map", "@cname");
-        }});
-
-        // 获取接口代理
-        final MockTestInterface proxy = Mock.proxy(MockTestInterface.class);
-
-        // 输出测试
-        System.out.println("proxy.admin(): " + proxy.admin());
-        System.out.println("proxy.justNullAdmin(): " + proxy.justNullAdmin());
-        System.out.println("proxy.list(): " + proxy.list());
-        System.out.println("proxy.array(): " + Arrays.toString(proxy.array()));
-        System.out.println("proxy.map(): " + proxy.map());
-
-}
-```
-
-# #函数
-(v1.7.0更新,参阅wiki `8_#function&class parse`)
-
-
-
-# List区间(v1.7.0)
-(v1.7.0更新,参阅wiki `9_List interval parameter(v1.7.0)`)
-
-
-
-## **注意事项**
-
-- 尽可能别用泛型。
-
-- 每次使用`Mock.proxy(...)`都会去生成动态代理对象,会影响性能,所以尽可能保存成一个单例使用。
-
-
-
-
-
-# **使用依赖列表**
-```xml
-
-   commons-beanutils
-   commons-beanutils
-   1.9.3
-
-```
-
-
-## 更新公告
-
-### **v1.9.2(2020/10/24)**
-- 追加一个区间参数`const`, 当区间参数为 `const`的时候,将不会对value值进行解析,而是直接原样赋值。
-`const`区间参数常用于对`Map`类型的`MockObject`的某些字段指定默认值。
-例如:
-```java
-        Map map = new HashMap<>();
-        // 区间参数为 'const'
-        map.put("dataLevel|const", new int[]{51, 52, 53});
-        Mock.set("usermap", map);
-        MockObject mockObject = Mock.get("usermap");
-        Map one = mockObject.getOne();
-        
-        // 输出一个 int[] 的toString,而不是一个随机的int。
-        System.out.println(one.get("dataLevel"));   
-```
-
-### **v1.9.1(2020/07/30)**
-- 修复`Mock.set(...)`使用Map类型的时候会报空指针的问题
-
-- MockUtil中追加方法`string(...)`与`stringUpper(...)`以获取纯大/小写的随机字符串([pull request:2](https://gitee.com/ForteScarlet/Mock.java/pulls/2))
-
-
-### **v1.9.0(2020/07/30)**
-- 优化实例的获取效率(尤其是以获取中文内容为主的时候)
-效率提升约3~4倍左右。
-测试参考:100w条实例获取,getList(100w)约16s, getListParallel(100w)约7s
-
-
-### **v1.8.0(2020/07/13)**
-- 增加对某个类的父类字段的处理 
-fix [#I1NLT4](https://gitee.com/ForteScarlet/Mock.java/issues/I1NLT4)
-
-- 优化mock值的获取效率(大概提升了30倍),但是当值为字符串的时候,默认情况下不会再尝试将其视为JS脚本执行了。
-如果想要开启JS脚本尝试功能,请使用静态配置类`MockConfiguration.setEnableJsScriptEngine(true)`
-fix [#I1NLWA](https://gitee.com/ForteScarlet/Mock.java/issues/I1NLWA)
-
-
-`Mock`中增加三个`setAndGet(...)`方法,故名思意,整合先set后get的流程。
-
-
-### **v1.7.1(2020/04/11)**
-修复由于我的疏忽,导致在使用注解创建映射而映射类中存在没有注解的字段的时候,会出现空指针异常的问题。
-fix gitee issue [#I1E46D](https://gitee.com/ForteScarlet/Mock.java/issues/I1E46D)
-
-
-### **v1.7.0(2020/04/01)**
-- 增加一个“#函数”, 其映射已经添加进Mock中的映射名称。例如: 
-```java
-        Map map = new HashMap<>();
-        map.put("name", "@name");
-        // set 'name_map'
-        Mock.set("name_map", map);
-
-        Map userMap = new HashMap<>();
-        userMap.put("mapList", "#name_map");
-
-        //set 'user_map'
-        Mock.set("user_map", userMap);
-        MockObject userMapMock = Mock.get("user_map");
-
-        // show 
-        System.out.println(userMapMock.getOne());
-```
-- 增加对于`Map`中,value类型为`Class`类型的参数的解析。类似于上述的“#函数”,只不过参数不是`'#xxx'`而是一个指定的`Class`对象。
-例如:
-```java
-        Map map = new HashMap<>();
-        map.put("name", "@name");
-        map.put("user", User.class);           // will get User from Mock
-        map.put("userList|1-2.0", User.class); // will get User list from Mock
-```
-- 增加针对于`List`类型参数的规则修改。
-以前版本,假如一个字段叫做`users`, 是一个List类型,此时,我填入的参数为:
-```java
-map.put("users|1-2.3-4", otherMap);
-```
-这个时候,区间参数中的`3-4`将会被忽略不计。
-当前版本,由于加入了`Class解析`与`#函数`,使得各种不同类之间的嵌套成为可能(甚至是自己嵌套自己)
-此时,我修改了`List`类型的字段的区间参数规定,以上述例子为例,现在`3-4`并不会被忽略了,List结果的最终输出长度,会从`1-2`这个区间和`3-4`这个区间中随机选择一个。
-也就是说,假如你写了:
-```java
-map.put("users|1000.0", User.class);
-```
-那么最终的结果里,`users`字段的长度,要么是1000,要么是0。
-而如果你写了:
-```java
-map.put("users|5-10.200-300", User.class);
-```
-那么最终的结果里,`users`字段的长度,要么在5-10之间,要么在200-300之间。
-
-
-### **v1.6.0(2020/3/25)**
-
-**增加功能:映射扫描、映射代理**
-
-删除某些无用代码
-删除文档开始的一些废话
-简单改善部分代码
-增加一个注解`@MockBean`, 使用在类上,当使用包扫描功能的时候,只会扫描被标记了此注解的类。配合两个注解映射使用。
-注解`@MockValue`增加参数:`String param() default ""`、`Class valueType() default String.class`
-注解`@MockArray`增加参数:`String param() default ""`
-`Mock`中增加了一个方法`scan(Function, Map> withOther, boolean reset, String... packages)`以及部分重载方法,用来根据`@MockBean`注解来进行扫描与批量的注解映射注册。返回值为加载后的class列表
-
-增加一个注解`@MockProxy` 标记接口代理中,一些需要特殊处理的抽象方法,例如返回值为Map或者忽略参数等。
-`Mock`中增加了一个方法
-
-着手准备编写wiki
-删除helpDoc文件夹
-
-### **v1.5.2(2020/2/22)**
-修复在使用ChineseUtil的时候会在控制台打印所有的姓氏的问题
-
-
-### **v1.5.1(2019.12.13)**
-修复自定义函数添加无效的bug
-
-
-### **v1.5.0(2019.12.5)**
-变更MockMapObject的获取值类型为Map(原本是Map)
-本质上依旧获取的是Map 类型。
-内部增加一些参数以适应扩展开发。
-
-
-### **v1.4.4(2019.12.4)**
-优化内部Random操作,现在理论上Random相关操作的效率会高一些了。
-
-
-### **v1.4.3(2019.10.18)**
-
-在上一个版本的基础上又为Map类型的参数与Object类型参数增加了List值返回。
-现在可以更好的支持例如fastJson等Json工具了。
-(测试量较少,可能存在些许bug)
-
-
-
-
-### **v1.4.2(2019.10.18)**
-修复机制:当生成Map类型的值的时候,假如字段映射类似于这种格式:
-```json
-{
-    "a": {
-        "array|5-10":"@name"
-    }
-}
-```
-更新之前,`array`字段将会忽略区间参数`5-10`直接根据`@name`生成随机姓名,更新后则将会生成长度在5-10之间的list集合。
-引申思考:如果参数不是字符串而是数字,如何判断是获取数组还是获取区间内的随机数字?
-于是在更新计划中追加一项以在特殊情况下指定生成的字段的数据类型
-
-
-修复当参数为List类型的时候,作为参数List没有被copy的bug。
-
-距离功能写完到现在,我意识到了曾经的自己十分喜欢使用异步并行流遍历,然而这不一定是效率最高的,
-所以在内部我修改了两处我看到的并行流遍历代码并修改为了单线程。
-
-
-
-
-### **v1.4(2019.8.26)**
-
-提供两个注解以实现注解形式的映射。(测试量较少不知道有没有bug)
-
-
-
-### **v1.3(2019.7.12)**
-
-优化`MockObject`接口内部接口,增加大量`parallel`(并行流)方法与`collect`方法。
-
-`parallel`相关方法将会在您创建对象的时候使用并行线程进行创建,当您需求的数据量较大的时候,此系列方法将会相对于原本的串行方法更加有效率。
-
-`collect`相关方法将会提供大量参照于`Stream`中的`collect`方法而制定的方法。假如您对于`Stream`中的`collect`方法十分的熟悉,那么此系列的方法将会使得您可以更加灵活的对数据列表进行操作。
-
-在接口结构更新后,接口中所有的方法全部基于`get()`方法而不需要其他实现。
-
-(虽然用户一般也不需要实现此接口。)
-
-### **v1.2 (2019.2.27)**
-
-※ 与上一版本不兼容点:将MockObject类变更为接口类型
-
-支持直接获取Map类型,不再必须指定一个javaBean了
-
-@函数中的字符串参数可以支持中文了,但是中文请务必放在单引号或双引号之中才可以生效
-
-### **v1.1  (2019.1.4)**
-
-支持自定义方法的导入
-
-### **v1.0  (2018.12.20)**
-
-更新README.md文档。
-
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/old_readme.md b/docs/raw-materials/backup/qabox-alt/api-mock/old_readme.md
deleted file mode 100644
index 5fad09b..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/old_readme.md
+++ /dev/null
@@ -1,966 +0,0 @@
-# Mock.java使用说明手册
-
-
-
-## 简介
-
-这是一个仿照Mock.js语法的Java语言使用的假数据生成工具框架。
-部分方法与类介绍详细可查看JavaDoc文档(推荐先下载下来再看):[JavaDoc文档](helpDoc/index.html)
-
-码云生成的在线javaDoc文档:[在线文档](https://apidoc.gitee.com/ForteScarlet/Mock.java/)
-
-如果存在BUG或者有什么意见、建议,可以通过邮箱`ForteScarlet@163.com`进行反馈或者联系QQ`1149159218`.
-(记得注明身份哦~)
-
-github: [github](https://github.com/ForteScarlet/Mock.java)
-
-gitee : [码云地址](https://gitee.com/ForteScarlet/Mock.java)
-
-此框架中不仅仅只可以作为假数据获取用,还有一些比较实用的工具类可以拿来单独使用。
-
-*工具类介绍:工具类介绍
-
-当前版本:![[maven](https://search.maven.org/artifact/io.gitee.ForteScarlet/mock.java)](https://img.shields.io/maven-central/v/io.gitee.ForteScarlet/mock.java)
-
-最低JDK版本:JDK8
-
-※ 版本更新内容与预期更新计划详见于文档末尾 : 更新公告
-
-# **WIKI**
-
-文档将会开始转移至WIKI处,转移完成后,此处README中的说明性文档将不再更新并择日删除,替换为简单的介绍与demo示例。
-
-**wiki文档:[github wiki](https://github.com/ForteScarlet/Mock.java/wiki) or [gitee wiki](https://gitee.com/ForteScarlet/Mock.java/wikis/pages)**
-
-## 注意
-未来2.x版本将会使用与1.x版本不同的包路径。如果迭代版本请注意包路径的修改。
-仅为修改包路径,其余内容不变。
-如果有2.x的话
-
-
- -## 友情链接 -|项目名称|项目介绍|项目地址| -|---|---|---| -|Mock.JDBC|基于Mock.java与JDBC向数据库插入假数据(暂时停工)|https://github.com/ForteScarlet/Mock.JDBC| - - -
- -## 使用方法 -### **Maven** - -在maven项目下,从pom.xml中导入以下地址: -> 最新版本以maven仓库中的地址为准。仓库地址:`https://mvnrepository.com/artifact/io.gitee.ForteScarlet/mock.java` - -```xml - - io.gitee.ForteScarlet - mock.java - ${version} - -``` - -### **Gradle** - -```gradle -compile group: 'io.gitee.ForteScarlet', name: 'mock.java', version: '${version}' -``` - -### **Jar** - -使用jar包导入的时候,记得同时把作为依赖的`commons-beanutils.commons-beanutils:1.9.3`中的jar包也导入进去。我上传了这些依赖在 [dependencies文件夹](./dependencies) 中。 - - - - - -### **使用** - -相信使用过Mock.js的各位大佬应该知道,在使用Mock.js的时候是用的JSON格式的参数。 -但是,Java可是没法直接识别JSON的啊! -所以,我们采用最接近JSON格式的方式:**Map集合**。 - -简单来说,就是将一个类的字段根据Mock.js那样的key-value的键值对转化为一个Map对象就好了!我习惯将这种Map对象称为 *字段映射表* 。 - -而且作为Java语言,数据类型是必须要多加考虑的问题。我在获取值的时候已经尽可能的增加了容错率,但是还是需要您注意数据类型的问题,请尽可能不要犯下将一个字符串赋值给整数这类难以防范的错误.. - -或许感觉上比JSON格式的使用要麻烦一些,但是这也是没有办法的事情嘛!假如您有更好的代替方式,希望您能告诉我 :) - - -### 设置字段映射的方式: - -#### 1·创建对象字段与随机值语法的映射关系(Map 类型的键值对) - -创建的这个Map,Key值代表了映射的字段名,value值代表了映射语法 -由于这毕竟与弱引用类型语言不同,所以在设置映射的时候请务必注意字段的数据类型。 - -​ `Map map = new HashMap<>();` - -#### 2·添加字段映射 - -字段映射中,value值所用到的 @函数 可以从 [JavaDoc文档](https://apidoc.gitee.com/ForteScarlet/Mock.java/) 中查阅 [**MockUtil**](https://apidoc.gitee.com/ForteScarlet/Mock.java/com/forte/util/utils/MockUtil.html) 类中的方法,MockUtil中的全部方法均可作为 @函数 出现在value值中。 - -> **再次提醒,请务必注意对应好字段的字段类型** - -```java -map.put("age","@age"); -map.put("list|2-3","@title"); -map.put("user","@name"); -...... -``` - -key值中,有三种写法: - -仅有字段映射、字段映射与整数部分区间参数、字段映射、整数部分区间参数与小数部分区间参数。 - -例如如下这么两个字段映射: - -```java -map.put("money1|10-40.2-4" , 0); -map.put("money2|10-40.2" , 0); -``` - -其中,字段名与区间参数之间的分割符为 **|** 符号,左边为字段名,右半边为区间参数。 - -区间参数中,整数部分与小数部分用 **.** 符号分割,左半边为整数部分区间参数,右半边为小数部分区间参数。 - -- ##### 仅有字段映射 - - 任务分配器首先会根据参数(value)的类型分配字段解析器,然后再根据字段类型进行取值。 - - 参数类型有一下几种情况: - - - **字符串类型**:如果存在一个或多个@函数,解析@函数并取值,(如果有多个@函数则会尝试对@函数的取值进行加法计算);如果不存在@函数或@函数不存在于MockUtil中的方法列表则将其会视为普通字符串。 - - **整数(Integer)、浮点数(Double)类型**:如果参数为Integer或Double类型,则字段值获取器会直接将此值作为默认值赋给字段。 - - **数组或集合**:如果参数是数组或集合类型,字段值获取器会从其中随机获取一个值赋予字段。 - - **Map集合**:如果参数是Map集合类型,则会对字段的类型进行判断,如果: - - 字段为Map类型,则直接将此Map作为字段值赋予字段,不做处理。 - - 字段为List类型,则字段值获取器会将将此Map集合封装至List集合中并返回。 - - 字段为其他任意类型,则任务分配器会将此Map视为此字段类型的字段映射集合进行解析并获取一个实例对象为字段赋值。**(※注:此字段映射同样会被Mock的映射集合记录下来。即嵌套的字段映射不需要单独再进行set了。)** - - **其他任意类型**:如果参数不是上面的任何类型,则字段值获取器会将此参数原样赋值,不做处理。 - - ```java - //假设以下字段映射的是User类 - Map map = new HashMap<>(); - map.put("age1" , "@age"); - map.put("age2" , 15); - map.put("age3" , new Integer(){1,2,3,4}); - map.put("name1" , "@name"); - map.put("name2" , "@title(2)"); - map.put("name3" , "这是一个名字"); - - //下面三个email字段的参数,如果是中文,必须放在单引号或双引号中才会生效,英文不受限制 - map.put("email1" , "@email('这是中文')"); - map.put("email2" , "@email('this is english')"); - map.put("email3" , "@email(this is english)"); - - //下面的friend字段的字段类型是一个Friend类,friendMap是对friend字段的映射,也就是嵌套映射 - //此friendMap的映射无需单独进行记录 - map.put("friend" , friendMap); - - //记录映射 - Mock.set(User.class, map); - //User类的映射被直接记录,可以获取 - MockObject userMockObject = Mock.get(User.class); - //Friend类的映射以嵌套的形式被记录过了,可以直接获取 - MockObject friendMockObject = Mock.get(Friend.class); - ``` - - - ##### 字段映射与仅整数部分的区间参数 - - 参数类型有一下几种情况: - > - - **字符串类型**:如果存在一个或多个@函数,解析@函数并取值,(如果有多个@函数则会尝试对@函数的取值进行加法计算);在存在@函数的情况下,区间参数将被忽略。 - - **※ 从`v1.4.2与v1.4.3`之后,当字段类型为Object类型(常见于创建Map类型对象)则会根据区间函数创建范围内大小的List集合。(详细见`v1.4.2与v1.4.3`更新日志)** - - - - 如果不存在@函数或@函数不存在于MockUtil中的方法列表则将其会视为普通字符串,然后根据整数参数区间获取一个随机数值并对此字符串进行重复输出。 - - - **整数(Integer)、浮点数(Double)类型**:如果参数为Integer或Double类型,则字段值获取器将区间参数作为方法参数,根据字段的类型使用随机函数获取对应的随机值。 - - 例如: - - ```java - //age为一个Integer类型的字段,等同于使用了@integer(2,4)函数 - map.put("age|2-4" , 0); - //money为一个Double类型,等同于使用了@doubles(2,4)函数 - map.put("money|2-4" , 0); - ``` - - 则age将会被赋予一个2-4之间的随机整数(Integer),money将会被赋予一个2-4之间最大小数位数为0的浮点数(Double)。 - - - **数组或集合**:如果参数是数组或集合类型,任务分配器会判断字段的类型分配字段值获取器: - - - 字段类型为整数或浮点数,则区间参数将会被忽略,直接从参数中获取一个随机元素并赋值。 - - *如下所示三种情况的取值是完全相同的:* - - ```java - //age为一个Integer类型的字段 - map.put("age|2-4" , new Integer[]{1,2,3}); - map.put("age|2" , new Integer[]{1,2,3}); - map.put("age" , new Integer[]{1,2,3}); - ``` - - - 字段类型为数组或集合的时候,会根据区间参数获取一个随机数量,并从传入的参数中获取此数量的随机元素。 - - - **Map集合**:如果参数是Map集合类型,则任务分配器会对字段的类型进行判断,如果: - - - 字段为Map类型,则直接将此Map作为字段值赋予字段且忽略区间参数,不做处理。 - - - 字段为List类型,则字段值获取器会将将此Map集合封装至List集合并根据区间参数重复一个随即数量并返回。 - - - 字段为List 类型,即一个任意泛型的List类型的时候,任务分配器会将此Map视为此泛型类型的字段映射集合进行解析,再根据区间参数获取指定范围内数量的实例对象,并封装为List类型为字段赋值。**(※注:上文提到过,内嵌字段映射同样会被记录。)** - - - 字段为其他任意类型,则任务分配器会将此Map视为此字段类型的字段映射集合进行解析并获取一个实例对象为字段赋值,忽略区间参数。**(※注:上文提到过,内嵌字段映射同样会被记录。)** - - - - ※ 在`v1.4.3`版本之后,存在整数区间的Map映射机制存在改动(主要为结果对象为Map类型的时候)详见`v1.4.3`版本日志。 - - - **其他任意类型**: - - 如果字段是list类型或数组类型,则会根区间参数重复输出并为字段赋值。 - - 如果字段类型为其他未知类型,则会忽略区间参数并使用参数值作为默认值赋值。 - -```java - map.put("list|2-6" , "@title"); - map.put("age|10-40" , 2); -``` - - - 字段映射、整数区间参数和小数区间参数 - - 参数类型有一下几种情况: - - - **字符串类型**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **整数(Integer)、浮点数(Double)类型**:如果字段类型为整数,则会无视小数部分区间参数,与整数区间参数时的情况相同( 字段映射与仅整数部分的区间参数 )。 - - 如果字段类型为小数,则会根据区间参数获取一个指定区间内的随机小数,例如: - - ```java - // money为一个Double类型的字段,此映射等同于使用了@doubles(2,4,2,4)函数 - map.put("money|2-4.2-4" , 0); - // money为一个Double类型的字段,此映射等同于使用了@doubles(2,4,2)函数 - map.put("money|2-4.2" , 0); - ``` - - - **数组或集合**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **Map集合**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - **其他任意类型**:同仅整数区间参数时的情况。( 字段映射与仅整数部分的区间参数 )。 - - - - -#### 3·获取假字段封装对象 - -通过Mock的get方法获取一个已经添加过映射记录的数据 - -* 首先使用set方法记录类的字段映射 - - ```java - //映射表尽可能是Sting,Object类型的 - Map map = new HashMap<>(); - //添加一个映射 - map.put("age|10-40" , 2); - //记录类的映射 - //1、使用javaBean封装 - Mock.set(User.class , map); - //2、或者直接使用Map类型,不再需要javaBean的class对象,但是需要指定一个映射名 - Mock.set("userMap", map); - ``` - - - * 然后使用get方法得到假对象封装类 - -```java -//已经记录过User类的映射,获取封装类 -//1、如果是使用的javaBean记录的,使用javaBean获取 -MockObject mockObject = Mock.get(User.class); -//2、或者你之前是使用map记录的,使用记录时保存的映射名获取 -//注:MockMapObject 对象实现了MockObject接口 -MockMapObject mockMapObject = Mock.get("userMap"); -``` - - * 根据MockObject中提供的API来获取你所需要的结果: - - - -```java -// 获取一个结果,并使用Optional类进行封装。 -Optional get(); -``` - -```java -// 获取一个结果 -T getOne(); -``` - -```java -// 获取指定数量的多个结果,返回List集合 -List getList(int num); -``` - - -```java -// 获取指定数量的多个结果,并根据给定规则进行转化,返回List集合 -List getList(int num , Function mapper); -``` - -```java -// 获取指定数量的多个结果,返回Set集合 -Set getSet(int num); -``` -```java -// 获取指定数量的多个结果,并根据给定规则进行转化,返回Set集合 -Set getSet(int num , Function mapper); -``` - - - ```java -// 获取指定数量的多个结果,并根据给定规则转化为Map集合 -Map getMap(int num , Function keyMapper, Function valueMapper); - ``` - - - - -​ -​ **※ 自1.3版本之后,我优化了`MockObject`接口内部结构,并增加了大量parallel方法与collect方法,您现在可以在1.3版本中更加灵活的对数据进行转化,或者根据数据量的需求自行决定是否需要使用并行线程进行对象创建。** - -## **自定义@函数** - -有时候,我提供的MockUtil中的方法可能无法满足您的需求,那么这时候,就需要一个可以对@函数进行扩展、加强的窗口。在v1.1版本中,我添加了这个功能。(这个功能测数量很少,可能会存在很多bug) - -### 1· 获取自定义@函数加载器 - -```java -//获取@函数加载器 -MethodLoader methodLoader = Mock.mockMethodLoader(); -``` - -函数加载器支持链式加载,也支持一次性加载 - -链式: - -```java -LoadResults loadResults = methodLoader - //添加指定类中的指定方法名的方法 - .append(Demo1.class, "testMethod") - //添加指定类中的多个指定方法名的方法 - .appendByNames(Demo2.class, new String[]{"method1" , "method2"}) - //添加指定类中的多个符合指定正则回则的方法 - .appendByRegex(Demo3.class, "[a-zA-Z]+") - //还有很多...敬请查阅API文档 - .load(); -``` - -> 使用链式加载的时候,请务必记住在结尾使用load()进行加载,否则方法集将无法被加载,而是一直留存在等待区。 - -非链式: - -```java -methodLoader.add(Demo1.class, "testMethod"); -``` - -通过以上代码可以发现,加载完成后都会有一个` LoadResults` 类作为返回值,这个类是在方法加载后的一个加载报告封装类,通过`LoadResults` 可以获取到刚刚加载的方法谁成功了,谁失败了,失败了的方法为什么失败等信息: - -```java - -Map> map = loadResults.loadResults();//加载的方法集根据成功与否分组 -Set successMethods = loadResults.loadSuccessResults();//加载成功的方法集 -Map whyFailMap = loadResults.whyFail();//加载失败的方法以及抛出的异常 -int successNum = loadResults.successNums();//成功的个数 -int failNum = loadResults.failNums();//失败的个数 -``` - -假若加载成功后,则此方法便可以直接在映射中直接用@开头作为使用@函数使用了~ - - - -# **注解形式映射** - -1.4版本之后我提供了两个可以使用在字段上的注解:`@MockValue` 和 `@MockArray` - -## @MockValue - -使用在类的字段上,参数: - -```java - /** - * 映射值,如果为空则视为无效 - */ - String value(); - - /* --- 1.6.0后增加 --- */ - - /** - * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值 - * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为 {@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。 - * @since 1.6.0 - */ - String param() default ""; - - /** - * 参数value的最终类型,在转化的时候会使用beanutils中的工具类 - {@link org.apache.commons.beanutils.ConvertUtils}进行类型转化, 默认为String类型。 - * @return - */ - Class valueType() default String.class; -``` - -也就是说,假设这个字段叫做:`field_A`,则映射结果大致相当于: - -```java -// 其中,${value()} 的最终结果值为通过ConvertUtils进行转化的结果。 -// 其中,[|${param()}]的存在与否取决于param()里有没有值 -xxxMap.put("${field_A}[|${param()}]", (${valueType()}) ${value()}) -``` - - - -用来指定此字段的映射值。例如: - -```java -public class User { - - // 相当于 ("name", "@cname") - @MockValue("@name") - private String name; - - // 相当于 ("age|20-40", 0) - @MockValue(value = "0", param = "20-40", valueType = Integer.class) - private Integer age; - - // 省略 getter & setter - -} -``` - - - - - -## @MockArray - -使用在类的字段上,参数: - -```java - /** - * 数组参数, 必填参数 - */ - String[] value(); - - /** - * 类型转化器实现类,需要存在无参构造 - * 默认转化为字符串,即默认不变 - */ - Class mapper() default ArrayMapperType.ToString.class; - - /* --- 1.6.0后增加 --- */ - - /** - * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值 - * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为 - {@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。 - * @since 1.6.0 - */ - String param() default ""; -``` - -其中,`mapper()`参数可选,其类型为`ArrayMapper`接口的的实现类,用于指定将字符串数组,也就是`value()`中的值进行转化的规则。此参数默认为不进行转化,即转化为字符串类型。 - -`ArrayMapper`接口中的抽象方法: - -```java - /** - * 给你一个数组长度,返回一个数组实例的function,用于数组的实例化获取 - * @return 数组实例获取函数,例如:Integer[]::new; 或者 size -> new Integer[size]; - */ - IntFunction getArrayParseFunction(); - - /** - * 将字符串转化为指定类型 - */ - T apply(String t); - -``` - -在对`ArrayMapper`接口进行实现的时候,请务必保留下无参构造用于对其进行实例化。 - - - -对于一些比较常见的类型转化,我提供了几个已经实现好的实现类。这些实现类以内部类的形式存在于`ArrayMapperType`接口中。 - -- `ArrayMapperType.ToString.class` - - 转化为字符串类型,即不进行转化 - -- `ArrayMapperType.ToInt.class` - - 转化为Integer类型 - -- `ArrayMapperType.ToLong.class` - - 转化为Long类型 - -- `ArrayMapperType.ToDouble.class` - - 转化为Double类型 - -例如: - -```java -public class User { - @MockArray(value = {"1", "2", "3"}, mapper = ArrayMapperType.ToInt.class) - private int age; - - // 省略 getter & setter -} -``` - - - - - -## **使用** - -使用也很简单,我在`Mock`中增加了4个方法,2个`set`方法 2个`reset`方法。 - -```java - /* --- 1.4版本之后增加 --- */ - - /** - * 通过注解来获取映射 - */ - public static void set(Class objClass); - - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void setWithOther(Class objClass, Map other); - - /** - * 通过注解来获取映射 - */ - public static void reset(Class objClass); - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void resetWithOther(Class objClass, Map other); -``` - - - -## **注意事项** - -### 注解优先级 - -假如你在同一个字段上同时使用了两个注解,则会优先使用`@MockValue`; - - -### 额外映射 - -可以发现,4个方法中各有一个方法需要提供额外参数,他会在注解映射创建完毕后进行添加,也就是假如额外参数和字段中有冲突的键,则额外参数的值将会覆盖注解映射值。 - - -# **映射扫描** - -1.6.0版本后,我更新了**映射扫描**与**映射代理**功能。感谢提出建议的朋友。[Issue#I1CCMT](https://gitee.com/ForteScarlet/Mock.java/issues/I1CCMT) - -在您使用注解形式映射的时候,是否有感觉到每个类都需要使用`Mock.set(...)`进行设置很麻烦?希望能够通过包扫描一键批量set?现在我增加了一个注解:`@MockBean`,将其标注在您的类上,此时再配合使用`Mock.scan(...)`方法即可扫描指定的一个或多个包路径中所有标注了`@MockBean`的javaBean。 - -对于`Mock.scan(...)`的方法定义如下: - -```java - /** - * 扫描包路径,加载标记了{@link com.forte.util.mapper.MockBean}注解的类。 - * - * @param classLoader nullable, 类加载器, null则默认为当前类加载器 - * @param withOther nullable, 假如扫描的类中存在某些类,你想要为它提供一些额外的参数,此函数用于获取对应class所需要添加的额外参数。可以为null - * @param reset 加载注解映射的时候是否使用reset - * @param packages emptyable, 要扫描的包路径列表, 为空则直接返回空set - * @return 扫描并加载成功的类 - */ - public static Set> scan(ClassLoader classLoader, Function, Map> withOther, boolean reset, String... packages) throws Exception; - -``` - -这么多参数?先别怕,我先简单介绍下这些参数: - -- classLoader:包扫描使用的类加载器。**可以为null。** -- withOther:一个Function函数,这个参数接收一个`Class`参数,返回一个`Map`结果,即获取一个对应类的额外参数。类似于注解映射中set方法的额外映射。**可以为null。** -- reset:即如果扫描到了已经被添加的映射,是否覆盖。 -- packages:需要扫描的包路径列表。 - -除了这个方法,我还提供了一些重载方法: - -```java - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(Function, Map> withOther, boolean reset, String... packages) throws Exception; - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(boolean reset, String... packages) throws Exception; - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法, reset默认为false - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(String... packages) throws Exception; -``` - - - -## **使用** - -所以一般情况下,你可以直接这么使用: - -```java -// 扫描两个包 -Mock.scan("forte.test2.beans", "forte.test1.beans", ...); -// 然后直接获取 -Mock.get(Xxxx.class); -// 使用 -``` - -# **映射代理** - -1.6.0版本后,我更新了**映射扫描**与**映射代理**功能。感谢提出建议的朋友。[Issue#I1CCMT](https://gitee.com/ForteScarlet/Mock.java/issues/I1CCMT) - -首先看一下Issue上提出的模拟场景: - -```java -// interface -public interface ServiceA{ - VoA methodA(); -} -// bean, can with @MockBean -public class VoA{ - @MockValue("@cname") - private String p1; -} -``` - - - -此时,接口中的`methodA()`方法的返回值`VoA`恰好是一个MockBean,这时候,我想要得到`ServiceA`的一个代理对象,使其能够通过`methodA()`得到`VoA`的实例对象。 - -ok,因此我添加了方法`Mock.proxy(...)`及其重载。方法定义如下: - -```java - /** - *
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * @param type    要代理的接口类型。
-     * @param factory 接口代理处理器的获取工厂。可自行实现。
-     * @param  接口类型
-     * @return 代理结果
-     */
-    public static  T proxy(Class type, MockProxyHandlerFactory factory);
-```
-
-此方法传入一个接口类型`Class type` 和一个动态代理处理器`MockProxyHandlerFactory factory`,来获取一个代理对象。
-
-`MockProxyHandlerFactory`是一个接口类型,只存在一个抽象方法:
-
-```java
-    /**
-     * 获取代理处理接口{@link InvocationHandler}实例
-     * @param mockObjectFunction 传入一个类型和一个可能为null的name字符串,获取一个mockObject对象。如果存在name,则会尝试先用name获取
-     * @return JDK动态代理所需要的代理处理器示例。
-     * @see InvocationHandler
-     */
-    InvocationHandler getHandler(BiFunction, String, MockObject> mockObjectFunction);
-
-```
-
-此接口定义了如何创建一个代理类。`InvocationHandler`是JDK为动态代理的创建所提供的一个接口,知道动态代理的人对他应该不会很陌生。
-
-但是如果不熟悉也没关系,我在内部提供了一个默认的实现`MockProxyHandlerFactoryImpl`,同时也为`Mock.proxy(...)`提供了一个重载方法:
-
-```java
-    /**
-     * 
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * 
 使用默认的接口代理处理器工厂{@link MockProxyHandlerFactoryImpl}。
-     * 
 默认处理工厂中,代理接口时,被代理的方法需要:
-     * 
 不是default方法。default方法会根据其自己的逻辑执行。
-     * 
 没有参数
-     * 
 没有标注{@code @MockProxy(ignore=true) ignore=true的时候代表忽略}
-     * 
-     * @see MockProxyHandlerFactoryImpl
-     * @param type    要代理的接口类型。
-     * @return 接口代理
-     */
-    public static  T proxy(Class type);
-```
-
-因此,一般情况下,你可以直接这么使用:
-
-```java
-MockTestInterface proxy = Mock.proxy(MockInterface.class);
-```
-
-让我们来创建一个示例接口来看看:
-
-```java
-public interface MockTestInterface {
-
-    /**
-     * 指定List的泛型类型为User类型,长度为2-4之间
-     */
-    @MockProxy(size = {2, 4}, genericType = User.class)
-    List list();
-
-    /**
-     * 默认情况下,长度为1
-     */
-    Teacher[] array();
-
-    /**
-     * 直接获取,不用注解。
-     */
-    Admin admin();
-    
-    /**
-     * 获取name为{@code mock_map}的mockObject, 基本上返回值都是Map类型。
-     */
-    @MockProxy(name = "mock_map")
-    Map map();
-
-    /**
-     * 忽略, 返回值会默认为null
-     * @return null
-     */
-    @MockProxy(ignore = true)
-    Admin justNullAdmin();
-}
-```
-
-
-
-可以看到,上面的这些抽象方法中,有一部分方法标注了注解`@MockProxy`。此注解参数如下:
-
-```java
-
-    /**
-     * 
 是否忽略此方法。如果为是,则接口的最终代理结果为返回一个null。
-     * 
 当然,如果获取不到对应的Mock类型,无论是否忽略都会返回null或者默认值。
-     * 
 如果是基础数据类型相关,数字类型,返回{@code 0 或 0.0}。
-     * 
 如果是基础数据类型相关,char类型,返回{@code ' '}。
-     * 
 如果是基础数据类型相关,boolean类型,返回{@code false}。
-     */
-    boolean ignore() default false;
-
-    /**
-     * 如果此参数存在值,则会优先尝试通过name获取MockObject对象。一般应用在返回值为Map类型的时候。
-     */
-    String name() default "";
-
-    /**
-     * 
 当接口返回值为数组或者集合的时候,此方法标记其返回值数量大小区间{@link [min, max], 即 max >= size >= min}。是数学上的闭区间。
-     * 
 如果此参数长度为0,则返回值为1。
-     * 
 如果参数长度为1,则相当于不是随机长度。
-     * 
 如果参数长度大于2,只取前两位。
-     */
-    int[] size() default {1,1};
-
-    /**
-     * 
 指定返回值类型,三种可能类型:list类型,array类型,Object其他任意类型。默认值为Unknown类型。当为Unknown类型的时候,会根据返回值类型自动判断。
-     * 
 当类型为list与array类型的时候,需要通过{@link #genericType()}方法指定泛型的类型,获取mock类型的时候将会通过此方法得到的类型来获取。
-     */
-    MockProxyType proxyType() default MockProxyType.UNKNOWN;
-
-
-    /**
-     * 假如类型为List或者数组类型,此处代表泛型的实际类型。
-     */
-    Class genericType() default Object.class;
-
-
-```
-
-
-
-简单汇总一下此注解的参数:
-
-- ignore:忽略这个方法。
-
-- name:指定mockObject的name。一般在返回值为Map类型的时候使用。
-
-- size:指定大小区间。只有在返回值为array或者List类型的时候才有用。
-
-- proxyType:指定返回值类型。一般情况下可以自动推断。
-
-- genericType:当返回值为List类型的时候,此参数指定他的泛型类型。array类型可以进行推断。
-
-  
-
-`@MockProxy`并不是必须的,但也不是可能完全省略的,需要根据实际情况来。一般来讲,如果返回值是一个任意的bean类型,则基本上可以省略,而数组类型如果没有长度要求的话也可以省略,但是例如list、map类型基本上是需要配置部分参数的。
-
-
-
-## **使用**
-
-接着上面的`MockTestInterface`,在使用的时候就是这个样子的:
-
-```java
-public static void main(String[] args) throws Exception {
-        // 扫描包
-        Mock.scan("com.test");
-        // 注册一个map类型的mockbean
-        Mock.set("mock_map", new HashMap(){{
-            put("name_map", "@cname");
-        }});
-
-        // 获取接口代理
-        final MockTestInterface proxy = Mock.proxy(MockTestInterface.class);
-
-        // 输出测试
-        System.out.println("proxy.admin(): " + proxy.admin());
-        System.out.println("proxy.justNullAdmin(): " + proxy.justNullAdmin());
-        System.out.println("proxy.list(): " + proxy.list());
-        System.out.println("proxy.array(): " + Arrays.toString(proxy.array()));
-        System.out.println("proxy.map(): " + proxy.map());
-
-}
-```
-
-
-
-
-
-## **注意事项**
-
-- 尽可能别用泛型。
-
-- 每次使用`Mock.proxy(...)`都会去生成动态代理对象,会影响性能,所以尽可能保存成一个单例使用。
-
-
-
-
-
-# **使用依赖列表**
-```xml
-
-   commons-beanutils
-   commons-beanutils
-   1.9.3
-
-```
-
-## 更新公告
-
-### **v1.6.0(2020/3/25)**
-
-**增加功能:映射扫描、映射代理**
-
-删除某些无用代码
-删除文档开始的一些废话
-简单改善部分代码
-增加一个注解`@MockBean`, 使用在类上,当使用包扫描功能的时候,只会扫描被标记了此注解的类。配合两个注解映射使用。
-注解`@MockValue`增加参数:`String param() default ""`、`Class valueType() default String.class`
-注解`@MockArray`增加参数:`String param() default ""`
-`Mock`中增加了一个方法`scan(Function, Map> withOther, boolean reset, String... packages)`以及部分重载方法,用来根据`@MockBean`注解来进行扫描与批量的注解映射注册。返回值为加载后的class列表
-
-增加一个注解`@MockProxy` 标记接口代理中,一些需要特殊处理的抽象方法,例如返回值为Map或者忽略参数等。
-`Mock`中增加了一个方法
-
-着手准备编写wiki
-删除helpDoc文件夹
-
-### **v1.5.2(2020/2/22)**
-修复在使用ChineseUtil的时候会在控制台打印所有的姓氏的问题
-
-
-### **v1.5.1(2019.12.13)**
-修复自定义函数添加无效的bug
-
-
-### **v1.5.0(2019.12.5)**
-变更MockMapObject的获取值类型为Map(原本是Map)
-本质上依旧获取的是Map 类型。
-内部增加一些参数以适应扩展开发。
-
-
-### **v1.4.4(2019.12.4)**
-优化内部Random操作,现在理论上Random相关操作的效率会高一些了。
-
-
-### **v1.4.3(2019.10.18)**
-
-在上一个版本的基础上又为Map类型的参数与Object类型参数增加了List值返回。
-现在可以更好的支持例如fastJson等Json工具了。
-(测试量较少,可能存在些许bug)
-
-
-
-
-### **v1.4.2(2019.10.18)**
-修复机制:当生成Map类型的值的时候,假如字段映射类似于这种格式:
-```json
-{
-    "a": {
-        "array|5-10":"@name"
-    }
-}
-```
-更新之前,`array`字段将会忽略区间参数`5-10`直接根据`@name`生成随机姓名,更新后则将会生成长度在5-10之间的list集合。
-引申思考:如果参数不是字符串而是数字,如何判断是获取数组还是获取区间内的随机数字?
-于是在更新计划中追加一项以在特殊情况下指定生成的字段的数据类型
-
-
-修复当参数为List类型的时候,作为参数List没有被copy的bug。
-
-距离功能写完到现在,我意识到了曾经的自己十分喜欢使用异步并行流遍历,然而这不一定是效率最高的,
-所以在内部我修改了两处我看到的并行流遍历代码并修改为了单线程。
-
-
-
-
-### **v1.4(2019.8.26)**
-
-提供两个注解以实现注解形式的映射。(测试量较少不知道有没有bug)
-
-
-
-### **v1.3(2019.7.12)**
-
-优化`MockObject`接口内部接口,增加大量`parallel`(并行流)方法与`collect`方法。
-
-`parallel`相关方法将会在您创建对象的时候使用并行线程进行创建,当您需求的数据量较大的时候,此系列方法将会相对于原本的串行方法更加有效率。
-
-`collect`相关方法将会提供大量参照于`Stream`中的`collect`方法而制定的方法。假如您对于`Stream`中的`collect`方法十分的熟悉,那么此系列的方法将会使得您可以更加灵活的对数据列表进行操作。
-
-在接口结构更新后,接口中所有的方法全部基于`get()`方法而不需要其他实现。
-
-(虽然用户一般也不需要实现此接口。)
-
-### **v1.2 (2019.2.27)**
-
-※ 与上一版本不兼容点:将MockObject类变更为接口类型
-
-支持直接获取Map类型,不再必须指定一个javaBean了
-
-@函数中的字符串参数可以支持中文了,但是中文请务必放在单引号或双引号之中才可以生效
-
-### **v1.1  (2019.1.4)**
-
-支持自定义方法的导入
-
-### **v1.0  (2018.12.20)**
-
-更新README.md文档。
-
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/pom.xml b/docs/raw-materials/backup/qabox-alt/api-mock/pom.xml
deleted file mode 100644
index c88e46d..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/pom.xml
+++ /dev/null
@@ -1,324 +0,0 @@
-
-
-
-    
-        org.sonatype.oss
-        oss-parent
-        7
-    
-
-    4.0.0
-    io.gitee.ForteScarlet
-    mock.java
-
-    
-    
-    1.9.2
-    jar
-    Mock.java
-    https://github.com/ForteScarlet/Mock.java.git
-    Mock For Java, like mock.js
-
-    
-        UTF-8
-        1.8
-        1.8
-
-        
-        src/main/resources/META-INF/MANIFEST.MF
-    
-
-    
-        
-        
-        
-        
-            commons-beanutils
-            commons-beanutils
-            1.9.4
-        
-
-        
-        
-            dk.brics.automaton
-            automaton
-            1.11-8
-            true
-            test
-        
-
-
-    
-
-    
-        
-        
-            
-            
-                org.apache.maven.plugins
-                maven-assembly-plugin
-                2.6
-                
-                    
-                        make-assembly
-                        package
-                         single 
-                        
-                            
-                                jar-with-dependencies
-                            
-                        
-                    
-                
-            
-
-            
-            
-            
-                maven-deploy-plugin
-                2.8.2
-                
-                    
-                        default-deploy
-                        deploy
-                        
-                            deploy
-                        
-                    
-                
-            
-
-            
-            
-                org.sonatype.plugins
-                nexus-staging-maven-plugin
-                1.6.7
-                true
-                
-                    oss
-                    https://oss.sonatype.org/
-                    true
-                
-            
-
-            
-            
-                org.apache.maven.plugins
-                maven-scm-plugin
-                1.8.1
-            
-
-            
-            
-                org.apache.maven.plugins
-                maven-release-plugin
-                2.5.3
-                
-                    forked-path
-                    false
-                    -Psonatype-oss-release
-                    false
-                    false
-                    true
-                    
-                        .idea/
-                        .idea/*
-                        .dependencies/
-                        .dependencies/*
-                        test/
-                        test/*
-                        .idea/libraries/*
-                        pom.xml
-                        release-pom.xml
-
-                        jdonframework.iml
-                        JdonAccessory/jdon-hibernate3x/jdon-hibernate3x.iml
-                        
-                        JdonAccessory/jdon-jdbc/jdon-jdbc.iml
-                        JdonAccessory/jdon-remote/jdon-remote.iml
-                        JdonAccessory/jdon-struts1x/jdon-struts1x.iml
-                        
-                    
-                
-                
-                    
-                        org.apache.maven.plugins
-                        maven-scm-plugin
-                        1.8.1
-                    
-                
-            
-
-            
-            
-            
-                org.apache.maven.plugins
-                maven-compiler-plugin
-                3.7.0
-                
-                    1.8
-                    1.8
-                
-            
-
-            
-            
-                org.apache.maven.plugins
-                maven-source-plugin
-                2.2.1
-                
-                    
-                        attach-sources
-                        
-                            jar-no-fork
-                        
-                    
-                
-            
-
-            
-            
-                org.apache.maven.plugins
-                maven-javadoc-plugin
-                3.0.0
-                
-                    UTF-8
-                    UTF-8
-                    
-                        -Xdoclint:none
-                        --allow-script-in-comments
-                    
-                    none
-                
-                
-                    
-                        package
-                        
-                            jar
-                        
-                    
-                
-            
-
-            
-            
-                org.apache.maven.plugins
-                maven-gpg-plugin
-                1.5
-                
-                    
-                        sign-artifacts
-                        verify
-                        
-                            sign
-                        
-                    
-                
-            
-
-        
-        
-    
-
-    
-
-    
-        
-            The Apache License, Version 2.0
-            http://www.apache.org/licenses/LICENSE-2.0.txt
-        
-    
-
-    
-    
-        
-            
-            
-            
-            
-            
-            
-                a single programmer
-            
-            
-            
-            
-            
-            
-            
-            
-            
-            
-            
-            
-            ForteScarlet
-            
-            ForteScarlet@163.com
-        
-    
-
-    
-
-        
-            oss
-            https://oss.sonatype.org/content/repositories/snapshots
-        
-        
-            oss
-            https://oss.sonatype.org/service/local/staging/deploy/maven2/
-        
-
-    
-
-    
-        https://gitee.com/ForteScarlet/Mock.java
-    
-
-
-    
-        
-            release
-            
-                
-                    
-                        org.sonatype.plugins
-                        nexus-staging-maven-plugin
-                        1.6.7
-                        true
-                        
-                            oss
-                            https://oss.sonatype.org/
-                            true
-                        
-                    
-                    
-                        org.apache.maven.plugins
-                        maven-gpg-plugin
-                        1.5
-                        
-                            
-                                sign-artifacts
-                                verify
-                                
-                                    sign
-                                
-                            
-                        
-                    
-                
-            
-        
-    
-
-
-
\ No newline at end of file
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/Mock.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/Mock.java
deleted file mode 100644
index c1d5b0b..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/Mock.java
+++ /dev/null
@@ -1,492 +0,0 @@
-package com.forte.util;
-
-
-import com.forte.util.exception.MockException;
-import com.forte.util.factory.MockMapperFactory;
-import com.forte.util.factory.MockObjectFactory;
-import com.forte.util.factory.MockProxyHandlerFactory;
-import com.forte.util.factory.MockProxyHandlerFactoryImpl;
-import com.forte.util.loader.DefaultMockMethodLoader;
-import com.forte.util.loader.MethodLoader;
-import com.forte.util.mockbean.*;
-import com.forte.util.parser.ParameterParser;
-import com.forte.util.utils.ClassScanner;
-import com.forte.util.utils.MockUtil;
-import com.forte.util.utils.ProxyUtils;
-
-import java.lang.reflect.InvocationHandler;
-import java.lang.reflect.Method;
-import java.lang.reflect.Modifier;
-import java.util.*;
-import java.util.concurrent.ConcurrentHashMap;
-import java.util.function.Function;
-import java.util.function.Predicate;
-import java.util.stream.Collectors;
-
-/**
- * 

- * javaBean假数据生成工具 - *

- *

- * 使用静态方法:{@link #set(Class, Map)} 来添加一个类的假数据类型映射
- * 语法基本与Mock.js中的类似,字符串参数中可以使用@+方法名的方式指定随机方法(随机方法详见{@link MockUtil},此类也可以直接使用) - *

- *

- *

    - *
  • 使用的时候,请务必保证填充假数据的字段有他的getter方法
  • - *
  • 使用多层级赋值的时候,请注意保证多层级中涉及的深层对象有无参构造方法
  • - *
- *

- *

- * 为类中的引用类型对象赋值的时候,有两种方式: - *

    - *
  • - * - * map.set("user" , new HashMap) - * - * -> 即为字段再配置一个map映射集合 - *
  • - *
  • - * - * map.set("user.name" , "@cname") - * - * -> 使用"."分割,即使用多层级对象赋值,此方式需要保证引用类型的对象有无参构造,且字段有getter方法 - *
  • - *
- *

- * - * @author ForteScarlet - * @version 0.5-beta - */ -public class Mock { - - /* 静态代码块加载资源 */ - static { - //创建线程安全的map集合,保存全部映射记录 - MOCK_OBJECT = new ConcurrentHashMap<>(4); - MOCK_MAP = new ConcurrentHashMap<>(4); - - //创建map,这里的map理论上不需要线程同步 - Map mockUtilMethods; - - //加载这些方法,防止每次都使用反射去调用方法。 - //直接调用的话无法掌控参数,所以必须使用反射的形式进行调用 - Class mockUtilClass = MockUtil.class; - //只获取公共方法 - Method[] methods = mockUtilClass.getMethods(); - /* - 过滤Object中的方法、 - 将MockUtil中的全部方法名格式化 格式:方法名(参数类型class地址,参数类型class地址.....)、 - 转化为<方法名:方法>的map集合 - */ - mockUtilMethods = - Arrays.stream(methods) - //过滤掉Object中继承过来的方法 - .filter(m -> Arrays.stream(Object.class.getMethods()).noneMatch(om -> om.equals(m))) - //格式化方法名,格式:方法名(参数类型class地址,参数类型class地址.....) - .flatMap(m -> { - Map methodMap = new HashMap<>(); - //格式化方法名,并作为key - String key = m.getName() + "(" - + Arrays.stream(m.getParameterTypes()) - .map(Class::getName) - .collect(Collectors.joining(",")) + - ")"; - methodMap.put(key, m); - return methodMap.entrySet().stream(); - }).collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue)); - - - //保存MockUtil中的全部方法 - MOCK_METHOD = mockUtilMethods; - } - - /** - * 保存全部记录的class与其对应的假对象{@link MockBean} - */ - private static final Map MOCK_OBJECT; - - /** - * Map类型假对象 - * TODO 后期考虑合并MOCK_OBJECT 和 MOCK_MAP两个字段 - */ - private static final Map MOCK_MAP; - - - /** - * MockUtil中的全部方法 - */ - private static final Map MOCK_METHOD; - - - /** - * 添加一个数据映射 - * - * @param objClass 映射类型 - * @param map 映射对应值 - * @return 映射结果表 - */ - public static MockBean setResult(Class objClass, Map map, boolean reset) { - //如果不是重新设置且此映射已经存在,并且objClass对象存在,将会抛出异常 - if (!reset && MOCK_OBJECT.get(objClass) != null) { - throw new MockException("此映射已存在!"); - } - - MockBean parser; - - //使用参数解析器进行解析 - parser = ParameterParser.parser(objClass, map); - - //如果类型不是Map类型,添加 - MOCK_OBJECT.put(objClass, new MockNormalObject<>(parser)); - - //提醒系统的垃圾回收 - System.gc(); - - return parser; - } - - /** - * 添加一个map类型的映射 - * - * @param resultName 映射名 - * @param map 映射值 - * @param reset 是否覆盖 - */ - public static MockMapBean setResult(String resultName, Map map, boolean reset) { - //如果不是重新设置且此映射已经存在,并且objClass对象存在,将会抛出异常 - if (!reset && MOCK_MAP.get(resultName) != null) { - throw new MockException("此映射已存在!this mock result has already exists."); - } - - MockMapBean parser; - - //使用参数解析器进行解析 - parser = ParameterParser.parser(map); - MOCK_MAP.put(resultName, MockObjectFactory.createMapObj(parser)); - - //提醒系统的垃圾回收 - System.gc(); - - return parser; - } - - - /** - * 添加数据记录,如果要添加的映射已存在,则会抛出异常 - * - * @param objClass 映射的class - * @param map

映射的规则对象

- *

- *

    - *
  • key:对应的字段
  • - *
  • - * value:映射参数,可以是: - *
      - *
    • 字符串
    • - *
    • 若干随机方法指令(指令详见{@link MockUtil})
    • - *
    • 整数(Integer)
    • - *
    • 浮点数(Double)
    • - *
    • 数组或集合类型
    • - *
    • Map集合类型(可作为新的映射,也可直接作为参数)
    • - *
    • 任意引用数据类型
    • - *
    - *
  • - *
- *

- *

- * 如果映射的对象中有多层级对象,支持使用多层级字段映射,例如:
- * - * map.put("friend.name" , "@cname"); - * - *

- */ - public static void set(Class objClass, Map map) { - //设置并保存映射,不可覆盖 - setResult(objClass, map, false); - } - - /** - * {@link #set(Class, Map)} and {@link #get(Class)} - */ - public static MockObject setAndGet(Class objClass, Map map){ - set(objClass, map); - return get(objClass); - } - - /** - * 通过注解来获取映射 - */ - public static void set(Class objClass) { - //获取映射Map - Map mapper = MockMapperFactory.getMapper(objClass); - setResult(objClass, mapper, false); - } - - /** - * {@link #set(Class, Map)} and {@link #get(Class)} - */ - public static MockObject setAndGet(Class objClass){ - set(objClass); - return get(objClass); - } - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void setWithOther(Class objClass, Map other) { - //获取映射Map - Map mapper = MockMapperFactory.getMapper(objClass, other); - setResult(objClass, mapper, false); - } - - - /** - * 添加数据记录,如果要添加的映射已存在,则会抛出异常 - * - * @param resultName - * @param map - */ - public static void set(String resultName, Map map) { - //设置并保存映射,不可覆盖 - setResult(resultName, map, false); - } - - - /** - * {@link #set(Class, Map)} and {@link #get(Class)} - */ - public static MockObject setAndGet(String resultName, Map map){ - set(resultName, map); - return get(resultName); - } - - - /** - * 添加数据记录,如果要添加的映射已存在,则会覆盖 - * - * @param objClass - * @param map - * @param - */ - public static void reset(Class objClass, Map map) { - //设置并保存映射 - setResult(objClass, map, true); - } - - /** - * 通过注解来获取映射 - */ - public static void reset(Class objClass) { - //获取映射Map - Map mapper = MockMapperFactory.getMapper(objClass); - setResult(objClass, mapper, true); - } - - /** - * 通过注解来获取映射, 并提供额外的、难以用注解进行表达的映射参数 - */ - public static void resetWithOther(Class objClass, Map other) { - //获取映射Map - Map mapper = MockMapperFactory.getMapper(objClass, other); - setResult(objClass, mapper, true); - } - - - /** - * 添加数据记录,如果要添加的映射已存在,则会覆盖 - * - * @param resultName - * @param map - * @param - */ - public static void reset(String resultName, Map map) { - //设置并保存映射 - setResult(resultName, map, true); - } - - - /** - * 获取一个实例对象 - * - * @param objClass - * @param - * @return - */ - public static MockObject get(Class objClass) { - return Optional.ofNullable(MOCK_OBJECT.get(objClass)).orElse(null); - } - - /** - * 获取一个实例对象 - * - * @param resultName - * @param - * @return - */ - public static MockObject get(String resultName) { - return Optional.ofNullable(MOCK_MAP.get(resultName)).orElse(null); - } - - /** - * 扫描包路径,加载标记了{@link com.forte.util.mapper.MockBean}注解的类。 - * - * @param classLoader nullable, 类加载器, null则默认为当前类加载器 - * @param withOther nullable, 假如扫描的类中存在某些类,你想要为它提供一些额外的参数,此函数用于获取对应class所需要添加的额外参数。可以为null - * @param reset 加载注解映射的时候是否使用reset - * @param packages emptyable, 要扫描的包路径列表, 为空则直接返回空set - * @return 扫描并加载成功的类 - * @throws Exception 包扫描过程中可能会出现一些例如类找不到等各种异常。需要进行处理。 - */ - public static Set> scan(ClassLoader classLoader, Function, Map> withOther, boolean reset, String... packages) throws Exception { - if (packages.length == 0) { - return new HashSet<>(); - } - // 包扫描器 - final ClassScanner scanner = classLoader == null ? new ClassScanner() : new ClassScanner(classLoader); - - // 扫描所有的包路径 - for (String p : packages) { - scanner.find(p, c -> c.getAnnotation(com.forte.util.mapper.MockBean.class) != null); - } - - // 扫描完了之后,load - final Set> classes = scanner.get(); - - classes.forEach(c -> { - if (withOther != null) { - if (reset) { - resetWithOther(c, withOther.apply(c)); - } else { - setWithOther(c, withOther.apply(c)); - } - } else { - if (reset) { - reset(c); - } else { - set(c); - } - } - }); - - return classes; - } - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(Function, Map> withOther, boolean reset, String... packages) throws Exception { - return scan(null, withOther, reset, packages); - } - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(boolean reset, String... packages) throws Exception { - return scan(null, null, reset, packages); - } - - /** - * {@link #scan(ClassLoader, Function, boolean, String...)}的重载方法 - * reset默认为false - * - * @see #scan(ClassLoader, Function, boolean, String...) - */ - public static Set> scan(String... packages) throws Exception { - return scan(null, null, false, packages); - } - - /** - *
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * @param type    要代理的接口类型。
-     * @param factory 接口代理处理器的获取工厂。可自行实现。
-     * @param  接口类型
-     * @return 代理结果
-     */
-    public static  T proxy(Class type, MockProxyHandlerFactory factory) {
-        // 验证是否为接口类型
-        if (!Modifier.isInterface(type.getModifiers())) {
-            throw new IllegalArgumentException("type ["+ type +"] is not a interface type.");
-        }
-
-        // 获取代理处理器
-        final InvocationHandler proxyHandler = factory.getHandler((returnType, name) -> {
-            MockObject mockObject = null;
-            if (name != null) {
-                mockObject = get(name);
-            }
-            if (mockObject == null) {
-                mockObject = get(returnType);
-            }
-            return mockObject;
-        });
-
-        // 返回结果
-        return ProxyUtils.proxy(type, proxyHandler);
-    }
-
-    /**
-     * 
 为一个接口提供一个代理对象。此接口中,所有的 抽象方法 都会被扫描,假如他的返回值存在与Mock中,则为其创建代理。
-     * 
 此方法默认不会为使用者保存单例,每次代理都会代理一个新的对象,因此如果有需要,请保存一个单例对象而不是频繁代理。
-     * 
 使用默认的接口代理处理器工厂{@link MockProxyHandlerFactoryImpl}。
-     * 
 默认处理工厂中,代理接口时,被代理的方法需要:
-     * 
 不是default方法。default方法会根据其自己的逻辑执行。
-     * 
 没有参数
-     * 
 没有标注{@code @MockProxy(ignore=true) ignore=true的时候代表忽略}
-     * 
-     * @see MockProxyHandlerFactoryImpl
-     * @param type    要代理的接口类型。
-     * @return 接口代理
-     */
-    public static > T proxy(C type) {
-        return proxy(type, new MockProxyHandlerFactoryImpl());
-    }
-
-    /**
-     * 获取方法加载器
-     *
-     * @return
-     */
-    public static MethodLoader mockMethodLoader() {
-        return new DefaultMockMethodLoader(MOCK_METHOD);
-    }
-
-    /**
-     * Deprecated
-     *
-     * @see #getMockMethods()
-     */
-    @Deprecated
-    public static Map _getMockMethod() {
-        return getMockMethods();
-    }
-
-    /**
-     * 获取Mock方法集合
-     *
-     * @return 全部已被加载的映射方法
-     */
-    public static Map getMockMethods() {
-        return new HashMap<>(MOCK_METHOD);
-    }
-
-
-    /**
-     * 根据过滤条件寻找指定的string-method
-     */
-    public static Map.Entry getMockMethodByFilter(Predicate> predicate){
-        for (Map.Entry entry : MOCK_METHOD.entrySet()) {
-            if(predicate.test(entry)){
-                return entry;
-            }
-        }
-        return null;
-    }
-
-
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/MockConfiguration.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/MockConfiguration.java
deleted file mode 100644
index 701932c..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/MockConfiguration.java
+++ /dev/null
@@ -1,45 +0,0 @@
-package com.forte.util;
-import javax.script.ScriptEngine;
-
-/**
- *
- * 这是一个静态类,你可以随时随地修改他的配置。
- * 他的一些配置会在某些地方用到。
- *
- * @author ForteScarlet  
- */
-public class MockConfiguration {
-
-    /**
-     * 

是否启用JS脚本执行。 - *

版本1.8之后将脚本执行类{@link ScriptEngine}变更为了复用的单例形式, - * 效率提升了约2倍多(测试生成1000条数据,优化前:14s左右,优化后:6s左右) - *

6秒还是太慢了,因此现在修改为默认情况下不开启JS脚本执行, - * 未开启配置的情况下,任何字符串类型的值都不再会尝试进行JS脚本执行。 - *

关闭脚本的效率比开启脚本的效率高15倍左右(测试生成1000条数据,开启JS:6左右,关闭JS:0.4秒左右) - *

- *

- * - */ - private static boolean enableJsScriptEngine = false; - - /** - * 配置是否开启JS脚本执行 - */ - public static synchronized void setEnableJsScriptEngine(boolean enableJsScriptEngine){ - MockConfiguration.enableJsScriptEngine = enableJsScriptEngine; - } - - /** - * 获取是否开启JS脚本执行。 - * @return - */ - public static boolean isEnableJsScriptEngine(){ - return enableJsScriptEngine; - } - - - - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockException.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockException.java deleted file mode 100644 index 7f1092f..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockException.java +++ /dev/null @@ -1,26 +0,0 @@ -package com.forte.util.exception; - -/** - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/24 20:31 - * @since JDK1.8 - **/ -public class MockException extends RuntimeException { - public MockException() { - } - public MockException(String message) { - super(message); - } - - public MockException(String message, Throwable cause) { - super(message, cause); - } - - public MockException(Throwable cause) { - super(cause); - } - - public MockException(String message, Throwable cause, boolean enableSuppression, boolean writableStackTrace) { - super(message, cause, enableSuppression, writableStackTrace); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockParserException.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockParserException.java deleted file mode 100644 index b1444d9..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/MockParserException.java +++ /dev/null @@ -1,26 +0,0 @@ -package com.forte.util.exception; - -/** - * 假数据解析异常 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class MockParserException extends MockException{ - - public MockParserException() { - } - public MockParserException(String message) { - super(message); - } - - public MockParserException(String message, Throwable cause) { - super(message, cause); - } - - public MockParserException(Throwable cause) { - super(cause); - } - - public MockParserException(String message, Throwable cause, boolean enableSuppression, boolean writableStackTrace) { - super(message, cause, enableSuppression, writableStackTrace); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/ParameterSizeException.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/ParameterSizeException.java deleted file mode 100644 index 078c9bc..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/exception/ParameterSizeException.java +++ /dev/null @@ -1,12 +0,0 @@ -package com.forte.util.exception; - -/** - * 参数数量不符异常 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class ParameterSizeException extends MockException { - public ParameterSizeException() { - super("参数数量与方法参数数量不符!"); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockBeanFactory.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockBeanFactory.java deleted file mode 100644 index fea4ae7..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockBeanFactory.java +++ /dev/null @@ -1,34 +0,0 @@ -package com.forte.util.factory; - -import com.forte.util.mockbean.MockBean; -import com.forte.util.mockbean.MockField; -import com.forte.util.mockbean.MockMapBean; - -/** - * MockBean的工厂 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2019/2/27 14:57 - */ -public class MockBeanFactory { - - /** - * 创建一个MockBean - * @param objectClass - * @param fields - * @param - * @return - */ - public static MockBean createMockBean(Class objectClass, MockField[] fields){ - return new MockBean<>(objectClass, fields); - } - - /** - * 创建一个MockMapBean - * @param fields - * @return - */ - public static MockMapBean createMockMapBean(MockField[] fields){ - return new MockMapBean(fields); - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockMapperFactory.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockMapperFactory.java deleted file mode 100644 index 8432d89..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockMapperFactory.java +++ /dev/null @@ -1,104 +0,0 @@ -package com.forte.util.factory; - -import com.forte.util.exception.MockException; -import com.forte.util.mapper.ArrayMapper; -import com.forte.util.mapper.MockArray; -import com.forte.util.mapper.MockValue; -import org.apache.commons.beanutils.ConvertUtils; - -import java.util.AbstractMap; -import java.util.Arrays; -import java.util.Map; -import java.util.Objects; -import java.util.function.Supplier; -import java.util.stream.Collectors; - -/** - * 生成映射Map的工厂 - * - * @author ForteScarlet <[email]ForteScarlet@163.com> - * @since JDK1.8 - **/ -public class MockMapperFactory { - - /** - * 通过一个类的注解来生成映射,然后再整合额外的指定参数 - * - * @param type 类型 - * @param other 额外参数,可以为null - */ - public static Map getMapper(Class type, Map other) { - Map mapper = Arrays.stream(type.getDeclaredFields()).map(f -> { - //由于目前只有两种注解,直接判断下就行了 - //优先使用MockValue的值 - Map.Entry, String> valueAndParam = getValue(f.getAnnotation(MockValue.class)); - if (valueAndParam == null) { - valueAndParam = getValue(f.getAnnotation(MockArray.class)); - } - // 无注解,返回null并由后续过滤 - if(valueAndParam == null){ - return null; - } - return new AbstractMap.SimpleEntry<>(f.getName() + valueAndParam.getValue(), valueAndParam.getKey().get()); - }).filter(Objects::nonNull).collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue)); - - if(other != null){ - mapper.putAll(other); - } - return mapper; - } - - /** - * 通过一个类的注解来生成映射 - * - * @param type 类型 - */ - public static Map getMapper(Class type) { - return getMapper(type, null); - } - - /** - * 通过注解MockValue类型来获取value值 - * 如果为null或者值没有(保留空字符),则返回null - * @return {@code Entry, fieldParam>} , 返回一个entry,key为value中的值的获取函数,value为map映射中字段的区间参数值。 - */ - public static Map.Entry, String> getValue(MockValue mockValueAnnotation) { - if (mockValueAnnotation == null || mockValueAnnotation.value().length() <= 0) { - return null; - } - final String mapValue = mockValueAnnotation.value(); - final Class valueType = mockValueAnnotation.valueType(); - - Supplier valueGetter = valueType.equals(String.class) ? () -> mapValue : () -> ConvertUtils.convert(mapValue, valueType); - - String param = mockValueAnnotation.param().trim(); - if(param.length() > 0 && !param.startsWith("|")){ - param = "|" + param; - } - return new AbstractMap.SimpleEntry<>(valueGetter, param); - } - - /** - * 通过注解MockValue类型来获取value值 - * @return {@code Entry, fieldParam>} , 返回一个entry,key为value中的值的获取函数,value为map映射中字段的区间参数值。 - */ - public static Map.Entry, String> getValue(MockArray mockArrayAnnotation) { - if (mockArrayAnnotation == null) { - return null; - } - try { - ArrayMapper arrayMapper = mockArrayAnnotation.mapper().newInstance(); - final String[] mockArrayValue = mockArrayAnnotation.value(); - final Supplier valueGetter = () -> arrayMapper.map(mockArrayValue); - String param = mockArrayAnnotation.param().trim(); - if(param.length() > 0 && !param.startsWith("|")){ - param = "|" + param; - } - return new AbstractMap.SimpleEntry<>(valueGetter, param); - } catch (InstantiationException | IllegalAccessException e) { - throw new MockException("无法实例化数组转化器 Cannot instantiate an array converter :" + mockArrayAnnotation.mapper()); - } - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockObjectFactory.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockObjectFactory.java deleted file mode 100644 index 36cdf68..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockObjectFactory.java +++ /dev/null @@ -1,32 +0,0 @@ -package com.forte.util.factory; - -import com.forte.util.mockbean.*; - -/** - * MockObject对象工厂 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2019/2/27 14:38 - */ -public class MockObjectFactory { - - /** - * 创建一个普通mock对象 - * @param mockBean - * @param - * @return - */ - public static MockObject createNormalObj(MockBean mockBean){ - return new MockNormalObject<>(mockBean); - } - - - /** - * 创建一个map类型的mock对象 - * @param mockMapBean - * @return - */ - public static MockMapObject createMapObj(MockMapBean mockMapBean){ - return new MockMapObject(mockMapBean); - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactory.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactory.java deleted file mode 100644 index a8bcb58..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactory.java +++ /dev/null @@ -1,22 +0,0 @@ -package com.forte.util.factory; - -import com.forte.util.mockbean.MockObject; - -import java.lang.reflect.InvocationHandler; -import java.util.function.BiFunction; - -/** - * Mock接口代理对象工厂的接口定义 - * @author ForteScarlet - */ -public interface MockProxyHandlerFactory { - - /** - * 获取代理处理接口{@link InvocationHandler}实例 - * @param mockObjectFunction 传入一个类型和一个可能为null的name字符串,获取一个mockObject对象。如果存在name,则会尝试先用name获取 - * @return JDK动态代理所需要的代理处理器示例。 - * @see InvocationHandler - */ - InvocationHandler getHandler(BiFunction, String, MockObject> mockObjectFunction); - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactoryImpl.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactoryImpl.java deleted file mode 100644 index f8ab2d0..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/MockProxyHandlerFactoryImpl.java +++ /dev/null @@ -1,257 +0,0 @@ -package com.forte.util.factory; - -import com.forte.util.mapper.MockProxy; -import com.forte.util.mapper.MockProxyType; -import com.forte.util.mockbean.MockObject; -import com.forte.util.utils.FieldUtils; - -import java.lang.invoke.MethodHandles; -import java.lang.reflect.Constructor; -import java.lang.reflect.InvocationHandler; -import java.lang.reflect.Method; -import java.util.List; -import java.util.Map; -import java.util.concurrent.ConcurrentHashMap; -import java.util.function.BiFunction; - -/** - * {@link MockProxyHandlerFactory}的默认实现 - * - * @author ForteScarlet - */ -public class MockProxyHandlerFactoryImpl implements MockProxyHandlerFactory { - - /** - * 获取接口代理处理器实例 - * 首先,只扫描所有的抽象方法,default方法不会代理,而是执行它自己。 - * - * @param mockObjectFunction 传入一个类型,获取一个mockObject对象。如果是Map类型,则第二参数为map的名称,否则忽视第二参数。 - * @return 接口代理处理器实例 - */ - @Override - public InvocationHandler getHandler(BiFunction, String, MockObject> mockObjectFunction) { - return new DefaultMockProxyHandler(mockObjectFunction); - } - - - /** - * 默认的mock接口代理处理器实现 - */ - public static class DefaultMockProxyHandler implements InvocationHandler { - - - /** - * mockObject获取器 - */ - private BiFunction, String, MockObject> mockObjectFunction; - - /** - * 方法返回值缓存map。 - */ - private Map> methodReturnCacheMap; - - /** - * 返回值永远为null的缓存值 - */ - private final SimpleBean nullValueCache = new SimpleBean<>((p, m, o) -> null); - - /** - * 构造需要一个mockObject获取器 - */ - public DefaultMockProxyHandler(BiFunction, String, MockObject> mockObjectFunction) { - this.mockObjectFunction = mockObjectFunction; - this.methodReturnCacheMap = new ConcurrentHashMap<>(2); - } - - /** - * 记录缓存 - */ - private void saveCache(Method m, InvocationHandler handler) { - methodReturnCacheMap.put(m, new SimpleBean<>(handler)); - } - - - /** - * 记录缓存 - */ - private void saveNullCache(Method m) { - methodReturnCacheMap.put(m, nullValueCache); - } - - - private InvocationHandler getCache(Method m) { - final SimpleBean supplierAtomicReference = methodReturnCacheMap.get(m); - return supplierAtomicReference == null ? null : supplierAtomicReference.get(); - } - - /** - * 函数接口 - * - * @param method 第一参数 - * @param args 第二参数 - * @return 返回值 - * @throws Throwable 任意异常 - */ - @Override - public Object invoke(Object proxy, Method method, Object[] args) throws Throwable { - // 先尝试获取缓存 - final InvocationHandler cache = getCache(method); - if (cache != null) { - return cache.invoke(proxy, method, args); - } - - //如果是接口中的默认方法,使用特殊方法执行 - //代码源于网络: http://www.it1352.com/988865.html - if (method.isDefault()) { - Class declaringClass = method.getDeclaringClass(); - Constructor constructor = MethodHandles.Lookup.class.getDeclaredConstructor(Class.class, int.class); - constructor.setAccessible(true); - InvocationHandler defaultInvocationHandler = (p, m, o) -> constructor. - newInstance(declaringClass, MethodHandles.Lookup.PRIVATE). - unreflectSpecial(method, declaringClass). - bindTo(proxy). - invokeWithArguments(args); - - // 记录缓存 - saveCache(method, defaultInvocationHandler); - return defaultInvocationHandler.invoke(proxy, method, args); - } - - // 尝试获取@MockProxy注解 - final MockProxy mockProxyAnnotation = method.getAnnotation(MockProxy.class); - boolean ignore = mockProxyAnnotation != null && mockProxyAnnotation.ignore(); - // 返回值类型 - final Class returnType = method.getReturnType(); - - // 如果忽略此函数,则跳过 - if (ignore) { - return getDefaultResultAndCache(returnType, method); - } - - // 没有忽略,进行解析 - Class genericType = mockProxyAnnotation == null ? Object.class : mockProxyAnnotation.genericType(); - String name = mockProxyAnnotation == null ? null : mockProxyAnnotation.name().trim().length() == 0 ? null : mockProxyAnnotation.name().trim(); - int[] array = mockProxyAnnotation == null ? new int[]{1, 1} : mockProxyAnnotation.size(); - - // 准备参数 - MockProxyType proxyType = mockProxyAnnotation == null ? MockProxyType.UNKNOWN : mockProxyAnnotation.proxyType(); - // 如果是未知类型,根据返回值类型进行匹配。 - if(proxyType == MockProxyType.UNKNOWN){ - // 判断一下返回值的类型,如果是数组,转化为数组类型,如果是list,转化为list类型 - if(returnType.isArray()){ - final Class arrayComponentType = returnType.getComponentType(); - genericType = genericType.equals(Object.class) ? arrayComponentType : genericType; - proxyType = MockProxyType.ARRAY; - }else if(FieldUtils.isChild(returnType, List.class)){ - proxyType = MockProxyType.LIST; - }else{ - // 其他类型,认定为Object类型 - proxyType = MockProxyType.OBJECT; - } - } - - - if (array.length == 0) { - array = new int[]{1, 1}; - } - if (array.length == 1) { - array = new int[]{array[0], array[0]}; - } - if (array.length > 2) { - array = new int[]{array[0], array[1]}; - } - - - // 要获取的mock类型 - Class mockGetType = proxyType.selectTypeUse(returnType, genericType); - - - final MockObject mockObject = mockObjectFunction.apply(mockGetType, name); - - if (mockObject == null) { - // 获取不到mockObject, 获取默认返回值 - return getDefaultResultAndCache(returnType, method); - } - - // mockObject不为null,构建返回值 - - final int[] finalArr = array; - - final MockProxyType finalProxyType = proxyType; - - // 构建缓存函数 - InvocationHandler proxyHandler = (p, m, o) -> finalProxyType.buildReturnType(num -> { - if (num == 1) { - return new Object[]{mockObject.getOne()}; - } else if (num == 0) { - return new Object[0]; - } else if (num < 0) { - throw new IllegalArgumentException("size cannot be zero."); - } else { - return mockObject.getStream(num).toArray(Object[]::new); - } - }, mockGetType, finalArr[0], finalArr[1]); - - saveCache(method, proxyHandler); - return proxyHandler.invoke(proxy, method, args); - } - - /** - * 获取默认返回值并缓存 - * - * @param returnType 返回值类型 - * @param method 方法 - */ - private Object getDefaultResultAndCache(Class returnType, Method method) { - final Object defaultResult = getDefaultResult(returnType); - // 记录缓存 - if (defaultResult == null) { - saveNullCache(method); - } else { - InvocationHandler ignoreHandler = (p, m, o) -> defaultResult; - saveCache(method, ignoreHandler); - } - return defaultResult; - } - - - /** - * 获取默认返回值 - * - * @param returnType 返回值类型 - */ - private Object getDefaultResult(Class returnType) { - // char - if (returnType.equals(char.class)) { - return ' '; - } - // boolean - if (returnType.equals(boolean.class)) { - return false; - } - - // 浮点型 - Class[] basicFloatTypes = new Class[]{double.class, float.class}; - for (Class basicFloatType : basicFloatTypes) { - if (returnType.equals(basicFloatType)) { - return 0.0; - } - } - - // 整型 - Class[] basicNumberTypes = new Class[]{byte.class, short.class, int.class, long.class}; - for (Class basicNumberType : basicNumberTypes) { - if (returnType.equals(basicNumberType)) { - return 0; - } - } - - // 其他的返回null - return null; - } - - - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/SimpleBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/SimpleBean.java deleted file mode 100644 index 16d4897..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/factory/SimpleBean.java +++ /dev/null @@ -1,30 +0,0 @@ -package com.forte.util.factory; - -import java.util.function.Supplier; - -/** - * 就是一个类的一层封装类 - * @author ForteScarlet - */ -public class SimpleBean implements Supplier { - - private T bean = null; - - public SimpleBean(){ } - public SimpleBean(T bean){ - this.bean = bean; - } - - public void set(T bean){ - this.bean = bean; - } - - /** - * Gets a result. - * @return a result - */ - @Override - public T get() { - return bean; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ArrayFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ArrayFieldValueGetter.java deleted file mode 100644 index 097456d..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ArrayFieldValueGetter.java +++ /dev/null @@ -1,238 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; -import com.forte.util.utils.RandomUtil; - -import javax.script.ScriptException; - -/** - * 数组类型的字段值获取器,与{@link ListFieldValueGetter}十分相似,基本可是说是只有参数类似不同了 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class ArrayFieldValueGetter implements FieldValueGetter { - - /** - * 方法执行者们 - * 期望中 只有一个执行者 - * 不排除多个执行者的情况, - * 但是如果是多个执行者,List集合很大可能是String、Integer之类的基础数据类型 - */ - private final Invoker[] invokers; - - /** - * 区间参数,重复最终输出,参数期望中长度为2,0索引为最小值,1为最大值 - * 默认值为[1,1],即为不重复 - */ - private final Integer[] integerInterval; - - /** - * 多余字符 - * 集合字段的指令类参数中可能出现多余字符,当有多于字符的时候,list集合的类型有很大的概率是Sting、Integer之类的基础数据类型 - */ - private final String[] moreStrs; - - /** - * 获取一个数组 - * - * @return - */ - @Override - public Object[] value() { - //获取执行次数 - Integer min = integerInterval[0]; - Integer max = integerInterval[1]; - int num = (max == null ? min : RandomUtil.getNumberWithRight(min, max)); - //创建一个Object类型的List集合,用于保存数据 - Object[] list = new Object[num]; - //判断执行者的数量 - if (invokers.length > 1) { - - //当执行者数量大于1,执行方法 - getValueWhenInvokersMoreThan1(num, list); - - } else if (invokers.length <= 0) { - /* - * 没有执行者的情况,创建并返回一个空的字符串集合 - * 一般情况下不会出现空执行者的情况,就算是没有可执行方法也会有空值执行者 - */ - return new Object[0]; - } else { - - //当执行者的数量不大于1的时候,执行方法 - getValueWhenInvokerIs1(num, list); - } - //返回结果 - return list; - } - - /** - * 当执行者的数量超过1的时候 - * - * @param num - * @param list - */ - private void getValueWhenInvokersMoreThan1(int num, Object[] list) { - //执行者数量大于1的情况下,只能将全部执行结果的toString拼接,并尝试使用eval进行执行 - //如果eval可以执行,则保存eval中得到的结果,如果无法执行则返回拼接字符串 - StringBuilder sb = new StringBuilder(); - //开始遍历并执行 - try { - for (int i = 0; i < num; i++) { - //执行全部执行者 - for (int j = 0; j < invokers.length; j++) { - //如果有多余字符,先拼接多余字符 - if (moreStrs != null) { - sb.append(moreStrs[j]); - } - //拼接执行结果 - sb.append(invokers[j].invoke()); - } - //如果有多余字符且多余字符的数量比执行者多1 - //只要多余字符比执行者数量大,则说明多余字符的数量为执行者的数量+1 - if (moreStrs != null && moreStrs.length > invokers.length) { - //拼接多余字符的最后值 - sb.append(moreStrs[moreStrs.length - 1]); - } - String invokeStr = sb.toString(); - try { - //尝试使用eval进行执行 - Object eval = MethodUtil.eval(invokeStr); - //如果能执行成功,保存这个执行结果到集合 - list[i] = (eval); - } catch (ScriptException e) { - //如果执行失败,保存执行前的字符串 - list[i] = (invokeStr); - } - } - } catch (Exception e) { - throw new MockException(e); - } - } - - - /** - * 当执行者数量不大于1的时候 - * - * @param num - * @param list - */ - private void getValueWhenInvokerIs1(int num, Object[] list) { - - //执行者数量不大于1,即只有一个 - Invoker invoker = invokers[0]; - //尽管只有一个方法执行者,但是仍然可能存在多余字符 - //所以分两种情况 - //在有多余字符的情况下,处理方式类似于上面的多执行者 - if (moreStrs != null) { - //如果存在多余字符 - //准备拼接结果 - StringBuilder sb = new StringBuilder(); - //遍历num次数 - try { - for (int i = 0; i < num; i++) { - //先拼接多余字符,再拼接方法执行结果 - sb.append(moreStrs[i]); - //方法的执行结果 - sb.append(invoker.invoke()); - //如果多余字符有结尾,拼接 - //由于只有一个执行者,所以如果多余字符数量大于1就说明有尾部多余 - if (moreStrs.length > 1) { - //这里的元素索引不出意外的话,必定是2 - sb.append(moreStrs[moreStrs.length - 1]); - } - String invokeData = sb.toString(); - //尝试对结果进行eval - //如果执行成功,保存执行结果 - list[i] = (MethodUtil.evalCache(invokeData)); - } - } catch (Exception e) { - throw new MockException(e); - } - - } else { - //没有多余字符 - //只有一个方法执行者、且没有多余字符的情况是最稳定的情况。 - //指令参数类型情况下,这种类型只有及低的可能会出现类型异常(只要你的list类型没有填写错误) - //遍历num次数,执行执行者并将结果保存至list集合 - try { - for (int i = 0; i < num; i++) { - //执行结果 - list[i] = invoker.invoke(); - } - } catch (Exception e) { - throw new MockException(e); - } - } - } - - - /** - * 构造方法 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - public ArrayFieldValueGetter(Invoker[] invokers, Integer[] integerInterval, String[] moreStrs) { - this.invokers = invokers; - //如果多余字符长度为0,则赋值为null - this.moreStrs = moreStrs.length == 0 ? null : moreStrs; - //如果为true,则使用默认的数组 - boolean isNull = integerInterval == null || integerInterval.length > 2 || integerInterval[0] == null || integerInterval[1] == null; - if (isNull) { - this.integerInterval = new Integer[]{1, 1}; - } else { - this.integerInterval = integerInterval; - } - } - - /** - * 构造方法,区间参数默认为[1,1] - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - public ArrayFieldValueGetter(Invoker[] invokers, String[] moreStrs) { - this.invokers = invokers; - this.integerInterval = new Integer[]{1, 1}; - //如果多余字符长度为0,则赋值为null - this.moreStrs = moreStrs.length == 0 ? null : moreStrs; - } - - - /** - * 构造方法,没有多余字符 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - */ - public ArrayFieldValueGetter(Invoker[] invokers, Integer[] integerInterval) { - this.invokers = invokers; - //判断:数组为null || 长度大于2 || 左参数为null || 左右参数都为null - //如果为true,则使用默认的数组 - boolean isNull = integerInterval == null || integerInterval.length > 2 || integerInterval[0] == null || integerInterval[1] == null; - if (isNull) { - this.integerInterval = new Integer[]{1, 1}; - } else { - this.integerInterval = integerInterval; - } - //多余字符赋值为null - this.moreStrs = null; - } - - /** - * 构造方法,没有多余字符,区间参数默认为[1,1] - * - * @param invokers 方法执行者 - */ - public ArrayFieldValueGetter(Invoker[] invokers) { - this.invokers = invokers; - this.integerInterval = new Integer[]{1, 1}; - //多余字符赋值为null - this.moreStrs = null; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/DoubleFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/DoubleFieldValueGetter.java deleted file mode 100644 index 5d1ba40..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/DoubleFieldValueGetter.java +++ /dev/null @@ -1,39 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; - - -/** - * Double类型字段值获取器 - * 既然使用了此字段值获取器,则说明已经确定了字段的类型,则必定不会出现多个执行者。 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class DoubleFieldValueGetter implements FieldValueGetter { - - /** - * 方法执行者,用于获取double值 - * 方法执行者必定为1个 - */ - private Invoker invoker; - - @Override - public Double value() { - //直接返回执行结果 - try { - return (Double) invoker.invoke(); - } catch (Exception e) { - throw new MockException(e); - } - } - - - /** - * 构造方法,只需要一个方法执行者 - */ - public DoubleFieldValueGetter(Invoker invoker) { - this.invoker = invoker; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/EnumFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/EnumFieldValueGetter.java deleted file mode 100644 index 876aa70..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/EnumFieldValueGetter.java +++ /dev/null @@ -1,19 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -/** - * - * 枚举类型的字段值获取器 - * - * @deprecated 尚未实现完成 - * - * @author ForteScarlet - * @date 2020/8/1 - */ -@Deprecated -public class EnumFieldValueGetter> implements FieldValueGetter> { - - @Override - public Enum value() { - return null; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/FieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/FieldValueGetter.java deleted file mode 100644 index 7985d75..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/FieldValueGetter.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -/** - * 字段值获取器 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -@FunctionalInterface -public interface FieldValueGetter { - - /** - * 获取这个字段的参数 - */ - T value(); - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/IntegerFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/IntegerFieldValueGetter.java deleted file mode 100644 index b17245d..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/IntegerFieldValueGetter.java +++ /dev/null @@ -1,42 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; - - -/** - * 整数类型字段值获取器 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class IntegerFieldValueGetter implements FieldValueGetter { - - /** - * 方法执行者用于获取整数类型的字段值 - * 执行者必然只有一个 - */ - private Invoker invoker; - - - /** - * 获取一个整数类型的字段值 - * @return - */ - @Override - public Integer value() { - try { - return (Integer) invoker.invoke(); - } catch (Exception e) { - throw new MockException(e); - } - } - - - /** - * 构造方法,只需要一个方法执行者 - */ - public IntegerFieldValueGetter(Invoker invoker) { - this.invoker = invoker; - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ListFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ListFieldValueGetter.java deleted file mode 100644 index 6f6cbd0..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ListFieldValueGetter.java +++ /dev/null @@ -1,324 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; -import com.forte.util.utils.RandomUtil; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.concurrent.ThreadLocalRandom; -import java.util.function.Supplier; - -/** - * List集合的字段值获取器 - * 有较大的可能出现一些类型异常,请注意一定按照规范填写 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class ListFieldValueGetter implements FieldValueGetter { - - - /** - * 方法执行者们 - * 期望中 只有一个执行者 - * 不排除多个执行者的情况, - * 但是如果是多个执行者,List集合很大可能是String、Integer之类的基础数据类型 - */ - private final Invoker[] invokers; - - /** - * 区间参数,重复最终输出,参数期望中长度为2,0索引为最小值,1为最大值 - * 默认值为[1,1],即为不重复 - * - * @since 1.7.0 当前版本替换为获取函数 - */ - private final Supplier integerIntervalSupplier; - - /** - * 多余字符 - * 集合字段的指令类参数中可能出现多余字符,当有多于字符的时候,list集合的类型有很大的概率是Sting、Integer之类的基础数据类型 - */ - private final String[] moreStrs; - - - /** - * 获取字段值 - */ - @Override - public List value() { - Integer[] integerInterval = integerIntervalSupplier.get(); - //创建一个Object类型的List集合,用于保存数据 - List list = new ArrayList(); - //获取执行次数 - Integer min = integerInterval[0]; - Integer max = integerInterval[1]; - int num = (max == null ? min : RandomUtil.getNumberWithRight(min, max)); - //判断执行者的数量 - if (invokers.length > 1) { - - //当执行者数量大于1,执行方法 - getValueWhenInvokersMoreThan1(num, list); - - } else if (invokers.length <= 0) { - /* - * 没有执行者的情况,创建并返回一个空的字符串集合 - * 一般情况下不会出现空执行者的情况,就算是没有可执行方法也会有空值执行者 - */ -// return new ArrayList(); - return list; - } else { - - //当执行者的数量不大于1的时候,执行方法 - getValueWhenInvokerIs1(num, list); - } - //返回结果 - return list; - } - - /** - * 当执行者的数量超过1的时候 - * - * @param num - * @param list - */ - private void getValueWhenInvokersMoreThan1(int num, List list) { - //执行者数量大于1的情况下,只能将全部执行结果的toString拼接,并尝试使用eval进行执行 - //如果eval可以执行,则保存eval中得到的结果,如果无法执行则返回拼接字符串 - StringBuilder sb = new StringBuilder(); - //开始遍历并执行 - try { - for (int i = 0; i < num; i++) { - //执行全部执行者 - for (int j = 0; j < invokers.length; j++) { - //如果有多余字符,先拼接多余字符 - if (moreStrs != null) { - sb.append(moreStrs[j]); - } - //拼接执行结果 - sb.append(invokers[j].invoke()); - } - - //如果有多余字符且多余字符的数量比执行者多1 - //只要多余字符比执行者数量大,则说明多余字符的数量为执行者的数量+1 - if (moreStrs != null && moreStrs.length > invokers.length) { - //拼接多余字符的最后值 - sb.append(moreStrs[moreStrs.length - 1]); - } - String invokeStr = sb.toString(); - //尝试使用eval进行执行 - Object eval = MethodUtil.evalCache(invokeStr); - //如果能执行成功,保存这个执行结果到集合 - list.add(eval); - } - } catch (Exception e) { - throw new MockException(e); - } - } - - - /** - * 当执行者数量不大于1的时候 - * - * @param num - * @param list - */ - private void getValueWhenInvokerIs1(int num, List list) { - //执行者数量不大于1,即只有一个 - Invoker invoker = invokers[0]; - //尽管只有一个方法执行者,但是仍然可能存在多余字符 - //所以分两种情况 - //在有多余字符的情况下,处理方式类似于上面的多执行者 - if (moreStrs != null) { - //如果存在多余字符 - //准备拼接结果 - StringBuilder sb = new StringBuilder(); - //遍历num次数 - try { - for (int i = 0; i < num; i++) { - //先拼接多余字符,再拼接方法执行结果 - sb.append(moreStrs[i]); - //方法的执行结果 - sb.append(invoker.invoke()); - //如果多余字符有结尾,拼接 - //由于只有一个执行者,所以如果多余字符数量大于1就说明有尾部多余 - if (moreStrs.length > 1) { - //这里的元素索引不出意外的话,必定是2 - sb.append(moreStrs[moreStrs.length - 1]); - } - String invokeData = sb.toString(); - //尝试对结果进行eval - //如果执行成功,保存执行结果 - Object eval = MethodUtil.evalCache(invokeData); - list.add(eval); - } - } catch (Exception e) { - throw new MockException(e); - } - } else { - //没有多余字符 - //只有一个方法执行者、且没有多余字符的情况是最稳定的情况。 - //指令参数类型情况下,这种类型只有及低的可能会出现类型异常(只要你的list类型没有填写错误) - //遍历num次数,执行执行者并将结果保存至list集合 - try { - for (int i = 0; i < num; i++) { - //执行结果 - list.add(invoker.invoke()); - } - } catch (Exception e) { - throw new MockException(e); - //如果执行出现错误,保存一个空值 null -// list.add(null); - } - } - } - - - /** - * 获取一个固定值的区间获取函数 - * - * @param a 固定区间 a - * @param b 固定区间 b - */ - static Supplier normalIntegerIntervalSupplier(int a, int b) { - Integer[] intervalsNew = new Integer[]{a, b}; - return () -> intervalsNew; - } - - /** - * 获取一个固定值的区间获取函数 - * - * @param intervals 固定区间 - */ - static Supplier normalIntegerIntervalSupplier(Integer[] intervals) { - Integer[] intervalsNew = Arrays.copyOf(intervals, intervals.length); - return () -> intervalsNew; - } - - /** - * 根据两个区间来随机获取其中一个区间的函数 - * - * @param intervals1 第一个区间 - * @param intervals2 第二个区间 - * @return 获取函数 - */ - static Supplier normalIntegerIntervalSupplier(Integer[] intervals1, Integer[] intervals2) { - Integer[] intervals1New = Arrays.copyOf(intervals1, intervals1.length); - Integer[] intervals2New = Arrays.copyOf(intervals2, intervals2.length); - return () -> ThreadLocalRandom.current().nextBoolean() ? intervals1New : intervals2New; - } - - - /** - * 构造方法 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - public ListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval, String[] moreStrs) { - this.invokers = invokers; - //如果多余字符长度为0,则赋值为null - this.moreStrs = moreStrs.length == 0 ? null : moreStrs; - //如果为true,则使用默认的数组 - boolean isNull = integerInterval == null || integerInterval.length > 2 || integerInterval[0] == null || integerInterval[1] == null; - if (isNull) { - this.integerIntervalSupplier = normalIntegerIntervalSupplier(1, 1); - } else { - this.integerIntervalSupplier = normalIntegerIntervalSupplier(integerInterval); - } - } - - /** - * 构造方法 - * - * @param invokers 方法执行者 - * @param integerInterval1 区间参数 - * @param integerInterval2 区间参数 - * @param moreStrs 多余字符 - */ - public ListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval1, Integer[] integerInterval2, String[] moreStrs) { - this.invokers = invokers; - //如果多余字符长度为0,则赋值为null - this.moreStrs = moreStrs.length == 0 ? null : moreStrs; - //如果为true,则使用默认的数组 - boolean isNull1 = integerInterval1 == null || integerInterval1.length > 2 || integerInterval1[0] == null || integerInterval1[1] == null; - boolean isNull2 = integerInterval2 == null || integerInterval2.length > 2 || integerInterval2[0] == null || integerInterval2[1] == null; - - Integer[] integerInterval1New = isNull1 ? new Integer[]{1, 1} : integerInterval1; - Integer[] integerInterval2New = isNull2 ? integerInterval1New : integerInterval2; - - this.integerIntervalSupplier = normalIntegerIntervalSupplier(integerInterval1New, integerInterval2New); - } - - /** - * 构造方法,区间参数默认为[1,1] - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - public ListFieldValueGetter(Invoker[] invokers, String[] moreStrs) { - this.invokers = invokers; - this.integerIntervalSupplier = normalIntegerIntervalSupplier(1, 1); - //如果多余字符长度为0,则赋值为null - this.moreStrs = moreStrs.length == 0 ? null : moreStrs; - } - - /** - * 构造方法,没有多余字符 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - */ - public ListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval) { - this.invokers = invokers; - //判断:数组为null || 长度大于2 || 左参数为null || 左右参数都为null - //如果为true,则使用默认的数组 - boolean isNull = integerInterval == null || integerInterval.length > 2 || integerInterval[0] == null || integerInterval[1] == null; - if (isNull) { - this.integerIntervalSupplier = normalIntegerIntervalSupplier(1, 1); - } else { - this.integerIntervalSupplier = normalIntegerIntervalSupplier(integerInterval); - } - //多余字符赋值为null - this.moreStrs = null; - } - - /** - * 构造方法,没有多余字符 - * - * @param invokers 方法执行者 - * @param integerInterval1 区间参数1 - * @param integerInterval2 区间参数2 - */ - public ListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval1, Integer[] integerInterval2) { - this.invokers = invokers; - //判断:数组为null || 长度大于2 || 左参数为null || 左右参数都为null - //如果为true,则使用默认的数组 - boolean isNull1 = integerInterval1 == null || integerInterval1.length > 2 || integerInterval1[0] == null || integerInterval1[1] == null; - boolean isNull2 = integerInterval2 == null || integerInterval2.length > 2 || integerInterval2[0] == null || integerInterval2[1] == null; - - Integer[] integerInterval1New = isNull1 ? new Integer[]{1, 1} : integerInterval1; - Integer[] integerInterval2New = isNull2 ? integerInterval1New : integerInterval2; - - this.integerIntervalSupplier = normalIntegerIntervalSupplier(integerInterval1New, integerInterval2New); - - //多余字符赋值为null - this.moreStrs = null; - } - - /** - * 构造方法,没有多余字符,区间参数默认为[1,1] - * - * @param invokers 方法执行者 - */ - public ListFieldValueGetter(Invoker[] invokers) { - this.invokers = invokers; - this.integerIntervalSupplier = normalIntegerIntervalSupplier(1, 1); - //多余字符赋值为null - this.moreStrs = null; - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ObjectFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ObjectFieldValueGetter.java deleted file mode 100644 index c541464..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/ObjectFieldValueGetter.java +++ /dev/null @@ -1,73 +0,0 @@ -package com.forte.util.fieldvaluegetter; - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; - -import javax.script.ScriptException; - -/** - * 字段类型为任意未知类型的时候 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class ObjectFieldValueGetter implements FieldValueGetter { - - /** - * 方法执行者,期望值是只有一个,但是不能保证,假如有多个参数的话,尝试使用加法运算进行相加 - */ - private final Invoker[] invokers; - - /** - * 获取值 - * - * @return - */ - @Override - public Object value() { - try { - //如果只有一个执行者 - if (invokers.length > 1) { - //不止一个,拼接结果为类js代码并执行eval() - //用于执行eval的拼接字符串 - StringBuilder evalString = new StringBuilder(); - //用于防止执行出现错误的直接返回用的字符串 - StringBuilder returnString = new StringBuilder(); - //遍历执行并拼接 - for (int i = 0; i < invokers.length; i++) { - if (i != 0) { - evalString.append("+"); - } - //执行结果 - Object invoke = invokers[i].invoke(); - evalString.append(invoke); - returnString.append(invoke); - } - //遍历结束,执行加法运算并返回结果 - String forEval = evalString.toString(); - try { - return MethodUtil.eval(forEval); - } catch (ScriptException e) { - //如果出现异常,则直接返回结果的拼接字符串 - return returnString.toString(); - } - }else { - //直接返回执行结果 - return invokers[0].invoke(); - } - } catch (Exception e) { - //出现异常,抛出 - throw new MockException(e); - } - } - - - /** - * 构造方法 - * - * @param invokers 方法执行者 - */ - public ObjectFieldValueGetter(Invoker[] invokers) { - this.invokers = invokers; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/StringFieldValueGetter.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/StringFieldValueGetter.java deleted file mode 100644 index 988e3c6..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/fieldvaluegetter/StringFieldValueGetter.java +++ /dev/null @@ -1,136 +0,0 @@ -package com.forte.util.fieldvaluegetter; - - -import com.forte.util.exception.MockException; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; -import com.forte.util.utils.RandomUtil; - -import java.util.Collections; -import java.util.function.Supplier; - -/** - * 字符串类型字段值的获取者 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class StringFieldValueGetter implements FieldValueGetter { - - /** - * 方法执行者,顺序与解析出来的方法和多余字符之间的顺序一致 - */ - private final Invoker[] invokers; - - /** - * 多余字符,如果不为null,则长度必然是 方法执行者invokers 的数量的+1或者相同 - */ - private final String[] moreStr; - - /** - * 区间参数,重复最终输出,参数期望中长度为2,0索引为最小值,1为最大值 - * 默认值为[1,1],即为不重复 - */ - private final Integer[] integerInterval; - - - /** - * 获取重复次数的获取函数。 - */ - private final Supplier timeSupplier; - - - /** - * 获取字段值 - * - * @return - */ - @Override - public String value() { - StringBuilder sb = new StringBuilder(32); - //同时遍历方法与多余字符,使用methods遍历 - int i = 0; - int invokerLength = invokers.length; - try { - for (; i < invokerLength; i++) { - //如果有多余字符,先存多余字符,后存执行结果 - if (moreStr != null) { - sb.append(moreStr[i]); - } - - sb.append(invokers[i].invoke()); - } - } catch (Exception e) { - throw new MockException(e); - } - //如果多余字符不为空 - //判断多余字符的数量: - // 如果数量相等,说明在最后的方法后面没有多余参数, - // 如果数量多1,则说明在最后的方法后面还有多余字符 - //如果尾部有多余字符,添加 - if (moreStr != null && moreStr.length > invokerLength) { - sb.append(moreStr[i]); - } - - //重复输出,次数为integerInterval的参数 - //如果没有右参数,重复次数则为左参数 -// int times; - int times = timeSupplier.get(); -// if(integerInterval[1] == null){ -// times = integerInterval[0]; -// }else{ -// int min = integerInterval[0]; -// int max = integerInterval[1]; -// times = RandomUtil.getNumberWithRight(min , max); -// } - - //有些少数情况,end中拼接后的字符串是可以作为简单JS代码执行的,在此处重复字符串之前,尝试使用eval进行执行 - String end = String.valueOf(MethodUtil.evalCache(sb.toString())); - - - //重复次数并返回 - if (times <= 1) { - return end; - } else { - return String.join("", Collections.nCopies(times, end)); - } - } - - - /** - * 构造 - * - * @param invokers - * @param moreStr - */ - public StringFieldValueGetter(Invoker[] invokers, String[] moreStr, Integer[] integerInterval) { - this.invokers = invokers; - this.moreStr = moreStr; - //判断:数组为null || 长度大于2 || 左参数为null || 左右参数都为null - //如果为true,则使用默认的数组 - boolean isNull = integerInterval == null || integerInterval.length > 2 || integerInterval[0] == null || integerInterval[1] == null; - if (isNull) { - this.integerInterval = new Integer[]{1, 1}; - this.timeSupplier = () -> 1; - } else { - this.integerInterval = integerInterval; - int min = integerInterval[0]; - int max = integerInterval[1]; - this.timeSupplier = () -> RandomUtil.getNumberWithRight(min, max); - } - } - - /** - * 构造,区间参数默认为[1-1] - * - * @param invokers - * @param moreStr - */ - public StringFieldValueGetter(Invoker[] invokers, String[] moreStr) { - this.invokers = invokers; - this.moreStr = moreStr; - //区间为默认值 - this.integerInterval = new Integer[]{1, 1}; - this.timeSupplier = () -> 1; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExFunction.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExFunction.java deleted file mode 100644 index af63774..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExFunction.java +++ /dev/null @@ -1,14 +0,0 @@ -package com.forte.util.function; - -/** - * 三个参数的function - * @author ForteScarlet - */ -@FunctionalInterface -public interface ExFunction { - - /** - * 接收三个参数,返回一个结果 - */ - R apply(T t, U1 u1, U2 u2); -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExProxyHandler.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExProxyHandler.java deleted file mode 100644 index 4bf8145..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/ExProxyHandler.java +++ /dev/null @@ -1,17 +0,0 @@ -package com.forte.util.function; - -/** - * 带着异常处理的BiFunction,用于构建动态代理的参数 - */ -@FunctionalInterface -public interface ExProxyHandler { - /** - * 函数接口 - * - * @param t 第一参数 - * @param u 第二参数 - * @return 返回值 - * @throws Throwable 任意异常 - */ - R apply(T t, U u) throws Throwable; -} \ No newline at end of file diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/TypeParse.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/TypeParse.java deleted file mode 100644 index d47e20c..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/function/TypeParse.java +++ /dev/null @@ -1,23 +0,0 @@ -package com.forte.util.function; - -import com.forte.util.parser.FieldParser; - -/** - * 用于{@link com.forte.util.parser.ParameterParser}中,来注册各种参数类型的解析器。 - * 参数类型一般代表的是Map中的这个Object - * @author ForteScarlet - */ -@FunctionalInterface -public interface TypeParse { - - /** - * 接收部分参数,得到一个解析结果 - * @param objectClass 封装类型 - * @param fieldName 字段名称 - * @param intervalStr 区间字符串 - * @param value 参数 - * @return 字段解析器 - */ - FieldParser parse(Class objectClass, String fieldName, String intervalStr, Object value); - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ArrayElementInvoker.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ArrayElementInvoker.java deleted file mode 100644 index f5c0262..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ArrayElementInvoker.java +++ /dev/null @@ -1,28 +0,0 @@ -package com.forte.util.invoker; - -import com.forte.util.utils.RandomUtil; - -/** - * - * 数组类型的{@link ElementInvoker} - * - * @author ForteScarlet - * @date 2020/8/1 - */ -public class ArrayElementInvoker extends ElementInvoker { - - /** 集合参数 */ - private T[] arr; - /** - * 数组构造 - * @param arr - */ - public ArrayElementInvoker(T[] arr){ - this.arr = arr; - } - - @Override - public T getRandomElement() { - return RandomUtil.getRandomElement(arr); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ElementInvoker.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ElementInvoker.java deleted file mode 100644 index 043c428..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ElementInvoker.java +++ /dev/null @@ -1,33 +0,0 @@ -package com.forte.util.invoker; - -import java.util.List; - -/** - * 随机元素值执行者 - * 两个字段,一个有值,一个为null - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/7 20:37 - */ -public abstract class ElementInvoker implements Invoker { - - public abstract T getRandomElement(); - - /** - * 执行者,获取随机元素 - */ - @Override - public Object invoke() { - return getRandomElement(); - } - - public static ElementInvoker getInstance(T... array){ - return new ArrayElementInvoker<>(array); - } - - public static ElementInvoker getInstance(List list){ - return new ListElementInvoker<>(list); - } - - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/Invoker.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/Invoker.java deleted file mode 100644 index fff08ca..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/Invoker.java +++ /dev/null @@ -1,17 +0,0 @@ -package com.forte.util.invoker; - -/** - * 执行者接口,定义了一个执行者的函数,执行者会通过invoke()方法获得结果 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -@FunctionalInterface -public interface Invoker { - - /** - * 返回方法执行的结果 - * @return 获取执行结果 - * @throws Exception 可能会存在异常 - */ - Object invoke() throws Exception; -} - diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ListElementInvoker.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ListElementInvoker.java deleted file mode 100644 index 9c748cf..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/ListElementInvoker.java +++ /dev/null @@ -1,32 +0,0 @@ -package com.forte.util.invoker; - -import com.forte.util.utils.RandomUtil; - -import java.util.List; - -/** - * - * list类型的{@link ElementInvoker} - * - * @author ForteScarlet - * @date 2020/8/1 - */ -public class ListElementInvoker extends ElementInvoker { - /** 集合参数 */ - private List list; - - - - /** - * 集合构造 - * @param list - */ - public ListElementInvoker(List list){ - this.list = list; - } - - @Override - public T getRandomElement() { - return RandomUtil.getRandomElement(list); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/MethodInvoker.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/MethodInvoker.java deleted file mode 100644 index 9253c46..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/invoker/MethodInvoker.java +++ /dev/null @@ -1,101 +0,0 @@ -package com.forte.util.invoker; - -import com.forte.util.utils.MethodUtil; - -import java.lang.reflect.InvocationTargetException; -import java.lang.reflect.Method; - -/** - * 方法执行者,是{@link com.forte.util.invoker.Invoker}的是实现类,代表了一个方法的执行 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class MethodInvoker implements Invoker { - /** - * 执行方法的对象 - */ - private final Object obj; - /** - * 方法的名字 - */ - private final String methodName; - /** - * 方法的参数 - */ - private final Object[] args; - /** - * 方法的主体 - MockUtil中的静态方法 - */ - private final Method method; - - /** - * 获取一个普通的{@link MethodInvoker} - * @param obj 实例 - * @param args 参数列表 - * @param method 方法实例 - * @return {@link MethodInvoker} - */ - public static MethodInvoker getInstance(Object obj, Object[] args, Method method){ - return new MethodInvoker(obj, args, method); - } - - /** - * 获取一个常量{@link MethodInvoker} - * @param constValue 常量值 - * @return {@link MethodInvoker} - */ - public static MethodInvoker getInstance(Object constValue){ - return new ConstValueMethodInvoker(constValue); - } - - /** - * 执行方法 - * - * @return - * @throws InvocationTargetException - * @throws IllegalAccessException - */ - @Override - public Object invoke() throws InvocationTargetException, IllegalAccessException { - // 普通的执行者 - return MethodUtil.invoke(obj, args, method); - } - - /** - * 构造 - */ - MethodInvoker(Object obj, Object[] args, Method method) { - this.obj = obj; - if(method != null){ - this.methodName = method.getName(); - }else{ - this.methodName = null; - } - this.args = args; - this.method = method; - } - - - /** - * 常量值方法执行者 - */ - static class ConstValueMethodInvoker extends MethodInvoker { - /** - * 如果是一个空执行者,将会将对象返回 - */ - private final Object constValue; - - ConstValueMethodInvoker(Object constValue){ - super(null, null, null); - this.constValue = constValue; - } - - /** - * 返回常量值 - */ - @Override - public Object invoke() { - return constValue; - } - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/BranchResult.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/BranchResult.java deleted file mode 100644 index 880f878..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/BranchResult.java +++ /dev/null @@ -1,24 +0,0 @@ -package com.forte.util.loader; - -/** - * 直译:分支结果 - * 代表结果有着成功与否 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/26 17:53 - * @since JDK1.8 - **/ -public interface BranchResult extends Result { - - /** - * 判断是否成功 - * @return - */ - Boolean isSuccess(); - - /** - * 如果失败,为何失败 - * @return - */ - Exception why(); - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/DefaultMockMethodLoader.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/DefaultMockMethodLoader.java deleted file mode 100644 index a5fe0d2..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/DefaultMockMethodLoader.java +++ /dev/null @@ -1,369 +0,0 @@ -package com.forte.util.loader; - -import com.forte.util.Mock; - -import java.lang.reflect.Method; -import java.util.*; -import java.util.function.Predicate; -import java.util.stream.Collectors; - -/** - * 基础的假方法加载者 - * 加载的类或方法需要满足以下要求:
- *
    - *
  • 加载的方法不可与{@link com.forte.util.utils.MockUtil}中出现的方法发生方法名相同,参数数量也相同的情况,如果发生此情况,将会抛出异常。
  • - *
  • 方法必须有返回值(非void)
  • - *
- * ※ 本类目前不保证线程安全 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/26 17:33 - * @since JDK1.8 - **/ -public class DefaultMockMethodLoader implements MethodLoader { - - /** 获取mock方法集,此时的方法集已经通过静态代码块初始化完毕 */ - private final Map MOCK_METHOD; - - /** 要加载的方法,方法不能重复,使用set */ - private Set waitingMethods = new HashSet<>(10); - - public DefaultMockMethodLoader(Map mockMethod){ - this.MOCK_METHOD = mockMethod; - } - - /** - * 根据方法名加载一个方法,如果方法名对应了多个方法,则会全部进行判断,因此可能会有多个方法 - * @param loadClz 指定类 - * @param methodName 方法名 - * @return 返回自身-链式 - */ - @Override - public MethodLoader append(Class loadClz, String methodName) { - //获取全部方法 - Method[] declaredMethods = loadClz.getDeclaredMethods(); - //返回结果 - return appends(Arrays.stream(declaredMethods).filter(m -> m.getName().equals(methodName)).toArray(Method[]::new)); - } - - /** - * 添加一个方法 - * @param method 要加载的方法 - * @return 返回自身-链式 - */ - @Override - public MethodLoader append(Method method) { - if(can(method)){ - this.waitingMethods.add(method); - } - return this; - } - - /** - * 加载多个方法 - * @param methods 要加载的方法列表 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appends(Method... methods) { - //筛选,方法名匹配且参数类型 - Set q = Arrays.stream(methods).filter(this::can).collect(Collectors.toSet()); - //添加结果 - this.waitingMethods.addAll(q); - return this; - } - - /** - * 加载类中的全部方法 - * @param loadClz 加载方法的类 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendAll(Class loadClz) { - return appends(loadClz.getDeclaredMethods()); - } - - /** - * 根据方法名筛选方法筛选 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendForNameFilter(Class loadClz, Predicate predicate) { - //先根据方法名筛选过滤 - Method[] q = Arrays.stream(loadClz.getDeclaredMethods()).filter(m -> predicate.test(m.getName())).toArray(Method[]::new); - return appends(q); - } - - /** - * 根据方法筛选方法筛选 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendForMethodFilter(Class loadClz, Predicate predicate) { - //根据条件筛选 - Method[] q = Arrays.stream(loadClz.getDeclaredMethods()).filter(predicate).toArray(Method[]::new); - return appends(q); - } - - /** - * 加载指定方法名的多个方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendByNames(Class loadClz, String[] names) { - //遍历每个名字并尝试加载 - for (String name : names) { - append(loadClz , name); - } - return this; - } - - /** - * 加载指定方法名的多个方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendByNames(Class loadClz, List names) { - return appendByNames(loadClz , names.toArray(new String[0])); - } - - /** - * 根据正则规则匹配方法中的方法名 - * @param loadClz 加载方法的类 - * @param regex 正则表达式 - * @return 返回自身-链式 - */ - @Override - public MethodLoader appendByRegex(Class loadClz, String regex) { - return appendForNameFilter(loadClz, s -> s.matches(regex)); - } - - /** - * 过滤 - * @param predicate 过滤规则 - * @return 过滤后的结果 - */ - @Override - public MethodLoader filter(Predicate predicate) { - //过滤并重新赋值 - waitingMethods = waitingMethods.stream().filter(predicate).collect(Collectors.toSet()); - return this; - } - - - /* ———————————————————————— 非链式方法 —————————————————————— */ - - /** - * 加载某类中指定方法名的方法。如果有重载方法将会全部判断 - * - * @param loadClz 指定类 - * @param methodName 方法名 - * @return 处理结果 - */ - @Override - public LoadResults add(Class loadClz, String methodName) { - //获取全部方法 - Method[] declaredMethods = loadClz.getDeclaredMethods(); - //返回结果 - return adds(declaredMethods); - - } - - /** - * 加载指定方法 - * - * @param method 要加载的方法 - * @return 处理结果 - */ - @Override - public LoadResults add(Method method) { - //返回结果 - return adds(method); - } - - /** - * 直接加载方法 - * - * @param methods 要加载的方法列表 - * @return 处理结果 - */ - @Override - public LoadResults adds(Method... methods) { - //筛选,方法名匹配且参数类型 - Set q = Arrays.stream(methods).filter(this::can).collect(Collectors.toSet()); - - //直接添加方法 - return load(q); - } - - /** - * 加载class中的全部方法 - * - * @param loadClz 加载方法的类 - * @return 处理结果 - */ - @Override - public LoadResults addAll(Class loadClz) { - return adds(loadClz.getDeclaredMethods()); - } - - /** - * 根据匹配规则对类中的方法名进行过滤 - * - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - @Override - public LoadResults addForNameFilter(Class loadClz, Predicate predicate) { - Method[] q = Arrays.stream(loadClz.getDeclaredMethods()).filter(m -> predicate.test(m.getName())).toArray(Method[]::new); - return adds(q); - } - - /** - * 根据匹配规则对类中的方法进行过滤 - * - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - @Override - public LoadResults addForMethodFilter(Class loadClz, Predicate predicate) { - //根据条件筛选 - Method[] q = Arrays.stream(loadClz.getDeclaredMethods()).filter(predicate).toArray(Method[]::new); - return adds(q); - } - - /** - * 根据方法名列表加载class中的指定方法 - * - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - @Override - public LoadResults addByNames(Class loadClz, String[] names) { - //根据名称获取全部方法 - Method[] methods = loadClz.getMethods(); - //将方法名匹配的方法留下 - Method[] endMethod = Arrays.stream(methods).filter(m -> Arrays.stream(names).anyMatch(n -> n.equals(m.getName()))).toArray(Method[]::new); - - //返回结果 - return adds(endMethod); - } - - /** - * 根据方法名列表加载class中的指定方法 - * - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - @Override - public LoadResults addByNames(Class loadClz, List names) { - return addByNames(loadClz, names.toArray(new String[0])); - } - - /** - * 根据正则对方法名匹配并加载class中符合条件的方法 - * - * @param loadClz 加载方法的类 - * @param regex 正则表达式 - * @return 处理结果 - */ - @Override - public LoadResults addByRegex(Class loadClz, String regex) { - return addForNameFilter(loadClz , s -> s.matches(regex)); - } - - /* ———————————————————————— 终结方法 —————————————————————— */ - - /** - * 要加载的内容是否为空 - * @return 是否为空 - */ - @Override - public boolean isEmpty() { - return waitingMethods.isEmpty(); - } - - /** - * 加载 - * @return 加载结果 - */ - @Override - public LoadResults load() { - return load(waitingMethods); - } - - - /** - * 将传入的方法加载至随机方法集中并返回添加结果报告 - * @param methods - * @return - */ - private LoadResults load(Set methods){ - //遍历要加载的方法并添加,并获取结果返回值 - Set> collect = methods.stream().flatMap(m -> { - Map methodMap = new HashMap<>(8); - //格式化方法名,并作为key - String key = m.getName() + "(" - + Arrays.stream(m.getParameterTypes()) - .map(Class::getName) - .collect(Collectors.joining(",")) + - ")"; - methodMap.put(key, m); - - //添加记录 - MOCK_METHOD.put(key, m); - - return methodMap.entrySet().stream(); - }).map(e -> { - //遍历所有的Entry对象 - try { - Method put = Mock._getMockMethod().put(e.getKey(), e.getValue()); - //如果添加成功则保存 - return MockMethodLoadResult.success(put); - } catch (Exception err) { - //如果添加失败则返回错误 - return MockMethodLoadResult.fail(e.getValue(), err); - } - }).collect(Collectors.toSet()); - - //返回结果集封装 - return new MockMethodLoadReport(collect); - } - - - /* ———————————————————————— 部分getter的api —————————————————————— */ - - - /** - * 等待加载的方法集 - * - * @return - */ - @Override - public Set waiting() { - return waitingMethods; - } - - /** - * 等待加载的方法集的数量 - * - * @return - */ - @Override - public int waitingNum() { - return waitingMethods.size(); - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/LoadResults.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/LoadResults.java deleted file mode 100644 index 88d1e54..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/LoadResults.java +++ /dev/null @@ -1,45 +0,0 @@ -package com.forte.util.loader; - -import java.lang.reflect.Method; -import java.util.Map; -import java.util.Set; - -/** - * 方法加载后的返回值接口 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/26 17:40 - * @since JDK1.8 - **/ -public interface LoadResults { - - /** - * 将成功加载的结果返回 - * @return 成功结果 - */ - Set loadSuccessResults(); - - /** - * 将结果作为结果返回,分为成功和失败两种结果 - * @return - */ - Map> loadResults(); - - /** - * 成功结果的数量 - * @return - */ - int successNums(); - - /** - * 失败的结果数量 - * @return - */ - int failNums(); - - /** - * 将加载错误的方法返回,并告知其失败原因 - * @return - */ - Map whyFail(); - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MethodLoader.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MethodLoader.java deleted file mode 100644 index 7e49ab6..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MethodLoader.java +++ /dev/null @@ -1,245 +0,0 @@ -package com.forte.util.loader; - -import com.forte.util.Mock; - -import java.lang.reflect.Method; -import java.util.Arrays; -import java.util.List; -import java.util.Map; -import java.util.Set; -import java.util.function.Predicate; -import java.util.stream.Collectors; - -/** - * 假方法加载类
- * 加载的类或方法需要满足以下要求:
- *
    - *
  • 加载的方法不可与{@link com.forte.util.utils.MockUtil}中出现的方法发生方法名相同,参数数量也相同的情况,如果发生此情况,将会抛出异常。
  • - *
  • 方法必须为有返回值的(非void)
  • - *
  • 方法如果存在参数,请使用引用数据类型,避免使用基本数据类型
  • - *
- * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/24 20:36 - * @version 1.1 - * @since JDK1.8 - **/ -public interface MethodLoader { - - /* —————————————————————————— 预载单方法 —————————————————————————— */ - - /** - * 加载某类中指定方法名的方法。如果有重载方法将会全部判断 - * @param loadClz 指定类 - * @param methodName 方法名 - * @return 处理结果 - */ - MethodLoader append(Class loadClz , String methodName); - - /** - * 加载指定方法 - * @param method 要加载的方法 - * @return 处理结果 - */ - MethodLoader append(Method method); - - /* —————————————————————————— 预载多方法 —————————————————————————— */ - - /** - * 直接加载方法 - * @param methods 要加载的方法列表 - * @return 处理结果 - */ - MethodLoader appends(Method... methods); - - - /** - * 加载class中的全部方法 - * @param loadClz 加载方法的类 - * @return 处理结果 - */ - MethodLoader appendAll(Class loadClz); - - /** - * 根据匹配规则对类中的方法名进行过滤 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - MethodLoader appendForNameFilter(Class loadClz , Predicate predicate); - - /** - * 根据匹配规则对类中的方法进行过滤 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - MethodLoader appendForMethodFilter(Class loadClz , Predicate predicate); - - /** - * 根据方法名列表加载class中的指定方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - MethodLoader appendByNames(Class loadClz , String[] names); - - /** - * 根据方法名列表加载class中的指定方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - MethodLoader appendByNames(Class loadClz , List names); - - /** - * 根据正则对方法名匹配并加载class中符合条件的方法 - * @param loadClz 加载方法的类 - * @param regex 正则表达式 - * @return 处理结果 - */ - MethodLoader appendByRegex(Class loadClz , String regex); - - /* —————————————————————————— 数据处理 —————————————————————————— */ - - /** - * 对方法进行过滤 - * @param predicate 过滤规则 - * @return 处理结果 - */ - MethodLoader filter(Predicate predicate); - - - /** - * 判断此方法是否可行 - * @param method - * @return - */ - default boolean can(Method method){ - //如果此方法没有返回值则直接返回false - if(method.getReturnType().equals(void.class)){ - return false; - } - //获取已经加载的方法 - Map methodMap = Mock._getMockMethod(); - String keyName = method.getName() + "(" + Arrays.stream(method.getParameters()).map(p -> p.getType().getTypeName()).collect(Collectors.joining(",")) + ")"; - return methodMap.entrySet().stream().noneMatch( - e -> keyName.equals(e.getKey()) - && (method.getParameters().length == e.getValue().getParameters().length) - ); - } - - - - /* —————————————————————————— 加载/终结方法 —————————————————————————— */ - - - /** - * 将预载内容加载至方法集 - * @return 加载成功数量 - */ - LoadResults load(); - - - - /* —————————————————————————— 加载/终结方法-非链式 —————————————————————————— */ - - /* —————————————————————————— 预载单方法 —————————————————————————— */ - - /** - * 加载某类中指定方法名的方法。如果有重载方法将会全部判断 - * @param loadClz 指定类 - * @param methodName 方法名 - * @return 处理结果 - */ - LoadResults add(Class loadClz , String methodName); - - /** - * 加载指定方法 - * @param method 要加载的方法 - * @return 处理结果 - */ - LoadResults add(Method method); - - /* —————————————————————————— 预载多方法 —————————————————————————— */ - - /** - * 直接加载方法 - * @param methods 要加载的方法列表 - * @return 处理结果 - */ - LoadResults adds(Method... methods); - - - /** - * 加载class中的全部方法 - * @param loadClz 加载方法的类 - * @return 处理结果 - */ - LoadResults addAll(Class loadClz); - - /** - * 根据匹配规则对类中的方法名进行过滤 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - LoadResults addForNameFilter(Class loadClz , Predicate predicate); - - /** - * 根据匹配规则对类中的方法进行过滤 - * @param loadClz 加载方法的类 - * @param predicate 匹配规则 - * @return 处理结果 - */ - LoadResults addForMethodFilter(Class loadClz , Predicate predicate); - - /** - * 根据方法名列表加载class中的指定方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - LoadResults addByNames(Class loadClz , String[] names); - - /** - * 根据方法名列表加载class中的指定方法 - * @param loadClz 加载方法的类 - * @param names 方法名列表 - * @return 处理结果 - */ - LoadResults addByNames(Class loadClz , List names); - - /** - * 根据正则对方法名匹配并加载class中符合条件的方法 - * @param loadClz 加载方法的类 - * @param regex 正则表达式 - * @return 处理结果 - */ - LoadResults addByRegex(Class loadClz , String regex); - - /* —————————————————————————— 一些getter方法等api —————————————————————————— */ - - /** - * 等待加载的方法集 - * @return - */ - Set waiting(); - - /** - * 等待加载的方法集的数量 - * @return - */ - int waitingNum(); - - /** - * 判断预载内容是否为空 - * @return 判断结果 - */ - boolean isEmpty(); - - - - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadReport.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadReport.java deleted file mode 100644 index 64b4765..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadReport.java +++ /dev/null @@ -1,79 +0,0 @@ -package com.forte.util.loader; - -import java.lang.reflect.Method; -import java.util.Map; -import java.util.Set; -import java.util.stream.Collectors; - -/** - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/27 17:10 - */ -public class MockMethodLoadReport implements LoadResults { - - - private final Set> RESULTS; - - /** - * 将成功加载的结果返回 - * - * @return 成功结果 - */ - @Override - public Set loadSuccessResults() { - //获取成功的方法列表 - return RESULTS.stream().filter(BranchResult::isSuccess).map(BranchResult::getResult).collect(Collectors.toSet()); - } - - /** - * 将结果作为结果返回,分为成功和失败两种结果 - * @return - * 成功与否的结果集 - */ - @Override - public Map> loadResults() { - return RESULTS.stream().collect(Collectors.groupingBy( - BranchResult::isSuccess , - Collectors.mapping(BranchResult::getResult , Collectors.toSet()) - )); - } - - /** - * 成功结果的数量 - * @return - * 成功结果的数量 - */ - @Override - public int successNums() { - return (int) RESULTS.stream().filter(BranchResult::isSuccess).count(); - } - - /** - * 失败的结果数量 - * @return 失败的结果数量 - */ - @Override - public int failNums() { - return (int) RESULTS.stream().filter(r -> !r.isSuccess()).count(); - } - - /** - * 将加载错误的方法返回,并告知其失败原因 - * @return 错误的方法以及失败原因 - */ - @Override - public Map whyFail() { - return RESULTS.stream().filter(re -> !re.isSuccess()).collect(Collectors.toMap(BranchResult::getResult, BranchResult::why)); - } - - - - /** - * 构造 - * @param results 结果集 - */ - MockMethodLoadReport(Set> results){ - this.RESULTS = results; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadResult.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadResult.java deleted file mode 100644 index be09195..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/MockMethodLoadResult.java +++ /dev/null @@ -1,69 +0,0 @@ -package com.forte.util.loader; - -import java.lang.reflect.Method; - -/** - * 方法的加载结果 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/26 17:45 - * @since JDK1.8 - **/ -class MockMethodLoadResult implements BranchResult { - - /** 结果 */ - private final Method result; - - /** 如果结果为成功的结果,则为true */ - private final Boolean success; - - /** 如果success为false,则此字段记录的结果的失败原因(异常) */ - private final Exception why; - - - @Override - public Boolean isSuccess() { - return this.success; - } - - @Override - public Exception why() { - return this.why; - } - - @Override - public Method getResult() { - return this.result; - } - - /** - * 工厂方法。获得一个成功的返回值 - * @param result 结果 - * @return 返回一个实例 - */ - public static MockMethodLoadResult success(Method result){ - return new MockMethodLoadResult(result , true , null); - } - - /** - * 工厂方法。获得一个失败的返回值 - * @param result 结果 - * @param why 为何失败 - * @return 返回一个实例 - */ - public static MockMethodLoadResult fail(Method result , Exception why){ - return new MockMethodLoadResult(result , false, why); - } - - /** - * 唯一私有构造 - * @param result 结果 - * @param success 是否成功 - * @param why 为何失败 - */ - private MockMethodLoadResult(Method result, Boolean success, Exception why){ - this.result = result; - this.success = success; - this.why = why; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/Result.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/Result.java deleted file mode 100644 index 8cc4f26..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/loader/Result.java +++ /dev/null @@ -1,19 +0,0 @@ -package com.forte.util.loader; - -/** - * 方法的返回结果 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2018/12/26 17:48 - * @since JDK1.8 - **/ -public interface Result { - - - /** - * 获取结果 - * @return 结果 - */ - T getResult(); - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapper.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapper.java deleted file mode 100644 index a1a4f9b..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapper.java +++ /dev/null @@ -1,31 +0,0 @@ -package com.forte.util.mapper; - -import java.util.Arrays; -import java.util.function.Function; -import java.util.function.IntFunction; - -/** - * 映射器,可以指定将字符串数组内的值转化为其他类型 - * 需要存在一个无参构造 - * @author ForteScarlet <[email]ForteScarlet@163.com> - * @since JDK1.8 - **/ -public interface ArrayMapper extends Function { - - /** - * 给你一个数组长度,返回一个数组实例的function,用于数组的实例化获取 - * @return 数组实例获取函数 - */ - IntFunction getArrayParseFunction(); - - /** - * 进行转化, 将String类型的数组转化为某指定类型 - * @param array String array - */ - default T[] map(String[] array){ - //真正被使用的对外接口 - return Arrays.stream(array).map(this).toArray(this.getArrayParseFunction()); - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapperType.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapperType.java deleted file mode 100644 index 7888b77..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/ArrayMapperType.java +++ /dev/null @@ -1,94 +0,0 @@ -package com.forte.util.mapper; - -import java.util.function.IntFunction; - -/** - * 部分指定好的转化器类型 - * 接口类型,内部提供部分ArrayMapper实现类 - * @author ForteScarlet <[email]ForteScarlet@163.com> - * @since JDK1.8 - **/ -public interface ArrayMapperType { - -// /** 转化为String类型 */ -// Class TO_STRING = ToString.class; -// -// /** 转化为Integer类型 */ -// Class TO_INT = ToInt.class; -// -// /** 转化为Double类型 */ -// Class TO_DOUBLE = ToDouble.class; -// -// /** 转化为Long类型 */ -// Class TO_LONG = ToLong.class; - - /** - * 保持String类型不变 - */ - class ToString implements ArrayMapper { - - @Override - public IntFunction getArrayParseFunction() { - return String[]::new; - } - - @Override - public String apply(String s) { - return s; - } - } - - /** - * 转化为Int类型 - */ - class ToInt implements ArrayMapper { - @Override - public Integer apply(String s) { - return Integer.parseInt(s); - } - @Override - public IntFunction getArrayParseFunction() { - return Integer[]::new; - } - } - - /** - * 转化为Double类型 - */ - class ToDouble implements ArrayMapper { - @Override - public IntFunction getArrayParseFunction() { - return Double[]::new; - } - - @Override - public Double apply(String s) { - return Double.parseDouble(s); - } - } - - /** - * 转化为Long类型 - */ - class ToLong implements ArrayMapper { - - @Override - public IntFunction getArrayParseFunction() { - return Long[]::new; - } - - @Override - public Long apply(String s) { - return Long.parseLong(s); - } - } - - - - - - - - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockArray.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockArray.java deleted file mode 100644 index f83c79d..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockArray.java +++ /dev/null @@ -1,38 +0,0 @@ -package com.forte.util.mapper; - -import java.lang.annotation.ElementType; -import java.lang.annotation.Retention; -import java.lang.annotation.RetentionPolicy; -import java.lang.annotation.Target; - -/** - * - * 数组类型的注解,类型为整数类型, - * 可以指定一个转化规则使得这个字符串数组可以转化为其他类型 - * - * @author ForteScarlet <[email]ForteScarlet@163.com> - * @since JDK1.8 - **/ -@Retention(RetentionPolicy.RUNTIME) -@Target(ElementType.FIELD) //字段 -public @interface MockArray { - - /** - * 数组参数 - */ - String[] value(); - - /** - * 类型转化器实现类,需要存在无参构造 - * 默认不变 - */ - Class mapper() default ArrayMapperType.ToString.class; - - - /** - * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值 - * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为{@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。 - * @since 1.6.0 - */ - String param() default ""; -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockBean.java deleted file mode 100644 index f1d841e..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockBean.java +++ /dev/null @@ -1,16 +0,0 @@ -package com.forte.util.mapper; - -import java.lang.annotation.ElementType; -import java.lang.annotation.Retention; -import java.lang.annotation.RetentionPolicy; -import java.lang.annotation.Target; - -/** - * 当进行包扫描的时候,会扫描到所有标记了此注解的类 - * 暂时没有参数,仅用作标记用。 - * @author ForteScarlet - */ -@Retention(RetentionPolicy.RUNTIME) //注解会在class字节码文件中存在,在运行时可以通过反射获取到 -@Target({ElementType.TYPE}) //类 -public @interface MockBean { -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxy.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxy.java deleted file mode 100644 index 0065eb0..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxy.java +++ /dev/null @@ -1,51 +0,0 @@ -package com.forte.util.mapper; - -import java.lang.annotation.ElementType; -import java.lang.annotation.Retention; -import java.lang.annotation.RetentionPolicy; -import java.lang.annotation.Target; - -/** - * Mock接口代理注解标识,标记在接口的抽象方法上。抽象方法 - * @author ForteScarlet - */ -@Retention(RetentionPolicy.RUNTIME) //注解会在class字节码文件中存在,在运行时可以通过反射获取到 -@Target({ElementType.METHOD}) //接口、类、枚举、注解、方法 -public @interface MockProxy { - - /** - *
 是否忽略此方法。如果为是,则接口的最终代理结果为返回一个null。
-     * 
 当然,如果获取不到对应的Mock类型,无论是否忽略都会返回null或者默认值。
-     * 
 如果是基础数据类型相关,数字类型,返回{@code 0}。
-     * 
 如果是基础数据类型相关,char类型,返回{@code ' '}。
-     * 
 如果是基础数据类型相关,boolean类型,返回{@code false}。
-     */
-    boolean ignore() default false;
-
-    /**
-     * 如果此参数存在值,则会优先尝试通过name获取MockObject对象
-     */
-    String name() default "";
-
-    /**
-     * 
 当接口返回值为数组或者集合的时候,此方法标记其返回值数量大小区间{@code [min, max], 即 max >= size >= min}。是数学上的闭区间。
-     * 
 如果此参数长度为0,则返回值为1。
-     * 
 如果参数长度为1,则相当于不是随机长度。
-     * 
 如果参数长度大于2,只取前两位。
-     */
-    int[] size() default {1,1};
-
-    /**
-     * 
 指定返回值类型,三种可能类型:list类型,array类型,Object其他任意类型。默认值为Unknown类型。当为Unknown类型的时候,会根据返回值类型自动判断。
-     * 
 当类型为list与array类型的时候,需要通过{@link #genericType()}方法指定泛型的类型,获取mock类型的时候将会通过此方法得到的类型来获取。
-     */
-    MockProxyType proxyType() default MockProxyType.UNKNOWN;
-
-
-    /**
-     * 假如类型为List类型,此处代表泛型的实际类型。
-     */
-    Class genericType() default Object.class;
-
-
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxyType.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxyType.java
deleted file mode 100644
index 4b6d52b..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockProxyType.java
+++ /dev/null
@@ -1,121 +0,0 @@
-package com.forte.util.mapper;
-
-import com.forte.util.exception.MockException;
-import com.forte.util.function.ExFunction;
-import com.forte.util.utils.RandomUtil;
-
-import java.lang.reflect.Array;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.function.BiFunction;
-import java.util.function.Function;
-
-/**
- * 代理接口对象的时候可能存在的三种类型:list类型,Array类型,和Object类型
- *
- * @author  ForteScarlet 
- */
-public enum MockProxyType {
-
-    /**
-     * 未知类型。直接使用此类型将会抛出异常。
-     * 但是一般来讲,默认的代理工厂内部会进行判断。
-     */
-    UNKNOWN(
-            (mrt, grt) -> {
-                throw new MockException("unknown proxy type.");
-            },
-            (f, num, t) -> {
-                throw new MockException("unknown proxy type.");
-            }
-    ),
-
-    /**
-     * 其他任意Object
-     * 返回值选择会选方法返回值
-     * 构建返回值的时候,无视区间参数,只会传入1且只取第一个返回值。
-     */
-    OBJECT(
-            (mrt, grt) -> mrt,
-            (f, num, t) -> f.apply(1)[0]
-    ) {
-        @Override
-        protected int getRandomNum(int min, int max) {
-            return 1;
-        }
-    },
-    /**
-     * 数组类型
-     */
-    ARRAY(
-            (mrt, grt) -> grt,
-            (f, num, t) -> {
-                final Object[] array = f.apply(num);
-                final Object newArrayInstance = Array.newInstance(t, array.length);
-                for (int i = 0; i < array.length; i++) {
-                    Array.set(newArrayInstance, i, array[i]);
-                }
-                return newArrayInstance;
-            }
-    ),
-    /**
-     * list类型
-     */
-    LIST(
-            (mrt, grt) -> grt,
-            (f, num, t) -> {
-                final Object[] array = f.apply(num);
-                return new ArrayList<>(Arrays.asList(array));
-            }
-    );
-
-    /**
-     * 函数参数1代表方法的返回值类型,参数2代表注解上的泛型类型(如果存在)
-     * 返回值代表使用哪个类型来获取泛型
-     */
-    private final BiFunction, Class, Class> selectUseTypeFunction;
-    /**
-     * 参数1代表一个函数,这个函数接收一个数量参数,返回一定数量的结果对象
-     * 参数2代表具体数量
-     * 参数3代表bean的类型
-     * 参数4代表最终的返回结果
-     */
-    private final ExFunction, Integer, Class, Object> resultObjectFunction;
-
-    MockProxyType(
-            BiFunction, Class, Class> selectUseTypeFunction,
-            ExFunction, Integer, Class, Object> resultObjectFunction
-    ) {
-        this.selectUseTypeFunction = selectUseTypeFunction;
-        this.resultObjectFunction = resultObjectFunction;
-    }
-
-    /**
-     * 选择使用的类型
-     *
-     * @param methodReturnType 方法类型
-     * @param genericType      泛型类型
-     * @return
-     */
-    public Class selectTypeUse(Class methodReturnType, Class genericType) {
-        return selectUseTypeFunction.apply(methodReturnType, genericType);
-    }
-
-    /**
-     * 根据函数构建最终结果
-     * @param beanGetter bean获取器
-     * @param beanType   bean的类型,即mock所得类型
-     * @param min
-     * @param max
-     * @return
-     */
-    public Object buildReturnType(Function beanGetter, Class beanType, int min, int max) {
-        final int randomNum = getRandomNum(min, max);
-        return resultObjectFunction.apply(beanGetter, randomNum, beanType);
-    }
-
-    protected int getRandomNum(int min, int max) {
-        return RandomUtil.getNumberWithRight(min, max);
-    }
-
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockValue.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockValue.java
deleted file mode 100644
index 28e8163..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mapper/MockValue.java
+++ /dev/null
@@ -1,36 +0,0 @@
-package com.forte.util.mapper;
-
-import java.lang.annotation.ElementType;
-import java.lang.annotation.Retention;
-import java.lang.annotation.RetentionPolicy;
-import java.lang.annotation.Target;
-
-/**
- * 应用于注解映射, 使用在字段上
- * 映射值为字符串类型
- * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com>
- * @since JDK1.8
- **/
-@Retention(RetentionPolicy.RUNTIME)
-@Target(ElementType.FIELD) //字段
-public @interface MockValue {
-
-    /**
-     * 映射值,如果为空则视为无效
-     */
-    String value();
-
-    /**
-     * 区间参数,如果有值,则代表了字段之前的区间参数。默认没有值
-     * 例如当字段{@code age} 的注解参数为 {@code param = "10-20"} 的时候, 相当于字段值为{@code "age|10-20"}。参数中的那个竖线不需要写。写了也会被去除的。
-     * @since  1.6.0
-     */
-    String param() default "";
-
-    /**
-     * 参数value的最终类型,在转化的时候会使用beanutils中的工具类{@link org.apache.commons.beanutils.ConvertUtils}进行类型转化, 默认为String类型。
-     * @return
-     */
-    Class valueType() default String.class;
-
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ConstMockField.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ConstMockField.java
deleted file mode 100644
index 8a4e1f7..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ConstMockField.java
+++ /dev/null
@@ -1,19 +0,0 @@
-package com.forte.util.mockbean;
-
-import com.forte.util.fieldvaluegetter.FieldValueGetter;
-
-/**
- * @author  ForteScarlet 
- */
-public class ConstMockField extends MockField {
-    /**
-     * 构造
-     * @param objType
-     * @param fieldName 字段名称
-     * @param fieldConstValue 字段值获取器
-     * @param fieldType 字段类型
-     */
-    public ConstMockField(Class objType, String fieldName, Object fieldConstValue, Class fieldType) {
-        super(objType, fieldName, () -> fieldConstValue, fieldType);
-    }
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockBean.java
deleted file mode 100644
index 3941467..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockBean.java
+++ /dev/null
@@ -1,84 +0,0 @@
-package com.forte.util.mockbean;
-
-import com.forte.util.exception.MockException;
-
-import java.util.Arrays;
-
-/**
- * 假对象的封装类,利用此类的getObject来获取一个对象
- *
- * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com>
- */
-public class MockBean {
-
-    /**
-     * 需要封装假数据的对象
-     */
-    protected Class objectClass;
-
-    /**
-     * 假对象的全部字段
-     */
-    protected MockField[] fields;
-
-    /**
-     * 获取对象一个对象
-     * @return
-     */
-    public T getObject() {
-        //先创建一个实例
-        T instance;
-        try {
-            instance = getObjectClass().newInstance();
-        } catch (InstantiationException | IllegalAccessException e) {
-            return null;
-        }
-        for (MockField field : fields) {
-            try {
-                field.setValue(instance);
-            } catch (Exception e) {
-                // ignored ?
-                // no
-                throw new MockException(e);
-            }
-        }
-        //返回这个实例
-        return instance;
-    }
-
-    /**
-     * 获取假字段集
-     */
-    public MockField[] getFields(){
-        return Arrays.copyOf(fields, fields.length);
-    }
-
-
-    public Class getObjectClass(){
-        return objectClass;
-    }
-
-
-    public MockBean parallel(){
-        return new ParallelMockBean<>(objectClass, Arrays.copyOf(fields, fields.length));
-    }
-
-
-    public MockBean sequential(){
-        return this;
-    }
-
-
-
-
-    /**
-     * 构造方法
-     *
-     * @param objectClass
-     * @param fields
-     */
-    public MockBean(Class objectClass, MockField[] fields) {
-        this.objectClass = objectClass;
-        this.fields = fields;
-    }
-}
diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockField.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockField.java
deleted file mode 100644
index f980a13..0000000
--- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockField.java
+++ /dev/null
@@ -1,142 +0,0 @@
-package com.forte.util.mockbean;
-
-import com.forte.util.fieldvaluegetter.FieldValueGetter;
-import com.forte.util.utils.FieldUtils;
-
-import java.lang.reflect.Field;
-import java.lang.reflect.Method;
-
-/**
- * 假字段值的字段封装对象的抽象类
- * 假的字段有几种类型:
- * 
    - *
  • 字符串
  • - *
  • 整数
  • - *
  • 浮点数
  • - *
  • 集合
  • - *
  • 数组
  • - *
  • 引用对象
  • - *
- * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class MockField { - - private final Class objType; - - /** - * 字段值获取器 - */ - private FieldValueGetter valueGetter; - - /** - * 字段名称 - */ - private final String fieldName; - - /** - * 字段类型 - */ - private final Class fieldType; - - - private final Field field; - - - private final Method setterMethod; - - - - /** - * 为传入的对象的对应的参数赋值 - * 通过FieldUtils工具类使用setter方法赋值 - * @see FieldUtils - * - * @param - */ - public void setValue(Object bean) throws Exception { -// Object value = getValue(); -// if(setterMethod == null){ -// if(value.getClass().equals(fieldType)){ -// field.set(bean, value); -// }else{ -// field.set(bean, ConvertUtils.convert(value, fieldType)); -// } -// } - FieldUtils.objectSetter(bean, fieldName, getValue()); - } - - /** - * 获取字段的值 - * - * @return 字段的值 - */ - public Object getValue() { - return valueGetter.value(); - } - - /** - * 获取字段的名称 - * - * @return 字段名 - */ - public String getFieldName() { - return this.fieldName; - } - - - public Class getFieldType(){ - return this.fieldType; - } - - public Class getObjType() { - return objType; - } - - public FieldValueGetter getValueGetter() { - return valueGetter; - } - - public Field getField() { - return field; - } - - public Method getSetterMethod() { - return setterMethod; - } - - public void setValueGetter(FieldValueGetter valueGetter) { - this.valueGetter = valueGetter; - } - - /** - * 构造 - */ - public MockField(Class objType, String fieldName, FieldValueGetter fieldValueGetter, Class fieldType) { - this.objType = objType; - this.fieldName = fieldName; - this.valueGetter = fieldValueGetter; - this.fieldType = fieldType; - - // 获取field - if(objType != null){ - this.field = FieldUtils.getField(objType, fieldName); - this.field.setAccessible(true); - this.setterMethod = FieldUtils.getFieldSetter(objType, this.field); - }else{ - // 当类型为Map类型的时候,objType为null,因此field为null。 - this.field = null; - this.setterMethod = null; - } - - - - - - } - - @Override - public String toString() { - return "MockField(StringName='"+fieldName+"')"; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapBean.java deleted file mode 100644 index b7836b6..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapBean.java +++ /dev/null @@ -1,61 +0,0 @@ -package com.forte.util.mockbean; - -import java.util.LinkedHashMap; -import java.util.Map; -import java.util.function.BinaryOperator; -import java.util.stream.Collectors; - -/** - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date Created in 2019/2/14 16:52 - @since JDK1.8 - **/ -public class MockMapBean extends MockBean { - - - /** - * 重写MockBean的方法,返回Map封装对象 - * @return - */ - @Override - public Map getObject(){ - //假字段集 - MockField[] fields = this.getFields(); - - Map map = new LinkedHashMap<>(); - - for (MockField field : fields) { - map.merge(field.getFieldName(), field.getValue(), (old, val) -> throwingMerger()); - } - - return map; - -// return Arrays.stream(fields) -// .map(f -> new AbstractMap.SimpleEntry<>(f.getFieldName(), f.getValue())) -// .collect(Collectors.toMap( -// Map.Entry::getKey, Map.Entry::getValue, -// throwingMerger() , LinkedHashMap::new -// )); - } - - - /** - * 构造方法 - * - * @param fields - */ - public MockMapBean(MockField[] fields) { - super(Map.class, fields); - } - - /** - * 此方法来自 {@link Collectors#throwingMerger()} - * @param - * @return - */ - @SuppressWarnings("JavadocReference") - protected static BinaryOperator throwingMerger() { - return (u,v) -> { throw new IllegalStateException(String.format("Duplicate key %s", u)); }; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapObject.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapObject.java deleted file mode 100644 index 496d55e..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockMapObject.java +++ /dev/null @@ -1,48 +0,0 @@ -package com.forte.util.mockbean; - -import java.util.Map; - -/** - * - * Map类型的结果集合 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2019/2/27 14:39 - */ -public class MockMapObject implements MockObject { - - private final MockMapBean mockMapBean; - - @Override - public MockBean getMockBean() { - return mockMapBean; - } - -// /** -// * 返回获取结果的Optional封装类 -// * -// * @return -// */ -// @Override -// public Optional get() { -// return Optional.ofNullable(mockMapBean.getObject()); -// } - - /** - * - * @return - */ - @Override - public Map getOne() { - return mockMapBean.getObject(); - } - - /** - * 唯一构造 - * @param mockMapBean - */ - public MockMapObject(MockMapBean mockMapBean){ - this.mockMapBean = mockMapBean; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockNormalObject.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockNormalObject.java deleted file mode 100644 index 1ad1506..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockNormalObject.java +++ /dev/null @@ -1,40 +0,0 @@ -package com.forte.util.mockbean; - -/** - * 将{@link MockBean}封装并返回 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/11 16:11 - * @since JDK1.8 - **/ -public class MockNormalObject implements MockObject { - - private final MockBean mockBean; - - @Override - public MockBean getMockBean() { - return mockBean; - } - -// /** -// * 返回获取结果的Optional封装类 -// * @return -// */ -// @Override -// public Optional get(){ -// return Optional.ofNullable(mockBean.getObject()); -// } - - @Override - public T getOne() { - return mockBean.getObject(); - } - - /** - * 唯一构造 - * @param mockBean - */ - public MockNormalObject(MockBean mockBean){ - this.mockBean = mockBean; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockObject.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockObject.java deleted file mode 100644 index d0a60d8..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/MockObject.java +++ /dev/null @@ -1,315 +0,0 @@ -package com.forte.util.mockbean; - -import com.forte.util.utils.CollectorUtil; - -import java.util.*; -import java.util.function.Function; -import java.util.stream.Collector; -import java.util.stream.Collectors; -import java.util.stream.IntStream; -import java.util.stream.Stream; - -/** - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2019/2/27 14:33 - */ -public interface MockObject { - - /** - * 获取一个MockBean对象 - */ - MockBean getMockBean(); - - - /** - * 返回获取结果的Optional封装类 - */ - default Optional get(){ - return Optional.ofNullable(getOne()); - } - - - /** - * 获取一个实例对象 - */ - T getOne(); - - - - /** - * 获取一个无限流 - */ - default Stream getStream(){ - return Stream.iterate(getOne(), i -> getOne()); - } - - /** - * 获取一个指定长度的流,等同于 getStream().limit(limit) - * @param limit - * @return - */ - default Stream getStream(int limit){ - return getStream().limit(limit); - } - - /** - * 获取一个并行无限流 - */ - default Stream getParallelStream(){ - return getStream().parallel(); - } - - /** - * 获取一个并行流 - */ - default Stream getParallelStream(int limit){ - return getParallelStream().limit(limit); - } - - /** - * 获取多个实例对象,作为list集合返回 - */ - default List getList(int num){ - ArrayList list = new ArrayList<>(num); - for (int i = 0; i < num; i++) { - list.add(getOne()); - } - return list; - } - - - /** - * 并行线程创建多个实例对象,作为list集合返回 - */ - default List getListParallel(int num){ - return collectParallel(num, Collectors.toList()); - } - - - /** - * 获取多个实例对象,根据转化规则转化后作为list集合返回 - * @param num - * @param mapper - * @param - * @return - */ - default List getList(int num , Function mapper){ - ArrayList list = new ArrayList<>(num); - for (int i = 0; i < num; i++) { - list.add(mapper.apply(getOne())); - } - return list; -// return collect(num, mapper, Collectors.toList()); - } - - - /** - * 获取多个实例对象,使用并行线程获取并根据指定规则进行转化,作为List返回 - * @param num - * @param mapper - * @param - * @return - */ - default List getListParallel(int num, Function mapper){ - return collectParallel(num, mapper, Collectors.toList()); - } - - - /** - * 获取多个实例对象,作为Set返回 - * @param num - * @return - */ - default Set getSet(int num){ - Set set = new LinkedHashSet<>((int) Math.ceil(num / 0.75)); - for (int i = 0; i < num; i++) { - set.add(getOne()); - } - return set; - } - - - /** - * 获取多个实例对象,使用并行流操作,作为Set返回 - * @param num - * @return - */ - default Set getSetParallel(int num){ - return collectParallel(num, Collectors.toSet()); - } - - - /** - * 获取多个实例对象,根据转化规则转化后作为Set返回 - */ - default Set getSet(int num , Function mapper){ - Set set = new LinkedHashSet<>((int) Math.ceil(num / 0.75)); - for (int i = 0; i < num; i++) { - set.add(mapper.apply(getOne())); - } - return set; - } - - - /** - * 获取多个实例对象,使用并行流根据转化规则进行转化后作为Set返回 - */ - default Set getSetParallel(int num, Function mapper){ - return collectParallel(num, mapper, Collectors.toSet()); - } - - - /** - * 获取多个实例对象,作为Map返回,需要指定Map的转化方式 - */ - default Map getMap(int num , Function keyMapper , Function valueMapper){ - Map map = new LinkedHashMap<>((int) Math.ceil(num / 0.75)); - for (int i = 0; i < num; i++) { - final T value = getOne(); - final K k = keyMapper.apply(value); - final V v = valueMapper.apply(value); - map.put(k, v); - } - return map; - } - - /** - * 获取多个实例对象,作为Map返回,需要指定Map的转化方式 - */ - default Map getMap(int num, Collector> collector){ - return collectToMap(num, collector); - } - - - /** - * 获取多个实例对象,作为Map返回,需要指定Map的转化方式 - */ - default Map getMapParallel(int num , Function keyMapper , Function valueMapper){ - return collectToMapParallel(num, keyMapper, valueMapper); - } - - /** - * 获取多个实例对象,作为Map返回,需要指定Map的转化方式 - */ - default Map getMapParallel(int num , Collector> collector){ - return collectToMapParallel(num, collector); - } - - - /** - * 串行collect - */ - default R collect(int num, Collector collector){ - return CollectorUtil.collector(num, this::getOne, collector); -// // 获取容器 -// A container = collector.supplier().get(); -// // 获取累加器 -// BiConsumer accumulator = collector.accumulator(); -// for (int i = 0; i < num; i++) { -// accumulator.accept(container, getOne()); -// } -// // 获取结果 -// return collector.finisher().apply(container); - // return getStream(num).collect(collector); - } - - /** - * 带转化的串行collect - */ - default N collect(int num, Function mapper, Collector collector){ - return CollectorUtil.collector(num, this::getOne, mapper, collector); - // return getStream(num).map(mapper).collect(collector); - } - - /** - * 串行collect toMap - */ - default Map collectToMap(int num, Collector> collector){ - return CollectorUtil.collector(num, this::getOne, collector); -// return getStream(num).collect(collector); - } - - /** - * 串行collect toMap - * 默认使用{@link LinkedHashMap} - */ - default Map collectToMap(int num, Function keyFunction, Function valueFunction){ - Map map = new LinkedHashMap<>(); - T one; - for (int i = 0; i < num; i++) { - one = getOne(); - map.put(keyFunction.apply(one), valueFunction.apply(one)); - } - return map; - // return getStream(num).collect(Collectors.toMap(keyFunction, valueFunction)); - } - - /** - * 带转化的串行collect toMap - * 默认使用{@link LinkedHashMap} - */ - default Map collectToMap(int num, Function mapper, Collector> collector){ - return CollectorUtil.collector(num, this::getOne, mapper, collector); - // return getStream(num).map(mapper).collect(collector); - } - - /** - * 带转化的collect toMap - * 默认使用{@link LinkedHashMap} - */ - default Map collectToMap(int num, Function mapper, Function keyFunction, Function valueFunction){ - Map map = new LinkedHashMap<>(); - T one; - for (int i = 0; i < num; i++) { - one = getOne(); - map.put(keyFunction.apply(mapper.apply(one)), valueFunction.apply(mapper.apply(one))); - } - return map; - // return getStream(num).map(mapper).collect(Collectors.toMap(keyFunction, valueFunction)); - } - - - /** - * 并行collect - */ - default R collectParallel(int num, Collector collector){ - return IntStream.range(0, num).parallel().mapToObj(i -> getOne()).collect(collector); - } - - /** - * 带转化的并行collect - */ - default N collectParallel(int num, Function mapper, Collector collector){ - return IntStream.range(0, num).parallel().mapToObj(i -> mapper.apply(getOne())).collect(collector); - } - - /** - * 并行collect - */ - default Map collectToMapParallel(int num, Collector> collector){ - return IntStream.range(0, num).parallel().mapToObj(i -> getOne()).collect(collector); - } - - /** - * 并行collect - */ - default Map collectToMapParallel(int num, Function keyFunction, Function valueFunction){ - return IntStream.range(0, num).parallel().mapToObj(i -> getOne()).collect(Collectors.toMap(keyFunction, valueFunction)); - } - - /** - * 带转化的并行collect - */ - default Map collectToMapParallel(int num, Function mapper, Collector> collector){ - return IntStream.range(0, num).parallel().mapToObj(i -> mapper.apply(getOne())).collect(collector); - } - - /** - * 带转化的并行collect - */ - default Map collectToMapParallel(int num, Function mapper, Function keyFunction, Function valueFunction){ - return IntStream.range(0, num).parallel().mapToObj(i -> mapper.apply(getOne())).collect(Collectors.toMap(keyFunction, valueFunction)); - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockBean.java deleted file mode 100644 index ab7c701..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockBean.java +++ /dev/null @@ -1,55 +0,0 @@ -package com.forte.util.mockbean; - -import com.forte.util.exception.MockException; - -import java.util.Arrays; - -/** - * @author ForteScarlet - * @date 2020/7/29 - */ -public class ParallelMockBean extends MockBean { - - - /** - * 获取对象一个对象 - * @return - */ - public T getObject() { - //先创建一个实例 - T instance; - try { - instance = objectClass.newInstance(); - } catch (InstantiationException | IllegalAccessException e) { - return null; - } - Arrays.stream(fields).parallel().forEach(field -> { - try { - field.setValue(instance); - } catch (Exception e) { - // ignored ? - throw new MockException(e); - } - }); - //返回这个实例 - return instance; - } - - - public MockBean parallel(){ - return this; - } - - - public MockBean sequential(){ - return new MockBean<>(objectClass, Arrays.copyOf(fields, fields.length)); - } - - - /** - * 构造方法 - */ - public ParallelMockBean(Class objectClass, MockField[] fields) { - super(objectClass, fields); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockMapBean.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockMapBean.java deleted file mode 100644 index 331416a..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/mockbean/ParallelMockMapBean.java +++ /dev/null @@ -1,50 +0,0 @@ -package com.forte.util.mockbean; - -import java.util.AbstractMap; -import java.util.Arrays; -import java.util.LinkedHashMap; -import java.util.Map; -import java.util.stream.Collectors; - -/** - * @author ForteScarlet - * @date 2020/7/29 - */ -public class ParallelMockMapBean extends MockMapBean { - - /** - * 重写MockBean的方法,返回Map封装对象 - * @return - */ - @Override - public Map getObject(){ - //假字段集 - MockField[] fields = this.getFields(); - - return Arrays.stream(fields).parallel() - .map(f -> new AbstractMap.SimpleEntry<>(f.getFieldName(), f.getValue())) - .collect(Collectors.toMap( - Map.Entry::getKey, Map.Entry::getValue, - throwingMerger() , LinkedHashMap::new - )); - } - - - public MockMapBean parallel(){ - return this; - } - - - public MockMapBean sequential(){ - return new MockMapBean(Arrays.copyOf(fields, fields.length)); - } - - /** - * 构造方法 - * - * @param fields - */ - public ParallelMockMapBean(MockField[] fields) { - super(fields); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/package-info.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/package-info.java deleted file mode 100644 index a132a5b..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/package-info.java +++ /dev/null @@ -1,13 +0,0 @@ -/** - * - *

- * 除了所有的工具类以外唯一的对外接口: {@link com.forte.util.Mock} - *

- *

- * 可用在参数映射的方法列表: {@link com.forte.util.utils.MockUtil} - *

- * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/11 15:20 - * @since JDK1.8 - **/ -package com.forte.util; \ No newline at end of file diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ArraysParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ArraysParser.java deleted file mode 100644 index 44f70a8..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ArraysParser.java +++ /dev/null @@ -1,128 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; - -import java.util.Arrays; -import java.util.Optional; - -/** - * 数组参数解析器 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class ArraysParser extends BaseFieldParser { - - /** 参数传入的数组 */ - private final Object[] defaultArr; - - /** - * 当字段类型既不是集合也不是数组的解析 - * 说明从参数中随机获取一个并返回 - * - * @return - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - //创建一个数组元素获取执行者 - Invoker invoker = MethodUtil.createArrayElementInvoker(defaultArr); - return getObjectFieldValueGetter(invoker); - } - - /** - * 当字段是一个list类型集合的时候 - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //转化并返回结果 - return getListFieldValueGetter(); - } - - - /** - * 当字段是一个数组类型的时候 - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //转化并返回结果 - return getArrayFieldValueGetter(); - } - - - /** - * 获取数组字段值获取器 - * @return - */ - private FieldValueGetter getArrayFieldValueGetter(){ - //获取随机元素值执行者 - Invoker invoker = MethodUtil.createArrayElementInvoker(defaultArr); - //因为区间不可能为null,直接转化并返回 - return Optional.of(getIntervalData()).map(i -> { - //如果有区间参数,根据区间参数获取字段值获取器 - return getArrayFieldValueGetter(new Invoker[]{invoker}, i); - }).get(); - } - - /** - * 获取集合字段值获取器 - * @return - */ - private FieldValueGetter getListFieldValueGetter(){ - //获取随机元素值执行者 - Invoker invoker = MethodUtil.createArrayElementInvoker(defaultArr); - //因为区间不可能为null,直接转化并返回 - return Optional.of(getIntervalData()).map(i -> { - //如果有区间参数,根据区间参数获取字段值获取器 - return getListFieldValueGetter(new Invoker[]{invoker}, i); - }).get(); - } - - - - /** - * 获取区间参数区间,如果没有区间参数则返回区间[1,1] - * @return - */ - private Integer[] getIntervalData(){ - //获取参数 - Integer min = intervalMin; - Integer max = intervalMax; - - //判断区间参数 - if(min == null){ - //如果没左参数 - if(max == null){ - //如果右参数也没有,直接返回一个[1,1]的区间 - return new Integer[]{1,1}; - }else{ - //如果有右参数,参数同化 - min = max; - } - }else{ - //有左参数,判断右参数 - if(max == null){ - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min ,max}; - } - - - /** - * 构造 - * - * @param objectClass - * @param fieldName - * @param intervalStr - */ - public ArraysParser(Class objectClass, String fieldName, String intervalStr, Object[] defaultArr) { - super(objectClass, fieldName, intervalStr); - //参数数组,复制一份而并非使用原来的 ->浅拷贝 - this.defaultArr = Arrays.copyOf(defaultArr, defaultArr.length); - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/BaseFieldParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/BaseFieldParser.java deleted file mode 100644 index abfe484..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/BaseFieldParser.java +++ /dev/null @@ -1,936 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.Mock; -import com.forte.util.fieldvaluegetter.*; -import com.forte.util.invoker.Invoker; -import com.forte.util.mockbean.MockField; -import com.forte.util.utils.FieldUtils; -import com.forte.util.utils.MethodUtil; -import com.forte.util.utils.RegexUtil; - -import java.lang.reflect.Method; -import java.util.*; -import java.util.regex.Pattern; -import java.util.stream.Collectors; - -/** - * 所有字段解析器的抽象父类
- *
- *
- *
- * 每一个字段都有可能是集合或者数组类型的,但是每种类型的参数解析结果却又不同
- * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public abstract class BaseFieldParser implements FieldParser { - - /** - * 类的class对象 - */ - protected final Class objectClass; - - /** - * 字段名称 - */ - protected final String fieldName; - - /** - * 区间参数字符串min-max - */ - protected final Integer intervalMin, intervalMax; - - /** - * 区间参数字符串-小数位min-max(如果有的话) - */ - protected final Integer intervalDoubleMin, intervalDoubleMax; - - /** - * 此字段的class类型 - */ - protected final Class fieldClass; - - /** - * 获取一个封装好的MockField对象 - * 不可重写 - * @return 封装好的MockField对象 - */ - @Override - public final MockField getMockField(){ - //解析获取字段值获取器 - FieldValueGetter fieldValueGetter = parserForFieldValueGetter(); - - //创建一个MockField对象并返回 - return new MockField(objectClass, fieldName, fieldValueGetter, getFieldClass()); - } - - /** - * 获取字段类型值 - */ - protected Class getFieldClass(){ - return fieldClass; - } - - /** - * 进行解析,判断需要产生的字段类型 - */ - private FieldValueGetter parserForFieldValueGetter() { - //字段值获取器 - FieldValueGetter fieldValueGetter; - //判断这个字段的类型是不是List集合的形式 - //根据字段的class对象判断类型 - boolean isList = FieldUtils.isChild(fieldClass, List.class); - //如果是list集合类型 - if (isList) { - //是list集合形式的解析 - fieldValueGetter = parserForListFieldValueGetter(); - - } else if (fieldClass.isArray()) { - //是Array数组的解析 - fieldValueGetter = parserForArrayFieldValueGetter(); - - } else { - //不是list集合形式或Array数组的解析 - fieldValueGetter = parserForNotListOrArrayFieldValueGetter(); - } - - //返回结果 - return fieldValueGetter; - } - - - - /* * * —————————————— 为子类服务的辅助方法 —————————————— * * */ - - - /* —————————————— 正则与方法的解析 ———————————————— */ - - /** - * 看看有没有匹配的方法 - * - * @param name 方法名 - * @return 是否有 - */ - protected static boolean match(String name) { - //获取正则 - String regex = ".*" + getMethodRegex() + ".*"; - return name.matches(regex); - } - - /** - * 获取匹配的方法 - * @param name 方法名 - * @return 匹配的方法合集 - */ - protected static String[] getMethods(String name) { - //获取正则 - String regex = getMethodRegex(); - return RegexUtil.getMatcher(name, regex).toArray(new String[0]); - } - - /** - * 获取对匹配的方法进行切割之后的结果 - * - * @return - */ - protected static String[] getMethodsSplit(String name) { - //获取正则 - String regex = getMethodRegex(); - return name.split(regex); - } - - - /** - * 获取一个方法执行者 - * @param methodName - * @return - */ - protected static Invoker getOneMethodInvoker(String methodName){ - return getMethodInvoker(new String[]{methodName}).get(0); - } - - - private static final Pattern replaceForNameRegex = - Pattern.compile("@|(\\(\\))|(\\(((\\w+)|('.+')|(\".+\"))(\\, *((\\w+)|('.+')|(\".+\")))*\\))"); - -// /** -// * 为方法{@link #getMethodInvoker}服务,提供正则来获取方法名 -// * @return -// */ -// private static String getReplaceForNameRegex(){ -// return "@|(\\(\\))|(\\(((\\w+)|('.+')|(\".+\"))(\\,((\\w+)|('.+')|(\".+\")))*\\))"; -// } - /** - * 为方法{@link #getMethodInvoker}服务,提供正则来获取方法名 - * @return - */ - private static Pattern getReplaceForNameRegex(){ - return replaceForNameRegex; - } - - /** - * 解析方法字符串并获取方法执行者 - * @param methods - * 方法名列表 - * @return - */ - protected static List getMethodInvoker(String[] methods) { - //移除@符号、空括号、一个参数的括号、两个参数的括号 -// String replaceForNameRegex = "@|(\\(\\))|(\\(\\w+(\\,\\w+)*\\))"; - Pattern replaceForNameRegex = getReplaceForNameRegex(); -// String replaceForParamRegex = "[(@" + getMethodNameRegexs() + ")\\(\\)]"; -// String regex = "@" + getMethodNameRegexs() + "(((\\((\\w+|\\w+\\,\\w+)\\))|(\\(\\))|())?)"; - - List invokerList = new ArrayList<>(); - - //遍历方法,保证顺序 - for (String methodStr : methods) { - //获取方法名称 - String methodName; -// String methodName = replaceForNameRegex.matcher(methodStr).replaceAll(""); -// String methodName = methodStr.replaceAll(replaceForNameRegex, ""); - - - int head = 0; - int end = -1; - if(methodStr.charAt(0) == '@'){ - head = 1; - } - end = methodStr.indexOf("("); - if(end > -1){ - methodName = methodStr.substring(head, end); - }else { - methodName = methodStr.substring(head); - } - -// System.out.println(methodStr); -// System.out.println(methodName); - - //获取方法的参数 - String[] params = getMethodParams(methodStr); - //获取方法对象 - Method method = getMethodFromName(methodName, params.length); - //返回一个方法执行者,如果为没有对应的方法则创建一个空执行者,返回原本的字符串 - if (method == null) { - invokerList.add(MethodUtil.createNullMethodInvoker(methodStr)); - } else { - invokerList.add(MethodUtil.createMethodInvoker(null, params, method)); - } - } - - //返回方法执行者的集合 - return invokerList; - } - - - /** - * 获取匹配MockUtil中的方法的正则 - * - * @return - */ - protected static String getMethodRegex() { - String collect = getMethodNameRegexs(); - //全数量参数匹配、参数间空格匹配(参数为任意字符匹配)- 有bug -// String regex = "(@"+ collect +"+((\\((.+(\\,.+)*)\\))|(\\(\\))|()|))"; - //全数量参数匹配、参数\\w匹配 -// String regex = "(@" + collect + "+((\\((\\w+(\\,\\w+)*)\\))|(\\(\\))|())?)"; - - //尝试支持中文字符串 - String regex = "(@" + collect + "{1}((\\((((\\w)+|('.+')|(\".+\"))(\\, *((\\w)+|('.+')|(\".+\")))*)\\))|(\\(\\))|())?)"; - - //值匹配0-1-2个参数 -// String regex = "(@"+ collect +"+((\\((\\w+|\\w+\\,\\w+)\\))|(\\(\\))|())?)"; - return regex; - } - - - /** - * 从方法字符串中获取参数 - * @param methodStr 方法字符串 - * @return - */ - public static String[] getMethodParams(String methodStr) { - String replaceForParamRegex = "(@" + getMethodNameRegexs() + ")|\\(|\\)"; - String[] split = methodStr.replaceAll(replaceForParamRegex, "").split("( *)\\,( *)"); - return Arrays.stream(split).map(s -> s.trim().replaceAll("['\"]", "")).filter(s -> s.length() > 0).toArray(String[]::new); - } - - /** - * 通过名称获取方法 - * @param methodName 纯方法名称 - * @param paramsLength 参数数量 - * @return MockUtil中对应方法名与参数数量的方法对象,如果没有获取到则返回null - */ - public static Method getMethodFromName(String methodName, int paramsLength) { - //过滤出方法名匹配,参数长度也匹配的方法 - Method value = null; - -// try { - Map.Entry mockMethod = Mock.getMockMethodByFilter(entry -> -// entry.getKey().replaceAll("\\([\\w\\.\\,]*\\)", "").equals(methodName) - entry.getKey().startsWith(methodName) - && entry.getValue().getParameters().length == paramsLength); - if(mockMethod != null){ - value = mockMethod.getValue(); - } -// //取值,理论上来说最后的过滤结果应该只有一个结果 -// value = Mock.getMockMethods().entrySet().stream() -// .filter(e -> e.getKey().replaceAll("\\([\\w\\.\\,]*\\)","").equals(methodName) && e.getValue().getParameters().length == paramsLength).findFirst().get().getValue(); -// } catch (NoSuchElementException e) { -// } - - //返回结果 - return value; - } - - -// private static final Pattern allMethodNameRegex; - - /** - * 获取全部方法的正则匹配字符串 - * - * @return - */ - protected static String getMethodNameRegexs() { - return Mock.getMockMethods().keySet().stream().map(method -> method.replaceAll("(\\(.*\\))", "")).distinct().collect(Collectors.joining("|", "(", ")")); - } - - - - /* —————————————————— 获取各种参数获取器的方法 —————————————————————— */ - - /* —————————————————— Integer参数获取器的方法 —————————————————————— */ - /** - * 获取随机整数的方法名,有三种参数: - * {@link com.forte.util.utils.MockUtil#integer()}获取随机数字 0-9 - * {@link com.forte.util.utils.MockUtil#integer(Integer)}获取指定长度的随机数,※不可超过int最大上限 - * {@link com.forte.util.utils.MockUtil#integer(Integer, Integer)}获取指定区间[a,b]的随机数,※不可超过int最大上限 - */ - private static final String INTEGER_METHOD_NAME = "@integer"; - - /** - * 获取随机整数方法执行者,由于整数三种参数中方法的执行类型不同,则需要分为三个方法 - * @return - */ - private Invoker getIntegerMethodInvoker(Integer intIntervalMin , Integer intIntervalMax){ - //拼接出方法 - //获取方法字符串 - String methodStr = INTEGER_METHOD_NAME + - "(" + - intIntervalMin + - "," + - intIntervalMax + - ")" - //获取方法字符串 - ; - - //返回方法执行者 - return getOneMethodInvoker(methodStr); - } - /** - * 获取随机整数方法执行者,由于整数三种参数中方法的执行类型不同,则需要分为三个方法 - * @return - */ - private Invoker getIntegerMethodInvoker(Integer intIntervalLength){ - //拼接出方法 - StringBuilder methodBuilder = new StringBuilder(); - methodBuilder.append(INTEGER_METHOD_NAME); - methodBuilder.append("("); - methodBuilder.append(intIntervalLength); - methodBuilder.append(")"); - - //获取方法字符串 - String methodStr = methodBuilder.toString(); - - //返回方法执行者 - return getOneMethodInvoker(methodStr); - } - /** - * 获取随机整数方法执行者,由于整数三种参数中方法的执行类型不同,则需要分为三个方法 - * @return - */ - private Invoker getIntegerMethodInvoker(){ - //拼接出方法 - //获取方法字符串 - String methodStr = INTEGER_METHOD_NAME + "()"; - - //返回方法执行者 - return getOneMethodInvoker(methodStr); - } - - /** - * 获取一个整数类型字段值获取器 - * @param intIntervalMin - * @param intIntervalMax - * @return - */ - protected IntegerFieldValueGetter getIntegerFieldValueGetter(Integer intIntervalMin , Integer intIntervalMax){ - //获取一个整数类型字段值获取器 - //获取随机整数方法执行者 - Invoker integerMethodInvoker = getIntegerMethodInvoker(intIntervalMin, intIntervalMax); - - //创建整数类型字段值获取器 - return new IntegerFieldValueGetter(integerMethodInvoker); - } - - /** - * 获取一个整数类型字段值获取器 - * @param intIntervalLength - * 整数的长度 - * @return - */ - protected IntegerFieldValueGetter getIntegerFieldValueGetter(Integer intIntervalLength){ - //获取一个整数类型字段值获取器 - //获取随机整数方法执行者 - Invoker integerMethodInvoker = getIntegerMethodInvoker(intIntervalLength); - - //创建整数类型字段值获取器 - return new IntegerFieldValueGetter(integerMethodInvoker); - } - - /** - * 获取一个整数类型字段值获取器 - * @return - */ - protected IntegerFieldValueGetter getIntegerFieldValueGetter(){ - //获取一个整数类型字段值获取器 - //获取随机整数方法执行者 - Invoker integerMethodInvoker = this::getIntegerMethodInvoker; - - //创建整数类型字段值获取器 - return new IntegerFieldValueGetter(integerMethodInvoker); - } - - - /* —————————————————— Double参数获取器的方法 —————————————————————— */ - /** - * 获取随机小数的方法名,有4种参数: - * {@link com.forte.util.utils.MockUtil#doubles(Integer, Integer, Integer, Integer)}获取制定区间[a,b]的小数,指定小数位数[endL,endR],double类型 - * {@link com.forte.util.utils.MockUtil#doubles(Integer, Integer, Integer)}获取制定区间[a,b]的小数,指定小数位数[end],double类型 - * {@link com.forte.util.utils.MockUtil#doubles(Integer, Integer)}获取指定区间[a,b]的小数,默认小数位数为0,double类型 - * {@link com.forte.util.utils.MockUtil#doubles(Integer)}获取指定数值为a的小数,默认小数位数为0,double类型 - * - */ - private final String DOUBLE_METHOD_NAME = "@doubles"; - - - - /** - * 获取随机小数的方法执行者 - * @param intIntervalMin - * @param intIntervalMax - * @param doubleIntervalMin - * @param doubleIntervalMax - * @return - */ - private Invoker getDoublesMethodInvoker(Integer intIntervalMin, Integer intIntervalMax, Integer doubleIntervalMin, Integer doubleIntervalMax){ - StringBuilder strForMethod = new StringBuilder(); - strForMethod.append(DOUBLE_METHOD_NAME); - strForMethod.append("("); - strForMethod.append(intIntervalMin); - strForMethod.append(","); - strForMethod.append(intIntervalMax); - strForMethod.append(","); - strForMethod.append(doubleIntervalMin); - strForMethod.append(","); - strForMethod.append(doubleIntervalMax); - strForMethod.append(")"); - - String methodStr = strForMethod.toString(); - - //获取并返回方法执行者 - return getOneMethodInvoker(methodStr); - - } - - /** - * 获取小数字段值获取器 - * @param intInterval - * 整数部分数值 - * @return - */ - protected DoubleFieldValueGetter getDoubleFieldValueGetter(Integer intInterval){ - //获取方法执行者 - String methodName = DOUBLE_METHOD_NAME + "("+ intInterval +")"; - //获取方法执行者 - Invoker oneMethodInvoker = getOneMethodInvoker(methodName); - - //创建并返回double字段值获取器 - return new DoubleFieldValueGetter(oneMethodInvoker); - } - - /** - * 获取小数字段值获取器 - * @param intIntervalMin - * 整数部分区间最小值 - * @param intIntervalMax - * 整数部分区间最大值 - * @return - */ - protected DoubleFieldValueGetter getDoubleFieldValueGetter(Integer intIntervalMin , Integer intIntervalMax){ - return getDoubleFieldValueGetter(intIntervalMin , intIntervalMax , 0 , 0); - } - - /** - * 获取小数字段值获取器 - * @param intIntervalMin - * 整数部分最小数区间 - * @param intIntervalMax - * 整数部分最大数区间 - * @param doubleInterval - * 小数部分位数 - * @return - */ - protected DoubleFieldValueGetter getDoubleFieldValueGetter(Integer intIntervalMin , Integer intIntervalMax , Integer doubleInterval){ - return getDoubleFieldValueGetter(intIntervalMin , intIntervalMax , doubleInterval , doubleInterval); - } - - /** - * 获取小数字段值获取器 - * @param intIntervalMin - * 整数部分最小数区间 - * @param intIntervalMax - * 整数部分最大数区间 - * @param doubleIntervalMin - * 小数部分最小位数区间 - * @param doubleIntervalMax - * 小数部分最大位数区间 - * @return - */ - protected DoubleFieldValueGetter getDoubleFieldValueGetter(Integer intIntervalMin , Integer intIntervalMax , Integer doubleIntervalMin , Integer doubleIntervalMax){ - //有一下4种情况种可能, - //1 - 4个参数都有 - //2 - 整数部分没有右参数 - //3 - 小数部分没有右参数 - //4 - 整数和小数都没有右边参数 - - //先判断两部分的右参数,如果没有则赋值与左参数相同 - intIntervalMax = Optional.ofNullable(intIntervalMax).orElse(intIntervalMin); - doubleIntervalMax = Optional.ofNullable(doubleIntervalMax).orElse(intIntervalMin); - - - //现在只有一种情况:4个参数都有,拼接参数 - //获取方法执行者 - Invoker oneMethodInvoker =getDoublesMethodInvoker(intIntervalMin , intIntervalMax , doubleIntervalMin , doubleIntervalMax); - - //创建并返回double字段值获取器 - return new DoubleFieldValueGetter(oneMethodInvoker); - } - - - /* —————————————————— Array参数获取器的方法 —————————————————————— */ - - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getArrayFieldValueGetter(Invoker[] invokers, Integer[] integerInterval, String[] moreStrs) { - if (integerInterval == null) { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers, moreStrs); - } else { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers, integerInterval, moreStrs); - } - } - - /** - * 获取一个List类型字段值获取器 - * @param invokers - * 方法执行者 - * @param integerInterval - * 区间参数 - * @return - */ - protected FieldValueGetter getArrayFieldValueGetter(Invoker[] invokers, Integer[] integerInterval) { - if (integerInterval == null) { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers); - } else { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers, integerInterval); - } - } - - - /** - * 获取一个数组类字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getArrayFieldValueGetter(List invokers, Integer[] integerInterval, String[] moreStrs) { - Invoker[] invokersArr = invokers.toArray(new Invoker[0]); - if (integerInterval == null) { - //创建对象并返回 - return new ArrayFieldValueGetter(invokersArr, moreStrs); - } else { - //创建对象并返回 - return new ArrayFieldValueGetter(invokersArr, integerInterval, moreStrs); - } - } - - /** - * 获取一个数组类字段值获取器 - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getArrayFieldValueGetter(Invoker[] invokers, String[] moreStrs) { - //创建对象并返回 - return getArrayFieldValueGetter(invokers, new Integer[]{intervalMin, intervalMax}, moreStrs); - } - - /** - * 获取一个数组类型字段值获取器 - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getArrayFieldValueGetter(List invokers, String[] moreStrs) { - //创建对象并返回 - return getArrayFieldValueGetter(invokers.toArray(new Invoker[0]), new Integer[]{intervalMin, intervalMax}, moreStrs); - } - - - /** - * 获取一个数组类型字段值获取器 - * - * @param invokers 方法执行者 - */ - protected FieldValueGetter getArrayFieldValueGetter(Invoker[] invokers) { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers, new Integer[]{intervalMin, intervalMax}); - } - - /** - * 获取一个数组类型字段值获取器 - * - * @param invokers 方法执行者 - */ - protected FieldValueGetter getArrayFieldValueGetter(List invokers) { - //创建对象并返回 - return new ArrayFieldValueGetter(invokers.toArray(new Invoker[0]), new Integer[]{intervalMin, intervalMax}); - } - - - - /* —————————————————— List参数获取器的方法 —————————————————————— */ - - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval, String[] moreStrs) { - if (integerInterval == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokers, moreStrs); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokers, integerInterval, moreStrs); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval1 区间参数 - * @param integerInterval2 区间参数 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval1, Integer[] integerInterval2, String[] moreStrs) { - if (integerInterval1 == null && integerInterval2 == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokers, moreStrs); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokers, integerInterval1, integerInterval2, moreStrs); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval) { - if (integerInterval == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokers); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokers, integerInterval); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval1 区间参数 - * @param integerInterval2 区间参数 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers, Integer[] integerInterval1, Integer[] integerInterval2) { - if (integerInterval1 == null && integerInterval2 == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokers); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokers, integerInterval1, integerInterval2); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getListFieldValueGetter(List invokers, Integer[] integerInterval, String[] moreStrs) { - Invoker[] invokersArr = invokers.toArray(new Invoker[0]); - if (integerInterval == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokersArr, moreStrs); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokersArr, integerInterval, moreStrs); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param integerInterval 区间参数 - */ - protected FieldValueGetter getListFieldValueGetter(List invokers, Integer[] integerInterval) { - Invoker[] invokersArr = invokers.toArray(new Invoker[0]); - if (integerInterval == null) { - //创建对象并返回 - return new ListFieldValueGetter(invokersArr); - } else { - //创建对象并返回 - return new ListFieldValueGetter(invokersArr, integerInterval); - } - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers, String[] moreStrs) { - //创建对象并返回 - return getListFieldValueGetter(invokers, new Integer[]{intervalMin, intervalMax}, moreStrs); - } - - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - * @param moreStrs 多余字符 - */ - protected FieldValueGetter getListFieldValueGetter(List invokers, String[] moreStrs) { - //创建对象并返回 - return getListFieldValueGetter(invokers.toArray(new Invoker[0]), new Integer[]{intervalMin, intervalMax}, moreStrs); - } - - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - */ - protected FieldValueGetter getListFieldValueGetter(Invoker[] invokers) { - //创建对象并返回 - return new ListFieldValueGetter(invokers, new Integer[]{intervalMin, intervalMax}); - } - - /** - * 获取一个List类型字段值获取器 - * - * @param invokers 方法执行者 - */ - protected FieldValueGetter getListFieldValueGetter(List invokers) { - //创建对象并返回 - return new ListFieldValueGetter(invokers.toArray(new Invoker[0]), new Integer[]{intervalMin, intervalMax}); - } - - - - - /* —————————————————— Object参数获取器的方法 —————————————————————— */ - - - /** - * 获取一个未知引用类型参数获取器 - * - * @return - */ - protected FieldValueGetter getObjectFieldValueGetter(Invoker... invokers) { - return new ObjectFieldValueGetter(invokers); - } - - /** - * 获取一个未知引用类型参数获取器 - * - * @return - */ - protected FieldValueGetter getObjectFieldValueGetter(List invokers) { - return new ObjectFieldValueGetter(invokers.toArray(new Invoker[0])); - } - - - /* —————————————————— String参数获取器的方法 —————————————————————— */ - - - /** - * 获取一个字符串参数获取器 - * - * @param invokers 方法执行者数组 - * @param methodsSplit 多余字符数组 - * @param intervalArray 区间参数 - * @return - */ - protected FieldValueGetter getStringFieldValueGetter(Invoker[] invokers, String[] methodsSplit, Integer[] intervalArray) { - //获取 StringFieldValueGetter参数获取器,如果有整数位的区间参数,添加参数 - //使用三元运算判断 - //如果为空||长度小于1||长度大于2||索引0为空 - boolean intervalArrayIsRight = !(intervalArray == null || intervalArray.length < 1 || intervalArray.length >= 2 || intervalArray[0] == null); - return intervalArrayIsRight ? - new StringFieldValueGetter(invokers, methodsSplit, intervalArray) - : - new StringFieldValueGetter(invokers, methodsSplit); - } - - - /** - * 获取一个字符串参数获取器 - * - * @param invokers 方法执行者数组 - * @param methodsSplit 多余字符数组 - * @return - */ - protected FieldValueGetter getStringFieldValueGetter(Invoker[] invokers, String[] methodsSplit) { - //获取 StringFieldValueGetter参数获取器,如果有整数位的区间参数,添加参数 - //使用三元运算判断 - return intervalMax == null ? - new StringFieldValueGetter(invokers, methodsSplit) - : - new StringFieldValueGetter(invokers, methodsSplit, new Integer[]{intervalMin, intervalMax}); - } - - /** - * 获取一个字符串参数获取器 - * - * @param invokers 方法执行者集合 - * @param methodsSplit 多余字符数组 - * @return - */ - protected FieldValueGetter getStringFieldValueGetter(List invokers, String[] methodsSplit) { - return getStringFieldValueGetter(invokers.toArray(new Invoker[0]), methodsSplit); - } - - /** - * 获取一个没有多余字符的字符串字段值获取器 - * - * @param invokers 方法执行者数组 - * @return - */ - protected FieldValueGetter getStringFieldValueGetter(Invoker[] invokers) { - return getStringFieldValueGetter(invokers, null); - } - - /** - * 获取一个没有多余字符的字符串字段值获取器 - * - * @param invokers 方法执行者集合 - * @return - */ - protected FieldValueGetter getStringFieldValueGetter(List invokers) { - return getStringFieldValueGetter(invokers, null); - } - - - - - /* ————————————————————— 构造 ————————————————————— */ - - - /** - * 构造 - * - * @param objectClass - * @param fieldName - * @param intervalStr - */ - public BaseFieldParser(Class objectClass, String fieldName, String intervalStr) { - //保存数据 - this.objectClass = objectClass; - this.fieldName = fieldName; - //获取此字段的数据类型 - if(objectClass != null) - this.fieldClass = FieldUtils.fieldClassGetter(objectClass, fieldName); - else - this.fieldClass = Object.class; - //解析区间参数,如果有的话 - if (intervalStr != null) { - //切割,看看有没有小数位的区间 - //期望中,切割后长度最多为2 - String[] split = intervalStr.split("\\."); - //整数位的区间参数 - String integerInterval = split[0].trim(); - if (integerInterval.length() > 0) { - //如果不是空的,切割并记录 - String[] splitIntInterval = integerInterval.split("-"); - this.intervalMin = Integer.parseInt(splitIntInterval[0]); - this.intervalMax = splitIntInterval.length > 1 ? Integer.parseInt(splitIntInterval[1]) : null; - } else { - //否则赋值为空 - this.intervalMin = null; - this.intervalMax = null; - } - - //如果切割后的长度不只一个,说明还有小数位数 - if (split.length > 1) { - String doubleInterval = split[1].trim(); - //如果小数位上有值 - if (doubleInterval.length() > 0) { - //如果不是空的,切割并记录 - String[] splitDouInterval = doubleInterval.split("\\-"); - this.intervalDoubleMin = Integer.parseInt(splitDouInterval[0]); - this.intervalDoubleMax = splitDouInterval.length > 1 ? Integer.parseInt(splitDouInterval[1]) : null; - } else { - //否则赋空值 - this.intervalDoubleMax = null; - this.intervalDoubleMin = null; - } - - } else { - //如果没有,赋值为空 - this.intervalDoubleMax = null; - this.intervalDoubleMin = null; - } - } else { - //如果没有,赋值为null - this.intervalMin = null; - this.intervalMax = null; - this.intervalDoubleMax = null; - this.intervalDoubleMin = null; - - } - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/DoubleParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/DoubleParser.java deleted file mode 100644 index ceec839..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/DoubleParser.java +++ /dev/null @@ -1,203 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; - -import java.util.ArrayList; -import java.util.Optional; - -/** - * double浮点型参数解析器,相对与指令解析,此解析器相对比较简单。 - * 只需要根据区间参数,获取相应的浮点数即可 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class DoubleParser extends BaseFieldParser { - - /** - * 参数默认值,唯一构造保证其必定有值,默认值为0 - */ - private Double defaultValue; - - - /** - * 当字段既不是集合也不是数组的时候,获取其字段值获取器 - * 字段不是集合也不是数组,且参数为Double,则说明区间参数代表了取值的随机范围 - * @return - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - //判断小数部分的区间参数 - 将参数判断提为单独的方法 - //获取区间参数 - Integer[][] intervalsData = getIntervalsData(); - - //整理结果 - //map 转化为doubleFieldValueGetter - //如果为空则返回默认值 - return Optional.ofNullable(intervalsData).map(i -> (FieldValueGetter) getDoubleFieldValueGetter(i[0][0], i[0][1], i[1][0], i[1][1])).orElse(() -> defaultValue); - } - - /** - * 当字段是集合的时候,获取其字段值获取器 - * 如果字段为集合,参数为Double - * 则区间参数变为List集合的长度区间,将无视小数部分的区间参数 - * list集合的值为默认值 - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //获取整数区间参数 - Integer[] intIntervalsData = getIntIntervalsData(); - //将区间参数转化为list集合,如果不存在则返回空集合 - //并返回 - return Optional.ofNullable(intIntervalsData).map(i -> { - //获取空值执行者 - Invoker nullMethodInvoker = MethodUtil.createNullMethodInvoker(defaultValue); - //返回结果 - return getListFieldValueGetter(new Invoker[]{nullMethodInvoker}, i); - }).orElse(ArrayList::new); - } - - /** - * 当字段是数组的时候,获取其字段值获取器 - * 如果字段为数组,参数为Double - * 则区间参数变为数组的长度区间,将无视小数部分的区间参数 - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - // 将结果整合为数组并返回 - // 方式类似于List集合,仅返回值不同 - //获取整数区间参数 - Integer[] intIntervalsData = getIntIntervalsData(); - //将区间参数转化为数组,如果不存在则返回空集合 - //并返回 - return Optional.ofNullable(intIntervalsData).map(i -> { - //获取空值执行者 - Invoker nullMethodInvoker = MethodUtil.createNullMethodInvoker(defaultValue); - //返回结果 - return getArrayFieldValueGetter(new Invoker[]{nullMethodInvoker}, i); - }).orElse(() -> new Double[0]); - } - - - - /** - * 仅获取整数部分的区间参数 - * @return - */ - private Integer[] getIntIntervalsData(){ - //准备数组 - Integer[] intIntervals = new Integer[2]; - - //初始化参数 - Integer intIntervalMin = intervalMin; - Integer intIntervalMax = intervalMax; - - //判断整数位区间参数 - if (intIntervalMin == null) { - //如果没有整数位左参数,判断是否存在右参数 - if (intIntervalMax == null) { - // 如果右参数也为null,直接返回null - return null; - } else { - // 有右参数,没有左参数,按照仅有左参数处理 - intIntervalMin = intIntervalMax; - } - }else{ - //有左参数,判断是否有右参数 - if (intIntervalMax == null) { - // 如果右参数为null,赋值为左参数 - intIntervalMax = intIntervalMin; - } - // 两参数都有,不变 - } - - //为结果赋值 - intIntervals[0] = intIntervalMin; - intIntervals[1] = intIntervalMax; - - //返回结果 - return intIntervals; - } - - /** - * 仅获取小数部分的区间参数 - * @return - */ - private Integer[] getDoubleIntervalsData(){ - //准备结果 - Integer[] doubleIntervals = new Integer[2]; - - //初始化参数 - Integer doubleIntervalMin = intervalDoubleMin; - Integer doubleIntervalMax = intervalDoubleMax; - - //判断小数区间,判断方法与整数基本一致 - if (doubleIntervalMin == null) { - //如果没有整数位左参数,判断是否存在右参数 - if (doubleIntervalMax == null) { - // 如果右参数也为null,将两值设为0 - doubleIntervalMin = doubleIntervalMax = 0; - } else { - // 有右参数,没有左参数,按照仅有左参数处理 - doubleIntervalMin = doubleIntervalMax; - } - }else{ - //有左参数,判断是否有右参数 - if (doubleIntervalMax == null) { - // 如果右参数为null,赋值为左参数 - doubleIntervalMax = doubleIntervalMin; - } - // 两参数都有,不变 - } - - - //为结果赋值 - doubleIntervals[0] = doubleIntervalMin; - doubleIntervals[1] = doubleIntervalMax; - - //返回结果 - return doubleIntervals; - } - - /** - * 获取转化好的区间参数 - * @return - * 返回值为二维数组, - * 索引0为整数部分的区间数组,[0]为左参数,[1]为右参数 - * 索引1为小数部分的区间参数,[0]为左参数,[1]为右参数 - * 假如返回值为null,则说明没有区间参数 - */ - private Integer[][] getIntervalsData(){ - Integer[] intIntervals = new Integer[2]; - Integer[] doubleIntervals = new Integer[2]; - Integer[][] intervals = new Integer[][]{intIntervals , doubleIntervals}; - - //获取整数部分区间参数,可能为null - Integer[] intIntervalsData = getIntIntervalsData(); - //获取小数部分区间参数,此结果必定不为null - Integer[] doubleIntervalsData = getDoubleIntervalsData(); - - //返回结果 - //如果整数区间不为null,则转化为二维数组并返回,如果不存在则返回null - return Optional.ofNullable(intIntervalsData).map(i -> new Integer[][]{intIntervalsData , doubleIntervalsData}).orElse(null); - } - - - /** - * 构造 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param defaultValue 默认值,如果未null则默认为0 - */ - public DoubleParser(Class objectClass, String fieldName, String intervalStr, Double defaultValue) { - super(objectClass, fieldName, intervalStr); - //如果默认值为null,赋值为0 - this.defaultValue = defaultValue == null ? 0 : defaultValue; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/FieldParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/FieldParser.java deleted file mode 100644 index 7169d5b..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/FieldParser.java +++ /dev/null @@ -1,42 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.mockbean.MockField; - -/** - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public interface FieldParser { - - /** - * 若字段不是List集合时的解析方法 - * - * @return 字段值获取器 - */ - FieldValueGetter parserForNotListOrArrayFieldValueGetter(); - - - /** - * 若字段是集合时的解析方法 - * - * @return 字段值获取器 - */ - FieldValueGetter parserForListFieldValueGetter(); - - - /** - * 若字段是数组时的解析方法 - * - * @return 字段值获取器 - */ - FieldValueGetter parserForArrayFieldValueGetter(); - - - /** - * 获取一个假字段对象,为参数解析器准备 - * @return - */ - MockField getMockField(); - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/InstructionParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/InstructionParser.java deleted file mode 100644 index 2322803..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/InstructionParser.java +++ /dev/null @@ -1,241 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.Mock; -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.mockbean.MockObject; -import com.forte.util.utils.MethodUtil; - -import java.util.List; -import java.util.Map; - -/** - * 指令字段解析器 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class InstructionParser extends BaseFieldParser { - - /** - * 指令的字符串 - */ - private final String instructionStr; - - - /** - * 当字段是数组的时候,进行解析 - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //字段是数组类型时的解析方法 - //字段值获取器 - FieldValueGetter fieldValueGetter; - - // 如果指令参数开头为井号'#',则先判断其为一个已经存在的类型名称 - if(instructionStr.startsWith("#")){ - final MockObject mockObject = Mock.get(instructionStr.substring(1)); - if(mockObject != null){ - fieldValueGetter = getArrayFieldValueGetter(new Invoker[]{mockObject::getOne}); - return fieldValueGetter; - } - } - - //解析指令,查找指令中的@方法 - //先判断是否有匹配的方法 - boolean match = match(instructionStr); - if (match) { - //如果指令中有方法 - //解析出方法名 - String[] methods = getMethods(instructionStr); - //解析方法并获取方法执行者 - List invoker = getMethodInvoker(methods); - //获取多余字符 - String[] methodsSplit = getMethodsSplit(instructionStr); - //获取list类型字段值获取器 - fieldValueGetter = getArrayFieldValueGetter(invoker, methodsSplit); - - } else { - //如果没有@方法,则说明list集合是String集合或者是可以使用eval函数执行的js代码 - //则创建一个空执行者,将参数作为其输出值 - Invoker nullInvoker = MethodUtil.createNullMethodInvoker(instructionStr); - - //创建字段值获取器 - fieldValueGetter = getArrayFieldValueGetter(new Invoker[]{nullInvoker}); - } - - //返回字段值获取器 - return fieldValueGetter; - } - - /** - * 当字段是list集合的时候进行解析 - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //字段是数组类型时的解析方法 - //字段值获取器 - FieldValueGetter fieldValueGetter; - - // 如果指令参数开头为井号'#',则先判断其为一个已经存在的类型名称 - if(instructionStr.startsWith("#")){ - final MockObject mockObject = Mock.get(instructionStr.substring(1)); - if(mockObject != null){ - fieldValueGetter = getListFieldValueGetter(new Invoker[]{mockObject::getOne}); - return fieldValueGetter; - } - } - - //解析指令,查找指令中的@方法 - //先判断是否有匹配的方法 - boolean match = match(instructionStr); - if (match) { - //如果指令中有方法 - //解析出方法名 - String[] methods = getMethods(instructionStr); - //解析方法并获取方法执行者 - List invoker = getMethodInvoker(methods); - //获取多余字符 - String[] methodsSplit = getMethodsSplit(instructionStr); - //获取list类型字段值获取器 - fieldValueGetter = getListFieldValueGetter(invoker, methodsSplit); - - } else { - //如果没有@方法,则说明list集合是String集合或者是可以使用eval函数执行的js代码 - //则创建一个空执行者,将参数作为其输出值 - Invoker nullInvoker = MethodUtil.createNullMethodInvoker(instructionStr); - - //创建字段值获取器 - fieldValueGetter = getListFieldValueGetter(new Invoker[]{nullInvoker}); - } - - - //返回字符值获取器 - return fieldValueGetter; - } - - - /** - * 当字段既不是数组又不是集合的时候,进行解析 - * - * @return 字段值获取器 - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - /* - 假如字段类型为Object类型且存在区间参数,则认为这是一个需要转化为集合的类型,即认为字段类型为List类型,直接使用List字段值生成器 - 区间参数只要存在左参数即为存在 - */ - boolean isObjectToList = this.fieldClass.equals(Object.class) && (intervalMin != null); - if(isObjectToList){ - return this.parserForListFieldValueGetter(); - } - - - //字段值获取器 - FieldValueGetter fieldValueGetter; - - // 如果指令参数开头为井号'#',则先判断其为一个已经存在的类型名称 - if(instructionStr.startsWith("#")){ - final MockObject mockObject = Mock.get(instructionStr.substring(1)); - if(mockObject != null){ - fieldValueGetter = mockObject::getOne; - return fieldValueGetter; - } - } - - //解析指令,查找指令中的@方法 - //先判断是否有匹配的方法 - boolean match = match(instructionStr); - if (match) { - //如果存在指令方法 - //解析出方法名 - String[] methods = getMethods(instructionStr); - //如果有方法,解析方法,解析参数-由于做过判断,所以此处必然有方法 - - //解析方法并获取方法执行者 - List invoker = getMethodInvoker(methods); - - //获取多余字符 - String[] methodsSplit = getMethodsSplit(instructionStr); - //如果参数多余字符不为0,则参数类型必定为String且methodsSplit[]的长度必定为methods[]的长度+1或与methods[]的长度相等 - //则必定为字符串字段 - //有指令方法的时候,如果有区间参数,对字符串的最终输出进行重复 - if (methodsSplit.length > 0) { - //获取 StringFieldValueGetter字段值获取器,如果有整数位的区间参数,添加参数 - //使用三元运算判断 - fieldValueGetter = getStringFieldValueGetter(invoker, methodsSplit); - } else { - //如果没有多余字符,则字段可能不是字符串类型, - //判断字段的数据类型 - if (fieldClass.equals(String.class)) { - //如果是String类型的,使用StringFieldValueGetter字段值获取器 - fieldValueGetter = getStringFieldValueGetter(invoker); - } else { - //如果字段类型不是String,则不能用StringFieldValueGetter字段值获取器了 - //使用ObjectFieldValueGetter,不指定参数获取值 - fieldValueGetter = getObjectFieldValueGetter(invoker); - } - } - } else { - //如果没有能够匹配的@方法,则说明指令部分就是普通的字符串,创建一个方法执行者为空值的未知类型字段值获取器: ObjectFieldValueGetter - Integer[] intervalData = getIntervalData(); - //判断区间参数是否存在 - if(intervalData == null){ - //因为是没有@方法的普通字符串,没有多余字符,使用Object类型的字段值获取器即可 - fieldValueGetter = getObjectFieldValueGetter(MethodUtil.createNullMethodInvoker(instructionStr)); - }else{ - //有区间参数,获取一个对字符串重复输出的Invoker - fieldValueGetter = getStringFieldValueGetter(new Invoker[]{MethodUtil.createNullMethodInvoker(instructionStr)}); - } - - - } - //返回结果 - return fieldValueGetter; - } - - - /** - * 获取区间参数区间,如果没有区间参数则返回null - * @return - */ - private Integer[] getIntervalData(){ - //获取参数 - Integer min = intervalMin; - Integer max = intervalMax; - - //判断区间参数 - if(min == null){ - //如果没左参数 - if(max == null){ - //如果右参数也没有,直接返回一个[1,1]的区间 - return null; - }else{ - //如果有右参数,参数同化 - min = max; - } - }else{ - //有左参数,判断右参数 - if(max == null){ - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min ,max}; - } - - /** - * 构造 - * - * @param objectClass 类的class对象 - * @param fieldName 字段名称 - * @param intervalStr 区间参数字符串 - * @param instructionStr 指令字符串 - */ - public InstructionParser(Class objectClass, String fieldName, String intervalStr, String instructionStr) { - super(objectClass, fieldName, intervalStr); - this.instructionStr = instructionStr; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/IntegerParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/IntegerParser.java deleted file mode 100644 index 1a26182..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/IntegerParser.java +++ /dev/null @@ -1,128 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; - -import java.util.ArrayList; -import java.util.Optional; - -/** - * Integer整数类型字段解析器 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class IntegerParser extends BaseFieldParser { - - /** 默认值 */ - private Integer defaultValue; - - - /** - * 当字段类型既不是list集合也不是数组的时候,获取字段值获取器 - * @return - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - //既不是list也不是数组,则结果为一个整数 - //获取区间参数并获取随机整数的方法执行者 - //判断区间参数 - if(intervalMin == null){ - //如果没有左参数,判断是否有右参数 - if(intervalMax == null){ - //如果右参数也为0,则直接返回默认值字段值获取器 -> 使用Lambda表达式 - return () -> defaultValue; - }else{ - //如果有右参数,将有参数视为唯一参数,获取字段值获取器 - return getIntegerFieldValueGetter(intervalMax); - } - }else{ - //如果有左参数,判断有没有右参数 - if(intervalMax == null){ - //如果没有右参数,则将左参数视为唯一参数,获取字段值获取器 - return getIntegerFieldValueGetter(intervalMin); - }else{ - //如果两参数都有,获取字段值获取器 - return getIntegerFieldValueGetter(intervalMin, intervalMax); - } - } - } - - - /** - * 当字段值是List集合时,获取字段值获取器 - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //字段值是List集合形式,则区间参数为list集合的长度,输出区间内数量的默认值 - //获取一个默认值方法执行者 - Invoker nullMethodInvoker = MethodUtil.createNullMethodInvoker(defaultValue); - //获取区间参数数组 - Integer[] interValData = getInterValData(); - //如果区间参数不为null,获取List字段值获取器,否则获取一个空的集合字段值获取器 - return Optional.ofNullable(interValData).map(i -> getListFieldValueGetter(new Invoker[]{nullMethodInvoker}, i)).orElse(ArrayList::new); - } - - - /** - * 当字段值是数组时,获取字段值获取器 - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //字段值是数组形式,则区间参数为数组的长度,输出区间内数量的默认值 - //获取一个默认值方法执行者 - Invoker nullMethodInvoker = MethodUtil.createNullMethodInvoker(defaultValue); - //获取区间参数数组 - Integer[] interValData = getInterValData(); - //如果区间参数不为null,获取数组字段值获取器,否则获取一个空的数组字段值获取器 - return Optional.ofNullable(interValData).map(i -> getArrayFieldValueGetter(new Invoker[]{nullMethodInvoker}, i)).orElse(() -> new Integer[0]); - } - - - /** - * 获取区间参数数组,用于获取List或数组类型的字段值获取器 - * 长度为2,分别代表最小值,最大值 - * @return - */ - private Integer[] getInterValData(){ - Integer min = intervalMin; - Integer max = intervalMax; - //判断区间参数 - if(min == null){ - //如果没有左参数,判断是否有右参数 - if(max == null){ - //如果右参数也为0,则直接返回null - return null; - }else{ - //如果有右参数,区间参数同化 - min = max; - } - }else{ - //如果有左参数,判断有没有右参数 - if(max == null){ - //如果没有右参数,区间参数同化 - max = min; - } - //如果两参数都有,参数不变 - } - - //返回区间参数数组 - return new Integer[]{min, max}; - } - - - /** - * 构造 - * - * @param objectClass - * @param fieldName - * @param intervalStr - */ - public IntegerParser(Class objectClass, String fieldName, String intervalStr, Integer defaultValue) { - super(objectClass, fieldName, intervalStr); - //默认值赋值 - this.defaultValue = defaultValue; - - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ListParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ListParser.java deleted file mode 100644 index dea26cc..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ListParser.java +++ /dev/null @@ -1,128 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.utils.MethodUtil; - -import java.util.ArrayList; -import java.util.List; -import java.util.Optional; - -/** - * List类型字段解析器 - * 解析方式与过程大致与{@link ArraysParser}相同 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class ListParser extends BaseFieldParser { - - - /** 参数传入的数组 */ - private final List defaultList; - - /** - * 字段类型既不是list集合也不是数组的时候 - * 说明从默认集合中随机获取一个并返回 - * @return - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - //创建一个集合元素获取执行者 - Invoker invoker = MethodUtil.createListElementInvoker(defaultList); - return getObjectFieldValueGetter(invoker); - } - - /** - * 字段类型是list集合的时候 - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //转化并返回结果 - return getListFieldValueGetter(); - } - - /** - * 字段类型是数组的时候 - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //转化并返回结果 - return getArrayFieldValueGetter(); - } - - /** - * 获取数组字段值获取器 - * @return - */ - private FieldValueGetter getArrayFieldValueGetter(){ - //获取随机元素值执行者 - Invoker invoker = MethodUtil.createListElementInvoker(defaultList); - //因为区间不可能为null,直接转化并返回 - return Optional.of(getIntervalData()).map(i -> { - //如果有区间参数,根据区间参数获取字段值获取器 - return getArrayFieldValueGetter(new Invoker[]{invoker}, i); - }).get(); - } - - /** - * 获取集合字段值获取器 - * @return - */ - private FieldValueGetter getListFieldValueGetter(){ - //获取随机元素值执行者 - Invoker invoker = MethodUtil.createListElementInvoker(defaultList); - //因为区间不可能为null,直接转化并返回 - return Optional.of(getIntervalData()).map(i -> { - //如果有区间参数,根据区间参数获取字段值获取器 - return getListFieldValueGetter(new Invoker[]{invoker}, i); - }).get(); - } - - - - /** - * 获取区间参数区间,如果没有区间参数则返回区间[1,1] - * @return - */ - private Integer[] getIntervalData(){ - //获取参数 - Integer min = intervalMin; - Integer max = intervalMax; - - //判断区间参数 - if(min == null){ - //如果没左参数 - if(max == null){ - //如果右参数也没有,直接返回一个[1,1]的区间 - return new Integer[]{1,1}; - }else{ - //如果有右参数,参数同化 - min = max; - } - }else{ - //有左参数,判断右参数 - if(max == null){ - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min ,max}; - } - - - /** - * 构造 - * @param objectClass - * @param fieldName - * @param intervalStr - */ - public ListParser(Class objectClass, String fieldName, String intervalStr, List defaultList) { - super(objectClass, fieldName, intervalStr); - //参数集合,复制一份而并非使用原来的 ->浅拷贝 - this.defaultList = new ArrayList<>(defaultList); - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/MockObjectParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/MockObjectParser.java deleted file mode 100644 index ec3097b..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/MockObjectParser.java +++ /dev/null @@ -1,154 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.mockbean.MockObject; - -import java.util.concurrent.atomic.AtomicReference; -import java.util.function.Supplier; - -/** - * Mock数据类型字段解析器 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class MockObjectParser extends BaseFieldParser { - - - /** - * 参数中的默认值 - */ - private Supplier> defaultValueSupplier; - - private AtomicReference> fieldValueGetter = new AtomicReference<>(null); - - /** - * 当字段类型既不是集合也不是数组的时候 - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - // 如果字段类型为Object且存在左区间参数,则认为这是一个List类型 - boolean isList = this.fieldClass.equals(Object.class) && (intervalMin != null); - - if (isList) { - return parserForListFieldValueGetter(); - } - //直接获取一个默认值参数获取器 - return () -> getObjectValue().get(); - } - - /** - * 如果字段类型是集合类型 - * - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //使用集合字段类型获取器 - return getListFieldValueGetter(new Invoker[]{() -> getObjectValue().get()}, getIntervalData(), getIntervalDoubleData()); - } - - /** - * 如果字段类型是数组类型 - * - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //使用集合字段类型获取器 - return getArrayFieldValueGetter(new Invoker[]{() -> getObjectValue().get()}, getIntervalData()); - } - - /** - * 得到获取objectValue的函数。 - * 初始化后,会决定最终是通过Object获取还是MockObject获取。 - */ - private Supplier getObjectValue(){ - return fieldValueGetter.updateAndGet(old -> { - if (old != null) { - return old; - } - final MockObject mockObject = defaultValueSupplier.get(); - if (mockObject == null) { - return () -> defaultValueSupplier; - } else { - return mockObject::getOne; - } - }); - } - - /** - * 获取区间参数区间,如果没有区间参数则返回区间[1,1] - */ - private Integer[] getIntervalData() { - //获取参数 - Integer min = intervalMin; - Integer max = intervalMax; - - //判断区间参数 - if (min == null) { - //如果没左参数 - if (max == null) { - //如果右参数也没有,直接返回一个[1,1]的区间 - return new Integer[]{1, 1}; - } else { - //如果有右参数,参数同化 - min = max; - } - } else { - //有左参数,判断右参数 - if (max == null) { - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min, max}; - } - - /** - * 获取小数区间参数区间,如果没有区间参数则返回区间[1,1] - */ - private Integer[] getIntervalDoubleData() { - //获取参数 - Integer min = intervalDoubleMin; - Integer max = intervalDoubleMax; - - //判断区间参数 - if (min == null) { - //如果没左参数 - if (max == null) { - //如果右参数也没有,直接返回一个[1,1]的区间 - return new Integer[]{1, 1}; - } else { - //如果有右参数,参数同化 - min = max; - } - } else { - //有左参数,判断右参数 - if (max == null) { - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min, max}; - } - - - /** - * 构造 - * - * @param objectClass 最终的mock对象类型 - * @param fieldName 字段名称 - * @param intervalStr 区间参数列表 - */ - public MockObjectParser(Class objectClass, String fieldName, String intervalStr, Supplier> mockObjectSupplier) { - super(objectClass, fieldName, intervalStr); - //赋值 - this.defaultValueSupplier = mockObjectSupplier; - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ObjectParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ObjectParser.java deleted file mode 100644 index 7e78f66..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ObjectParser.java +++ /dev/null @@ -1,99 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.invoker.Invoker; - -/** - * 引用数据类型字段解析器 - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -class ObjectParser extends BaseFieldParser { - - - /** 参数中的默认值 */ - private Object defaultValue; - - /** - * 当字段类型既不是集合也不是数组的时候 - * @return - */ - @Override - public FieldValueGetter parserForNotListOrArrayFieldValueGetter() { - // 如果字段类型为Object且存在左区间参数,则认为这是一个List类型 - boolean isList = this.fieldClass.equals(Object.class) && (intervalMin != null); - - if(isList){ - return parserForListFieldValueGetter(); - } - //直接获取一个默认值参数获取器 - return () -> defaultValue; - } - - /** - * 如果字段类型是集合类型 - * @return - */ - @Override - public FieldValueGetter parserForListFieldValueGetter() { - //使用集合字段类型获取器 - return getListFieldValueGetter(new Invoker[]{() -> defaultValue} , getIntervalData()); - } - - /** - * 如果字段类型是数组类型 - * @return - */ - @Override - public FieldValueGetter parserForArrayFieldValueGetter() { - //使用集合字段类型获取器 - return getArrayFieldValueGetter(new Invoker[]{() -> defaultValue} , getIntervalData()); - } - - - /** - * 获取区间参数区间,如果没有区间参数则返回区间[1,1] - * @return - */ - private Integer[] getIntervalData(){ - //获取参数 - Integer min = intervalMin; - Integer max = intervalMax; - - //判断区间参数 - if(min == null){ - //如果没左参数 - if(max == null){ - //如果右参数也没有,直接返回一个[1,1]的区间 - return new Integer[]{1,1}; - }else{ - //如果有右参数,参数同化 - min = max; - } - }else{ - //有左参数,判断右参数 - if(max == null){ - //没有右参数,同化 - max = min; - } - //否则都有,不变 - } - //返回结果 - return new Integer[]{min ,max}; - } - - - /** - * 构造 - * - * @param objectClass - * @param fieldName - * @param intervalStr - */ - public ObjectParser(Class objectClass, String fieldName, String intervalStr, Object defaultValue) { - super(objectClass, fieldName, intervalStr); - //赋值 - this.defaultValue = defaultValue; - - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ParameterParser.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ParameterParser.java deleted file mode 100644 index a28d6a1..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/parser/ParameterParser.java +++ /dev/null @@ -1,633 +0,0 @@ -package com.forte.util.parser; - -import com.forte.util.Mock; -import com.forte.util.factory.MockBeanFactory; -import com.forte.util.fieldvaluegetter.ArrayFieldValueGetter; -import com.forte.util.fieldvaluegetter.FieldValueGetter; -import com.forte.util.fieldvaluegetter.ListFieldValueGetter; -import com.forte.util.invoker.Invoker; -import com.forte.util.mockbean.ConstMockField; -import com.forte.util.mockbean.MockBean; -import com.forte.util.mockbean.MockField; -import com.forte.util.mockbean.MockMapBean; -import com.forte.util.utils.FieldUtils; - -import java.util.*; - -/** - * 参数解析器,用于解析用户填入的参数语法 - * 解析包({@link com.forte.util.parser})下唯一公共接口,为{@link Mock}解析用户参数 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class ParameterParser { - -// /** -// * MockUtil的方法集 -// */ -// @Deprecated -// private static final Map MOCK_METHOD = Mock._getMockMethod(); - - /* ———— 目前预期类型 ———— - * String / Double / Integer / Map / Object / array&list - */ - - /** - * TODO 注册各种解析器,不再使用switch分配解析器 - */ - // private static final Map, TypeParse> TYPE_PARSE_MAP = new HashMap<>(4); - - /* 内部使用的各类型的常量,用于switch语句,为参数分配解析器 */ - - private static final int TYPE_STRING = 0; - private static final int TYPE_DOUBLE = 1; - private static final int TYPE_INTEGER = 2; - private static final int TYPE_MAP = 3; - private static final int TYPE_OBJECT = 4; - private static final int TYPE_LIST = 5; - private static final int TYPE_ARRAY = 6; - - private static final int TYPE_CLASS = 7; - - /** 常量类型区间。 */ - private static final String CONST_INTERVAL = "const"; - - - /** - * 对参数进行解析-普通类型 - * - * @param objectClass 需要进行假数据封装的类对象 - * @param paramMap 参数集合 - */ - public static MockBean parser(Class objectClass, Map paramMap) { - //使用线程安全list集合 - List fields = new ArrayList<>(); - - Map copyParamsMap = new HashMap<>(paramMap); - - //单程遍历并解析 - copyParamsMap.forEach((key, value) -> { - //解析 - //切割名称,检测是否有区间函数 - String[] split = key.split("\\|"); - //字段名 - String fieldName = split[0]; - //区间参数字符串 - String intervalStr = split.length > 1 ? split[1] : null; - //如果对象不是Map类型且对象中不存在此字段,不进行解析 - if (FieldUtils.isFieldExist(objectClass, fieldName)) { - parser(objectClass, fieldName, intervalStr, value, fields); - } - }); - - - //解析结束,封装MockObject对象 - return getMockBean(objectClass, fields); - } - - /** - * 对参数进行解析-map类型 - * - * @param paramMap 参数集合 - */ - public static MockMapBean parser(Map paramMap) { - List fields = new ArrayList<>(); - - //遍历并解析-(多线程同步) - //如果是.entrySet().parallelStream().forEach的话,似乎会出现一个迷之bug - //如果结果没有任何输出语句打印控制台,会报NullPointer的错 - //已解决,需要使fields这个集合成为线程安全的集合 - /* - 2019/10/18 - 不是干什么都是多线程是好的的,所以遍历不在使用多线程遍历了~ - */ - Set> entries = paramMap.entrySet(); - for (Map.Entry entry : entries) { - String key = entry.getKey(); - Object value = entry.getValue(); - //解析 - //切割名称,检测是否有区间函数 - String[] split = key.split("\\|"); - //字段名 - String fieldName = split[0]; - //区间参数字符串 - String intervalStr = split.length > 1 ? split[1] : null; - //进行解析 - parser(null, fieldName, intervalStr, value, fields); - } - - //解析结束,封装MockObject对象 - return getMockMapBean(fields); - } - - - /** - * 解析 - * - * @param objectClass 封装类型 - * @param fieldName 字段名称 - * @param intervalStr 区间字符串。可能是 const 或者数值区间。 - * @param value 参数 - * @param fields 保存字段用的list - */ - private static void parser(Class objectClass, String fieldName, String intervalStr, Object value, List fields) { - /* - 判断参数类型 - 预期类型: - String / Double / Integer / Map> / object / array 。。。。。。 - */ - //准备字段解析器 - //准备假字段对象 - MockField mockField = null; - - if(intervalStr != null && intervalStr.equalsIgnoreCase(CONST_INTERVAL)) { - // 区间参数为 常量类型,直接使用ConstMockField - mockField = new ConstMockField<>(objectClass, fieldName, value, value.getClass()); - - } else { - int typeNum = typeReferee(value); - //根据字段参数分配解析器 - switch (typeNum) { - case TYPE_STRING: - //是字符串,使用指令解析器 - //获取假字段封装类 - mockField = stringTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_DOUBLE: - //是Double的浮点型,使用double浮点解析器 - //获取假字段封装类 - mockField = doubleTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_INTEGER: - //整数解析并获取假字段封装类 - mockField = integerTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_OBJECT: - //使用解析器,如果字段类型是集合或数组要重复输出 - mockField = objectTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_MAP: - //如果是一个Map集合,说明这个字段映射着另一个假对象 - //这个Map集合对应的映射类型应该必然是此字段的类型 - //获取假字段对象 - mockField = mapTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_ARRAY: - //如果value是数组类型,使用数组类型解析器进行解析 - mockField = arrayTypeParse(objectClass, fieldName, intervalStr, value); - break; - case TYPE_LIST: - //如果value是list集合类型,使用集合类型解析器解析 - mockField = listTypeParse(objectClass, fieldName, intervalStr, value); - break; - - case TYPE_CLASS: - // 如果value是class类型的 - mockField = classTypeParse(objectClass, fieldName, intervalStr, value); - break; - default: - System.out.println("无法解析映射类[ " + objectClass + " ]中的字段:" + fieldName); - break; - } - } - - - //添加假字段对象 - fields.add(mockField); - - } - - - - /* —————————————————————— 各个情况的解析方法 —————————————————— */ - - /** - * 字符串类型参数解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField stringTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //是字符串,使用指令解析器 - FieldParser fieldParser = new InstructionParser(objectClass, fieldName, intervalStr, (String) value); - //获取假字段封装类 - return fieldParser.getMockField(); - } - - /** - * double浮点类型解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField doubleTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //是Double的浮点型,使用double浮点解析器 - FieldParser fieldParser = new DoubleParser(objectClass, fieldName, intervalStr, (Double) value); - //获取假字段封装类 - return fieldParser.getMockField(); - } - - /** - * 整数类型解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField integerTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //准备字段解析器 - FieldParser fieldParser; - //如果是整数参数,判断区间参数是否有小数区间 - //如果有区间参数,进行判断 - if (intervalStr != null) { - String[] intervalSplit = intervalStr.split("\\."); - if (intervalSplit.length > 1) { - //如果切割'.'之后长度大于1,则说明有小数位数,使用浮点数解析器 - fieldParser = new DoubleParser(objectClass, fieldName, intervalStr, ((Integer) value) * 1.0); - } else { - //如果长度不大于1,则说明没有小数位数,使用整形解析器 - fieldParser = new IntegerParser(objectClass, fieldName, intervalStr, (Integer) value); - } - } else { - //如果没有区间参数,直接使用整数解析器(此处的intervalStr必定为null) - fieldParser = new IntegerParser(objectClass, fieldName, null, (Integer) value); - } - - //获取假字段封装类 - return fieldParser.getMockField(); - } - - - /** - * 未知的引用数据类型解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField objectTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { -// ObjectParser objectParser = new ObjectParser(objectClass, fieldName, intervalStr, value); - //返回假字段对象 - return new ObjectParser(objectClass, fieldName, intervalStr, value).getMockField(); - } - - - /** - * Map集合类型解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField mapTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - - - //解析区间字符串 - 只关心整数部分字符串 - //切割取整数位 - Integer intervalMin, intervalMax; - // 区间数组,左区间与右区间 - Integer[] integerInterval = new Integer[2]; - if (intervalStr != null) { - String integerIntervalStr = intervalStr.split("\\.")[0].trim(); - //如果不是空的,切割并记录 - String[] splitIntInterval = integerIntervalStr.split("-"); - intervalMin = Integer.parseInt(splitIntInterval[0]); - intervalMax = splitIntInterval.length > 1 ? Integer.parseInt(splitIntInterval[1]) : null; - integerInterval[0] = intervalMin; - integerInterval[1] = intervalMax == null ? intervalMin : intervalMax; - } else { -// intervalMin = intervalMax = null; - integerInterval = null; - } - - //如果是一个Map集合,说明这个字段映射着另一个假对象 - //也有可能只是一个普通的Map而不是映射关系 - //需要判断字段的类型,如果字段类型也是Map,则不进行映射解析而是转化为ObjectField - //这个Map集合对应的映射类型应当必然是此字段的类型 - //获取此字段的class类型 - Class fieldClass; - if (objectClass != null) { - fieldClass = FieldUtils.fieldClassGetter(objectClass, fieldName); - } else { - fieldClass = Object.class; - } - - - //判断类型 - if (FieldUtils.isChild(fieldClass, Map.class)) { - // 直接返回此对象作为假字段对象,不做处理 - return getDefaultObjectMockField(objectClass, fieldName, value, integerInterval, fieldClass); - } else if (FieldUtils.isChild(fieldClass, List.class) && FieldUtils.getListFieldGeneric(objectClass, fieldName).equals(Map.class)) { - //如果字段类型是List集合而且集合的泛型是Map类型,使用Object类型解析器 - ObjectParser objectParser = new ObjectParser(objectClass, fieldName, intervalStr, value); - return objectParser.getMockField(); - } else { - //将参数转化为Map类型 - Map fieldMap = (Map) value; - //如果字段不是Map类型 - //判断字段是否为list集合类型或数组类型 - if (FieldUtils.isChild(fieldClass, List.class)) { - //是list集合类型,获取集合的泛型类型 - Class fieldListGenericClass = FieldUtils.getListFieldGeneric(objectClass, fieldName); - //获取一个假对象 - //同时保存此对象的解析 - MockBean parser = Mock.setResult(fieldListGenericClass, fieldMap, true); - - FieldValueGetter fieldValueGetter = objectToListFieldValueGetter(parser, intervalStr); - return new MockField<>(objectClass, fieldName, fieldValueGetter, fieldClass); - } else if (fieldClass.isArray()) { - //是数组类型,获取数组的类型信息 - Class fieldArrayGeneric = FieldUtils.getArrayGeneric(fieldClass); - //获取一个假对象 -// MockBean parser = parser(fieldArrayGeneric, fieldMap); - //同时保存此对象的解析 - MockBean parser = Mock.setResult(fieldArrayGeneric, fieldMap, true); - - FieldValueGetter fieldValueGetter = objectToArrayFieldValueGetter(parser, intervalStr); - return new MockField<>(objectClass, fieldName, fieldValueGetter, fieldClass); - - } else { - // 如果字段类型为Object类型且存在区间参数,视为List类型处理 - if (fieldClass.equals(Object.class) && intervalStr != null) { - // 解析这个对象, 并作为Map对象 - MockMapBean mockBean = Mock.setResult("", fieldMap, true); - - FieldValueGetter fieldValueGetter = objectToListFieldValueGetter(mockBean, intervalStr); - return new MockField<>(objectClass, fieldName, fieldValueGetter, fieldClass); - } - - //得到一个假对象数据,封装为一个MockField -// MockBean parser = parser(fieldClass, fieldMap); - //同时保存此对象的解析 - if (objectClass == null) { - //如果为null,说明此为map类型对象的解析,则此处同样使用map类型的解析, result的名称使用"" - MockMapBean parser = Mock.setResult("", fieldMap, true); - return objectToField(null, fieldName, parser); - } else { - MockBean parser = Mock.setResult(fieldClass, fieldMap, true); - return objectToField(objectClass, fieldName, parser); - } - } - } - } - - - /** - * 数组类型参数解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField arrayTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //准备字段解析器 - FieldParser fieldParser; - //当参数为一个数组的时候,使用数组解析器 - fieldParser = new ArraysParser(objectClass, fieldName, intervalStr, (Object[]) value); - - //获取假字段封装类 - return fieldParser.getMockField(); - } - - /** - * 集合类型参数解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField listTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //准备字段解析器 - FieldParser fieldParser; - //如果参数是list集合类型的,使用list参数解析器 - fieldParser = new ListParser(objectClass, fieldName, intervalStr, (List) value); - - //获取假字段封装类 - return fieldParser.getMockField(); - } - - /** - * 集合类型参数解析 - * - * @param objectClass - * @param fieldName - * @param intervalStr - * @param value - * @return - */ - private static MockField classTypeParse(Class objectClass, String fieldName, String intervalStr, Object value) { - //准备字段解析器 - FieldParser fieldParser; - //如果参数是list集合类型的,使用list参数解析器 - fieldParser = new MockObjectParser(objectClass, fieldName, intervalStr, () -> Mock.get((Class) value)); - //获取假字段封装类 - return fieldParser.getMockField(); - } - - - /** - * 获取一个默认值假字段 - * - * @param fieldName 字段名 - * @param value 默认值 - * @param integerInterval 区间数组,如果存在的话 - * 如果存在区间函数,则使用ListFieldValueGetter进行构建 - * @return - */ - private static MockField getDefaultObjectMockField(Class objectClass, String fieldName, Object value, Integer[] integerInterval, Class fieldClass) { - if (integerInterval == null) { - return new MockField<>(objectClass, fieldName, () -> value, fieldClass); - } else { - return new MockField<>(objectClass, fieldName, new ListFieldValueGetter( - new Invoker[]{() -> value}, - integerInterval - ), fieldClass); - } - } - - /** - * 将一个假类对象封装为一个假字段对象 - * - * @param object - * @return - */ - private static MockField objectToField(Class objectClass, String fieldName, MockBean object) { - //使用lambda表达式,创建一个MOckField对象并返回 - return new MockField<>(objectClass, fieldName, object::getObject, object.getObjectClass()); - } - - - /** - * 将假字段对象转化为集合字段值获取器 - * @param object mockBean 对象 - * @param intervalStr 区间参数 - * @return - */ - private static FieldValueGetter objectToListFieldValueGetter(MockBean object, String intervalStr) { - //创建一个方法执行者 - Invoker invoker = object::getObject; - //获取区间参数 - Integer[] integers = intervalParse(intervalStr); - //获取集合字段值获取器 - return new ListFieldValueGetter(new Invoker[]{invoker}, integers); - } - - /** - * 将假字段对象转化为数组字段值获取器 - * - * @param object - * @param intervalStr - * @return - */ - private static FieldValueGetter objectToArrayFieldValueGetter(MockBean object, String intervalStr) { - //创建一个方法执行者 - Invoker invoker = object::getObject; - - //获取区间参数 - Integer[] integers = intervalParse(intervalStr); - //获取集合字段值获取器 - return new ArrayFieldValueGetter(new Invoker[]{invoker}, integers); - } - - /** - * 解析区间参数 - * - * @param intervalStr - * @return - */ - private static Integer[] intervalParse(String intervalStr) { - if (intervalStr == null) { - //如果没有区间参数,直接返回[1,1] - return new Integer[]{1, 1}; - } else { - //有区间参数,解析 - //切割,有可能有小数位的区间 - //期望中,切割后长度最多为2 - String[] split = intervalStr.split("\\."); - //整数位的区间参数 - String integerInterval = split[0].trim(); - if (integerInterval.length() > 0) { - //如果不是空的,切割 - String[] splitIntInterval = integerInterval.split("-"); - int intervalMin = Integer.parseInt(splitIntInterval[0]); - int intervalMax = splitIntInterval.length > 1 ? Integer.parseInt(splitIntInterval[1]) : 1; - return new Integer[]{intervalMin, intervalMax}; - } else { - //如果为空,返回[1,1] - return new Integer[]{1, 1}; - } - } - } - - - /** - * 获取一个MockBean - * - * @param - * @return - */ - private static MockBean getMockBean(Class objectObject, MockField[] fields) { - //返回封装结果 - return MockBeanFactory.createMockBean(objectObject, fields); - } - - /** - * 获取一个MockBean - */ - private static MockBean getMockBean(Class objectObject, List fields) { - //返回封装结果 - return getMockBean(objectObject, fields.toArray(new MockField[0])); - } - - /** - * 获取一个MockMapBean - * - * @param fields - * @return - */ - private static MockMapBean getMockMapBean(MockField[] fields) { - return MockBeanFactory.createMockMapBean(fields); - } - - - /** - * 获取一个MockMapBean - * - * @param fields - * @return - */ - private static MockMapBean getMockMapBean(List fields) { - return getMockMapBean(fields.toArray(new MockField[0])); - } - - /** - * 获取一个map类型封装对象 - */ - private static MockBean getMockMap(MockField[] fields) { - return new MockMapBean(fields); - } - - /** - * 获取一个map类型封装对象 - */ - private static MockBean getMockMap(List fields) { - return getMockMap(fields.toArray(new MockField[0])); - } - - - /** - * 判断这个类型在预期类型中是哪一个类型的 - * - * @return - */ - private static int typeReferee(Object object) { - //String 类型,属于指令 - if (object instanceof String) { - return TYPE_STRING; - } - //Integer 类型,属于整数 - if (object instanceof Integer) { - return TYPE_INTEGER; - } - //Double 类型,属于浮点数 - if (object instanceof Double) { - return TYPE_DOUBLE; - } - //Map类型,属于一个集合类或者对象类 - if (FieldUtils.isChild(object, Map.class)) { - return TYPE_MAP; - } - //List类型,可考虑将其转化为Array类型,减少工作量 - if (FieldUtils.isChild(object, List.class)) { - return TYPE_LIST; - } - //数组类型,属于数组类 - if (object.getClass().isArray()) { - return TYPE_ARRAY; - } - //Class类型,属于直接解析类型,除非解析不到 - if(object.getClass().equals(Class.class)){ - return TYPE_CLASS; - } - //其他情况,为一个未知的Object类型 - return TYPE_OBJECT; - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ChineseUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ChineseUtil.java deleted file mode 100644 index 2205819..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ChineseUtil.java +++ /dev/null @@ -1,214 +0,0 @@ -package com.forte.util.utils; - - -import java.nio.charset.Charset; -import java.util.concurrent.ThreadLocalRandom; - -/** - * 获取一个随机中文姓名 代码灵感来源于网络 讲道理,效果不是特别好 而且关于字符编码的转换也不确定处理的好 - * - * @author ForteScarlet - */ -public class ChineseUtil { - -// // 初始化百家姓数据 -// static { -// byte[][] end; -// try { -// // 获取字节文件输入流 -// InputStream inStream = ChineseUtil.class.getClassLoader().getResourceAsStream("mock/surnames"); -// BufferedReader reader = new BufferedReader(new InputStreamReader(inStream)); -// // 转化为byte数组 -// String[] lines = reader.lines().filter(l -> l.trim().length() > 0).toArray(String[]::new); -// byte[][] surnameBytes = new byte[lines.length][]; -// for (int index = 0; index < lines.length; index++) { -// String l = lines[index]; -// String[] split = l.replace(";", "").split(","); -// byte[] bytes = new byte[split.length]; -// for (int i = 0; i < split.length; i++) { -// bytes[i] = Byte.parseByte(split[i]); -// } -// surnameBytes[index] = bytes; -// String s = new String(bytes, StandardCharsets.UTF_8); -// } -// end = surnameBytes; -// } catch (Exception ignored) { -// end = null; -// } -// -// SURNAME_BYTES = end; -// } -// /** -// * 百家姓对应的字节 -// */ -// private static final byte[][] SURNAME_BYTES; - - private static final Charset GBK = Charset.forName("GBK"); - -// public static final Charset UTF_8 = StandardCharsets.UTF_8; - - /** - * 百家姓 - */ - private static String[] Surname = {"赵", "钱", "孙", "李", "周", "吴", "郑", "王", "冯", "陈", "褚", "卫", "蒋", "沈", "韩", "杨", - "朱", "秦", "尤", "许", "何", "吕", "施", "张", "孔", "曹", "严", "华", "金", "魏", "陶", "姜", "戚", "谢", "邹", "喻", "柏", - "水", "窦", "章", "云", "苏", "潘", "葛", "奚", "范", "彭", "郎", "鲁", "韦", "昌", "马", "苗", "凤", "花", "方", "俞", "任", - "袁", "柳", "酆", "鲍", "史", "唐", "费", "廉", "岑", "薛", "雷", "贺", "倪", "汤", "滕", "殷", "罗", "毕", "郝", "邬", "安", - "常", "乐", "于", "时", "傅", "皮", "卞", "齐", "康", "伍", "余", "元", "卜", "顾", "孟", "平", "黄", "和", "穆", "萧", "尹", - "姚", "邵", "湛", "汪", "祁", "毛", "禹", "狄", "米", "贝", "明", "臧", "计", "伏", "成", "戴", "谈", "宋", "茅", "庞", "熊", - "纪", "舒", "屈", "项", "祝", "董", "梁", "杜", "阮", "蓝", "闵", "席", "季", "麻", "强", "贾", "路", "娄", "危", "江", "童", - "颜", "郭", "梅", "盛", "林", "刁", "钟", "徐", "邱", "骆", "高", "夏", "蔡", "田", "樊", "胡", "凌", "霍", "虞", "万", "支", - "柯", "昝", "管", "卢", "莫", "经", "房", "裘", "缪", "干", "解", "应", "宗", "丁", "宣", "贲", "邓", "郁", "单", "杭", "洪", - "包", "诸", "左", "石", "崔", "吉", "钮", "龚", "程", "嵇", "邢", "滑", "裴", "陆", "荣", "翁", "荀", "羊", "于", "惠", "甄", - "曲", "家", "封", "芮", "羿", "储", "靳", "汲", "邴", "糜", "松", "井", "段", "富", "巫", "乌", "焦", "巴", "弓", "牧", "隗", - "山", "谷", "车", "侯", "宓", "蓬", "全", "郗", "班", "仰", "秋", "仲", "伊", "宫", "宁", "仇", "栾", "暴", "甘", "钭", "厉", - "戎", "祖", "武", "符", "刘", "景", "詹", "束", "龙", "叶", "幸", "司", "韶", "郜", "黎", "蓟", "溥", "印", "宿", "白", "怀", - "蒲", "邰", "从", "鄂", "索", "咸", "籍", "赖", "卓", "蔺", "屠", "蒙", "池", "乔", "阴", "郁", "胥", "能", "苍", "双", "闻", - "莘", "党", "翟", "谭", "贡", "劳", "逄", "姬", "申", "扶", "堵", "冉", "宰", "郦", "雍", "却", "璩", "桑", "桂", "濮", "牛", - "寿", "通", "边", "扈", "燕", "冀", "浦", "尚", "农", "温", "别", "庄", "晏", "柴", "瞿", "阎", "充", "慕", "连", "茹", "习", - "宦", "艾", "鱼", "容", "向", "古", "易", "慎", "戈", "廖", "庾", "终", "暨", "居", "衡", "步", "都", "耿", "满", "弘", "匡", - "国", "文", "寇", "广", "禄", "阙", "东", "欧", "殳", "沃", "利", "蔚", "越", "夔", "隆", "师", "巩", "厍", "聂", "晁", "勾", - "敖", "融", "冷", "訾", "辛", "阚", "那", "简", "饶", "空", "曾", "毋", "沙", "乜", "养", "鞠", "须", "丰", "巢", "关", "蒯", - "相", "查", "后", "荆", "红", "游", "郏", "竺", "权", "逯", "盖", "益", "桓", "公", "仉", "督", "岳", "帅", "缑", "亢", "况", - "郈", "有", "琴", "归", "海", "晋", "楚", "闫", "法", "汝", "鄢", "涂", "钦", "商", "牟", "佘", "佴", "伯", "赏", "墨", "哈", - "谯", "篁", "年", "爱", "阳", "佟", "言", "福", "南", "火", "铁", "迟", "漆", "官", "冼", "真", "展", "繁", "檀", "祭", "密", - "敬", "揭", "舜", "楼", "疏", "冒", "浑", "挚", "胶", "随", "高", "皋", "原", "种", "练", "弥", "仓", "眭", "蹇", "覃", "阿", - "门", "恽", "来", "綦", "召", "仪", "风", "介", "巨", "木", "京", "狐", "郇", "虎", "枚", "抗", "达", "杞", "苌", "折", "麦", - "庆", "过", "竹", "端", "鲜", "皇", "亓", "老", "是", "秘", "畅", "邝", "还", "宾", "闾", "辜", "纵", "侴", "万俟", "司马", "上官", - "欧阳", "夏侯", "诸葛", "闻人", "东方", "赫连", "皇甫", "羊舌", "尉迟", "公羊", "澹台", "公冶", "宗正", "濮阳", "淳于", "单于", "太叔", "申屠", - "公孙", "仲孙", "轩辕", "令狐", "钟离", "宇文", "长孙", "慕容", "鲜于", "闾丘", "司徒", "司空", "兀官", "司寇", "南门", "呼延", "子车", "颛孙", - "端木", "巫马", "公西", "漆雕", "车正", "壤驷", "公良", "拓跋", "夹谷", "宰父", "谷梁", "段干", "百里", "东郭", "微生", "梁丘", "左丘", "东门", - "西门", "南宫", "第五", "公仪", "公乘", "太史", "仲长", "叔孙", "屈突", "尔朱", "东乡", "相里", "胡母", "司城", "张廖", "雍门", "毋丘", "贺兰", - "綦毋", "屋庐", "独孤", "南郭", "北宫", "王孙"}; - - - - /** - * 获取一个随机姓名 - * @see #getName() - */ - @Deprecated - public static String getName(String charsetName) { - return getName(Charset.forName(charsetName)); - } - /** - * 获取一个随机姓名 - * @see #getName() - */ - @Deprecated - public static String getName(Charset charset) { - return getName(); - } - - /** - * 获得多个随机姓氏 - */ - public static String[] getFamilyName(int nums) { - String[] names = new String[nums]; - - int index; - for (int i = 0; i < nums; i++) { - // 获得一个随机的姓氏 - index = RandomUtil.getRandom().nextInt(Surname.length); - names[i] = Surname[index]; - } - - return names; - } - - /** - * 获得一个随机姓氏 - */ - public static String getFamilyName() { - return Surname[RandomUtil.getRandom().nextInt(Surname.length)]; - } - - /** - * 获取一个随机姓名 - */ - public static String getName() { - ThreadLocalRandom random = RandomUtil.getRandom(); - - // 获得一个随机的姓氏 - boolean two = random.nextBoolean(); - StringBuilder nameBuilder = new StringBuilder(1 + (two ? 2 : 1)).append(getFamilyName()); - /* 从常用字中选取一个或两个字作为名 */ - if (two) { - nameBuilder.append(getChinese()).append(getChinese()); - } else { - nameBuilder.append(getChinese()); - } - return nameBuilder.toString(); - } - - - /** - * 获取一个随机汉字 - * 指定编码已失效,理论上应该不需要指定编码 - * @see #getName() - */ - @Deprecated - public static String getChinese(String encoding) { - return getChinese(Charset.forName(encoding)); - } - - /** - * 获取一个汉字 - * 理论上应该不需要指定编码 - * @see #getName() - * - */ - @Deprecated - public static String getChinese(Charset charset) { - return getChinese(); - } - - /** - * 获取一个随机汉字 - */ - public static String getChinese() { - ThreadLocalRandom random = RandomUtil.getRandom(); - byte[] bArr = new byte[2]; - //区码,0xA0打头,从第16区开始,即0xB0=11*16=176,16~55一级汉字,56~87二级汉字 - // 176 ~ random(0~39) - bArr[0] = (byte) random.nextInt(176, 176 + 39); - //位码,0xA0打头,范围第1~94列 - // 161 ~ random(0~92) - bArr[1] = (byte) random.nextInt(161, 161 + 93); - // 区位码组合成汉字 - return new String(bArr, GBK); - } - - /** - * 获取一个随机汉字 - */ - public static String getChinese(int num) { - StringBuilder sb = new StringBuilder(num); - for (int i = 0; i < num; i++) { - sb.append(getChinese()); - } - return sb.toString(); - } - - /** - * 获取一个随机汉字 - * @see #getChinese(int) - */ - @Deprecated - public static String getChinese(int num, String encoding) { - return getChinese(num); - } - /** - * 获取一个随机汉字 - * @see #getChinese(int) - */ - @Deprecated - public static String getChinese(int num, Charset charset) { - return getChinese(num); - } - - /** - * 构造私有化 构造方法 - */ - private ChineseUtil() { } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ClassScanner.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ClassScanner.java deleted file mode 100644 index 7035e5c..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ClassScanner.java +++ /dev/null @@ -1,187 +0,0 @@ -package com.forte.util.utils; - - -import java.io.File; -import java.io.IOException; -import java.net.JarURLConnection; -import java.net.URI; -import java.net.URISyntaxException; -import java.net.URL; -import java.util.*; -import java.util.function.Predicate; -import java.util.jar.JarEntry; -import java.util.jar.JarFile; - - -/** - * 扫描包下路径 - * 包括本地文件和jar包文件 - * - * @author ljb - */ -public class ClassScanner { - - /** - * 储存结果的Set集合 - */ - private Set> eleStrategySet = new HashSet<>(); - - /** - * 默认使用的类加载器 - */ - private ClassLoader classLoader; - - /** - * 构造,使用当前类所在的加载器 - */ - public ClassScanner() { - //默认使用的类加载器 - this.classLoader = ClassScanner.class.getClassLoader(); - } - - /** - * 构造,提供一个类加载器 - * - * @param classLoader 类加载器 - */ - public ClassScanner(ClassLoader classLoader) { - Objects.requireNonNull(classLoader); - this.classLoader = classLoader; - } - - /** - * 根据过滤规则查询 - * - * @param classFilter class过滤规则 - * @throws ClassNotFoundException - */ - public ClassScanner find(String packageName, Predicate> classFilter) throws ClassNotFoundException, IOException, URISyntaxException { - eleStrategySet.addAll(addClass(packageName, classFilter)); - return this; - } - - /** - * 查询全部 - * @throws ClassNotFoundException - * @throws IOException - * @throws URISyntaxException - */ - public ClassScanner find(String packageName) throws ClassNotFoundException, IOException, URISyntaxException { - eleStrategySet.addAll(addClass(packageName, c -> true)); - return this; - } - - /** - * 获取包下所有实现了superStrategy的类并加入list - * - * @param classFilter class过滤器 - */ - private Set> addClass(String packageName, Predicate> classFilter) throws ClassNotFoundException, URISyntaxException, IOException { - final String path = packageName.replace(".", "/"); - URL url = classLoader.getResource(path); - //如果路径为null,抛出异常 - if (url == null) { - throw new RuntimeException("package url not exists: " + packageName); - } - - //路径字符串 - String protocol = url.getProtocol(); - //如果是文件类型,使用文件扫描 - if ("file".equals(protocol)) { - // 本地自己可见的代码 - return findClassLocal(packageName, classFilter); - //如果是jar包类型,使用jar包扫描 - } else if ("jar".equals(protocol)) { - // 引用jar包的代码 - return findClassJar(packageName, classFilter); - } - return Collections.emptySet(); - } - - /** - * 本地查找 - */ - private Set> findClassLocal(final String packName, final Predicate> classFilter) throws URISyntaxException { - Set> set = new HashSet<>(); - URI url = classLoader.getResource(packName.replace(".", "/")).toURI(); - - File file = new File(url); - final File[] files = file.listFiles(); - if (files != null) { - for (File chiFile : files) { - if (chiFile.isDirectory()) { - //如果是文件夹,递归扫描 - set.addAll(findClassLocal(packName + "." + chiFile.getName(), classFilter)); - } - if (chiFile.getName().endsWith(".class")) { - Class clazz = null; - try { - clazz = classLoader.loadClass(packName + "." + chiFile.getName().replace(".class", "")); - } catch (ClassNotFoundException e) { - throw new RuntimeException(e); - } - if (clazz != null && classFilter.test(clazz)) { - set.add(clazz); - } - } - } - } - - return set; - } - - /** - * jar包查找 - */ - private Set> findClassJar(final String packName, final Predicate> classFilter) throws ClassNotFoundException, IOException { - Set> set = new HashSet<>(); - String pathName = packName.replace(".", "/"); - JarFile jarFile; - URL url = classLoader.getResource(pathName); - JarURLConnection jarURLConnection = (JarURLConnection) url.openConnection(); - jarFile = jarURLConnection.getJarFile(); - - Enumeration jarEntries = jarFile.entries(); - while (jarEntries.hasMoreElements()) { - JarEntry jarEntry = jarEntries.nextElement(); - String jarEntryName = jarEntry.getName(); - - if (jarEntryName.contains(pathName) && !jarEntryName.equals(pathName + "/")) { - //递归遍历子目录 - if (jarEntry.isDirectory()) { - String clazzName = jarEntry.getName().replace("/", "."); - int endIndex = clazzName.lastIndexOf("."); - String prefix = null; - if (endIndex > 0) { - prefix = clazzName.substring(0, endIndex); - } - set.addAll(findClassJar(prefix, classFilter)); - } - if (jarEntry.getName().endsWith(".class")) { - Class clazz = null; - clazz = classLoader.loadClass(jarEntry.getName().replace("/", ".").replace(".class", "")); - //判断,如果符合,添加 - if (clazz != null && classFilter.test(clazz)) { - set.add(clazz); - } - } - } - } - return set; - } - - /** - * 获取当前扫描到的结果 - */ - public Set> get() { - return new HashSet<>(this.eleStrategySet); - } - - /** - * 清空当前扫描结果集 - */ - public void clear(){ - this.eleStrategySet.clear(); - } - -} \ No newline at end of file diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/CollectorUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/CollectorUtil.java deleted file mode 100644 index 5bb57f8..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/CollectorUtil.java +++ /dev/null @@ -1,54 +0,0 @@ -package com.forte.util.utils; - -import java.util.function.BiConsumer; -import java.util.function.Function; -import java.util.function.Supplier; -import java.util.stream.Collector; - -/** - * - * 针对{@link java.util.stream.Collector}的操作工具类 - * - * @author ForteScarlet - * @date 2020/8/1 - */ -public class CollectorUtil { - - /** - * 在非stream环境下使用{@link Collector} - * @param num 获取数量 - * @param getter 单值获取器 - * @param collector 收集器 - */ - public static R collector(int num, Supplier getter, Collector collector){ - // 获取容器 - A container = collector.supplier().get(); - // 获取累加器 - BiConsumer accumulator = collector.accumulator(); - for (int i = 0; i < num; i++) { - accumulator.accept(container, getter.get()); - } - // 获取结果 - return collector.finisher().apply(container); - } - - /** - * 在非stream环境下使用{@link Collector} - * @param num 获取数量 - * @param getter 单值获取器 - * @param mapper 转化器 - * @param collector 收集器 - */ - public static N collector(int num, Supplier getter, Function mapper, Collector collector){ - // 获取容器 - A container = collector.supplier().get(); - // 获取累加器 - BiConsumer accumulator = collector.accumulator(); - for (int i = 0; i < num; i++) { - accumulator.accept(container, mapper.apply(getter.get())); - } - // 获取结果 - return collector.finisher().apply(container); - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/FieldUtils.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/FieldUtils.java deleted file mode 100644 index fbc1ffd..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/FieldUtils.java +++ /dev/null @@ -1,2073 +0,0 @@ -package com.forte.util.utils; - - - - -import java.lang.reflect.*; -import java.util.*; -import java.util.stream.Collectors; - -/** - *

- * 字段操作工具,提供丰富的方法,以反射的方式从对象中获取值或赋值。
- * 其中:
- * - objectGetter方法可以允许使用多级字段,例如"user.child.name"
- * - getExcelNum方法可以获取Excel中列的数字坐标,例如:"AA" => 27
- *

- *
- *

- * 实现了缓存优化,使得效率大幅上升 - * 由于希望将其作为一个简单的工具类使用,因此全部使用内部类实现 - *

- *

2018-12-15 多层级缓存优化-getter测试结果:

- *
- * 次数:500w
- * 深度:100
- * -没有缓存的:
- *      - * >总用时:1635s ; 1635950ms
- *      - * >平均用时:.33ms
- * -有缓存的:
- *      - * >总用时:18s ; 18216ms
- *      - * >平均用时:.00ms
- *
- *
- *
- *
- *

2018-12-15至2018-12-16 多层级缓存优化-getter测试结果:

- *
- * 次数:5亿
- * 深度:200
- * -没有缓存的:
- *      - * >总用时:至终止程序为止,用时11 小时 50 分 13 秒 - *      - * >控制台可见进度:6.89%左右 - *      - * >未执行完成,中途终止 - * -有缓存的:
- *      - * >总用时:3433s ; 3433008ms - * - *
- *
- *

- * - * @author ForteScarlet - */ -public class FieldUtils { - - - //静态代码块加载字母顺序 - static { - - Map wordNum = new HashMap<>(); - - for (int i = 1; i <= 26; i++) { - char c = (char) (97 + (i - 1)); - wordNum.put(c + "", i); - } - - //获取字母顺序表 - //保存 - WORD_NUMBER = wordNum; - - } - - /** - * 字母顺序表 - */ - private static final Map WORD_NUMBER; - - /** - * 单层字段缓存记录,使用线程安全map - */ - private static final Map> SINGLE_FIELD_CACHE_MAP = Collections.synchronizedMap(new HashMap<>()); - - /** - * 多层字段缓存记录,使用线程安全map - */ - private static final Map> LEVEL_FIELD_CACHE_MAP = Collections.synchronizedMap(new HashMap<>()); - - - /** - * 获取Excel中列的数字坐标
- * 例如:"AA" => 27 - * - * @param colStr Excel中的列坐标 - * @return 此列坐标对应的数字 - * @author ForteScarlet - */ - public static long getExcelNum(String colStr) { - //获取数组 - char[] array = colStr.toCharArray(); - //长度 - int length = array.length; - //初始数 - long end = 1; - - //倒序遍历,从小位开始遍历 - for (int i = array.length - 1; i >= 0; i--) { - //字母序号 - int num = WORD_NUMBER.get((array[i] + "").toLowerCase()); - //加成 - int addBuffer = length - i - 1; - //结果加成 - end += num * (int) (Math.pow(26, addBuffer)); - } - - //返回结果 - return end; - } - - /** - * 获取对象的所有字段(包括没有get方法的字段) - * - * @param tClass 对象 - * @return - * @author ForteScarlet - */ - public static List getFieldsWithoutGetter(Class tClass) { - Field[] fs = tClass.getDeclaredFields(); - return Arrays.asList(fs); - } - - - /** - * 获取对象的所有字段(包括没有get方法的字段) - * @param t - * @param - * @return - */ - public static List getFieldsWithoutGetter(T t) { - return getFieldsWithoutGetter(t.getClass()); - } - - - /** - * 获取对象的所有字段名(包括没有get方法的字段) - * @param t 对象 - * @return - * @author ForteScarlet - */ - public static List getFieldsNameWithoutGetter(T t) { - return getFieldsWithoutGetter(t).stream().map(Field::getName).collect(Collectors.toList()); - } - - /** - * 获取对象的所有字段 - * @param tClass 对象 - * @return - * @author ForteScarlet - */ - public static List getFieldsWithGetter(Class tClass) { - Field[] fs = tClass.getDeclaredFields(); - //返回时过滤掉没有get方法的字段 - return Arrays.stream(fs).filter(f -> Arrays.stream(tClass.getMethods()).anyMatch(m -> m.getName().equals("get" + headUpper(f.getName())))).collect(Collectors.toList()); - } - - /** - * 获取对象的所有字段 - * - * @param t - * @param - * @return - */ - public static List getFieldsWithGetter(T t) { - return getFieldsWithGetter(t.getClass()); - } - - /** - * 获取对象的所有字段名 - * - * @param t 对象 - * @return - * @author ForteScarlet - */ - public static List getFieldsNameWithGetter(T t) { - return getFieldsWithGetter(t).stream().map(Field::getName).collect(Collectors.toList()); - } - - - /** - * 获取类指定的字段对象,支持多层级获取 - * @param c - * @param fieldName - * @return - */ - public static Field fieldGetter(Class c , String fieldName){ - //先进行缓存判断,如果缓存中有记录,直接返回 - CacheField cacheField = getCacheField(c, fieldName); - if(cacheField != null){ - return cacheField.getField(); - } - - //可以获取多级字段的字段对象 - //使用'.'切割字段名 - String[] split = fieldName.split("\\."); - if(split.length == 1){ - //如果长度为1,则说明只有一级字段,直接获取字段对象 - return getField(c , fieldName); - }else{ - //如果不是1,则说明不止1层字段值 - //当前字段名 - String thisFieldName = split[0]; - //剩下的字段名称拼接 - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - String otherFieldName = String.join(".", list); - //递归获取 - //多层级也计入缓存 - Field field = fieldGetter(getFieldClass(c, thisFieldName), otherFieldName); - //计入缓存 - saveSingleCacheField(c , field , null , null); - return field; - } - } - - /** - * 获取字段的class类型,支持多层级获取 - * @param c - * @param fieldName - * @return - */ - public static Class fieldClassGetter(Class c , String fieldName){ - return fieldGetter(c , fieldName).getType(); - } - - - /** - * 获取字段的getter方法,单层级 - * @param whereIn - * @param fieldName - * @return - */ - public static Method getFieldGetter(Class whereIn , String fieldName){ - //先查询缓存 - Method cacheGetter = getCacheFieldGetter(whereIn, fieldName); - if(cacheGetter != null){ - return cacheGetter; - } - - //获取getter方法 - try { - Method getter = whereIn.getMethod("get" + headUpper(fieldName)); - //计入缓存 - saveSingleCacheFieldGetter(whereIn , fieldName , getter); - //返回结果 - return getter; - } catch (NoSuchMethodException e) { - //如果出现异常,返回null - return null; - } - } - - /** - * 获取字段的getter方法 - * @param whereIn - * 字段所在类 - * @param field - * 字段 - * @return - */ - public static Method getFieldGetter(Class whereIn , Field field){ - //获取getter方法 - return getFieldGetter(whereIn , field.getName()); - } - - /** - * 获取字段的getter方法 - * @param obj - * @param field - * @return - */ - public static Method getFieldGetter(Object obj, Field field){ - return getFieldGetter(obj.getClass() , field); - } - - - /** - * 获取字段的getter方法 - * @param obj - * @param fieldName - * @return - */ - public static Method getFieldGetter(Object obj, String fieldName){ - return getFieldGetter(obj.getClass() , fieldName); - } - - - /** - * 获取字段的setter方法 - * @param whereIn - * @param fieldName - * @return - */ - public static Method getFieldSetter(Class whereIn , String fieldName){ - //先查询缓存中是否存在 - Method cacheSetter = getCacheFieldSetter(whereIn, fieldName); - if(cacheSetter != null){ - return cacheSetter; - } - - - try { - //获取这个字段的Field对象 - Field field = getField(whereIn, fieldName); - // whereIn.getDeclaredField(fieldName); - Method setter = whereIn.getMethod("set" + headUpper(fieldName), field.getType()); - //计入缓存 - saveSingleCacheFieldSetter(whereIn , fieldName , setter); - //返回 - return setter; - } catch (NoSuchMethodException e) { -// e.printStackTrace(); - //如果出现异常,返回null - return null; - } - } - - /** - * 获取字段的setter方法 - * @param whereIn - * @param field - * @return - */ - public static Method getFieldSetter(Class whereIn , Field field){ - //先查询缓存 - Method cacheFieldSetter = getCacheFieldSetter(whereIn, field.getName()); - if(cacheFieldSetter != null){ - return cacheFieldSetter; - } - - try { - // a public method - Method setter = whereIn.getMethod("set" + headUpper(field.getName()), field.getType()); - //计入缓存 - saveSingleCacheFieldSetter(whereIn , field , setter); - //返回结果 - return setter; - } catch (NoSuchMethodException e) { - //如果出现异常,返回null - return null; - } - } - - /** - * 获取字段的setter方法 - * @param obj - * @param fieldName - * @return - */ - public static Method getFieldSetter(Object obj , String fieldName){ - return getFieldSetter(obj.getClass() , fieldName); - } - - /** - * 获取字段的setter方法 - * @param obj - * @param field - * @return - */ - public static Method getFieldSetter(Object obj , Field field){ - return getFieldSetter(obj.getClass() , field); - } - - - /** - * 获取字段的getter方法,支持多层级获取 - * @param objClass - * @param fieldName - * @return - * @throws NoSuchMethodException - */ - public static Method fieldGetterGetter(Class objClass, String fieldName) throws NoSuchMethodException { - //判断是否有用“.”分割 - String[] split = fieldName.split("\\."); - //如果分割后只有一个字段值,说明是单层 - if (split.length == 1) { - //获取其get方法,返回执行结果 - Method getter = getFieldGetter(objClass , fieldName); - if(getter == null){ - //抛出没有此方法异常 - throw new NoSuchMethodException("没有找到类["+ objClass +"]字段["+ fieldName +"]的getter方法"); - } - //返回获取结果 - return getter; - }else{ - //否则为多层级,深入获取 - //获取第一个字段名,拼接其余字段名并进行递归处理 - String field = split[0]; - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - String innerFieldName = String.join(".", list); - - //递归 - return fieldGetterGetter(fieldGetterGetter(objClass, field).getReturnType(), innerFieldName); - } - } - - /** - * 通过对象的getter获取字段数值 - * 支持类似“user.child”这种多层级的获取方式 - * 获取的字段必须有其对应的公共get方法 - * @param t - * @param fieldName - * @return - * @throws IllegalAccessException - * @throws NoSuchMethodException - * @throws InvocationTargetException - */ - public static Object objectGetter(Object t, String fieldName) throws IllegalAccessException, NoSuchMethodException, InvocationTargetException { - return objectGetter(t , t , fieldName , fieldName , 1); - } - - /** - * 通过对象的getter获取字段数值 - * 支持类似“user.child”这种多层级的获取方式 - * 获取的字段必须有其对应的公共get方法 - * @param t - * 被获取的对象 - * @param fieldName - * 字段名 - * @return - * @throws SecurityException - * @throws InvocationTargetException - * @throws IllegalArgumentException - * @throws IllegalAccessException - * @author ForteScarlet - */ - private static Object objectGetter(Object t, Object root, String fieldName, String realFieldName, int level) throws SecurityException, IllegalAccessException, IllegalArgumentException, InvocationTargetException, NoSuchMethodException { - - //先查询缓存 - CacheField cacheField = getCacheField(t.getClass(), fieldName); - if(cacheField != null){ - //如果有缓存,获取执行结果 - InvokeResult invokeResult = cacheField.fieldValue(t); - boolean success = invokeResult.isSuccess(); - if(success){ - return invokeResult.getInvoke(); - } - } - - //判断是否有用“.”分割 - String[] split = fieldName.split("\\."); - //如果分割后只有一个字段值,直接返回 - if (split.length == 1) { - - //获取其get方法,返回执行结果 - Method getter = getFieldGetter(t , fieldName); - if(getter == null){ - //抛出没有此方法异常 - throw new NoSuchMethodException("没有找到类["+ t.getClass() +"]字段["+ fieldName +"]的getter方法"); - } - //计入单缓存-当前 - SingleCacheField singleCacheField = saveSingleCacheFieldGetter(t.getClass(), fieldName, getter); - - //如果层数等级与真实字段相同且等级不为1,说明这是多层级字段的最终字段,保存 - if(level != 1 && level == realFieldName.split("\\.").length){ - saveLevelCacheField(root.getClass() , realFieldName , singleCacheField); - } - - - //返回执行结果 - return getter.invoke(t); - } else { - //否则为多层级字段,获取第一个字段名,拼接其余字段名并进行递归处理 - String field = split[0]; - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - String innerFieldName = String.join(".", list); - - - /* - 以下的getter获取方法均为当前字段的,即field字段 - */ - - //获取其get方法,返回执行结果 - Method getter = getFieldGetter(t , field); - if(getter == null){ - //抛出没有此方法异常 - throw new NoSuchMethodException("没有找到类["+ t.getClass() +"]字段["+ field +"]的getter方法"); - } - - //计入单缓存-当前-同时将getter方法提前计入缓存 - SingleCacheField singleCacheField = saveSingleCacheFieldGetter(t.getClass(), field, getter); - - - //此字段的实例对象-直接使用与上面相同的方式获取执行结果而不是使用自我调用 - Object innerObject = getter.invoke(t); - - /* - 保存此多层级字段的getter并保存至多层级缓存 - 假如目标是 bean.bean.bean.a - 则当前第一轮为 - realField: bean.bean.bean.a - field: bean - innerField: bean.bean.a - 则计入多层缓存的应当是bean - - 第二轮为 - realField: bean.bean.bean.a - field: bean - innerField: bean.a - 则计入多层缓存的应当是bean.bean - - 可见应记录层数,并截取与层数相同数量的字段层级并进行记录 - 层数默认开始为1 - */ -// //计入多层级缓存-当前 - //获取字段名-切割真实字段名 - String levelFieldName = Arrays.stream(realFieldName.split("\\.")).limit(level).collect(Collectors.joining(".")); - saveLevelCacheField(root.getClass() , levelFieldName , singleCacheField); - - - - //field必定为单层字段,获取field对应的对象,然后使用此对象进行递归 - return objectGetter(innerObject, root, innerFieldName, realFieldName, level+1); - } - } - - - - /** - * 通过对象的setter为字段赋值 - * 支持类似“user.child”这种多层级的赋值方式 - * 赋值的字段必须有其对应的公共set方法 - * 如果多层级对象中有非底层级字段为null,将会尝试为其创建一个新的实例 - * - * TODO 旧 - * - * @param t 对象 - * @param fieldName 需要赋值的字段 - * @param value 需要赋的值 - * @throws InvocationTargetException - * @throws IllegalAccessException - */ - public static void objectSetter(Object t, String fieldName, Object value) throws Exception { - //判断是否有用“.”分割 - String[] split = fieldName.split("\\."); - //如果分割后只有一个字段值,直接进行赋值 - if (split.length == 1) { - //获取其set方法,返回执行结果 - String setterName = "set" + FieldUtils.headUpper(fieldName); - //获取字段的setter方法 - Method setter = getFieldSetter(t , fieldName); - //如果没有setter,展示异常提醒 - // 直接抛出异常 - if(setter == null){ - String error = "没有找到["+ t.getClass() +"]中的字段["+ fieldName +"]的setter["+ setterName +"]方法,无法进行赋值"; - throw new RuntimeException(new NoSuchFieldException(error)); - }else{ - //赋值 - MethodUtil.invoke(t , new Object[]{value} , setter); - } - - } else { - //否则为多层级字段,获取第一个字段名,拼接其余字段名并进行递归处理 - String field = split[0]; - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - fieldName = String.join(".", list); - - //获取下一层的对象 - Object fieldObject = objectGetter(t, field); - if (fieldObject == null) { - //如果为null,创建一个此类型的实例 - fieldObject = getFieldClass(t, field).newInstance(); - //并为此对象赋值 - objectSetter(t, field, fieldObject); - } - //寻找下一层字段 - objectSetter(fieldObject, fieldName, value); - } - } - - /** - * 通过对象的setter为字段赋值 - * 支持类似“user.child”这种多层级的赋值方式 - * 赋值的字段必须有其对应的公共set方法 - * 如果多层级对象中有非底层级字段为null,将会尝试为其创建一个新的实例 - * @param t - * @param fieldName - * @param param - * @throws NoSuchMethodException - * @throws InstantiationException - * @throws IllegalAccessException - * @throws InvocationTargetException - */ - public static void objectSetter2(Object t, String fieldName, Object param) throws Exception { - // TODO 发现bug,此方法会导致缓存无法储存 - objectSetter(t , t , fieldName , fieldName , 1 , param); - } - - /** - * 通过对象的setter为字段赋值 - * 支持类似“user.child”这种多层级的赋值方式 - * 赋值的字段必须有其对应的公共set方法 - * 如果多层级对象中有非底层级字段为null,将会尝试为其创建一个新的实例 - * @param t - * @param root - * @param fieldName - * @param realFieldName - * @param level - * @param param - */ - private static void objectSetter(Object t, Object root, String fieldName, String realFieldName, int level, Object param) throws Exception { - // TODO 实现缓存的setter方法 - // TODO 存在严重bug,此方法会导致缓存无法储存和获取,导致getter效率大幅度下降 - //先查询缓存 - CacheField cacheField = getCacheField(t.getClass(), fieldName); - if(cacheField != null){ - //如果有缓存,获取执行结果 - InvokeResult invokeResult = cacheField.fieldValueSet(t, param); - if(invokeResult.isSuccess()){ - //如果赋值成功,结束方法 - return; - } - } - //判断是否有用“.”分割 - String[] split = fieldName.split("\\."); - //如果分割后只有一个字段值,直接进行赋值 - if(split.length == 1){ - //单层字段,获取setter方法 - Method setter = getFieldSetter(t, fieldName); - if(setter == null){ - //如果没有setter,抛出异常 - throw new NoSuchMethodException("没有找到类["+ t.getClass() +"]字段["+ fieldName +"]的setter方法"); - } - - - //如果有此方法,计入缓存-当前 - SingleCacheField singleCacheField = saveSingleCacheFieldSetter(t.getClass(), fieldName, setter); - //如果层数等级与真实字段相同且等级不为1,说明这是多层级字段的最终字段,保存 - if(level != 1 && level == realFieldName.split("\\.").length){ - saveLevelCacheField(root.getClass() , realFieldName , singleCacheField); - } - - //执行赋值 - MethodUtil.invoke(t, new Object[]{param}, setter); - }else{ - //否则为多层字段,获取第一个字段名 - String field = split[0]; - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - String newFieldName = String.join(".", list); - - - /* - 以下的setter获取方法均为当前字段的,即field字段 - */ - - //获取其get方法和set方法,返回执行结果 - //获取setter方法 - Method setter = getFieldSetter(t , field); - if(setter == null){ - //抛出没有此方法异常 - throw new NoSuchMethodException("没有找到类["+ t.getClass() +"]字段["+ field +"]的setter方法"); - } - - //计入单缓存-当前-同时将setter方法提前计入缓存 - SingleCacheField singleCacheField = saveSingleCacheFieldSetter(t.getClass(), field, setter); - - //获取此字段的实例对象,使用objectGetter方法,可以省去一部计入缓存的步骤 - Object fieldInstance = objectGetter(t, field); - - //如果当前字段的值为null,赋值 - if(fieldInstance == null){ - fieldInstance = getFieldClass(t, field).newInstance(); - //为当前字段对象赋一个新的值 - objectSetter(t, field, fieldInstance); - - } - - //将setter计入缓存 - //计入多层级缓存-当前 - //获取字段名-切割真实字段名 - String levelFieldName = Arrays.stream(realFieldName.split("\\.")).limit(level).collect(Collectors.joining(".")); - saveLevelCacheField(root.getClass() , levelFieldName , singleCacheField); - - - //寻找下一层字段 - objectSetter(fieldInstance, root, newFieldName, realFieldName, level+1, param); - - } - - } - - - - /** - * 获取对象指定字段对象 - * - * @param object 对象的class对象 - * @param fieldName 字段名称 - */ - public static Field getField(Object object, String fieldName) { - return getField(object.getClass(), fieldName); - } - - - /** - * 获取类指定字段对象 - * - * @param objectClass 类的class对象 - * @param fieldName 字段名称 - */ - public static Field getField(Class objectClass, String fieldName) { - //反射获取全部字段 - Field[] declaredFields = objectClass.getDeclaredFields(); - //遍历寻找此字段 - Field field = null; - for (Field f : declaredFields) { - //如果找到了,赋值并跳出循环 - if (f.getName().equals(fieldName)) { - field = f; - break; - } - } - - if(field == null){ - Class parent = objectClass.getSuperclass(); - if(parent != null && !parent.equals(Object.class)){ - field = getField(parent, fieldName); - } - } - - return field; - } - - - /** - * 获取指定类字段的类型class对象 - * - * @param objectClass 类class对象 - * @param fieldName 字段名称 - */ - public static Class getFieldClass(Class objectClass, String fieldName) { - return getField(objectClass, fieldName).getType(); - } - - /** - * 获取类指定字段的class对象 - * - * @param object 类实例 - * @param fieldName 字段名称 - * @return - */ - public static Class getFieldClass(Object object, String fieldName) { - return getFieldClass(object.getClass(), fieldName); - } - - - /** - * 通过Class对象判断是否存在此字段 - * - * @param tClass - * @param field - * @return - */ - public static boolean isFieldExist(Class tClass, String field) { - //判断是否为多层级字段 - String[] split = field.split("\\."); - if (split.length == 1) { - //如果只有一个,直接获取 - Field getField = getField(tClass, field); -// try { -// getField = ; -// getField = tClass.getDeclaredField(field); -// } catch (NoSuchFieldException ignored) { -// } - //如果存在,返回true,否则返回false - return getField != null; - } else { - //否则,存在多层级字段,先获取第一个字段并获得其类型,然后通过此类型的Class对象进一步判断 - String firstField = split[0]; - //获取字段对象 - Field getField = null; - try { - getField = tClass.getDeclaredField(firstField); - } catch (NoSuchFieldException e) { - e.printStackTrace(); - } - //如果获取失败,直接返回false - if (getField == null) { - return false; - } - - Class firstFieldType = getField.getType(); - //拼接剩余 - //移除第一个字段 - List list = Arrays.stream(split).collect(Collectors.toList()); - list.remove(0); - //拼接剩余 - String fieldName = list.stream().collect(Collectors.joining(".")); - //获取当前字段的Class对象 - Class fieldClass = getFieldClass(tClass, firstField); - //递归 - return isFieldExist(fieldClass, fieldName); - } - } - - /** - * 通过对象实例判断字段是否存在 - * - * @param obj 实例对象 - * @param field 查询字段 - * @return - * @throws NoSuchFieldException - */ - public static boolean isFieldExist(Object obj, String field) { - return isFieldExist(obj.getClass(), field); - } - - - - - - - /** - * 获取一个list字段的泛型类型
- * 这个字段必须是一个list类型的字段! - * @param listField 字段 - * @return - * @throws ClassNotFoundException - */ - public static Class getListFieldGeneric(Field listField) { - - ParameterizedType listGenericType = (ParameterizedType) listField.getGenericType(); - Type[] listActualTypeArguments = listGenericType.getActualTypeArguments(); - if (listActualTypeArguments.length == 0) { - //如果没有数据 - return null; - } else if (listActualTypeArguments.length == 1) { - //如果只有一种类型 - String typeName = listActualTypeArguments[0].getTypeName(); - //如果此类型存在泛型,移除泛型 - typeName = typeName.replaceAll("<[\\w\\.\\, ]+>" , ""); - try { - return Class.forName(typeName); - } catch (ClassNotFoundException e) { - //将异常转化为运行时 - throw new RuntimeException(e); - } - } else { - //如果多个类型,直接返回Object类型 - return Object.class; - } - } - - /** - * 获取数组的组件类型 - * @param trr - * @param - * @return - */ - public static Class getArrayGeneric(T[] trr){ - //获取组件类型 - return trr.getClass().getComponentType(); - } - - /** - * 获取数组的组件类型 - * @param - * @return - */ - public static Class getArrayGeneric(Class tClass){ - return tClass.getComponentType(); - } - - - /** - * 获取一个list字段的泛型类型
- * 这个字段必须是一个list类型的字段! - * @param c - * @param fieldName - * @return - * @throws ClassNotFoundException - */ - public static Class getListFieldGeneric(Class c , String fieldName) { - return getListFieldGeneric(fieldGetter(c , fieldName)); - } - - /** - * 获取一个list字段的泛型类型
- * 这个字段必须是一个list类型的字段! - * @param obj - * @param fieldName - * @return - * @throws ClassNotFoundException - */ - public static Class getListFieldGeneric(Object obj , String fieldName) { - return getListFieldGeneric(obj.getClass() , fieldName); - } - - - - /** - * 判断一个Class对象是否为另一个对象的实现类 - * - * @param child 进行寻找的子类 - * @param findFather 被寻找的父类 - * @return - */ - public static boolean isChild(Class child, Class findFather) { - //如果自身就是这个类,直接返回true - if (child.equals(findFather)) { - return true; - } - - /* - 两个方向,一个是向父继承类递归,一个是向接口递归 - */ - //子类继承的父类 - Class superClass = child.getSuperclass(); - //子类实现的接口 - Class[] interfaces = child.getInterfaces(); - //如果全部为null,直接返回false - if (superClass == null && interfaces.length == 0) { - return false; - } - //进行判断-先对当前存在的两类型进行判断 - if (superClass != null && superClass.equals(findFather)) { - //如果发现了,返回true - return true; - } - - //遍历接口并判断 - for (Class interClass : interfaces) { - if (interClass.equals(findFather)) { - return true; - } - } - - //如果当前的没有发现,递归查询 - //如果没有发现,递归父类寻找 - if (superClass != null && isChild(superClass, findFather)) { - return true; - } - - //如果父类递归没有找到,进行接口递归查询 - //遍历 - for (Class interClass : interfaces) { - if (isChild(interClass, findFather)) { - return true; - } - } - - //未查询到 - return false; - } - - /** - * 判断一个Class对象是否为另一个对象的实现类 - * - * @param child 进行寻找的子类的实现类 - * @param findFather 被寻找的父类 - * @return - */ - public static boolean isChild(Object child, Class findFather) { - return isChild(child.getClass(), findFather); - } - - - - /** - * 单词开头大写 - * - * @param str - * @return - * @author ForteScarlet - */ - public static String headUpper(String str) { - return Character.toUpperCase(str.charAt(0)) + str.substring(1); - } - - /** - * 获取类名 - * - * @param c - * @return - * @author ForteScarlet - */ - public static String getClassName(Class c) { - String name = c.getName(); - String[] split = name.split("\\."); - return split[split.length - 1]; - } - - /** - * 通过对象获取类名 - * - * @param o - * @return - * @author ForteScarlet - */ - public static String getClassName(Object o) { - return getClassName(o.getClass()); - } - - - /** - * 只要传入的参数中任意一个出现了null则会抛出空指针异常 - * @param all - */ - public static void allNonNull(Object... all){ - for(Object o : all){ - if(o == null){ - throw new NullPointerException(); - } - } - } - - - - /* —————————————————————————————————————— 缓存字段接口 ———————————————————————————————————— */ - - /** - * 缓存字段的接口,定义获取一个缓存字段的值的方法 - * @param - * 字段所属的类型 - */ - private static interface CacheField{ - /** - * 获取一个缓存字段的值 - * @param object - *用于获取字段值的实例 - * @return - */ - InvokeResult fieldValue(T object); - - /** - * 为缓存字段的值赋值 - * @param object - * @param param - * @return - */ - InvokeResult fieldValueSet(T object, Object param); - - /** - * 获取Getter方法 - * @return - */ - Method getGetter(); - - /** - * 获取Setter方法 - * @return - */ - Method getSetter(); - - /** - * 获取字段对象 - * @return - */ - Field getField(); - - } - - /* —————————————————————————————————————— 单缓存字段相关方法 *基本全部为本类内部使用* ———————————————————————————————————— */ - - - - /** - * 向缓存中增加一个字段记录并返回这个新的缓存字段 - * @param field - * @param fieldWhereClass - * @param getter - * @param setter - */ - private static SingleCacheField saveSingleCacheField(Class fieldWhereClass , Field field, Method getter , Method setter){ - if(field.getName().split("\\.").length > 1){ - throw new RuntimeException("字段["+ field +"]并非单层字段"); - } - //先查看是否有这个字段 - SingleCacheField singleCacheField = (SingleCacheField)getCacheField(fieldWhereClass, field.getName()); - if(singleCacheField != null){ - //如果已经有这个缓存了,查看getter和setter是否存在 - //准备好Cache,有可能会进行更新操作 - SingleCacheField cache; - Method cGetter = singleCacheField.getGetter(); - Method cSetter = singleCacheField.getSetter(); - //判断是否更新 - boolean isUpdate = false; - - //判断getter - if(cGetter == null && getter != null){ - //getter为空,且传入的getter不为空 - isUpdate = true; - cGetter = getter; - } - - //判断setter - if(cSetter == null && setter != null){ - //setter为空,且传入的setter不为空 - //变更状态,设置getter - isUpdate = true; - cSetter = setter; - } - - //判断是否更新 - if(isUpdate){ - cache = new SingleCacheField(fieldWhereClass, field, cGetter, cSetter); - updateSingleCacheField(cache); - return cache; - } - //如果不需要更新,直接返回获取值 - return singleCacheField; - - }else{ - //不存在,直接返回 - SingleCacheField newSingleCacheField = new SingleCacheField<>(fieldWhereClass, field, getter, setter); - updateSingleCacheField(newSingleCacheField); - return newSingleCacheField; - } - } - - /** - * 向缓存中增加一个字段记录并返回这个新的缓存字段 - * getter、setter均为null - * @param fieldWhereClass - * @param field - * @param - * @return - */ - private static SingleCacheField saveSingleCacheField(Class fieldWhereClass , Field field){ - return saveSingleCacheField(fieldWhereClass , field , null , null); - } - - /** - * 向缓存中增加一个字段记录 - * 不可使用多层级字段获取 - * @param fieldWhereClass - * @param fieldName - * @param getter - * @param setter - */ - private static SingleCacheField saveSingleCacheField(Class fieldWhereClass , String fieldName , Method getter , Method setter){ - return saveSingleCacheField(fieldWhereClass , getField(fieldWhereClass ,fieldName) , getter , setter); - } - - /** - * 向缓存中增加一个字段记录 - * 不可使用多层级字段获取 - * @param fieldWhereClass - * @param fieldName - * @param - * @return - */ - private static SingleCacheField saveSingleCacheField(Class fieldWhereClass , String fieldName){ - return saveSingleCacheField(fieldWhereClass , getField(fieldWhereClass ,fieldName)); - } - - - - - /** - * 储存一个getter方法 - * @param fieldWhereClass - * @param fieldName - * @param getter - * @param - */ - private static SingleCacheField saveSingleCacheFieldGetter(Class fieldWhereClass , String fieldName , Method getter){ - return saveSingleCacheField(fieldWhereClass , fieldName , getter , null); - } - - /** - * 储存一个getter方法 - * @param fieldWhereClass - * @param field - * @param getter - * @param - */ - private static SingleCacheField saveSingleCacheFieldGetter(Class fieldWhereClass , Field field , Method getter){ - return saveSingleCacheField(fieldWhereClass , field , getter , null); - } - - /** - * 储存一个setter方法 - * @param fieldWhereClass - * @param fieldName - * @param setter - * @param - */ - private static SingleCacheField saveSingleCacheFieldSetter(Class fieldWhereClass , String fieldName , Method setter){ - return saveSingleCacheField(fieldWhereClass , fieldName , null , setter); - } - - /** - * 储存一个setter方法 - * @param fieldWhereClass - * @param field - * @param setter - * @param - */ - private static SingleCacheField saveSingleCacheFieldSetter(Class fieldWhereClass , Field field , Method setter){ - return saveSingleCacheField(fieldWhereClass , field , null , setter); - } - - /** - * 更新一个或新增加一个字段缓存 - * @param newSingleCacheField - */ - private static void updateSingleCacheField(SingleCacheField newSingleCacheField){ - //字段所在类的Class对象 - Class fieldWhereClassIn = newSingleCacheField.getFieldWhereClassIn(); - //从缓存中获取整个Map集合 - HashMap cacheFields = SINGLE_FIELD_CACHE_MAP.get(fieldWhereClassIn); - if(cacheFields == null){ - //如果没有此类的相关记录 - //获取这个类的字段总数量 - int length = fieldWhereClassIn.getDeclaredFields().length; - HashMap hashMap = new HashMap<>(length); - //保存这个增加字段并添加至缓存 - hashMap.put(newSingleCacheField.getFieldName() , newSingleCacheField); - SINGLE_FIELD_CACHE_MAP.put(fieldWhereClassIn , hashMap); - }else{ - //有此类相关记录,保存或覆盖此字段信息 - cacheFields.put(newSingleCacheField.getFieldName() , newSingleCacheField); - } - } - - - /** - * 获取缓存中的字段的getter方法 - * @param fieldWhereClass - * @param fieldName - * @return - */ - private static Method getCacheFieldGetter(Class fieldWhereClass , String fieldName){ - //如果缓存中存在此字段,返回getter,否则返回null - return Optional.ofNullable(getCacheField(fieldWhereClass , fieldName)).map(CacheField::getGetter).orElse(null); - } - - /** - * 获取缓存中字段的setter方法 - * @param fieldWhereClass - * @param fieldName - * @return - */ - private static Method getCacheFieldSetter(Class fieldWhereClass , String fieldName){ - //如果缓存中存在此字段,返回getter,否则返回null - return Optional.ofNullable(getCacheField(fieldWhereClass , fieldName)).map(CacheField::getSetter).orElse(null); - } - - /* ———————————————————————————————————————— 单缓存字段内部类 ———————————————————————————————————— */ - - /** - * 单层字段缓存对象 - * 内部类,实现字段缓存,优化此工具类的效率 - * 字段的缓存,其中储存所在类的Class对象、字段名称、字段对象、字段类型、getter、setter - * *虽然为公共权限,但仅仅为了使其内部类可被外部访问而设* - */ - private static class SingleCacheField implements CacheField { - // 储存所在类的Class对象、字段名称、字段对象、字段类型、getter、setter - - /** 字段所在Class,不可变更 */ - private final Class FIELD_WHERE_CLASS; - - /** 字段名称 */ - private final String FIELD_NAME; - - /** 字段对象 */ - private final Field FIELD; - - /** 字段类型 */ - private final Class FIELD_TYPE; - - /** getter方法 */ - private final Method GETTER; - - /** setter方法 */ - private final Method SETTER; - - /* —————— 各种api —————— */ - - /** - * 通过实例对象获取字段值,返回一个封装类 - * success为是否成功的执行了 - * invoke代表执行的值 - * 如果执行失败,则invoke的值必然为null - * @param obj - * @return - */ - private InvokeResult objectGetter(Object obj){ - if(GETTER != null){ - try { - //执行getter方法 - Object invoke = GETTER.invoke(obj); - return InvokeResult.success(invoke); - } catch (Exception e) { - return InvokeResult.fail(); - } - }else{ - return InvokeResult.fail(); - } - } - - /** - * 通过一个实例对象设置字段值,返回一个封装类 - * setter没有返回值,则invoke必然为null - * 如果success为false,则说明方法执行出现错误或setter不存在 - * @param obj - * @param value - * @return - */ - private InvokeResult objectSetter(Object obj, Object value){ - if(SETTER != null){ - try { - MethodUtil.invoke(obj, new Object[]{value}, SETTER); - return InvokeResult.emptySuccess(); - } catch (InvocationTargetException | IllegalAccessException e) { - return InvokeResult.fail(); - } - }else{ - return InvokeResult.fail(); - } - } - - /** - * 获取字段所在类的Class对象 - * @return - */ - private Class getFieldWhereClassIn(){ - return this.FIELD_WHERE_CLASS; - } - - /** - * 获取字段名 - * @return - */ - private String getFieldName(){ - return FIELD_NAME; - } - - /** - * 获取字段 - * @return - */ - @Override - public Field getField(){ - return FIELD; - } - - /** - * 获取字段类型 - * @return - */ - private Class getFieldType(){ - return FIELD_TYPE; - } - - /** - * 获取getter方法 - * @return - */ - @Override - public Method getGetter(){ - return GETTER; - } - - /** - * 获取setter方法 - * @return - */ - @Override - public Method getSetter(){ - return SETTER; - } - - - /* - 重写equals方法和hashCode方法,使得字段名和所在Class类为区别本类的字段 - */ - - @Override - public boolean equals(Object o) { - if (this == o) { - return true; - } - if (o == null || getClass() != o.getClass()) { - return false; - } - SingleCacheField that = (SingleCacheField) o; - return FIELD_WHERE_CLASS.equals(that.FIELD_WHERE_CLASS) && - FIELD_NAME.equals(that.FIELD_NAME); - } - - @Override - public int hashCode() { - return Objects.hash(FIELD_WHERE_CLASS, FIELD_NAME); - } - - /** 构造-通过字段为字段名赋值 */ - private SingleCacheField(Class fieldWhereClass , Field field , Method getter , Method setter){ - //设置所属类 - this.FIELD_WHERE_CLASS = fieldWhereClass; - //设置字段相关 - this.FIELD = field; - this.FIELD_NAME = field.getName(); - this.FIELD_TYPE = field.getType(); - //获取此字段的getter方法 - this.GETTER = getter; - this.SETTER = setter; - } - - /** - * 获取字段的实例值 - * @param object - * @return - */ - @Override - public InvokeResult fieldValue(Object object) { - return objectGetter(object); - } - - /** - * 为字段赋值 - * @param object - * @param param - * @return - */ - @Override - public InvokeResult fieldValueSet(T object, Object param) { - //赋值 - try { - MethodUtil.invoke(object , new Object[]{param} , SETTER); - //如果没有出现异常,则说明方法执行成功,返回成功信息 - return InvokeResult.emptySuccess(); - } catch (InvocationTargetException | IllegalAccessException e) { - //如果出现异常,则说明方法执行错误,返回错误信息 - return InvokeResult.fail(); - } - } - - } - - - - - /* —————————————————————————————————————— 多层缓存对象相关方法 —————————————————————————————————————————— */ - - /** - * 尝试查询上层层级字段 - * @param thisLevelCacheField - * @param - * @param - * @return - */ - private static LevelCacheField findUpperLevelCacheField(LevelCacheField thisLevelCacheField){ - HashMap levelCacheFieldHashMap = LEVEL_FIELD_CACHE_MAP.get(thisLevelCacheField.getRootClass()); - try{ - LevelCacheField tryFoundUpper = levelCacheFieldHashMap.entrySet().parallelStream().filter(e -> (e.getValue().getLevel() + 1) == thisLevelCacheField.getLevel()).findFirst().map(Map.Entry::getValue).orElse(null); - return Optional.ofNullable(tryFoundUpper).orElse(null); - }catch (Exception ignore){ - //如果出现异常则说明没有查询到,不做处理,直接返回null - return null; - } - } - - /** - * 储存或覆盖一个多层级字段对象 - * 可选:上层字段、下层字段 - * 必要:根类、字段名、本类字段 - * @param rootClass - * 必要 根类 - * @param fieldName - * 必要 字段名 - * @param upper - * 可选 上层 - * @param lower - * 可选 下层 - * @param thisCacheField - * 必要 本类字段 - * @param - * 根类类型 - * @param - * 本类类型 - * @return - * 保存或更新的多层级对象 - */ - private static LevelCacheField saveLevelCacheField( Class rootClass , - String fieldName , - LevelCacheField upper , - LevelCacheField lower , - SingleCacheField thisCacheField){ - - - //通过根类获取 - HashMap levelCacheFieldHashMap = LEVEL_FIELD_CACHE_MAP.get(rootClass); - //判断有没有此根类的多级字段集 - if(levelCacheFieldHashMap != null){ - //有多层级字段集,根据字段名获取对象 - LevelCacheField levelCacheField = levelCacheFieldHashMap.get(fieldName); - - if(levelCacheField == null){ - //没有此多层级对象,创建并添加 - levelCacheField = new LevelCacheField<>(rootClass, fieldName, upper, lower, thisCacheField); - levelCacheFieldHashMap.put(fieldName , levelCacheField); - } - - //判断upper - if(levelCacheField.getUpperLevelField() == null){ - //如果原本没有上层 - //如果upper不为null,赋值 - if(upper != null){ - levelCacheField.setUpperLevelField(upper); - }else{ - //如果为null,则尝试从内存中寻找上层字段 - 根据level查找即可 - //如果有上层字段,则肯定同属一个根类,且上层level+1为此层level,此层level使用字段名切割判断 - LevelCacheField upperLevelCacheField = findUpperLevelCacheField(levelCacheField); - LevelCacheField tryFoundUpper = levelCacheFieldHashMap.entrySet().parallelStream().filter(e -> (e.getValue().getLevel() + 1) == fieldName.split("\\.").length).findFirst().map(Map.Entry::getValue).orElse(null); - //如果查询到了则赋值 - if(tryFoundUpper != null){ - levelCacheField.setUpperLevelField(tryFoundUpper); - } - } - } - - //判断lower - if(levelCacheField.getLowerLevelField() == null){ - - //如果lower不为null,赋值 - if(lower != null){ - levelCacheField.setLowerLevelField(lower); - }else{ - //如果lower为null,尝试从缓存中查找下层字段 - 根据level查找 - //如果有下层字段,则肯定同属一个根类,且下层level-1为此层level,此层level使用字段名切割判断 - try{ - LevelCacheField tryFoundLower = levelCacheFieldHashMap.entrySet().parallelStream().filter(e -> (e.getValue().getLevel() - 1) == fieldName.split("\\.").length).findFirst().map(Map.Entry::getValue).orElse(null); - levelCacheField.setLowerLevelField(tryFoundLower); - }catch (Exception ignore){ - //如果出现异常则说明没有查询到,不做处理 - } - } - } - - - //返回 - return levelCacheField; - - }else{ - //没有字段集,创建一个新的并保存 - HashMap newLevelCacheFieldHashMap = new HashMap<>(5); - //新的多层级缓存字段 - LevelCacheField newLevelCacheField = new LevelCacheField<>(rootClass, fieldName, null, null, thisCacheField); - //保存 - newLevelCacheFieldHashMap.put(fieldName, newLevelCacheField); - //根据根类记入缓存 - LEVEL_FIELD_CACHE_MAP.put(rootClass , newLevelCacheFieldHashMap); - - //返回 - return newLevelCacheField; - } - } - - /** - * 储存或覆盖一个多层级字段对象 - * 可选的上层字段、下层字段为null - * 必要:根类、字段名、本类字段 - * @param rootClass - * @param fieldName - * @param thisCacheField - * @param - * @param - * @return - */ - private static LevelCacheField saveLevelCacheField( Class rootClass , - String fieldName , - SingleCacheField thisCacheField){ - return saveLevelCacheField(rootClass, fieldName, null , null , thisCacheField); - } - - /** - * 储存或覆盖一个多层级字段对象的upper - * @param rootClass - * @param fieldName - * @param upper - * @param thisCacheField - * @param - * @param - * @return - */ - private static LevelCacheField saveLevelCacheFieldUpper( Class rootClass , - String fieldName , - LevelCacheField upper , - SingleCacheField thisCacheField){ - return saveLevelCacheField(rootClass, fieldName, upper , null , thisCacheField); - } - - /** - * 储存或覆盖一个多层级字段对象的lower - * @param rootClass - * @param fieldName - * @param lower - * @param thisCacheField - * @param - * @param - * @return - */ - private static LevelCacheField saveLevelCacheFieldLower( Class rootClass , - String fieldName , - LevelCacheField lower , - SingleCacheField thisCacheField){ - return saveLevelCacheField(rootClass, fieldName, null , lower , thisCacheField); - } - - - /* —————————————————————————————————————— 多层缓存对象内部类 —————————————————————————————————————————— */ - - /** - * 多层级字段缓存对象 - * 内部类,实现字段缓存,优化此工具类的效率 - * **由于多层级字段的缓存与获取很容易发生错误和异常,所以尽可能的将异常处理,提高容错性 - * @param 多层级缓存字段的根类类型 - * @param 多层级缓存字段的当前类类型 - */ - private static class LevelCacheField implements CacheField { - /* - 多层级,就要一环套一环 - 每层概念上都会有:上层、下层、当前层 - 当前层即为一个SingleCacheField对象 - - */ - - /** 多层级的根层所在类 */ - private final Class ROOT_CLASS; - - /** 字段名,唯一且不可变 */ - private final String FIELD_NAME; - - /** 上层对象 */ - private LevelCacheField upperLevelField; - - /** 下层 */ - private LevelCacheField lowerLevelField; - - /** 当前层 */ - private SingleCacheField thisLevelField; - - private final int LEVEL; - - /* —————————— 相关api —————————— */ - - /** - * 当前层所在类 - * @return - */ - public Class thisLevelClass(){ - return thisLevelField.getFieldWhereClassIn(); - } - - @Override - public Method getGetter() { - return thisLevelField.getGetter(); - } - - @Override - public Method getSetter() { - return thisLevelField.getSetter(); - } - - @Override - public Field getField() { - return thisLevelField.getField(); - } - - - /* —————————— getter setter —————————— */ - - /** - * 当前字段在根类中的层级数,1级为根字段,根据字段名的切割计算 - * @return - */ - public int getLevel(){ - return this.LEVEL; - } - - public Class getRootClass(){ - return ROOT_CLASS; - } - - public String getFieldName(){ - return FIELD_NAME; - } - - public LevelCacheField getUpperLevelField() { - return upperLevelField; - } - - /** - * 设置upper的同时设置对方的lower - * @param upperLevelField - */ - public void setUpperLevelField(LevelCacheField upperLevelField) { - if(this.upperLevelField != null){ - this.upperLevelField = upperLevelField; - //直接赋值,防止无限循环 - upperLevelField.lowerLevelField = this; - } - } - - public LevelCacheField getLowerLevelField() { - return lowerLevelField; - } - - /** - * 设置lower的同时设置对方的upper - * @param lowerLevelField - */ - public void setLowerLevelField(LevelCacheField lowerLevelField) { - if(lowerLevelField != null){ - this.lowerLevelField = lowerLevelField; - //直接赋值,防止无限循环 - lowerLevelField.upperLevelField = this; - } - } - - public SingleCacheField getThisLevelField() { - return thisLevelField; - } - - public void setThisLevelField(SingleCacheField thisLevelField) { - this.thisLevelField = thisLevelField; - } - - /* —————— toString、equals、hashcode —————— */ - - @Override - public String toString() { - return "LevelCacheField{" + - "rootClass=" + ROOT_CLASS + - ", fieldName='" + FIELD_NAME + '\'' + - ", thisLevelFieldClass=" + thisLevelField.getFieldWhereClassIn() + - '}'; - } - - @Override - public boolean equals(Object o) { - if (this == o) { - return true; - } - if (o == null || getClass() != o.getClass()) { - return false; - } - LevelCacheField that = (LevelCacheField) o; - return ROOT_CLASS.equals(that.ROOT_CLASS) && - FIELD_NAME.equals(that.FIELD_NAME); - } - - @Override - public int hashCode() { - return Objects.hash(ROOT_CLASS, FIELD_NAME); - } - - /* —————— 根类匹配方法 —————— */ - - /** - * 如果根类不匹配则抛出异常 - * @param myRoot - * @param inRoot - */ - private static void isSameRoot(Class myRoot , Class inRoot){ - if(!myRoot.equals(inRoot)){ - throw new NotSameRootException(myRoot , inRoot); - } - - } - - /** 构造 */ - public LevelCacheField(Class rootClass , String fieldName, LevelCacheField upperLevelField, LevelCacheField lowerLevelField, SingleCacheField thisLevelField){ - //验证必要参数 - allNonNull(rootClass , fieldName , thisLevelField); - - //根层Class - this.ROOT_CLASS = rootClass; - - //为字段名赋值,不可为null - this.FIELD_NAME = fieldName; - - //为层级赋值 - 根据字段名切割‘.’计算层级数 - this.LEVEL = fieldName.split("\\.").length; - - //上层对象,可为null - //如果不为null,则根层类必须相同 - if(upperLevelField != null){ - isSameRoot(rootClass , upperLevelField.getRootClass()); - this.upperLevelField = upperLevelField; - }else{ - this.upperLevelField = null; - } - - //下层对象,可为null - //如果不为null,则根层类必须相同 - if(lowerLevelField != null){ - isSameRoot(rootClass , lowerLevelField.getRootClass()); - this.lowerLevelField = lowerLevelField; - }else{ - this.lowerLevelField = null; - } - - //当前层字段对象,不可为null - this.thisLevelField = thisLevelField; - - } - - /** - * 使用根类具体对象返回当前层级的当前字段的值 - * 从上层对象遍历至此,由远到近 - * @return - */ - private Object thisLevelFieldGetter(R r){ - //如果有上层对象 - if(upperLevelField != null){ - //如果有上层对象,则先获取上层对象的字段值,再根据上层对象的字段值返回对象获取当前字段值 - //上层字段值的获取对象的类型为根类类型 - //得到上层对象的值,使用上层对象获取本类对象 - Object upperInvoke = upperLevelField.thisLevelFieldGetter(r); - if(upperInvoke != null){ - try { - //使用上层字段的值执行getter - return getThisLevelField().getGetter().invoke(upperInvoke); - } catch (IllegalAccessException | InvocationTargetException e) { - // 出现异常,展示异常并返回一个null - e.printStackTrace(); - return null; - } - }else{ - //如果上层返回值为null,则直接返回null - return null; - } - }else{ - //如果没有上层对象 - //如果level不为1,则尝试查询上层,如果查询不到则返回null - if(LEVEL != 1){ - // 查询上层字段 - LevelCacheField upperLevelCacheField = findUpperLevelCacheField(this); - if(upperLevelCacheField != null){ - //如果查询到了,赋值并重新获取 - this.upperLevelField = upperLevelCacheField; - return thisLevelFieldGetter(r); - }else{ - //如果没有找到上层对象,返回null - return null; - } - }else{ - //如果level为1没有上层对象,认为当前即为根类,获取当前层字段值 - InvokeResult invokeResult = thisLevelField.objectGetter(r); - //抛出异常并不合适,这里选择不做处理,直接返回null - return invokeResult.getInvoke(); - } - } - } - - - /** - * 为字段赋值 - * @param object - * @param param - * @return - */ - private Boolean thisLevelFieldSetter(R object, Object param) { - //如果有上层字段, - if(upperLevelField != null){ - //如果有上层字段值,则先获取上层字段的字段值,再根据上层字段的值为当前字段赋值 - Object upperInvoke = upperLevelField.thisLevelFieldGetter(object); - //如果上层对象的获取值为null,尝试为上层字段赋一个新的实例对象 - if(upperInvoke == null){ - try { - //获取上层对象的SETTER方法和字段类型 - Method upperSetter = upperLevelField.getSetter(); - Object upperInstance = upperLevelField.getThisLevelField().getFieldType().newInstance(); - //为上层字段的上层字段赋值 - //获取上上层对象的实例 - InvokeResult upperUpperInvokeResult = upperLevelField.upperLevelField.fieldValue(object); - //如果获取失败或没有值,直接返回null - if(upperUpperInvokeResult.isSuccess() && upperUpperInvokeResult.getInvoke() != null){ - MethodUtil.invoke(upperUpperInvokeResult, new Object[]{upperInstance}, upperSetter); - //赋值之后重新获取 - return thisLevelFieldSetter(object , param); - }else{ - return false; - } - } catch (Exception e) { - e.printStackTrace(); - //若是出现异常,则说明上层对象无法赋值 - //没有上层对象则无法为本层对象赋值,直接返回false - return false; - } - }else{ - try { - //上层对象的获取值不为null,使用上层对象的返回值赋值 - Method setter = thisLevelField.getSetter(); - MethodUtil.invoke(object, new Object[]{param}, setter); - return true; - } catch (Exception e) { - //如果出现异常,则说明赋值出现错误,返回false - e.printStackTrace(); - return false; - } - } - }else{ - //如果没有上层对象,判断层级 - if(LEVEL != 1){ - //如果层级不为1,则尝试查询上层,如果查询不到则返回null - LevelCacheField upperLevelCacheField = findUpperLevelCacheField(this); - if(upperLevelCacheField != null){ - //如果查询到了,赋值并重新查询 - this.upperLevelField = upperLevelCacheField; - return thisLevelFieldSetter(object, param); - }else{ - //没有查询到,直接返回false - return false; - } - }else{ - //层级为1,则本类即为根类,直接赋值并返回赋值结果 - InvokeResult invokeResult = thisLevelField.objectSetter(object, param); - return invokeResult.isSuccess(); - } - } - } - - - /** - * 获取字段的实例 - * @param object - * @return - */ - @Override - public InvokeResult fieldValue(R object) { - Object o = thisLevelFieldGetter(object); - if(o != null){ - return InvokeResult.success(o); - }else{ - return InvokeResult.fail(); - } - } - - /** - * 为字段赋值并封装结果 - * @param r - * @param param - * @return - */ - @Override - public InvokeResult fieldValueSet(R r , Object param){ - Boolean isSuccess = thisLevelFieldSetter(r, param); - if(isSuccess){ - return InvokeResult.emptySuccess(); - }else{ - return InvokeResult.fail(); - } - } - - } - - - - - - /* ———————————————————————————————————— 缓存获取 封装类 —————————————————————————————————— */ - - /** - * 内部类 - * 方法执行的返回值封装类,此类除了重写方法以外的唯一公共接口 - */ - public static class InvokeResult{ - private final boolean success; - private final Object invoke; - private InvokeResult(boolean success , Object invoke){ - this.success = success; - this.invoke = invoke; - } - - /* ---- factory ----*/ - - static InvokeResult emptySuccess(){ - return new InvokeResult(true , null); - } - static InvokeResult success(Object invoke){ - return new InvokeResult(true , invoke); - } - static InvokeResult fail(){ - return new InvokeResult(false, null); - } - - /* ---- getter ---- */ - - public boolean isSuccess() { - return success; - } - public Object getInvoke() { - return invoke; - } - } - - /* ———————————————————————————————————— 缓存获取 方法 —————————————————————————————————— */ - - /** - * 从缓存中获取一个缓存字段 - * @param fieldWhereClass - * @return - */ - private static CacheField getCacheField(Class fieldWhereClass , String fieldName){ - //判断这是单层字段还是多层字段 - String[] split = fieldName.split("\\."); - if(split.length == 1){ - //是单层的 - HashMap singleCacheFieldHashMap = SINGLE_FIELD_CACHE_MAP.get(fieldWhereClass); - //如果有此字段的信息,返回获取值,如果没有直接返回null - SingleCacheField singleCacheField = Optional.ofNullable(singleCacheFieldHashMap).map(m -> m.get(fieldName)).orElse(null); - return singleCacheField; - }else{ - //如果长度不为1,则认为是多层对象 - HashMap levelCacheFieldHashMap = LEVEL_FIELD_CACHE_MAP.get(fieldWhereClass); - //如果有此字段的信息,返回获取值,如果没有直接返回null - LevelCacheField levelCacheField = Optional.ofNullable(levelCacheFieldHashMap).map(m -> m.get(fieldName)).orElse(null); - return levelCacheField; - } - } - - - - /* ———————————————————————————————————— 缓存异常 ———————————————————————————————————— */ - - /** - * 多层级对象根类不匹配异常 - */ - private static class NotSameRootException extends RuntimeException{ - NotSameRootException(Class myRoot , Class inRoot){ - super("多层级对象的根类不同!当前根类:["+ myRoot +"] 无法设置根类为["+ inRoot +"]的多层级对象!"); - } - } - - /** - * 多层级字段上层无返回值异常 - */ - private static class UpperLevelNoResultException extends RuntimeException{ - UpperLevelNoResultException(LevelCacheField upper){ - super("上层对象没有返回值:" + upper); - } - } - - - /** - * 构造私有化 - */ - private FieldUtils(){} - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MethodUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MethodUtil.java deleted file mode 100644 index ee24c19..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MethodUtil.java +++ /dev/null @@ -1,193 +0,0 @@ -package com.forte.util.utils; - -import com.forte.util.MockConfiguration; -import com.forte.util.exception.ParameterSizeException; -import com.forte.util.invoker.ElementInvoker; -import com.forte.util.invoker.Invoker; -import com.forte.util.invoker.MethodInvoker; -import org.apache.commons.beanutils.ConvertUtils; - -import javax.script.ScriptEngine; -import javax.script.ScriptEngineManager; -import javax.script.ScriptException; -import java.lang.reflect.InvocationTargetException; -import java.lang.reflect.Method; -import java.lang.reflect.Parameter; -import java.util.Arrays; -import java.util.List; -import java.util.stream.Collectors; - -/** - * 方法执行工具 - * - * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - */ -public class MethodUtil { - - - /** - * 执行一个方法,可以为基本的数据类型进行转化 - * - * @param obj - * @param args - * @param method - * @return - * @throws InvocationTargetException - * @throws IllegalAccessException - */ - public static Object invoke(Object obj, Object[] args, Method method) throws InvocationTargetException, IllegalAccessException { - //获取参数的数据类型数组,准备转化数据类型 - Parameter[] parameters = method.getParameters(); - //如果传入参数与方法参数数量不符 ,抛出异常 - //不知道是否能识别 String... args 这种参数 - if (args.length != parameters.length) { - throw new ParameterSizeException(); - } - //创建一个新的Object数组保存转化后的参数,如果使用原数组的话会抛异常:ArrayStoreException - Object[] newArr = new Object[args.length]; - //遍历参数并转化 - for (int i = 0; i < parameters.length; i++) { - //使用BeanUtils的数据类型器对参数的数据类型进行转化 - //保存至新的参数集 - Class paramType = parameters[i].getType(); - Object arg = args[i]; - if(arg.getClass().equals(paramType)){ - newArr[i] = arg; - }else{ - newArr[i] = ConvertUtils.convert(arg, paramType); - } - } - - //返回方法的执行结果 - return method.invoke(obj, newArr); - } - - /** - * 执行一个方法,可以为基本的数据类型进行转化 - * - * @param obj - * @param args - * @param methodName 方法名 - * @return - * @throws NoSuchMethodException - */ - public static Object invoke(Object obj, Object[] args, String methodName) throws NoSuchMethodException { - //通过反射获取此方法 - Method[] methods = Arrays.stream(obj.getClass().getMethods()).filter(m -> m.getName().equals(methodName) && m.getParameters().length == args.length).toArray(Method[]::new); - for (Method m : methods) { - try { - return invoke(obj, args, m); - } catch (Exception e) { - e.printStackTrace(); - } - } - - throw new NoSuchMethodException(); - } - - - /** - * Filter out the Object Methods
- * 过滤掉Object中继承来的方法 - * @param methods 需要过滤的方法列表 - */ - public static List getOriginal(List methods){ - return methods.stream().parallel().filter(m -> Arrays.stream(Object.class.getMethods()).noneMatch(om -> om.equals(m))).collect(Collectors.toList()); - } - - - /** - * 将一个方法存至缓存 - */ - private void saveChcheMethod() { - //TODO 完成方法的缓存,缓存方法的获取方式:根据类:方法名获取、字段获取 - - } - - - //创建一个js脚本执行器 - private static ScriptEngineManager manager = new ScriptEngineManager(); - private static ScriptEngine se = manager.getEngineByName("js"); - - /** - * js中的eval函数,应该是只能进行简单的计算 - * 利用js脚本完成 - * - * @param str 需要进行eval执行的函数 - * @return 执行后的结果 - */ - public static Object eval(String str) throws ScriptException { - //脚本执行并返回结果 - if(MockConfiguration.isEnableJsScriptEngine()){ - return se.eval(str); - }else{ - // 未开启脚本执行,直接返回 - return str; - } - } - - /** - * js中的eval函数,应该是只能进行简单的计算 - * 利用js脚本完成 - * - * @param str 需要进行eval执行的函数 - * @return 执行后的结果 - */ - public static Object evalCache(String str) { - try { - return eval(str); - } catch (ScriptException ignore) { - return str; - } -// //脚本执行并返回结果 -// if(MockConfiguration.isEnableJsScriptEngine()){ -// try { -// return se.eval(str); -// } catch (ScriptException ignore) { -// return str; -// } -// }else{ -// // 未开启脚本执行,直接返回 -// return str; -// } - } - - /** - * 创建一个方法执行者 - * - * @return - */ - public static Invoker createMethodInvoker(Object obj, Object[] args, Method method) { - return MethodInvoker.getInstance(obj, args, method); - } - - /** - * 创建一个方法为空的方法执行者,代表没有方法,将会输出指定的字符串 - * - * @return - */ - public static Invoker createNullMethodInvoker(Object nullValue) { - return MethodInvoker.getInstance(nullValue); - } - - /** - * 创建一个数组元素获取执行者 - * @param arr - * @return - */ - public static Invoker createArrayElementInvoker(Object[] arr){ - return ElementInvoker.getInstance(arr); - } - - /** - * 创建一个集合元素获取执行者 - * @param list - * @return - */ - public static Invoker createListElementInvoker(List list){ - return ElementInvoker.getInstance(list); - } - -} - - diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MockUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MockUtil.java deleted file mode 100644 index 9698094..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/MockUtil.java +++ /dev/null @@ -1,1100 +0,0 @@ -package com.forte.util.utils; - -import java.text.SimpleDateFormat; -import java.util.Arrays; -import java.util.Calendar; -import java.util.Date; -import java.util.Random; - -/** - *

- * 随机数据助手,可能会用到的所有随机方法
- * 此类所有的方法,只要在方法名前加上'@' 即可在Mock中作为映射指令 - * 例如: - *

- * - *

map.put("name" , "@cname");

- *

map.put("age" , "@age");

- *

map.put("place" , "@ctitle(2,5)");

- *
- *

- *

- * - *

※以下列表仅供参考,一切以方法内实际参数为准,此注释有些情况可能未及时更新。

- * - *

--名称、title等相关

- *
    - *
  • - * {@link MockUtil#cname()}获取一个中文姓名 - *
  • - *
  • - * {@link MockUtil#cnames(Integer, Integer)}获取指定数量区间[min , max]个随机中文名 - *
  • - *
  • - * {@link MockUtil#cnames(Integer)}获取指定数量个随机中文名 - *
  • - *
  • - * {@link MockUtil#name()}获取一个英文姓名 - *
  • - *
  • - * {@link MockUtil#names(Integer, Integer)}获取指定数量区间[min, max]个随机英文姓名 - *
  • - *
  • - * {@link MockUtil#names(Integer)}获取指定数量num个随机英文姓名 - *
  • - *
  • - * {@link MockUtil#ctitle()}获取一段中文,3-5 - *
  • - *
  • - * {@link MockUtil#ctitle(Integer)}获取一段指定长度的中文 - *
  • - *
  • - * {@link MockUtil#ctitle(Integer)}获取指定数量区间个随机汉字,区间[min,max] - *
  • - *
  • - * {@link MockUtil#title()}获取一段指定长度的英文5-10 - *
  • - *
  • - * {@link MockUtil#title(Integer)}获取一段指定长度的英文 - *
  • - *
  • - * {@link MockUtil#title(Integer)}获取指定数量区间个随机英文,区间[min,max] - *
  • - *
  • - * {@link MockUtil#UUID()}获取一个UUID - *
  • - *
- * - *

--date相关

- *
    - *
  • - * {@link MockUtil#date()}获取随机日期 1990 - 现在 - *
  • - *
  • - * {@link MockUtil#toDateStr()}返回一个日随机日期的字符串 - *
  • - *
  • - * {@link MockUtil#time(String)}返回一个随机时间的字符串 - *
  • - *
  • - * {@link MockUtil#time()}返回一个随机时间的字符串,格式为HH:mm:ss - *
  • - *
  • - * {@link MockUtil#toDateTime(String)}返回一个随机时间日期的字符串 - *
  • - *
  • - * {@link MockUtil#toDateTime()}返回一个随机日期时间的字符串,格式为yyyy-dd-MM HH:mm:ss - *
  • - *
- * - *

--number相关

- *
    - *
  • - * {@link MockUtil#age()}获取一个随机年龄 12-80 - *
  • - *
  • - * {@link MockUtil#integer()}获取随机数字 0-9 - *
  • - *
  • - * {@link MockUtil#integer(Integer)}获取指定长度的随机数,※不可超过int最大上限 - *
  • - *
  • - * {@link MockUtil#integer(Integer, Integer)}获取指定区间[a,b]的随机数,※不可超过int最大上限 - *
  • - *
  • - * {@link MockUtil#doubles(Integer, Integer, Integer, Integer)}获取指定区间[a,b]的小数,指定小数位数[endL,endR],double类型 - *
  • - *
  • - * {@link MockUtil#doubles(Integer, Integer, Integer)}获取指定区间[a,b]的小数,指定小数位数[end],double类型 - *
  • - *
  • - * {@link MockUtil#doubles(Integer, Integer)}获取指定区间[a,b]的小数,默认小数位数为0,double类型 - *
  • - *
  • - * {@link MockUtil#doubles(Integer)}获取指定数值为a的小数,默认小数位数为0,double类型 - *
  • - *
  • - * {@link MockUtil#UUNUM()}获取一个32位的随机数字 - *
  • - *
  • - * {@link MockUtil#getNumber(Integer)}获取任意长度的随机整数 - *
  • - *
  • - * {@link MockUtil#getDouble(Integer, Integer)}获取指定位的小数 - *
  • - *
  • - * {@link MockUtil#getDouble(Integer, Integer, Integer, Integer)}获取指定位的小数 - *
  • - *
  • - * {@link MockUtil#UUDOUBLE()}获取32位小数,小数为2位 - *
  • - *
- *

--String character相关

- *
    - *
  • - * {@link MockUtil#bool()}返回一个随机布尔值 - *
  • - *
  • - * {@link MockUtil#bool(double)}根据概率返回布尔值 - *
  • - *
- * - *

--String character相关

- *
    - *
  • - * {@link MockUtil#character()}获取一个随机字符 - *
  • - *
  • - * {@link MockUtil#character(Character[]...)}在提供的字符字典(数组中)随机 返回 - *
  • - *
  • - * {@link MockUtil#word(Integer)}返回一个随机的假单词,指定长度区间[min,max] - *
  • - *
  • - * {@link MockUtil#word(Integer, Integer)}返回一个随机的假单词,指定长度 - *
  • - *
  • - * {@link MockUtil#word()}返回一个随机的假单词 - *
  • - *
  • - * {@link MockUtil#cword(Integer)}返回一个随机的假中文词语,指定长度区间[min,max] - *
  • - *
  • - * {@link MockUtil#cword(Integer)}返回一个随机的假中文词语,指定长度 - *
  • - *
  • - * {@link MockUtil#cword()}返回一个随机的假中文词语 - *
  • - *
- *

--color相关

- *
    - *
  • - * {@link MockUtil#color()}获取一个随机颜色的16进制代码 - *
  • - *
- *

--text相关

- *
    - *
  • - * {@link MockUtil#sentence(Integer, Integer)}随机假英文句子,句子中的单词数量为参数的区间[min,max] - *
  • - *
  • - * {@link MockUtil#sentence(Integer)}返回指定长度的句子 - *
  • - *
  • - * {@link MockUtil#sentence()}返回长度为12-18长度的句子 - *
  • - *
  • - * {@link MockUtil#csentence(Integer, Integer)}随机假中文句子,句子中的单词数量为参数的区间[min,max] - *
  • - *
  • - * {@link MockUtil#csentence(Integer)}返回指定长度的中文句子 - *
  • - *
  • - * {@link MockUtil#csentence()}返回长度为5-10长度的中文句子 - *
  • - *
  • - * {@link MockUtil#paragraph(Integer, Integer)}返回一个文本,文中句子数量为参数区间[min,max] - *
  • - *
  • - * {@link MockUtil#paragraph(Integer)}返回指定句子数量的文本 - *
  • - *
  • - * {@link MockUtil#paragraph()}返回一个有3-7个句子的文本 - *
  • - *
  • - * {@link MockUtil#cparagraph(Integer, Integer)}返回一个文本,文中句子数量为参数区间[min,max] - *
  • - *
  • - * {@link MockUtil#cparagraph(Integer)}返回指定句子数量的文本 - *
  • - *
  • - * {@link MockUtil#cparagraph()}返回一个有3-7个句子的文本 - *
  • - *
- *

--web相关

- *
    - *
  • - * {@link MockUtil#ip()}获取一个随机IP - *
  • - *
  • - * {@link MockUtil#tId()}获取一个随机的顶级域名 - *
  • - *
  • - * {@link MockUtil#email(String, String)}返回一个随机邮箱,可以指定邮箱的名称(@后面的名字)和顶级域名 - *
  • - *
  • - * {@link MockUtil#email(String)}返回一个随机邮箱,可以指定邮箱的名称(@后面的名字) - *
  • - *
  • - * {@link MockUtil#email()}返回一个随机邮箱 - *
  • - *
  • - * {@link MockUtil#domain(String)}随机生成一个域名,可指定顶级域名 - *
  • - *
  • - * {@link MockUtil#domain()}随机生成一个域名 - *
  • - *
  • - * {@link MockUtil#url(String)}随机一个url路径,可指定域名 - *
  • - *
  • - * {@link MockUtil#url()}随机一个url - *
  • - *
- * - * 注意:此类在进行方法重载的时候不应出现参数数量相同的重载方法 - * - * @author ForteScarlet - */ -@SuppressWarnings({"unused", "SpellCheckingInspection"}) -public class MockUtil { - - - /* —————————— 默认参数 ———————————— */ - /** - * {@link #date()}默认使用的格式化参数 - */ - private static final String DATE_FORMAT; - - private static final SimpleDateFormat SIMPLE_DATE_FORMAT; - - - /** - * {@link #time()}默认使用的格式化参数 - */ - private static final String TIME_FORMAT; - - private static final SimpleDateFormat SIMPLE_DATETIME_FORMAT; - - /** - * {@link #toDateTime()}默认使用的格式化参数 - */ - private static final String DATETIME_FORMAT; - - private static final SimpleDateFormat SIMPLE_TIME_FORMAT; - - /** - * 顶级域名合集 - */ - private static final String[] DOMAINS; - - - //静态代码块加载资源 - static { - // 加载定义域名合集 - String domainStr = "top,xyz,xin,vip,win,red,net,org,wang,gov,edu,mil,biz,name,info,mobi,pro,travel,club,museum,int,aero,post,rec,asia"; - DOMAINS = domainStr.split(","); - - // 日期格式化 - DATE_FORMAT = "yyyy-dd-MM"; - SIMPLE_DATE_FORMAT = new SimpleDateFormat(DATE_FORMAT); - - DATETIME_FORMAT = "yyyy-dd-MM HH:mm:ss"; - SIMPLE_DATETIME_FORMAT = new SimpleDateFormat(DATE_FORMAT); - - TIME_FORMAT = "HH:mm:ss"; - SIMPLE_TIME_FORMAT = new SimpleDateFormat(DATE_FORMAT); - - } - - - /* —————————— name/chinese/cname —————————— */ - - /** - * 获取一个随机中文名称 - */ - public static String cname() { - return ChineseUtil.getName(); - } - - - /** - * 获取指定数量区间[min , max]个随机中文名 - * - * @param min 最小数量 - * @param max 最大数量 - * @return - */ - public static String[] cnames(Integer min, Integer max) { - //获取随机数量 - int num = RandomUtil.getNumberWithRight(min, max); - String[] names = new String[num]; - //遍历并获取 - for (int i = 0; i < num; i++) { - names[i] = cname(); - } - //返回结果 - return names; - } - - /** - * 获取指定数量个随机中文名 - * - * @return - */ - public static String[] cnames(Integer num) { - return cnames(num, num); - } - - /** - * 随机获取一个中文姓氏 - 百家姓中获取 - */ - public static String cfirstName() { - return ChineseUtil.getFamilyName(); - } - - /** - * 获取一个随机英文姓名-两个开头大写的英文字母(title(2,7)+" "+title(2,7)) - */ - public static String name() { - int min = 2, max = 7; - return title(min, max) + " " + title(min, max); - } - - /** - * 获取指定数量区间[min, max]个随机英文姓名 - * - * @param min 最少数量 - * @param max 最大数量 - * @return - */ - public static String[] names(Integer min, Integer max) { - //获取随机数量 - int num = RandomUtil.getNumberWithRight(min, max); - String[] names = new String[num]; - //遍历并获取 - for (int i = 0; i < num; i++) { - names[i] = name(); - } - //返回结果 - return names; - } - - /** - * 获取指定数量num个随机英文姓名 - * - * @param num 获取数量 - * @return - */ - public static String[] names(Integer num) { - return names(num, num); - } - - /** - * 获取3-5个随机汉字 - */ - public static String ctitle() { - return ctitle(3, 5); - } - - - /** - * 获取指定数量个随机汉字 - * - * @param num - */ - public static String ctitle(Integer num) { - return ChineseUtil.getChinese(num); - } - - /** - * 获取指定数量区间个随机汉字,区间[min,max] - * - * @param min 最少数量 - * @param max 最大数量 - */ - public static String ctitle(Integer min, Integer max) { - return ChineseUtil.getChinese(RandomUtil.getNumberWithRight(min, max)); - } - - - /** - * 获取5-10长度的英文字符串,开头大写 - */ - public static String title() { - return title(5, 10); - } - - /** - * 获取指定长度的英文字符串,开头大写 - * - * @param num - */ - public static String title(Integer num) { - return title(num, num); - } - - /** - * 获取指定长度的英文字符串,开头大写 - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String title(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - String title = RandomUtil.getRandomString(num, false); - //全部小写,开头大写 - return FieldUtils.headUpper(title); - } - - /** - * 获取指定长度的英文字符串,纯小写 - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String string(Integer min, Integer max) { - return word(min, max); - } - - /** - * 获取指定长度的英文字符串,纯小写 - * - * @param num - */ - public static String string(Integer num) { - return string(num, num); - } - - /** - * 获取5-10长度的英文字符串,纯小写 - */ - public static String string() { - return string(5, 10); - } - - /** - * 获取指定长度的英文字符串,纯大写 - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String stringUpper(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - return RandomUtil.getRandomUpperString(num); - } - - /** - * 获取指定长度的英文字符串,纯大写 - * - * @param num - */ - public static String stringUpper(Integer num) { - return stringUpper(num, num); - } - /** - * 获取5-10长度的英文字符串,纯大写 - */ - public static String stringUpper() { - return stringUpper(5, 10); - } - - - - - /** - * 获取一个UUID - */ - public static String UUID() { - return RandomUtil.getUUID(); - } - - - - /* —————————— date —————————— */ - - /** - * 获取随机日期 - * 时间:1990 - 现在 - */ - public static Date date() { - Calendar calendar = Calendar.getInstance(); - //设置年份等参数 - int nowYear = calendar.get(Calendar.YEAR); - int nowDay = calendar.get(Calendar.DAY_OF_YEAR); - - //设置随机年份 - calendar.set(Calendar.YEAR, RandomUtil.getNumberWithRight(1990, nowYear)); - //设置随机日期 - calendar.set(Calendar.DAY_OF_YEAR, RandomUtil.getNumberWithRight(1, nowDay)); - //设置随机小时 - calendar.set(Calendar.HOUR_OF_DAY, RandomUtil.getNumberWithRight(1, 24)); - //设置随机分钟 - calendar.set(Calendar.MINUTE, RandomUtil.getNumberWithRight(1, 60)); - //设置随机秒 - calendar.set(Calendar.SECOND, RandomUtil.getNumberWithRight(1, 60)); - - //返回随机日期 - return calendar.getTime(); - } - - /** - * 返回一个随机日期的字符串 - * - * @param format - */ - public static String toDateStr(String format) { - return new SimpleDateFormat(format).format(date()); - } - - /** - * 返回一个随机日期的字符串,格式为yyyy-dd-MM - */ - public static String toDateStr() { - return SIMPLE_DATE_FORMAT.format(date()); - } - - /** - * 返回一个随机时间的字符串 - * - * @param format - */ - public static String time(String format) { - return new SimpleDateFormat(format).format(date()); - } - - /** - * 返回一个随机时间的字符串,格式为HH:mm:ss - */ - public static String time() { - return SIMPLE_TIME_FORMAT.format(date()); - } - - /** - * 返回一个随机时间日期的字符串 - * - * @param format - */ - public static String toDateTime(String format) { - return new SimpleDateFormat(format).format(date()); - } - - /** - * 返回一个随机日期时间的字符串,格式为yyyy-dd-MM HH:mm:ss - */ - public static String toDateTime() { - return SIMPLE_DATETIME_FORMAT.format(date()); - } - - /* —————————— number age —————————— */ - - /** - * 获取一个随机年龄 - * 12 - 80 - */ - public static Integer age() { - return RandomUtil.getNumberWithRight(12, 80); - } - - /** - * 获取随机数字 - * 0-9 - */ - public static Integer integer() { - return RandomUtil.getNumber(1); - } - - /** - * 获取指定长度的随机数 - * - * @param length 长度,长度请不要超过整数型上限。
如果需要获取无限长度的整数请使用{@link MockUtil#getNumber(Integer)} - */ - public static Integer integer(Integer length) { - return RandomUtil.getNumber(length); - } - - /** - * 获取指定区间[a,b]的随机数 - * - * @param a 最小值 - * @param b 最大值 - * @return - */ - public static Integer integer(Integer a, Integer b) { - return RandomUtil.getNumberWithRight(a, b); - } - - - /** - * 获取制定区间[a,b]的小数,指定小数位数[endL,endR],double类型 - * - * @param a 整数部分的最小值 - * @param b 整数部分的最大值 - * @param endL 小数部分位数最小值 - * @param endR 小数部分位数最大值 - * @return - */ - public static Double doubles(Integer a, Integer b, Integer endL, Integer endR) { - int integer = integer(a, b); - //获取小数位数值 - int end = RandomUtil.getNumberWithRight(endL, endR); - double dou = Double.parseDouble(RandomUtil.toFixed(RandomUtil.getRandom().nextDouble(), end)); - return integer + dou; - } - - /** - * 获取指定区间[a,b]的小数,指定小数位数[end],double类型 - * - * @param a - * @param b - * @param end - * @return - */ - public static Double doubles(Integer a, Integer b, Integer end) { - return doubles(a, b, end, end); - } - - /** - * 获取指定区间[a,b]的小数,默认小数位数为0,double类型 - * - * @param a - * @param b - * @return - */ - public static Double doubles(Integer a, Integer b) { - return doubles(a, b, 0, 0); - } - - /** - * 获取指定数值为a的小数,默认小数位数为0,double类型 - * - * @param a - * @return - */ - public static Double doubles(Integer a) { - return a * 1.0; - } - - - /** - * 获取一个32位的随机数字 - */ - public static String UUNUM() { - int length = 32; - StringBuilder sb = new StringBuilder(length); - for (int i = 0; i < length; i++) { - sb.append(integer()); - } - return sb.toString(); - } - - /** - * 获取任意长度的随机整数 - * - * @param length - */ - public static String getNumber(Integer length) { - return getNumber(length, length); - } - - /** - * 获取任意长度的随机整数 - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String getNumber(Integer min, Integer max) { - //获取长度 - int length = RandomUtil.getNumberWithRight(min, max); - StringBuilder sb = new StringBuilder(length); - for (int i = 0; i < length; i++) { - sb.append(integer()); - } - return sb.toString(); - } - - - /** - * 获取指定位的小数 - * - * @param intLength 整数部分的长度 - * @param douLength 保留小数位数 - */ - public static String getDouble(Integer intLength, Integer douLength) { - return getDouble(intLength, intLength, douLength, douLength); - } - - - /** - * 获取指定位的小数的最大区间 - * - * @param intMinLength 整数部分的长度最小值 - * @param intMaxLength 整数部分的长度最大值 - * @param douMinLength 保留小数位数最小值 - * @param douMaxLength 保留小数位数最大值 - */ - public static String getDouble(Integer intMinLength, Integer intMaxLength, Integer douMinLength, Integer douMaxLength) { - //先获取整数位 - return getNumber(intMinLength, intMaxLength) + - "." + - getNumber(douMinLength, douMaxLength); - } - - - /** - * 获取32位小数,小数为2位 - */ - public static String UUDOUBLE() { - return getDouble(32, 2); - } - - - - /* ——————————————————————String character code—————————————————————— */ - - /** - * 获取一个随机字符 - */ - public static Character character() { - return RandomUtil.getRandomChar(); - } - - /** - * 在提供的字典(数组中)随机 返回 - * - * @param dic - */ - public static Character character(Character[]... dic) { - //合并集合 - Character[] characters = Arrays.stream(dic).flatMap(Arrays::stream).toArray(Character[]::new); - return characters[RandomUtil.getNumber(characters.length)]; - } - - /** - * 返回一个随机的假单词 - */ - public static String word() { - return word(3, 12); - } - - /** - * 返回一个随机的假单词,指定长度 - * - * @param length 指定长度 - */ - public static String word(Integer length) { - return RandomUtil.getRandomString(length, false); - } - - /** - * 返回一个随机的假单词,指定长度区间[min,max] - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String word(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - return RandomUtil.getRandomString(num, false); - } - - /** - * 返回一个随机的假中文词语,指定长度区间[min,max] - * - * @param min 最小长度 - * @param max 最大长度 - */ - public static String cword(Integer min, Integer max) { - return ctitle(min, max); - } - - /** - * 返回一个随机的假中文词语,指定长度 - * - * @param length 单词长度 - */ - public static String cword(Integer length) { - return ctitle(length); - } - - /** - * 返回一个随机的假中文词语,长度2-4 - */ - public static String cword() { - return ctitle(2, 4); - } - - - /* —————————————————————— color —————————————————————— */ - - /** - * 获取一个随机颜色的16进制代码 - */ - public static String color() { - return RandomUtil.randomColor$hexString(); - } - - /* —————————————————————— boolean —————————————————————— */ - - /** - * 返回一个随机布尔值 - */ - public static Boolean bool() { - return RandomUtil.getRandom().nextBoolean(); - } - - /** - * 根据概率返回布尔值 - * - * @param prob 返回true的概率,建议取值区间:0-1 - */ - public static Boolean bool(double prob) { - return RandomUtil.getProbability(prob); - } - - /* —————————————————————— text —————————————————————— */ - - /** - * 随机假英文句子,句子中的单词数量为参数的区间[min,max] - * - * @param min 单词最少数量 - * @param max 单词最多数量 - */ - public static String sentence(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - StringBuilder sb = new StringBuilder(num); - for (int i = 1; i <= num; i++) { - //首句子字母大写 - sb.append(i == 0 ? FieldUtils.headUpper(word()) : word()); - if (i != num) { - sb.append(' '); - } else { - //30%概率为!结尾 - if (RandomUtil.getProbability(0.3)) { - sb.append("! "); - //否则30%概率?结尾 - } else if (RandomUtil.getProbability(0.3)) { - sb.append("? "); - //否则。结尾 - } else { - sb.append(". "); - } - } - } - return sb.toString(); - } - - /** - * 返回指定长度的句子 - * - * @param length - */ - public static String sentence(Integer length) { - return sentence(length, length); - } - - /** - * 返回长度为12-18长度的句子 - */ - public static String sentence() { - return sentence(12, 18); - } - - /** - * 随机假中文句子,句子中的单词数量为参数的区间[min,max] - * - * @param min 单词最少数量 - * @param max 单词最多数量 - */ - public static String csentence(Integer min, Integer max) { - StringBuilder sb = new StringBuilder(); - int num = RandomUtil.getNumberWithRight(min, max); - for (int i = 1; i <= num; i++) { - //首句子字母大写 - sb.append(cword()); - if (i == num) { - //30%概率为!结尾 - if (RandomUtil.getProbability(0.3)) { - sb.append("!"); - //否则30%概率?结尾 - } else if (RandomUtil.getProbability(0.3)) { - sb.append("?"); - //否则。结尾 - } else { - sb.append("。"); - } - } - } - return sb.toString(); - } - - - /** - * 返回指定长度的中文句子 - * - * @param length - */ - public static String csentence(Integer length) { - return csentence(length, length); - } - - /** - * 返回长度为5-10长度的中文句子 - */ - public static String csentence() { - return csentence(5, 10); - } - - /** - * 返回一个文本,文中句子数量为参数区间[min,max] - * - * @param min - * @param max - */ - public static String paragraph(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - StringBuilder sb = new StringBuilder(num); - for (int i = 1; i <= num; i++) { - sb.append(sentence()); - } - return sb.toString(); - } - - /** - * 返回指定句子数量的文本 - * - * @param length - */ - public static String paragraph(Integer length) { - return paragraph(length, length); - } - - /** - * 返回一个有3-7个句子的文本 - */ - public static String paragraph() { - return paragraph(3, 7); - } - - /** - * 返回一个文本,文中句子数量为参数区间[min,max] - * - * @param min 最小数量 - * @param max 最大数量 - */ - public static String cparagraph(Integer min, Integer max) { - int num = RandomUtil.getNumberWithRight(min, max); - StringBuilder sb = new StringBuilder(num); - for (int i = 1; i <= num; i++) { - sb.append(csentence()); - } - return sb.toString(); - } - - /** - * 返回指定句子数量的文本 - * - * @param length - */ - public static String cparagraph(Integer length) { - return cparagraph(length, length); - } - - /** - * 返回一个有3-7个句子的文本 - */ - public static String cparagraph() { - return cparagraph(3, 7); - } - - - /* —————————————————————— web —————————————————————— */ - - /** - * 获取一个随机IP - */ - public static String ip() { - Random random = RandomUtil.getRandom(); - return (random.nextInt(255) + 1) + - "." + - (random.nextInt(255) + 1) + - '.' + - (random.nextInt(255) + 1) + - '.' + - (random.nextInt(255) + 1); - } - - /** - * 获取一个随机的顶级域名 - */ - public static String tId() { - return RandomUtil.getRandomElement(DOMAINS); - } - - /** - * 返回一个随机邮箱,可以指定邮箱的名称(@后面的名字)和顶级域名 - */ - public static String email(String emailName, String tid) { - - return word() + - '@' + - emailName + - '.' + - tid; - } - - /** - * 返回一个随机邮箱,可以指定邮箱的名称(@后面的名字) - */ - public static String email(String emailName) { - return email(emailName, tId()); - } - - /** - * 返回一个随机邮箱 - */ - public static String email() { - return email(word(), tId()); - } - - /** - * 随机生成一个域名,可指定顶级域名 - * - * @param tid 指定顶级域名 - */ - public static String domain(String tid) { - if (RandomUtil.getRandom().nextBoolean()) { - return "www." + word() + "." + tid; - } - return word() + '.' + tid; - } - - /** - * 随机生成一个域名 - */ - public static String domain() { - return domain(tId()); - } - - /** - * 随机一个url路径,可指定域名 - * - * @param domainName 指定域名 - */ - public static String url(String domainName) { - StringBuilder sb = new StringBuilder(32); - //url前半部分 - sb.append("http://").append(domainName).append('/').append(word()); - //每次有0.2的概率再追加一层路径 - while (RandomUtil.getProbability(0.2)) { - sb.append('/').append(word()); - } - return sb.toString(); - } - - /** - * 随机一个url - */ - public static String url() { - return url(domain()); - } - - - /* —————————————————————— 构造 —————————————————————— */ - - /** - * 构造私有化 - */ - private MockUtil() { } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ProxyUtils.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ProxyUtils.java deleted file mode 100644 index 8fd2ef4..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/ProxyUtils.java +++ /dev/null @@ -1,46 +0,0 @@ -package com.forte.util.utils; - -import com.forte.util.function.ExProxyHandler; - -import java.lang.reflect.InvocationHandler; -import java.lang.reflect.Method; -import java.lang.reflect.Modifier; -import java.lang.reflect.Proxy; - -/** - * 动态代理工具类,为一个接口类型创建动态代理对象 - * @author ForteScarlet - */ -public class ProxyUtils { - - /** - * 为一个接口类型创建动态代理对象。 - * @param type 接口的类型 - * @param proxyHandler 动态代理的逻辑处理类 - * @param - * @return - */ - public static T proxy(Class type, ExProxyHandler proxyHandler){ - if (!Modifier.isInterface(type.getModifiers())) { - throw new IllegalArgumentException("type ["+ type +"] is not a interface type."); - } - return (T) Proxy.newProxyInstance(type.getClassLoader(), new Class[]{type}, (p, m, o) -> proxyHandler.apply(m, o)); - } - - /** - * 为一个接口类型创建动态代理对象。 - * @param type 接口的类型 - * @param proxyHandler 动态代理的逻辑处理类 - * @param - * @return - */ - public static T proxy(Class type, InvocationHandler proxyHandler){ - if (!Modifier.isInterface(type.getModifiers())) { - throw new IllegalArgumentException("type ["+ type +"] is not a interface type."); - } - return (T) Proxy.newProxyInstance(type.getClassLoader(), new Class[]{type}, proxyHandler); - } - - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RandomUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RandomUtil.java deleted file mode 100644 index cb08d21..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RandomUtil.java +++ /dev/null @@ -1,365 +0,0 @@ -package com.forte.util.utils; - -import java.awt.*; -import java.text.DecimalFormat; -import java.util.List; -import java.util.Objects; -import java.util.Random; -import java.util.UUID; -import java.util.concurrent.ThreadLocalRandom; - -/** - * 随机值获取工具类 - * @author ForteScarlet - */ -public class RandomUtil { - - /** - * 保存一个单例Random - */ - private static final Random LOCAL_RANDOM = new Random(); - - /** - * 获取一个Random实例。 - * 是一个线程ThreadLocalRandom对象。 - */ - public static ThreadLocalRandom getRandom() { - return ThreadLocalRandom.current(); - } - - /** - * 获取一个单例的Random对象。 - */ - public static Random getLocalRandom() { - return LOCAL_RANDOM; - } - - /* ——————————————————————— getNumber : 获取随机长度字母(仅数字,尽量不要超过int的最大数上限长度) ——————————————————————————— */ - - /** - * 获取长度为4的随机数 - * - * @return - */ - public static int getNumber() { - return getNumber(4); - } - - /** - * 获取指定长度的随机数 - * @param length 数字的长度 - * @return - */ - public static int getNumber(int length) { - length--; - int pow = (int) Math.pow(10, length); - ThreadLocalRandom random = getRandom(); - if (length >= 1) { - //参照算法:random.nextInt(9000)+1000; - //(9 * pow) - int nextInt = (pow << 3) + pow; - return (random.nextInt(nextInt) + pow); - } else { - return random.nextInt(10); - } - } - - /** - * 获取一个随机整数 - * - * @return - */ - public static int getInteger() { - return getRandom().nextInt(10); - } - - /** - * 获取某个区间中的随机数[a,b) - * - * @return - */ - public static int getNumber(int a, int b) { - return getRandom().nextInt(a, b); - } - - - /** - * 获取某个区间中的随机数[a,b] - * - * @param a min number - * @param b max number - */ - public static int getNumberWithRight(int a, int b) { - return getNumber(a, b+1); - } - - /** - * @see #getNumberWithRight(int, int) - */ - @Deprecated - public static int getNumber$right(int a, int b) { - return getNumberWithRight(a, b); - } - - - /* ——————————————————————— getCode : 获取随机code(字母和数字) ——————————————————————————— */ - - - /** - * 获取随机code,包含数字和字母 - * - * @param length - * @return - */ - public static String getCode(int length) { - StringBuilder s = new StringBuilder(length); - Random r = getRandom(); - for (int i = 1; i <= length; i++) { - if (r.nextBoolean()) { - //0.5的概率为0-9的数字 - s.append(r.nextInt()); - } else { - //0.5的概率为字母,其中大写0.25,小写0.25 - if (r.nextBoolean()) { - //小写 - s.append(getRandomChar()); - } else { - //大写 - s.append(Character.toUpperCase(getRandomChar())); - } - } - } - return s.toString(); - } - - /** - * 获取一个4位数的随机code,字母小写 - * - * @return - */ - public static String getCode() { - return getCode(4); - } - - - /* ——————————————————————— getUUID : 获取随机UUID,java.util自带的UUID方法 ——————————————————————————— */ - - - /** - * 获取UUID.toString - * - * @return - */ - public static String getUUID() { - return UUID.randomUUID().toString(); - } - - - /* ——————————————————————— getRandomChar : 获取随机字符(单个字母) ——————————————————————————— */ - - /** - * 获取一个随机英文字符,小写 - */ - public static char getRandomChar() { - return (char) (RandomUtil.getRandom().nextInt(26) + 97); - } - - /** - * 获取一个随机英文字符,大写 - */ - public static char getRandomUpperChar() { - return (char) (RandomUtil.getRandom().nextInt(26) + 65); - } - - - /* ———————————————————— getRandomString : 获取随机字符串 ———————————————————————— */ - - /** - * 得到一串纯大写的字符串 - * @param length 字符串长度 - */ - public static String getRandomUpperString(int length){ - char[] crr = new char[length]; - for (int i = 0; i < length; i++) { - crr[i] = getRandomUpperChar(); - } - return String.valueOf(crr); - } - - /** - * 获取一串指定长度的随机字符串 - * - * @param length 字符串长度 - * @param randomCase 是否开启随机大小写 - * @return - */ - public static String getRandomString(int length, boolean randomCase) { - char[] crr = new char[length]; - for (int i = 0; i < length; i++) { - //如果开启了随机大写,则有概率将字符转为大写 1/2 - if (randomCase) { - crr[i] = RandomUtil.getRandom().nextBoolean() ? getRandomChar() : getRandomUpperChar(); - } else { - crr[i] = getRandomChar(); - } - } - return String.valueOf(crr); - } - - /** - * 获取一串指定长度的随机字符串,默认小写 - * - * @param length 字符串长度 - * @return - */ - public static String getRandomString(int length) { - return getRandomString(length, false); - } - - - /** - * 获取一串长度为32的字符串,默认小写 - */ - public static String getRandomString() { - return getRandomString(32, false); - } - - /** - * 数字小数保留 - * - * @param dnum 需要保留的小数 - * @param length 小数保留位数 - * @return - */ - public static String toFixed(Number dnum, int length) { - StringBuilder sb = new StringBuilder(2 + length).append("#."); - //遍历并设置位数 - for (int i = 0; i < length; i++) { - sb.append('0'); - } - - //返回结果 - String formatStr = sb.toString(); - sb = new StringBuilder(); - String douStr = numFormat(dnum, formatStr); - sb.append(douStr); - if(douStr.startsWith(".")){ - //如果开头是点,说明首位是0,补位 - sb.append('0').append(douStr); - } - return sb.toString(); - } - - - /** - * 自定义数字格式化 - * - * @param dnum - * @param formatStr - * @return - */ - public static String numFormat(Number dnum, String formatStr) { - return new DecimalFormat(formatStr).format(dnum); - } - - /* ———————————————————— getColor : 获取随机颜色 ———————————————————————— */ - - /** - * 返回一个随机颜色 - * - * @return - */ - public static Color randomColor() { - int[] arr = randomColor$intArr(); - return new Color(arr[0], arr[1], arr[2]); - } - - /** - * 返回一个长度为三的数组,三位分别代表了颜色的R、G、B - * - * @return - */ - public static int[] randomColor$intArr() { - final int[] arr = new int[3]; - Random random = RandomUtil.getRandom(); - arr[0] = random.nextInt(256); - arr[1] = random.nextInt(256); - arr[2] = random.nextInt(256); - return arr; - } - - /** - * 返回16进制颜色代码 - * - * @return - */ - public static String randomColor$hexString() { - int[] arr = randomColor$intArr(); - StringBuilder sb = new StringBuilder(); - String r = Integer.toHexString(arr[0]); - r = r.length() == 1 ? '0' + r : r; - - String g = Integer.toHexString(arr[1]); - g = g.length() == 1 ? '0' + g : g; - - String b = Integer.toHexString(arr[2]); - b = b.length() == 1 ? '0' + b : b; - sb.append("#") - .append(r) - .append(g) - .append(b); - return sb.toString(); - } - - /* ———————————————————— getProbability : 根据概率获取boolean ———————————————————————— */ - - /** - * 根据概率获取boolean,区间:[probL , probR] - * - * @param probL 概率百分比区间的左参数,取值范围为0-1之间,对应了0%和100% - * @param probR 概率百分比区间的右参数,取值范围为0-1之间,对应了0%和100% - * @return - */ - public static Boolean getProbability(double probL, double probR) { - double v = RandomUtil.getRandom().nextDouble(); - if (v >= probL && v <= probR) { - return true; - } - return false; - } - - /** - * 根据概率获取boolean,区间:[0 , prob] - * 填入的参数即为概率的百分比 - * - * @param prob 概率百分比的小数形式,参数范围0-1 - * @return - */ - public static Boolean getProbability(double prob) { - return getProbability(0, prob); - } - - /* ———————————————————— getRandomElement : 从数组或者集合中获取一个随机元素 ———————————————————————— */ - - /** - * 从数组中返回一个随机元素 - * @param trr 数组 - * @return 随机元素 - */ - public static T getRandomElement(T[] trr) { - Objects.requireNonNull(trr); - return trr.length == 0 ? null : trr[RandomUtil.getRandom().nextInt(trr.length)]; - } - - /** - * 从集合中返回一个随机元素, ru如果数组为空则返回null - * @param trr 集合 - * @return 随机元素 - */ - public static T getRandomElement(List trr) { - Objects.requireNonNull(trr); - return trr.size() == 0 ? null : trr.get(RandomUtil.getRandom().nextInt(trr.size())); - } - - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RegexUtil.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RegexUtil.java deleted file mode 100644 index 3db37df..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/RegexUtil.java +++ /dev/null @@ -1,33 +0,0 @@ -package com.forte.util.utils; - -import java.util.ArrayList; -import java.util.List; -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -/** - * 正则表达式匹配两个字符串之间的内容 - * 代码来源于网络 - * - * @author 来源:https://www.cnblogs.com/jimmy-c/p/4139664.html - */ -public class RegexUtil { - - public static List getMatcher(String source, String regex) { - - /* - Pattern: 一个Pattern是一个正则表达式经编译后的表现模式。 - Matcher: 一个Matcher对象是一个状态机器,它依据Pattern对象做为匹配模式对字符串展开匹配检查。 - */ - Pattern pattern = Pattern.compile(regex); - Matcher matcher = pattern.matcher(source); - //记录匹配的数量并创建数组 - List end = new ArrayList<>(); - //遍历并保存结果集 - while (matcher.find()) { - end.add(matcher.group()); - } - //返回结果 - return end; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/SingleFactory.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/SingleFactory.java deleted file mode 100644 index 87afbe4..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/java/com/forte/util/utils/SingleFactory.java +++ /dev/null @@ -1,347 +0,0 @@ -package com.forte.util.utils; - -import java.lang.reflect.InvocationTargetException; -import java.util.*; -import java.util.concurrent.atomic.AtomicReference; -import java.util.function.Supplier; - -/** - * 基于CAS原理的单例工厂,提供了丰富的方法来记录或获取一个单例实例
- * 单例原理为基于CAS的乐观锁懒汉式单例
- * 方法内全部为final类型的静态方法,所以本类为抽象类且没有继承的必要
- * @author ForteScarlet <[163邮箱地址]ForteScarlet@163.com> - * @date 2018/12/19 15:59 - * @version 2.0 - * @since JDK1.8 - **/ -public abstract class SingleFactory { - - /** - * 单例仓库-线程安全Map - */ - private static final Map SINGLE_MAP = Collections.synchronizedMap(new HashMap<>()); - - /** - * 获取单例,如果没有此类的记录则返回空 - */ - public static final T get(Class clz){ - return Optional.ofNullable(SINGLE_MAP.get(clz)).map(s -> (T)s.get()).orElse(null); - } - - /** - * 获取单例,如果没有则尝试使用反射获取一个新的,将会被记录。 - * 如果创建失败将会抛出相应的异常 - * @param - * @return - */ - public static final T getOrNew(Class clz){ - return getOrNew(clz, (Object[]) null); - } - - public static final T getOrNew(Class clz , Object... params){ - return Optional.ofNullable(SINGLE_MAP.get(clz)).map(s -> (T)s.get()).orElseGet(() -> { - //如果没有,记录 - return set(clz, () -> { - try { - //判断参数数量 - if(params != null && params.length > 0){ - //如果数量大于0 - //获取参数的class数组 - Class[] classes = Arrays.stream(params).map(Object::getClass).toArray(Class[]::new); - //尝试使用反射获取一个新对象 - return clz.getConstructor(classes).newInstance(params); - }else{ - //没有参数,直接获取 - return clz.newInstance(); - } - } catch (InstantiationException | IllegalAccessException | InvocationTargetException | NoSuchMethodException e) { - throw new RuntimeException(e); - } - }, false).get(); - }); - } - - - /** - * 如果存在则获取,不存在则赋值 - * @param clz - * @param t - * @param - * @return - */ - public static final T getOrSet(Class clz, T t){ - return Optional.ofNullable((T)get(clz)).orElseGet(() -> setAndGet(clz, t)); - } - - /** - * 如果存在则获取,不存在则赋值,不指定class对象 - * @param t - * @param - * @return - */ - public static final T getOrSet(T t){ - return Optional.ofNullable((T)get(t.getClass())).orElseGet(() -> setAndGet(t)); - } - - - /** - * 如果存在则获取,不存在则赋值 - * @param clz - * @param supplier - * @param - * @return - */ - public static final T getOrSet(Class clz, Supplier supplier){ - return Optional.ofNullable((T)get(clz)).orElseGet(() -> setAndGet(clz, supplier)); - } - - /** - * 如果存在则获取,不存在则赋值,不指定class对象 - * @param supplier - * @param - * @return - */ - public static final T getOrSet(Supplier supplier){ - return Optional.ofNullable((T)get(supplier.get().getClass())).orElseGet(() -> setAndGet(supplier)); - } - - /** - * 重设一个单例 - * @param clz - * @param t - * @param - */ - public static final void reset(Class clz , T t){ - set(clz, t, true); - } - - - /** - * 重设一个单例,不指定class - * @param t - * @param - */ - public static final void reset(T t){ - reset(t.getClass(), t); - } - - /** - * 重设一个单例并获取单例实例 - * @param clz - * @param t - * @param - * @return - */ - public static final T resetAndGet(Class clz , T t){ - return set(clz, t, true).get(); - } - - - /** - * 重设一个单例并获取单例实例,不指定class - * @param t - * @param - * @return - */ - public static final T resetAndGet(T t){ - return resetAndGet((Class)t.getClass(), t); - } - - /** - * 重设一个单例 - * @param clz - * @param supplier - * @param - */ - public static final void reset(Class clz , Supplier supplier){ - set(clz, supplier, true); - } - - - /** - * 重设一个单例,不指定class - * @param supplier - * @param - */ - public static final void reset(Supplier supplier){ - reset(supplier.get().getClass(), supplier); - } - - /** - * 重设一个单例并获取单例实例 - * @param clz - * @param supplier - * @param - * @return - */ - public static final T resetAndGet(Class clz , Supplier supplier){ - return set(clz, supplier, true).get(); - } - - /** - * 重设一个单例并获取单例实例,不指定class - * @param supplier - * @param - * @return - */ - public static final T resetAndGet(Supplier supplier){ - return resetAndGet((Class)supplier.get().getClass(), supplier); - } - - - /** - * 记录一个单例 - * @param clz - * @param t - * @param - */ - public static final void set(Class clz , T t){ - set(clz, t, false); - } - - /** - * 记录一个单例,不指定class - * @param t - * @param - */ - public static final void set(T t){ - set(t.getClass(), t); - } - - /** - * 记录一个单例并获取单例实例 - * @param clz - * @param t - * @param - * @return - */ - public static final T setAndGet(Class clz , T t){ - return set(clz, t, false).get(); - } - - - /** - * 记录一个单例并获取单例实例,不指定class - * @param t - * @param - * @return - */ - public static final T setAndGet(T t){ - return setAndGet((Class)t.getClass(), t); - } - - /** - * 记录一个单例 - * @param clz - * @param supplier - * @param - */ - public static final void set(Class clz , Supplier supplier){ - set(clz, supplier, false); - } - - /** - * 记录一个单例,不指定class - * @param supplier - * @param - */ - public static final void set(Supplier supplier){ - set(supplier.get().getClass(), supplier); - } - - /** - * 记录一个单例并获取单例实例 - * @param clz - * @param supplier - * @param - * @return - */ - public static final T setAndGet(Class clz , Supplier supplier){ - return set(clz, supplier, false).get(); - } - - /** - * 记录一个单例,不指定class - * @param supplier - * @param - * @return - */ - public static final T setAndGet(Supplier supplier){ - return setAndGet((Class)supplier.get().getClass(), supplier); - } - - - /** - * 记录一个单例对象或重设一个单例-对象 - * @param clz - * @param t - * @param - */ - private static final SingleBean set(Class clz , T t , boolean reset){ - return set(clz , () -> t , reset); - } - - /** - * 记录一个单例对象或重设一个单例-函数接口 - * 赋值相关重载方法的根方法 - * 使用synchronized标记 - * @param clz - * @param supplier - * @param - */ - private static synchronized final SingleBean set(Class clz , Supplier supplier , boolean reset){ - SingleBean singleBean = SINGLE_MAP.get(clz); - if(!reset && singleBean != null){ - //如果已经存在,直接返回此对象 - return singleBean; - } - - //创建新对象 - singleBean = new SingleBean<>(supplier); - SINGLE_MAP.put(clz, singleBean); - return singleBean; - } - - - /** - * 内部单例类,应当是基于CAS原理的懒汉式单例类 - * @param - */ - private static final class SingleBean { - /** - * 获取实例的方法 - */ - private final Supplier supplier; - private AtomicReference single = new AtomicReference<>(); - - /** - * 获取单例 - */ - private T get() { - for (;;) { - //获取 - T currect = single.get(); - //如果存在直接返回 - if (currect != null) { - return currect; - } - //创建 - currect = supplier.get(); - //原子赋值 - if (single.compareAndSet(null, currect)) { - return currect; - } - } - } - - /** - * 构造 - * - * @param supplier - */ - private SingleBean(Supplier supplier) { - this.supplier = supplier; - } - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/resources/mock/surnames b/docs/raw-materials/backup/qabox-alt/api-mock/src/main/resources/mock/surnames deleted file mode 100644 index 544515b..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/main/resources/mock/surnames +++ /dev/null @@ -1,598 +0,0 @@ --24,-75,-75 --23,-110,-79 --27,-83,-103 --26,-99,-114 --27,-111,-88 --27,-112,-76 --23,-125,-111 --25,-114,-117 --27,-122,-81 --23,-103,-120 --24,-92,-102 --27,-115,-85 --24,-110,-117 --26,-78,-120 --23,-97,-87 --26,-99,-88 --26,-100,-79 --25,-89,-90 --27,-80,-92 --24,-82,-72 --28,-67,-107 --27,-112,-107 --26,-106,-67 --27,-68,-96 --27,-83,-108 --26,-101,-71 --28,-72,-91 --27,-115,-114 --23,-121,-111 --23,-83,-113 --23,-103,-74 --27,-89,-100 --26,-120,-102 --24,-80,-94 --23,-126,-71 --27,-106,-69 --26,-97,-113 --26,-80,-76 --25,-86,-90 --25,-85,-96 --28,-70,-111 --24,-117,-113 --26,-67,-104 --24,-111,-101 --27,-91,-102 --24,-116,-125 --27,-67,-83 --23,-125,-114 --23,-78,-127 --23,-97,-90 --26,-104,-116 --23,-87,-84 --24,-117,-105 --27,-121,-92 --24,-118,-79 --26,-106,-71 --28,-65,-98 --28,-69,-69 --24,-94,-127 --26,-97,-77 --23,-123,-122 --23,-78,-115 --27,-113,-78 --27,-108,-112 --24,-76,-71 --27,-69,-119 --27,-78,-111 --24,-106,-101 --23,-101,-73 --24,-76,-70 --27,-128,-86 --26,-79,-92 --26,-69,-107 --26,-82,-73 --25,-67,-105 --26,-81,-107 --23,-125,-99 --23,-126,-84 --27,-82,-119 --27,-72,-72 --28,-71,-112 --28,-70,-114 --26,-105,-74 --27,-126,-123 --25,-102,-82 --27,-115,-98 --23,-67,-112 --27,-70,-73 --28,-68,-115 --28,-67,-103 --27,-123,-125 --27,-115,-100 --23,-95,-66 --27,-83,-97 --27,-71,-77 --23,-69,-124 --27,-110,-116 --25,-87,-122 --24,-112,-89 --27,-80,-71 --27,-89,-102 --23,-126,-75 --26,-71,-101 --26,-79,-86 --25,-91,-127 --26,-81,-101 --25,-90,-71 --25,-117,-124 --25,-79,-77 --24,-76,-99 --26,-104,-114 --24,-121,-89 --24,-82,-95 --28,-68,-113 --26,-120,-112 --26,-120,-76 --24,-80,-120 --27,-82,-117 --24,-116,-123 --27,-70,-98 --25,-122,-118 --25,-70,-86 --24,-120,-110 --27,-79,-120 --23,-95,-71 --25,-91,-99 --24,-111,-93 --26,-94,-127 --26,-99,-100 --23,-104,-82 --24,-109,-99 --23,-105,-75 --27,-72,-83 --27,-83,-93 --23,-70,-69 --27,-68,-70 --24,-76,-66 --24,-73,-81 --27,-88,-124 --27,-115,-79 --26,-79,-97 --25,-85,-91 --23,-94,-100 --23,-125,-83 --26,-94,-123 --25,-101,-101 --26,-98,-105 --27,-120,-127 --23,-110,-97 --27,-66,-112 --23,-126,-79 --23,-86,-122 --23,-85,-104 --27,-92,-113 --24,-108,-95 --25,-108,-80 --26,-88,-118 --24,-125,-95 --27,-121,-116 --23,-100,-115 --24,-103,-98 --28,-72,-121 --26,-108,-81 --26,-97,-81 --26,-104,-99 --25,-82,-95 --27,-115,-94 --24,-114,-85 --25,-69,-113 --26,-120,-65 --24,-93,-104 --25,-68,-86 --27,-71,-78 --24,-89,-93 --27,-70,-108 --27,-82,-105 --28,-72,-127 --27,-82,-93 --24,-76,-78 --23,-126,-109 --23,-125,-127 --27,-115,-107 --26,-99,-83 --26,-76,-86 --27,-116,-123 --24,-81,-72 --27,-73,-90 --25,-97,-77 --27,-76,-108 --27,-112,-119 --23,-110,-82 --23,-66,-102 --25,-88,-117 --27,-75,-121 --23,-126,-94 --26,-69,-111 --24,-93,-76 --23,-103,-122 --24,-115,-93 --25,-65,-127 --24,-115,-128 --25,-66,-118 --28,-70,-114 --26,-125,-96 --25,-108,-124 --26,-101,-78 --27,-82,-74 --27,-80,-127 --24,-118,-82 --25,-66,-65 --27,-126,-88 --23,-99,-77 --26,-79,-78 --23,-126,-76 --25,-77,-100 --26,-99,-66 --28,-70,-107 --26,-82,-75 --27,-81,-116 --27,-73,-85 --28,-71,-116 --25,-124,-90 --27,-73,-76 --27,-68,-109 --25,-119,-89 --23,-102,-105 --27,-79,-79 --24,-80,-73 --24,-67,-90 --28,-66,-81 --27,-82,-109 --24,-109,-84 --27,-123,-88 --23,-125,-105 --25,-113,-83 --28,-69,-80 --25,-89,-117 --28,-69,-78 --28,-68,-118 --27,-82,-85 --27,-82,-127 --28,-69,-121 --26,-96,-66 --26,-102,-76 --25,-108,-104 --23,-110,-83 --27,-114,-119 --26,-120,-114 --25,-91,-106 --26,-83,-90 --25,-84,-90 --27,-120,-104 --26,-103,-81 --24,-87,-71 --26,-99,-97 --23,-66,-103 --27,-113,-74 --27,-71,-72 --27,-113,-72 --23,-97,-74 --23,-125,-100 --23,-69,-114 --24,-109,-97 --26,-70,-91 --27,-115,-80 --27,-82,-65 --25,-103,-67 --26,-128,-128 --24,-110,-78 --23,-126,-80 --28,-69,-114 --23,-124,-126 --25,-76,-94 --27,-110,-72 --25,-79,-115 --24,-75,-106 --27,-115,-109 --24,-108,-70 --27,-79,-96 --24,-110,-103 --26,-79,-96 --28,-71,-108 --23,-104,-76 --23,-125,-127 --24,-125,-91 --24,-125,-67 --24,-117,-115 --27,-113,-116 --23,-105,-69 --24,-114,-104 --27,-123,-102 --25,-65,-97 --24,-80,-83 --24,-76,-95 --27,-118,-77 --23,-128,-124 --27,-89,-84 --25,-108,-77 --26,-119,-74 --27,-96,-75 --27,-122,-119 --27,-82,-80 --23,-125,-90 --23,-101,-115 --27,-115,-76 --25,-110,-87 --26,-95,-111 --26,-95,-126 --26,-65,-82 --25,-119,-101 --27,-81,-65 --23,-128,-102 --24,-66,-71 --26,-119,-120 --25,-121,-107 --27,-122,-128 --26,-75,-90 --27,-80,-102 --27,-122,-100 --26,-72,-87 --27,-120,-85 --27,-70,-124 --26,-103,-113 --26,-97,-76 --25,-98,-65 --23,-104,-114 --27,-123,-123 --26,-123,-107 --24,-65,-98 --24,-116,-71 --28,-71,-96 --27,-82,-90 --24,-119,-66 --23,-79,-68 --27,-82,-71 --27,-112,-111 --27,-113,-92 --26,-104,-109 --26,-123,-114 --26,-120,-120 --27,-69,-106 --27,-70,-66 --25,-69,-120 --26,-102,-88 --27,-79,-123 --24,-95,-95 --26,-83,-91 --23,-125,-67 --24,-128,-65 --26,-69,-95 --27,-68,-104 --27,-116,-95 --27,-101,-67 --26,-106,-121 --27,-81,-121 --27,-71,-65 --25,-90,-124 --23,-104,-103 --28,-72,-100 --26,-84,-89 --26,-82,-77 --26,-78,-125 --27,-120,-87 --24,-108,-102 --24,-74,-118 --27,-92,-108 --23,-102,-122 --27,-72,-120 --27,-73,-87 --27,-114,-115 --24,-127,-126 --26,-103,-127 --27,-117,-66 --26,-107,-106 --24,-98,-115 --27,-122,-73 --24,-88,-66 --24,-66,-101 --23,-104,-102 --23,-126,-93 --25,-82,-128 --23,-91,-74 --25,-87,-70 --26,-101,-66 --26,-81,-117 --26,-78,-103 --28,-71,-100 --27,-123,-69 --23,-98,-96 --23,-95,-69 --28,-72,-80 --27,-73,-94 --27,-123,-77 --24,-110,-81 --25,-101,-72 --26,-97,-91 --27,-112,-114 --24,-115,-122 --25,-70,-94 --26,-72,-72 --23,-125,-113 --25,-85,-70 --26,-99,-125 --23,-128,-81 --25,-101,-106 --25,-101,-118 --26,-95,-109 --27,-123,-84 --28,-69,-119 --25,-99,-93 --27,-78,-77 --27,-72,-123 --25,-68,-111 --28,-70,-94 --27,-122,-75 --23,-125,-120 --26,-100,-119 --25,-112,-76 --27,-67,-110 --26,-75,-73 --26,-103,-117 --26,-91,-102 --23,-105,-85 --26,-77,-107 --26,-79,-99 --23,-124,-94 --26,-74,-126 --23,-110,-90 --27,-107,-122 --25,-119,-97 --28,-67,-104 --28,-67,-76 --28,-68,-81 --24,-75,-113 --27,-94,-88 --27,-109,-120 --24,-80,-81 --25,-81,-127 --27,-71,-76 --25,-120,-79 --23,-104,-77 --28,-67,-97 --24,-88,-128 --25,-90,-113 --27,-115,-105 --25,-127,-85 --23,-109,-127 --24,-65,-97 --26,-68,-122 --27,-82,-104 --27,-122,-68 --25,-100,-97 --27,-79,-107 --25,-71,-127 --26,-86,-128 --25,-91,-83 --27,-81,-122 --26,-107,-84 --26,-113,-83 --24,-120,-100 --26,-91,-68 --25,-106,-113 --27,-122,-110 --26,-75,-111 --26,-116,-102 --24,-125,-74 --23,-102,-113 --23,-85,-104 --25,-102,-117 --27,-114,-97 --25,-89,-115 --25,-69,-125 --27,-68,-91 --28,-69,-109 --25,-100,-83 --24,-71,-121 --24,-90,-125 --23,-104,-65 --23,-105,-88 --26,-127,-67 --26,-99,-91 --25,-74,-90 --27,-113,-84 --28,-69,-86 --23,-93,-114 --28,-69,-117 --27,-73,-88 --26,-100,-88 --28,-70,-84 --25,-117,-112 --23,-125,-121 --24,-103,-114 --26,-98,-102 --26,-118,-105 --24,-66,-66 --26,-99,-98 --24,-117,-116 --26,-118,-104 --23,-70,-90 --27,-70,-122 --24,-65,-121 --25,-85,-71 --25,-85,-81 --23,-78,-100 --25,-102,-121 --28,-70,-109 --24,-128,-127 --26,-104,-81 --25,-89,-104 --25,-107,-123 --23,-126,-99 --24,-65,-104 --27,-82,-66 --23,-105,-66 --24,-66,-100 --25,-70,-75 --28,-66,-76 --28,-72,-121,-28,-65,-97 --27,-113,-72,-23,-87,-84 --28,-72,-118,-27,-82,-104 --26,-84,-89,-23,-104,-77 --27,-92,-113,-28,-66,-81 --24,-81,-72,-24,-111,-101 --23,-105,-69,-28,-70,-70 --28,-72,-100,-26,-106,-71 --24,-75,-85,-24,-65,-98 --25,-102,-121,-25,-108,-85 --25,-66,-118,-24,-120,-116 --27,-80,-119,-24,-65,-97 --27,-123,-84,-25,-66,-118 --26,-66,-71,-27,-113,-80 --27,-123,-84,-27,-122,-74 --27,-82,-105,-26,-83,-93 --26,-65,-82,-23,-104,-77 --26,-73,-77,-28,-70,-114 --27,-115,-107,-28,-70,-114 --27,-92,-86,-27,-113,-108 --25,-108,-77,-27,-79,-96 --27,-123,-84,-27,-83,-103 --28,-69,-78,-27,-83,-103 --24,-67,-87,-24,-66,-107 --28,-69,-92,-25,-117,-112 --23,-110,-97,-25,-90,-69 --27,-82,-121,-26,-106,-121 --23,-107,-65,-27,-83,-103 --26,-123,-107,-27,-82,-71 --23,-78,-100,-28,-70,-114 --23,-105,-66,-28,-72,-104 --27,-113,-72,-27,-66,-110 --27,-113,-72,-25,-87,-70 --27,-123,-128,-27,-82,-104 --27,-113,-72,-27,-81,-121 --27,-115,-105,-23,-105,-88 --27,-111,-68,-27,-69,-74 --27,-83,-112,-24,-67,-90 --23,-94,-101,-27,-83,-103 --25,-85,-81,-26,-100,-88 --27,-73,-85,-23,-87,-84 --27,-123,-84,-24,-91,-65 --26,-68,-122,-23,-101,-107 --24,-67,-90,-26,-83,-93 --27,-93,-92,-23,-87,-73 --27,-123,-84,-24,-119,-81 --26,-117,-109,-24,-73,-117 --27,-92,-71,-24,-80,-73 --27,-82,-80,-25,-120,-74 --24,-80,-73,-26,-94,-127 --26,-82,-75,-27,-71,-78 --25,-103,-66,-23,-121,-116 --28,-72,-100,-23,-125,-83 --27,-66,-82,-25,-108,-97 --26,-94,-127,-28,-72,-104 --27,-73,-90,-28,-72,-104 --28,-72,-100,-23,-105,-88 --24,-91,-65,-23,-105,-88 --27,-115,-105,-27,-82,-85 --25,-84,-84,-28,-70,-108 --27,-123,-84,-28,-69,-86 --27,-123,-84,-28,-71,-104 --27,-92,-86,-27,-113,-78 --28,-69,-78,-23,-107,-65 --27,-113,-108,-27,-83,-103 --27,-79,-120,-25,-86,-127 --27,-80,-108,-26,-100,-79 --28,-72,-100,-28,-71,-95 --25,-101,-72,-23,-121,-116 --24,-125,-95,-26,-81,-115 --27,-113,-72,-27,-97,-114 --27,-68,-96,-27,-69,-106 --23,-101,-115,-23,-105,-88 --26,-81,-117,-28,-72,-104 --24,-76,-70,-27,-123,-80 --25,-74,-90,-26,-81,-117 --27,-79,-117,-27,-70,-112 --25,-117,-84,-27,-83,-92 --27,-115,-105,-23,-125,-83 --27,-116,-105,-27,-82,-85 --25,-114,-117,-27,-83,-103 diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/RegexMockTest.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/RegexMockTest.java deleted file mode 100644 index 3763182..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/RegexMockTest.java +++ /dev/null @@ -1,42 +0,0 @@ -package test; - -/** - * @author ForteScarlet - * @date 2020/7/30 - */ -public class RegexMockTest { - - /** - * 根据正则规则生成随机字符串 - * 如:[a-z][A-Z][0-9]{32} 生成32位包含大写小写数字的字符串 - * like @regex('[a-z][A-Z][0-9]{32}') - * - * TODO 目前存在的缺陷: - * 1 导入了额外的依赖 - * 2 无法试别例如\\d、\\w等特殊字符 - * @param regex 正则规则 - */ - @Deprecated - public static String regex(String regex) { - Xeger xeger = Xeger.getInstance(regex); - return xeger.generate(); - } - - /** - * 获取指定数量num个随机字符串 - * TODO 目前存在的缺陷: - * 1 导入了额外的依赖 - * 2 无法试别例如\\d、\\w等特殊字符 - * @param regex z正则规则 - * @param num 获取数量 - * @return - */ - @Deprecated - public static String[] regexs(String regex, Integer num) { - String[] regexs = new String[num]; - for (int i = 0; i < num; i++) { - regexs[i] = regex(regex); - } - return regexs; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/SchemeImprovePointVo.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/SchemeImprovePointVo.java deleted file mode 100644 index a52e4b6..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/SchemeImprovePointVo.java +++ /dev/null @@ -1,175 +0,0 @@ -package test; - -public class SchemeImprovePointVo { - - /** - * 弱项ID - */ - private Integer weaknessId; - /** - * 方案序号 - */ - private String schemeCode; - /** - * schemeName - */ - private String schemeName; - /** - * 方案类别(频次) - */ - private String schemeCategoryFrequency; - /** - * 方案类别(内容) - */ - private String schemeCategoryContent; - /** - * 方案类别(提升) - */ - private String schemeCategoryPromote; - - /** - * 改善点序号 - */ - private String improvePointCode; - /** - * 改善点名称 - */ - private String improvePointName; - /** - * 改善点类别(频次) - */ - private String improvePointCategoryFrequency; - /** - * 改善点类别(内容) - */ - private String improvePointCategoryContent; - /** - * 负责人 - */ - private String principal; - /** - * 执行步骤 - */ - private String executionSteps; - /** - * 提报周期 - */ - private String reportingPeriod; - /** - * 提报材料 - */ - private String reportingMaterials; - - - public Integer getWeaknessId() { - return weaknessId; - } - - public void setWeaknessId(Integer weaknessId) { - this.weaknessId = weaknessId; - } - - public String getSchemeCode() { - return schemeCode; - } - - public void setSchemeCode(String schemeCode) { - this.schemeCode = schemeCode; - } - - public String getSchemeName() { - return schemeName; - } - - public void setSchemeName(String schemeName) { - this.schemeName = schemeName; - } - - public String getSchemeCategoryFrequency() { - return schemeCategoryFrequency; - } - - public void setSchemeCategoryFrequency(String schemeCategoryFrequency) { - this.schemeCategoryFrequency = schemeCategoryFrequency; - } - - public String getSchemeCategoryContent() { - return schemeCategoryContent; - } - - public void setSchemeCategoryContent(String schemeCategoryContent) { - this.schemeCategoryContent = schemeCategoryContent; - } - - public String getSchemeCategoryPromote() { - return schemeCategoryPromote; - } - - public void setSchemeCategoryPromote(String schemeCategoryPromote) { - this.schemeCategoryPromote = schemeCategoryPromote; - } - - public String getImprovePointCode() { - return improvePointCode; - } - - public void setImprovePointCode(String improvePointCode) { - this.improvePointCode = improvePointCode; - } - - public String getImprovePointName() { - return improvePointName; - } - - public void setImprovePointName(String improvePointName) { - this.improvePointName = improvePointName; - } - - public String getImprovePointCategoryFrequency() { - return improvePointCategoryFrequency; - } - - public void setImprovePointCategoryFrequency(String improvePointCategoryFrequency) { - this.improvePointCategoryFrequency = improvePointCategoryFrequency; - } - - public String getImprovePointCategoryContent() { - return improvePointCategoryContent; - } - - public void setImprovePointCategoryContent(String improvePointCategoryContent) { - this.improvePointCategoryContent = improvePointCategoryContent; - } - - public String getPrincipal() { - return principal; - } - - public void setPrincipal(String principal) { - this.principal = principal; - } - - public String getExecutionSteps() { - return executionSteps; - } - - public void setExecutionSteps(String executionSteps) { - this.executionSteps = executionSteps; - } - - public String getReportingPeriod() { - return reportingPeriod; - } - - public void setReportingPeriod(String reportingPeriod) { - this.reportingPeriod = reportingPeriod; - } - - public String getReportingMaterials() { - return reportingMaterials; - } - - public void setReportingMaterials(String reportingMaterials) { - this.reportingMaterials = reportingMaterials; - } -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Test3.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Test3.java deleted file mode 100644 index 832beb5..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Test3.java +++ /dev/null @@ -1,142 +0,0 @@ -package test; /** - * @author ForteScarlet(ForteScarlet @ 163.com) - * @since JDK1.8 - **/ - -import com.forte.util.Mock; -import com.forte.util.MockConfiguration; -import com.forte.util.mockbean.MockObject; - -import java.util.HashMap; -import java.util.Map; - -/** - * - * @author ForteScarlet - */ -public class Test3 { - public static void main(String[] args) { - - MockConfiguration.setEnableJsScriptEngine(false); - - Map map = getMap(); - - Mock.set(SchemeImprovePointVo.class, map); - MockObject mockObject = Mock.get(SchemeImprovePointVo.class); - mockObject.getOne(); - - // 异步流获取列表 -// long s = System.currentTimeMillis(); -// mockObject.getListParallel(10_0000); -// long e = System.currentTimeMillis(); -// System.out.println(e - s); - /* - 2690 2877 2805 - */ - - // 同步流获取列表 -// long s2 = System.currentTimeMillis(); -// mockObject.getList(10_0000); -// long e2 = System.currentTimeMillis(); -// System.out.println(e2 - s2); - /* - 3429 3793 3787 4120 4133 - */ - -// // 同步流获取大量数据 -// long s3 = System.currentTimeMillis(); -// mockObject.getList(100_0000); -// long e3 = System.currentTimeMillis(); -// System.out.println(e3 - s3); -// /* -// 45349 39445 -// */ - -// // 异步流获取大量数据 -// long s4 = System.currentTimeMillis(); -// mockObject.getListParallel(100_0000); -// long e4 = System.currentTimeMillis(); -// System.out.println(e4 - s4); -// /* -// 24543 24339 -// */ - - -// // 异步流获取大量数据 -// int size5 = 10_0000; -// List list = new ArrayList<>(size5); -// MockBean parallelMockBean = mockObject.getMockBean().parallel(); -// long s5 = System.currentTimeMillis(); -// for (int i = 0; i < size5; i++) { -// list.add(parallelMockBean.getObject()); -// } -// long e5 = System.currentTimeMillis(); -// System.out.println(e5 - s5); -// /* -// 4641 4760 4742 -// */ - - -// // v1.9.0 优化后 -// long s6 = System.currentTimeMillis(); -// mockObject.getList(10_0000); -// long e6 = System.currentTimeMillis(); -// System.out.println(e6 - s6); -// /* -// 2088 1775 1818 1748 1898 -// */ - - - - // v1.9.0 优化后 - // 同步流大量数据(100w) - // 再大一些的数据,例如1kw,推荐使用流而不是list。 -// long s7 = System.currentTimeMillis(); -// mockObject.getList(100_0000); -// long e7 = System.currentTimeMillis(); -// System.out.println(e7 - s7); - /* - 15534 16139 15522 - 15736 16138 15481 - 15782 15995 16403 - avg: 15858.89ms | ~16s - */ - -// // v1.9.0 优化后 - // 异步流大量数据(100w) - long s8 = System.currentTimeMillis(); - mockObject.getListParallel(100_0000); - long e8 = System.currentTimeMillis(); - System.out.println(e8 - s8); -// /* -// 6378 7313 7235 -// 7143 6654 6634 -// 7080 7203 6777 -// avg: 6935.22 | ~7s -// */ - - - - - } - - - public static Map getMap(){ - Map map = new HashMap<>(); - map.put("weaknessId", "@integer(1, 999999)"); - map.put("schemeCode", "@string"); - map.put("schemeName", "@ctitle(4,16)"); - map.put("schemeCategoryFrequency", "@string"); - map.put("schemeCategoryContent", "@string"); - map.put("schemeCategoryPromote", "@string"); - map.put("improvePointName", "@ctitle(8,20)"); - map.put("improvePointCategoryFrequency", "@string"); - map.put("principal", "@ctitle(2,8)"); - map.put("executionSteps", "@ctitle(10,100)"); - map.put("reportingPeriod", "@ctitle(2,4)"); - map.put("reportingMaterials", "@ctitle(8,40)"); - return map; - - } - -} diff --git a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Xeger.java b/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Xeger.java deleted file mode 100644 index cce08e9..0000000 --- a/docs/raw-materials/backup/qabox-alt/api-mock/src/test/java/test/Xeger.java +++ /dev/null @@ -1,224 +0,0 @@ -/** - * Copyright 2009 Wilfred Springer - * Copyright 2012 Jason Pell - * Copyright 2013 Antonio García-Domínguez - *

- * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - *

- * http://www.apache.org/licenses/LICENSE-2.0 - *

- * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package test; - -import dk.brics.automaton.Automaton; -import dk.brics.automaton.RegExp; -import dk.brics.automaton.State; -import dk.brics.automaton.Transition; - -import java.util.List; -import java.util.Optional; -import java.util.Random; -import java.util.concurrent.ThreadLocalRandom; -import java.util.stream.Collectors; - -/** - * An object that will generate text from a regular expression. In a way, it's the opposite of a regular expression - * matcher: an instance of this class will produce text that is guaranteed to match the regular expression passed in. - * - * 存在部分缺陷 - * - */ -@Deprecated -public class Xeger { - - public static class FailedRandomWalkException extends Exception { - FailedRandomWalkException(String message) { - super(message); - } - } - - private final Automaton automaton; - private Random random; - -// private volatile static Xeger instance; - - /** - * Constructs a new instance, accepting the regular expression and the randomizer. - * - * @param regex The regular expression. (Not null.) - * @param random The object that will randomize the way the String is generated. (Not null.) - * @throws IllegalArgumentException If the regular expression is invalid. - */ - private Xeger(String regex, Random random) { - assert regex != null; - assert random != null; - this.automaton = new RegExp(regex).toAutomaton(); - this.random = random; - } - - /** - * As {@link Xeger (String, Random)}, creating a {@link Random} instance - * implicityly. - */ - private Xeger(String regex) { - this(regex, ThreadLocalRandom.current()); - } - - public static Xeger getInstance(String regex) { - return new Xeger(regex); - } - - - /** - * Generates a random String that is guaranteed to match the regular expression passed to the constructor. - */ - public String generate() { - StringBuilder builder = new StringBuilder(); - generate(builder, automaton.getInitialState()); - return builder.toString(); - } - - /** - * Attempts to generate a random String using a random walk of length between minLength and - * maxLength steps. - *

- * A target length will be randomly generated within this range, and a random walk of at least that length - * will be attempted. The walk will initially avoid states with no outgoing transitions until the target - * length is reached: from then onwards, it will consider all transitions equally, and stop as soon as an - * accept state has been reached or the maximum walk length has been exceeded. If the minimum length is not - * reached or the maximum walk length has been exceeded, a {@link FailedRandomWalkException} exception will - * be thrown. Callers could catch this exception to try again if desired. - * - * @param minLength Minimum length for the range. - * @param maxLength Maximum length for the range. - * @throws FailedRandomWalkException The minimum random walk length was not reached, or the maximum random walk - * length was exceeded. - */ - public String generate(int minLength, int maxLength) throws FailedRandomWalkException { - final StringBuilder builder = new StringBuilder(); - int walkLength = 0; - State state = automaton.getInitialState(); - - // First get to the uniformly distributed target length - final int targetLength = Xeger.getRandomInt(minLength, maxLength, random); - while (walkLength < targetLength) { - List transitions = state.getSortedTransitions(false); - if (transitions.size() == 0) { - if (walkLength >= minLength) { - assert state.isAccept(); - return builder.toString(); - } else { - throw new FailedRandomWalkException(String.format( - "Reached accept state before minimum length (current = %d < min = %d)", - walkLength, minLength)); - } - } - - // Try to prefer non-final transitions if possible at this first stage - List nonFinalTransitions = transitions.stream() - .filter(t -> !t.getDest().getTransitions().isEmpty()).collect(Collectors.toList()); - if (!nonFinalTransitions.isEmpty()) { - transitions = nonFinalTransitions; - } - - final int option = Xeger.getRandomInt(0, transitions.size() - 1, random); - final Transition transition = transitions.get(option); - appendChoice(builder, transition); - state = transition.getDest(); - ++walkLength; - } - - // Now, get to an accept state - while (!state.isAccept() && walkLength < maxLength) { - List transitions = state.getSortedTransitions(false); - if (transitions.size() == 0) { - assert state.isAccept(); - return builder.toString(); - } - - final int option = Xeger.getRandomInt(0, transitions.size() - 1, random); - final Transition transition = transitions.get(option); - appendChoice(builder, transition); - state = transition.getDest(); - ++walkLength; - } - - if (state.isAccept()) { - return builder.toString(); - } else { - throw new FailedRandomWalkException(String.format( - "Exceeded maximum walk length (%d) before reaching an accept state: " + - "target length was %d (min length = %d)", - maxLength, targetLength, minLength)); - } - } - - private Optional appendRandomChoice(StringBuilder builder, State state, int minLength, int walkLength) throws FailedRandomWalkException { - List transitions = state.getSortedTransitions(false); - if (transitions.size() == 0) { - if (walkLength >= minLength) { - return Optional.empty(); - } else { - throw new FailedRandomWalkException(String.format( - "Reached accept state before minimum length (current = %d < min = %d)", - walkLength, minLength)); - } - } - - final int option = Xeger.getRandomInt(0, transitions.size() - 1, random); - final Transition transition = transitions.get(option); - appendChoice(builder, transition); - return Optional.of(transition); - } - - private void generate(StringBuilder builder, State state) { - List transitions = state.getSortedTransitions(false); - if (transitions.size() == 0) { - assert state.isAccept(); - return; - } - int nroptions = state.isAccept() ? transitions.size() : transitions.size() - 1; - int option = Xeger.getRandomInt(0, nroptions, random); - if (state.isAccept() && option == 0) { // 0 is considered stop - return; - } - // Moving on to next transition - Transition transition = transitions.get(option - (state.isAccept() ? 1 : 0)); - appendChoice(builder, transition); - generate(builder, transition.getDest()); - } - - private void appendChoice(StringBuilder builder, Transition transition) { - char c = (char) Xeger.getRandomInt(transition.getMin(), transition.getMax(), random); - builder.append(c); - } - - public Random getRandom() { - return random; - } - - public void setRandom(Random random) { - this.random = random; - } - - /** - * Generates a random number within the given bounds. - * - * @param min The minimum number (inclusive). - * @param max The maximum number (inclusive). - * @param random The object used as the randomizer. - * @return A random number in the given range. - */ - static int getRandomInt(int min, int max, Random random) { - // Use random.nextInt as it guarantees a uniform distribution - int maxForRandom = max - min + 1; - return random.nextInt(maxForRandom) + min; - } -} \ No newline at end of file diff --git a/docs/raw-materials/backup/qabox-alt/pom.xml b/docs/raw-materials/backup/qabox-alt/pom.xml deleted file mode 100644 index b501817..0000000 --- a/docs/raw-materials/backup/qabox-alt/pom.xml +++ /dev/null @@ -1,43 +0,0 @@ - - - - qabox-java - io.fluentqa - 1.0-SNAPSHOT - - 4.0.0 - - qabox-alt - pom - - qabox-proteus - qabox-mapper - qabox-excel - qabox-pdfs - qabox-proxyee - - - - - - - 18 - 18 - UTF-8 - 2.13.3 - - - - - - com.fasterxml.jackson - jackson-bom - ${jackson.version} - import - pom - - - - \ No newline at end of file diff --git a/docs/raw-materials/cicd/maven-gradle.md b/docs/raw-materials/cicd/maven-gradle.md deleted file mode 100644 index 37ee43a..0000000 --- a/docs/raw-materials/cicd/maven-gradle.md +++ /dev/null @@ -1,6 +0,0 @@ -# JAVA Build tools - -- MAVEN -- Gradle - -## \ No newline at end of file diff --git a/docs/raw-materials/cicd/maven-wrapper.md b/docs/raw-materials/cicd/maven-wrapper.md deleted file mode 100644 index cb6db43..0000000 --- a/docs/raw-materials/cicd/maven-wrapper.md +++ /dev/null @@ -1,23 +0,0 @@ -# MAVEN Wrapper setup - -```shell -mvn -N wrapper:wrapper -``` -生成了: -- .mvnw -- .mvn -- mvnw.cmd - -```shell -README.md fluentqa-parent mvnw pom.xml -docs fluentqa-thirdparty mvnw.cmd -``` - - -## mvnw commands - -```shell -./mvnw clean all -``` - -其实命令和maven的命令基本一样. \ No newline at end of file diff --git a/docs/raw-materials/cicd/quality-setting.md b/docs/raw-materials/cicd/quality-setting.md deleted file mode 100644 index c5c167c..0000000 --- a/docs/raw-materials/cicd/quality-setting.md +++ /dev/null @@ -1,112 +0,0 @@ -# README - -A Java Template Project support: - -- [X] MAVEN Java Lib Template -- [] MAVEN JAVA UI Testing Template -- [] MAVEN Springboot Template -- [] Github Action -- [] Code Coverage -- [] statistics Analysis -- [] CI/CD Pipeline Support - -## Java Project - -In real dev activity, there are a few things included in daily workflow: - -1. Unit Testing -2. Code Coverage -3. Test Report -4. Code statistics -5. Version Checker -6. jenkins pipeline -7. Docker files -8. K8S support 9 ...... - -This template project is to target to make setting project easier. - -## CheckStyle - -- [spotless] -- [checkstyle](https://github.com/checkstyle/checkstyle) -- [checkstyle-github](https://github.com/checkstyle) -- [google java format](https://github.com/google/google-java-format) -- [google-style-precommit-check](https://github.com/maltzj/google-style-precommit-hook) -- [google style format maven plugin](https://github.com/Cosium/git-code-format-maven-plugin) - -with google check style setting, and maven command is ***mvn checkstyle:check*** - -```xml - - - org.apache.maven.plugins - maven-checkstyle-plugin - 3.1.1 - - google_checks.xml - UTF-8 - true - true - false - - - - validate - validate - - check - - - - -``` - -## spotless check - -- [spotless](https://github.com/diffplug/spotless/) - -command for running spotless check: - -```shell -mvn spotless:check -mvn spotless:apply -``` - -## JUNIT setup - -## maven surefire plugin setup - -- [maven-surefire](https://maven.apache.org/surefire/maven-surefire-plugin/index.html) - -## Sonar: TODO - -- [sonarqube](https://www.sonarqube.org/) -- [sonar source](https://www.sonarsource.com/) -- [sonar plugins marketplace](https://www.sonarplugins.com/) - -## Unit Testing coverage - -- [junit]() -- [testng]() - -## Github CICD: TODO - -- [Action](../github/workflows/build.yml) -- [dependablebot](../github/dependabot.yml) - -## Gitlab-cicd - -- [gitlab-cicd](https://docs.gitlab.com/ee/ci/yaml/README.html) - -## Application Security Scanner - -please refer,[gitlab-security scan](https://docs.gitlab.com/ee/user/application_security/security_dashboard/index.html) - -## Gradle To Maven: TODO - -reference: [gradle to maven](https://www.baeldung.com/gradle-build-to-maven-pom) - - -## Examples - -- [Security & QA, by L1NNA Lab](https://github.com/CISC-CMPE-327) \ No newline at end of file diff --git a/docs/raw-materials/cicd/scripts.sh b/docs/raw-materials/cicd/scripts.sh deleted file mode 100644 index 0f460a6..0000000 --- a/docs/raw-materials/cicd/scripts.sh +++ /dev/null @@ -1,4 +0,0 @@ -#!/usr/bin/env bash - -mvn clean checkstyle:check -mvn clean checkstyle:checkstyle-aggregate \ No newline at end of file diff --git a/docs/raw-materials/cicd/spring-gradle-list.md b/docs/raw-materials/cicd/spring-gradle-list.md deleted file mode 100644 index 585efe2..0000000 --- a/docs/raw-materials/cicd/spring-gradle-list.md +++ /dev/null @@ -1,93 +0,0 @@ -# Getting Started - -### Reference Documentation -For further reference, please consider the following sections: - -* [Official Gradle documentation](https://docs.gradle.org) -* [Spring Boot Gradle Plugin Reference Guide](https://docs.spring.io/spring-boot/docs/2.6.1/gradle-plugin/reference/html/) -* [Create an OCI image](https://docs.spring.io/spring-boot/docs/2.6.1/gradle-plugin/reference/html/#build-image) -* [Spring Integration AMQP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/amqp.html) -* [Spring Integration JDBC Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jdbc.html) -* [Spring Integration JPA Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jpa.html) -* [Spring Integration Redis Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/redis.html) -* [Spring Integration Test Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/testing.html) -* [Spring Integration Apache Kafka Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/kafka.html) -* [Spring Integration Security Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/security.html) -* [Spring Integration HTTP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/http.html) -* [Spring Integration STOMP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/stomp.html) -* [Spring Integration WebSocket Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/websocket.html) -* [Spring Web](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-developing-web-applications) -* [Rest Repositories](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-use-exposing-spring-data-repositories-rest-endpoint) -* [Spring Boot DevTools](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#using-boot-devtools) -* [Spring Configuration Processor](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#configuration-metadata-annotation-processor) -* [Vaadin](https://vaadin.com/spring) -* [Apache Freemarker](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Groovy Templates](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Spring Security](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-security) -* [Spring LDAP](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-ldap) -* [JDBC API](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-sql) -* [Spring Data JPA](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jpa-and-spring-data) -* [Spring Data JDBC](https://docs.spring.io/spring-data/jdbc/docs/current/reference/html/) -* [MyBatis Framework](https://mybatis.org/spring-boot-starter/mybatis-spring-boot-autoconfigure/) -* [Flyway Migration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-execute-flyway-database-migrations-on-startup) -* [JOOQ Access Layer](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jooq) -* [Spring Data Redis (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-redis) -* [Resilience4J](https://cloud.spring.io/spring-cloud-static/spring-cloud-circuitbreaker/current/reference/html) -* [Spring Data Elasticsearch (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-elasticsearch) -* [Spring Boot Actuator](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#production-ready) -* [Spring cache abstraction](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-caching) -* [Wavefront for Spring Boot documentation](https://docs.wavefront.com/wavefront_springboot.html) -* [Wavefront for Spring Boot repository](https://github.com/wavefrontHQ/wavefront-spring-boot) -* [Function](https://cloud.spring.io/spring-cloud-function/) -* [Vault Client Quick Start](https://docs.spring.io/spring-cloud-vault/docs/current/reference/html/#client-side-usage) -* [Spring for RabbitMQ](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-amqp) -* [Spring Integration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-integration) -* [Apache Kafka Streams Support](https://docs.spring.io/spring-kafka/docs/current/reference/html/_reference.html#kafka-streams) -* [Apache Kafka Streams Binding Capabilities of Spring Cloud Stream](https://docs.spring.io/spring-cloud-stream/docs/current/reference/htmlsingle/#_kafka_streams_binding_capabilities_of_spring_cloud_stream) -* [Spring for Apache Kafka](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-kafka) -* [WebSocket](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-websockets) -* [Mustache](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Thymeleaf](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) - -### Guides -The following guides illustrate how to use some features concretely: - -* [Building a RESTful Web Service](https://spring.io/guides/gs/rest-service/) -* [Serving Web Content with Spring MVC](https://spring.io/guides/gs/serving-web-content/) -* [Building REST services with Spring](https://spring.io/guides/tutorials/bookmarks/) -* [Accessing JPA Data with REST](https://spring.io/guides/gs/accessing-data-rest/) -* [Accessing Neo4j Data with REST](https://spring.io/guides/gs/accessing-neo4j-data-rest/) -* [Accessing MongoDB Data with REST](https://spring.io/guides/gs/accessing-mongodb-data-rest/) -* [Creating CRUD UI with Vaadin](https://spring.io/guides/gs/crud-with-vaadin/) -* [Securing a Web Application](https://spring.io/guides/gs/securing-web/) -* [Spring Boot and OAuth2](https://spring.io/guides/tutorials/spring-boot-oauth2/) -* [Authenticating a User with LDAP](https://spring.io/guides/gs/authenticating-ldap/) -* [Accessing Relational Data using JDBC with Spring](https://spring.io/guides/gs/relational-data-access/) -* [Managing Transactions](https://spring.io/guides/gs/managing-transactions/) -* [Accessing Data with JPA](https://spring.io/guides/gs/accessing-data-jpa/) -* [Using Spring Data JDBC](https://github.com/spring-projects/spring-data-examples/tree/master/jdbc/basics) -* [MyBatis Quick Start](https://github.com/mybatis/spring-boot-starter/wiki/Quick-Start) -* [Accessing data with MySQL](https://spring.io/guides/gs/accessing-data-mysql/) -* [Messaging with Redis](https://spring.io/guides/gs/messaging-redis/) -* [Building a RESTful Web Service with Spring Boot Actuator](https://spring.io/guides/gs/actuator-service/) -* [Caching Data with Spring](https://spring.io/guides/gs/caching/) -* [Messaging with RabbitMQ](https://spring.io/guides/gs/messaging-rabbitmq/) -* [Integrating Data](https://spring.io/guides/gs/integration/) -* [Samples for using Apache Kafka Streams with Spring Cloud stream](https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/kafka-streams-samples) -* [Using WebSocket to build an interactive web application](https://spring.io/guides/gs/messaging-stomp-websocket/) -* [Handling Form Submission](https://spring.io/guides/gs/handling-form-submission/) - -### Additional Links -These additional references should also help you: - -* [Gradle Build Scans – insights for your project's build](https://scans.gradle.com#gradle) -* [Various sample apps using Spring Cloud Function](https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples) - -## Observability with Wavefront - -If you don't have a Wavefront account, the starter will create a freemium account for you. -The URL to access the Wavefront Service dashboard is logged on startup. - -You can also access your dashboard using the `/actuator/wavefront` endpoint. - -Finally, you can opt-in for distributed tracing by adding the Spring Cloud Sleuth starter. diff --git a/docs/raw-materials/cicd/spring-maven-list.md b/docs/raw-materials/cicd/spring-maven-list.md deleted file mode 100644 index eb850df..0000000 --- a/docs/raw-materials/cicd/spring-maven-list.md +++ /dev/null @@ -1,92 +0,0 @@ -# Getting Started - -### Reference Documentation -For further reference, please consider the following sections: - -* [Official Apache Maven documentation](https://maven.apache.org/guides/index.html) -* [Spring Boot Maven Plugin Reference Guide](https://docs.spring.io/spring-boot/docs/2.6.1/maven-plugin/reference/html/) -* [Create an OCI image](https://docs.spring.io/spring-boot/docs/2.6.1/maven-plugin/reference/html/#build-image) -* [Spring Integration AMQP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/amqp.html) -* [Spring Integration JDBC Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jdbc.html) -* [Spring Integration JPA Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/jpa.html) -* [Spring Integration Redis Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/redis.html) -* [Spring Integration Test Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/testing.html) -* [Spring Integration Apache Kafka Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/kafka.html) -* [Spring Integration Security Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/security.html) -* [Spring Integration HTTP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/http.html) -* [Spring Integration STOMP Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/stomp.html) -* [Spring Integration WebSocket Module Reference Guide](https://docs.spring.io/spring-integration/reference/html/websocket.html) -* [Spring Web](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-developing-web-applications) -* [Rest Repositories](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-use-exposing-spring-data-repositories-rest-endpoint) -* [Spring Boot DevTools](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#using-boot-devtools) -* [Spring Configuration Processor](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#configuration-metadata-annotation-processor) -* [Vaadin](https://vaadin.com/spring) -* [Apache Freemarker](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Groovy Templates](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Spring Security](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-security) -* [Spring LDAP](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-ldap) -* [JDBC API](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-sql) -* [Spring Data JPA](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jpa-and-spring-data) -* [Spring Data JDBC](https://docs.spring.io/spring-data/jdbc/docs/current/reference/html/) -* [MyBatis Framework](https://mybatis.org/spring-boot-starter/mybatis-spring-boot-autoconfigure/) -* [Flyway Migration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#howto-execute-flyway-database-migrations-on-startup) -* [JOOQ Access Layer](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-jooq) -* [Spring Data Redis (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-redis) -* [Resilience4J](https://cloud.spring.io/spring-cloud-static/spring-cloud-circuitbreaker/current/reference/html) -* [Spring Data Elasticsearch (Access+Driver)](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-elasticsearch) -* [Spring Boot Actuator](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#production-ready) -* [Spring cache abstraction](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-caching) -* [Wavefront for Spring Boot documentation](https://docs.wavefront.com/wavefront_springboot.html) -* [Wavefront for Spring Boot repository](https://github.com/wavefrontHQ/wavefront-spring-boot) -* [Function](https://cloud.spring.io/spring-cloud-function/) -* [Vault Client Quick Start](https://docs.spring.io/spring-cloud-vault/docs/current/reference/html/#client-side-usage) -* [Spring for RabbitMQ](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-amqp) -* [Spring Integration](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-integration) -* [Apache Kafka Streams Support](https://docs.spring.io/spring-kafka/docs/current/reference/html/_reference.html#kafka-streams) -* [Apache Kafka Streams Binding Capabilities of Spring Cloud Stream](https://docs.spring.io/spring-cloud-stream/docs/current/reference/htmlsingle/#_kafka_streams_binding_capabilities_of_spring_cloud_stream) -* [Spring for Apache Kafka](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-kafka) -* [WebSocket](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-websockets) -* [Mustache](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) -* [Thymeleaf](https://docs.spring.io/spring-boot/docs/2.6.1/reference/htmlsingle/#boot-features-spring-mvc-template-engines) - -### Guides -The following guides illustrate how to use some features concretely: - -* [Building a RESTful Web Service](https://spring.io/guides/gs/rest-service/) -* [Serving Web Content with Spring MVC](https://spring.io/guides/gs/serving-web-content/) -* [Building REST services with Spring](https://spring.io/guides/tutorials/bookmarks/) -* [Accessing JPA Data with REST](https://spring.io/guides/gs/accessing-data-rest/) -* [Accessing Neo4j Data with REST](https://spring.io/guides/gs/accessing-neo4j-data-rest/) -* [Accessing MongoDB Data with REST](https://spring.io/guides/gs/accessing-mongodb-data-rest/) -* [Creating CRUD UI with Vaadin](https://spring.io/guides/gs/crud-with-vaadin/) -* [Securing a Web Application](https://spring.io/guides/gs/securing-web/) -* [Spring Boot and OAuth2](https://spring.io/guides/tutorials/spring-boot-oauth2/) -* [Authenticating a User with LDAP](https://spring.io/guides/gs/authenticating-ldap/) -* [Accessing Relational Data using JDBC with Spring](https://spring.io/guides/gs/relational-data-access/) -* [Managing Transactions](https://spring.io/guides/gs/managing-transactions/) -* [Accessing Data with JPA](https://spring.io/guides/gs/accessing-data-jpa/) -* [Using Spring Data JDBC](https://github.com/spring-projects/spring-data-examples/tree/master/jdbc/basics) -* [MyBatis Quick Start](https://github.com/mybatis/spring-boot-starter/wiki/Quick-Start) -* [Accessing data with MySQL](https://spring.io/guides/gs/accessing-data-mysql/) -* [Messaging with Redis](https://spring.io/guides/gs/messaging-redis/) -* [Building a RESTful Web Service with Spring Boot Actuator](https://spring.io/guides/gs/actuator-service/) -* [Caching Data with Spring](https://spring.io/guides/gs/caching/) -* [Messaging with RabbitMQ](https://spring.io/guides/gs/messaging-rabbitmq/) -* [Integrating Data](https://spring.io/guides/gs/integration/) -* [Samples for using Apache Kafka Streams with Spring Cloud stream](https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/kafka-streams-samples) -* [Using WebSocket to build an interactive web application](https://spring.io/guides/gs/messaging-stomp-websocket/) -* [Handling Form Submission](https://spring.io/guides/gs/handling-form-submission/) - -### Additional Links -These additional references should also help you: - -* [Various sample apps using Spring Cloud Function](https://github.com/spring-cloud/spring-cloud-function/tree/master/spring-cloud-function-samples) - -## Observability with Wavefront - -If you don't have a Wavefront account, the starter will create a freemium account for you. -The URL to access the Wavefront Service dashboard is logged on startup. - -You can also access your dashboard using the `/actuator/wavefront` endpoint. - -Finally, you can opt-in for distributed tracing by adding the Spring Cloud Sleuth starter. diff --git a/docs/raw-materials/components/postman/postman-parser.md b/docs/raw-materials/components/postman/postman-parser.md deleted file mode 100644 index 7b6aaa8..0000000 --- a/docs/raw-materials/components/postman/postman-parser.md +++ /dev/null @@ -1,18 +0,0 @@ -## Postman 解析 - -一下代码可以进行postman解析: - -```java -public class PostmanParserTest { - PostmanParser parser = new PostmanParser(); - - @Test - void toPostmanCollectionFromFile() { - String jsonString = FileUtil.readString("openproject-postman.json", Charset.defaultCharset()); - PostmanCollection pc = parser.toPostmanCollection(jsonString); - Assertions.assertThat(pc.getItem()).isNotNull(); - } -} -``` - -文件读取后,会转化为PostmanCollection类.后续需要如何处理直接通过Postman转换进行. \ No newline at end of file diff --git a/docs/raw-materials/frameworks/overview.md b/docs/raw-materials/frameworks/overview.md deleted file mode 100644 index d9ec028..0000000 --- a/docs/raw-materials/frameworks/overview.md +++ /dev/null @@ -1,35 +0,0 @@ -# Overall - -Total Free Solution for Software Testing. - -## Overall - -- Project Management - - Issue Management - - Planning - - Tasks - - Notification - - BI -- Test Requirement - - Requirement - - Test Planning - - Test Cases - - Test Executions - - BI -- Automation - - Framework - - Reporting - - Status/Coverage - - CI/CD - - -## Typical Use Case - -- API Testing Use Case - - Requirement - - Test Case/Test Planning/Test Execution - - Test Automation - - Test Collaboration - - Issue Management - - Mock Tools - - Coverage \ No newline at end of file diff --git a/docs/raw-materials/frameworks/todo/congnizat-intelligent-testagent.md b/docs/raw-materials/frameworks/todo/congnizat-intelligent-testagent.md deleted file mode 100644 index 1eef0d0..0000000 --- a/docs/raw-materials/frameworks/todo/congnizat-intelligent-testagent.md +++ /dev/null @@ -1,25 +0,0 @@ -# CognizantIntelligentTestScripter - -- [site](https://github.com/ghoshasish99/CognizantIntelligentTestScripter.git) - -The code is quite complicated. But overall, it is a -- Runner to run test cases -- Different Processors -- Different Steps to compose -- Assertions -- Most valuable part is intergation with other system - - -## Overview - -- Runner -- Reporter -- Integration Hooks -- Test Context - - Test Case ... - - Custom Methods - -- Method Refs - - Use Reflection to load all methods - - Under Some Test Context - - All Action and Commands \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/README.md b/docs/raw-materials/qa-java-toolkits/README.md deleted file mode 100644 index 9e71c62..0000000 --- a/docs/raw-materials/qa-java-toolkits/README.md +++ /dev/null @@ -1,24 +0,0 @@ -# README - -Features to be Completed: - -- application-layer-middleware: - - sql as api - - entity as crud api - - service as api - -- Integration: - - HTTP - - Database - - Redis - - ..... - -- Utility Layer - - File - - OS - - Structure Data Transformer - - JSON/CSV/Bean/YAML/Excel - - Worker - - thread worker - - multiple threads worker - - async worker \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/api-handlers/postman-parser.md b/docs/raw-materials/qa-java-toolkits/api-handlers/postman-parser.md deleted file mode 100644 index 28b788d..0000000 --- a/docs/raw-materials/qa-java-toolkits/api-handlers/postman-parser.md +++ /dev/null @@ -1,15 +0,0 @@ -## Postman Parser - -- [fluentqa-openapi](fluent-components%2Ffluentqa-apispec%2Ffluentqa-openapi) Postman parser -```java -public class PostmanParserTest { - PostmanParser parser = new PostmanParser(); - - @Test - void toPostmanCollectionFromFile() { - String jsonString = FileUtil.readString("openproject-postman.json", Charset.defaultCharset()); - PostmanCollection pc = parser.toPostmanCollection(jsonString); - Assertions.assertThat(pc.getItem()).isNotNull(); - } -} -``` \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/configuration.md b/docs/raw-materials/qa-java-toolkits/configuration.md deleted file mode 100644 index 88f6c4a..0000000 --- a/docs/raw-materials/qa-java-toolkits/configuration.md +++ /dev/null @@ -1,34 +0,0 @@ -# Configuration - -## Load Configuration Property - -- Setting file: - -```shell -[demo] -driver = com.mysql.jdbc.Driver -url = jdbc:mysql://fedora.vmware:3306/extractor -user = root${demo.driver} -pass = 123456 -``` - -- How to load and Get Property -```java - @Test - @Story( "Get Configuration Property") - public void test_LoadSetting(){ - var appConfig = AppConfig.create("config/app2.setting"); - assertNotNull(appConfig); - assertEquals(appConfig.getConfigSet("demo").get("user"),"root${demo.driver}"); - } -``` - - -## Get Property Bean - -- getConfigSetBean -```java -var bean = appConfig.getConfigSetBean(DemoSetting.class); - assertEquals(bean.getUser(),"root${demo.driver}"); -``` - diff --git a/docs/raw-materials/qa-java-toolkits/data/quick-dao.md b/docs/raw-materials/qa-java-toolkits/data/quick-dao.md deleted file mode 100644 index bb1fe26..0000000 --- a/docs/raw-materials/qa-java-toolkits/data/quick-dao.md +++ /dev/null @@ -1,125 +0,0 @@ -# Quick Dao - 给测试快速操作的JAVA库 - -The initial thoughts: -> Simple DAO - Simple database access methods -> A simple DAO implementation to QA Daily Usage: -> 1. [X] Create DAO by database connection settings -> 2. [X] DAO access database: -> 1. [X] Query to Bean -> 2. [X] Execute SQL Statement -> 3. [X] Insert/Save/SaveOrUpdate -> 3. [X] Multiple Data Source -> 1. [X] Multiple Database Holder -> 2. [X] Multiple SQL execution in different Database - These are most frequent used functionalities for QA. - -快速操作数据库: - -好处: 免去了mybatis/jpa等库的各种学习成本,就是SQL+变量 -坏处: 可能只适合测试使用,不适合生产环境使用 - -1. 创建连接Dao -2. 直接执行SQL -3. 直接可以绑定变量 -4. 更具外部给定的输入选择不同的数据源进行操作数据库 - -## 创建连接Dao - -```java - DataSourceSetting setting=DataSourceSetting.builder(). - url("jdbc:postgresql://127.0.0.1:7432/test_hub?currentSchema=demo") - .driver("org.postgresql.Driver") - .username("postgres").password("changeit").build(); - QuickDao dao=QuickDao.createDao("config/db.setting"); -``` - -## 直接执行SQL - -- 直接SQL -- 占位符方式传递参数 -- 绑定变量传递参数 - -```java -@Test -public void testQuery(){ - var queryResult=dao.query("select * from hero"); - for(Entity entity:queryResult){ - System.out.println(entity.keySet()); - System.out.println(entity.values()); - } - } - -@Test -public void testQueryWithParameters(){ - var queryResult=dao.query("select * from hero where name=?","test"); - for(Entity entity:queryResult){ - System.out.println(entity.keySet()); - System.out.println(entity.values()); - } - } - -@Test -public void testQueryWithBindParams(){ - Map params=MapUtil.builder("name",(Object)"test2").build(); - var queryResult=dao.query("select * from hero where name=@name", - params); - for(Entity entity:queryResult){ - System.out.println(entity.keySet()); - System.out.println(entity.values()); - } - } - -``` - -## 执行SQL insert/delete - -- 直接输入SQL可以直接执行 -- 使用JAVA 类也可以直接操作 - -```shell - @Test - public void testSaveOrInsertEntity() { - Entity e = Entity.create("hero").set("name", "test4") - .set("secret_name", "100").set("age", 9000); - - HeroDemoEntity demoEntity = new HeroDemoEntity(); - demoEntity.setName("hero"); - demoEntity.setAge(10); - demoEntity.setSecret_name("secret_name"); - dao.saveOrUpdate(demoEntity,"hero"); - dao.saveOrUpdate(e, "hero", "name", "secret_name"); //for upsert error - } - - @Test - public void testExecute() { - int count = dao.execute("delete from hero where name='test2'"); - System.out.println(count); - } -``` - -## 根据外部给定的输入选择不同的数据源进行操作数据库 - -- 数据库例连接配置 -- SQL输入+变量都是配置 - -```java - -@Test -public void testDaoService(){ - String connectConfig="{\n"+ - " \"url\" :\"jdbc:postgresql://127.0.0.1:7432/test_hub\",\n"+ - " \"user\":\"postgres\",\n"+ - " \"password\": \"changeit\"\n"+ - "}"; - SqlRequest request=SqlRequest.createRequest("select * from data_sources where name=@name"); - String bindvalue="{\n"+ - "\"name\":\"本地测试环境\"\n"+ - " }"; - request.bindParameterValues(bindvalue); - request.dsConfig(connectConfig); - SqlService service=new SqlService(); - SqlQueryResponse response=service.query(request); - System.out.println(JSONUtil.toJsonPrettyStr(response)); - } -``` - diff --git a/docs/raw-materials/qa-java-toolkits/data/sql-as-api.md b/docs/raw-materials/qa-java-toolkits/data/sql-as-api.md deleted file mode 100644 index 71895ee..0000000 --- a/docs/raw-materials/qa-java-toolkits/data/sql-as-api.md +++ /dev/null @@ -1,5 +0,0 @@ -# SQL as API - -- config in database -- call the configuration different to get api result -- like call database view by api \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/img.png b/docs/raw-materials/qa-java-toolkits/img.png deleted file mode 100644 index b624a06..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/img.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/patterns/async-request-response.md b/docs/raw-materials/qa-java-toolkits/patterns/async-request-response.md deleted file mode 100644 index d83bde8..0000000 --- a/docs/raw-materials/qa-java-toolkits/patterns/async-request-response.md +++ /dev/null @@ -1,11 +0,0 @@ - - -![img.png](img.png) - -![img_1.png](img_1.png) - -![img_2.png](img_2.png) -![img_3.png](img_3.png) - - -https://dev.to/ragrag/asynchronous-request-response-pattern-2pbj \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/patterns/img.png b/docs/raw-materials/qa-java-toolkits/patterns/img.png deleted file mode 100644 index b5cd913..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/patterns/img.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/patterns/img_1.png b/docs/raw-materials/qa-java-toolkits/patterns/img_1.png deleted file mode 100644 index abca4ab..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/patterns/img_1.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/patterns/img_2.png b/docs/raw-materials/qa-java-toolkits/patterns/img_2.png deleted file mode 100644 index 81085f2..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/patterns/img_2.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/patterns/img_3.png b/docs/raw-materials/qa-java-toolkits/patterns/img_3.png deleted file mode 100644 index f433a79..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/patterns/img_3.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/product/img.png b/docs/raw-materials/qa-java-toolkits/product/img.png deleted file mode 100644 index 58a41f3..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/product/img.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/product/kiwicms-testcase.md b/docs/raw-materials/qa-java-toolkits/product/kiwicms-testcase.md deleted file mode 100644 index 2890ad2..0000000 --- a/docs/raw-materials/qa-java-toolkits/product/kiwicms-testcase.md +++ /dev/null @@ -1,45 +0,0 @@ -# KiwiTCMS: open source test management - -首先这个项目太老了,可能不适合使用了,主要了解思路的管理方式. - -## 业务分层 - -- [docs](https://kiwitcms.readthedocs.io/en/latest/guide/usecase.html) - -核心概念: -![img.png](img.png) - -- Use Cases - Manual Testing - Writing a Test Plan - Manager Assigns Testing Priorities - Cloning a Test Plan - -- Test Plans -Searching for Test Plans -Creating a Test Plan -Cloning a Test Plan -Editing a Test Plan -Change History -Bulk update -Disabling a Test Plan -Re-enable a Test Plan -Exporting Test Cases from a Test Plan - -- Test Cases -Add an existing test plan to test case -Create test case from a test plan -Add an existing test case to test plan -Cloning Test Cases -Reviewing a Test Case -Changing the order of Test Cases in a Test Plan - -- Test Runs -Searching for Test Runs -Creating a Test Run -Add Test Cases to an existing Test Run -Cloning a Test Run -Editing a Test Run -Changing the status of a Test Run -Deleting a Test Run -Executing a Test Run \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/product/seeds.yaml b/docs/raw-materials/qa-java-toolkits/product/seeds.yaml deleted file mode 100644 index b2ea390..0000000 --- a/docs/raw-materials/qa-java-toolkits/product/seeds.yaml +++ /dev/null @@ -1,5 +0,0 @@ -seeds: - - https://zebrunner.com/blog-posts - -docs: - - https://zebrunner.com/documentation/ \ No newline at end of file diff --git a/docs/raw-materials/qa-java-toolkits/product/zebrunner.md b/docs/raw-materials/qa-java-toolkits/product/zebrunner.md deleted file mode 100644 index d980338..0000000 --- a/docs/raw-materials/qa-java-toolkits/product/zebrunner.md +++ /dev/null @@ -1,14 +0,0 @@ -# zebrunner - -[zebrunner](https://zebrunner.com/) -> Zebrunner -Automation Reporting -Make your test automation process easier than ever and take debugging to the next level. - -## Key Ideas - -Testing Runner to collect daily tools: -![img.png](zebrunner.png) - -- https://github.com/zebrunner -- https://zebrunner.com/documentation/ diff --git a/docs/raw-materials/qa-java-toolkits/product/zebrunner.png b/docs/raw-materials/qa-java-toolkits/product/zebrunner.png deleted file mode 100644 index bba667f..0000000 Binary files a/docs/raw-materials/qa-java-toolkits/product/zebrunner.png and /dev/null differ diff --git a/docs/raw-materials/qa-java-toolkits/references.yaml b/docs/raw-materials/qa-java-toolkits/references.yaml deleted file mode 100644 index b2203c2..0000000 --- a/docs/raw-materials/qa-java-toolkits/references.yaml +++ /dev/null @@ -1,152 +0,0 @@ -openapi: - - https://github.com/OpenAPITools/openapi-generator.git -sql: - - https://gitee.com/drinkjava2/jsqlbox.git - - https://github.com/future-architect/uroborosql.git - - https://github.com/bes2008/sqlhelper.git - - https://github.com/jhannes/fluent-jdbc.git - - https://github.com/jdbi/jdbi.git - - https://github.com/line/kotlin-jdsl.git -proxy: - - https://github.com/FgForrest/Proxycian.git -orm: - - https://github.com/deng-hb/eorm.git - - https://gitee.com/xixifeng.com/fastquery.git - - https://github.com/yangziwen/quick-dao.git - - https://gitee.com/troyzhxu/bean-searcher.git - -bpm: - - https://github.com/camunda/camunda-bpm-platform.git -postman-openapi: - - https://github.com/Loadium/postman2jmx.git -eventbus: - - https://github.com/eventuate-foundation/eventuate-common.git -reflections: - - https://github.com/xvik/generics-resolver.git - - https://github.com/jOOQ/jOOR.git - - https://github.com/mstrobel/procyon - - https://github.com/FgForrest/Proxycian.git -frameworks: - - https://github.com/line/armeria.git -concurrency: - - https://github.com/trivago/fastutil-concurrent-wrapper.git - - https://github.com/LMAX-Exchange/disruptor.git - -cloud-storage: - - https://github.com/sine-io/cosbench-sineio.git -datakit: - - https://gitee.com/troyzhxu/data.git - - https://github.com/instancio/instancio.git -awesome: - - https://github.com/humayun-ashik -api-mgt: - - https://github.com/apioak/apioak.git - - https://gitee.com/mirrors/any-rule.git -testing: - reporting: - - https://docs.qameta.io/allure/ - tcms: - - https://github.com/kiwitcms/Kiwi.git -document: - - https://github.com/tstanislawek/awesome-document-understanding -xml: - - https://github.com/jhannes/eaxy.git -poi: - - https://github.com/Sayi/poi-tl -products: - - https://www.getxray.app/ - - https://testrigor.com/ - - https://www.testing-whiz.com/ - - https://directus.cloud/fluentqa/projects?status=success - - https://strapi.io/ - - https://docs.strapi.io/developer-docs/latest/getting-started/quick-start.html - - https://www.directual.com/ - - https://theqalead.com/ - - https://softwareengineeringdaily.com/ - - https://manage.testmo.com/trial - - https://www.pixtastock.com/illustration/58163085 - - https://reqtest.com/testing-blog/software-quality-assurance/ - - https://intland.com/codebeamer/quality-assurance-software-testing/ - - https://www.scnsoft.com/software-testing/quality-management-optimization - - https://www.bmc.com/blogs/quality-assurance-software-testing/ - - https://content.intland.com/blog/modern-software-qa-the-importance-of-requirements-based-testing - - https://mobidev.biz/blog/what_is_the_value_brought_by_qa_to_your_software_product - - https://www.projectmanager.com/ - - https://www.altexsoft.com/blog/engineering/software-testing-qa-best-practices/ - - http://flammen.bg/software-qa-and-testing-services/ - - https://radixweb.com/blog/signs-you-need-a-software-qa-process-audit - - https://monday.com/blog/project-management/software-quality-assurance/ - - https://intland.com/codebeamer/quality-assurance-software-testing/ - - https://www.testing-whiz.com/ - - https://www.spec-qa.com/ - - https://www.gurock.com/testrail/qa-software/?utm_campaign=gg_dg_isr_can_search_generic_medium_intent&utm_source=google&utm_medium=cpc&utm_content=qa_software&utm_term=qa%20software&gclid=CjwKCAjw0dKXBhBPEiwA2bmObQFJVlBblQfYHFxAL3mKHYhZTdcahRnm6lyijUsOj8mK5NYHVYwSKRoCBCoQAvD_BwE - - https://www.altexsoft.com/whitepapers/quality-assurance-quality-control-and-testing-the-basics-of-software-quality-management/ - - https://www.thoughtco.com/ - - https://fluentqa.testmo.net/projects/view/1 - - https://docs.testmo.com/docs/ - - https://www.baeldung.com/ - - https://gitee.com/wkeyuan/DWSurvey.git - - https://github.com/liaochong/database-connector.git -repos: - - https://github.com/NREL/api-umbrella.git - - https://github.com/apache/apisix - - https://github.com/DXY-F2E/api-mocker.git - - https://github.com/jormaechea/open-api-mocker - - https://github.com/api-evangelist/virtualization - - https://github.com/apimastery/APISimulator.git - - https://gravitee.io/ - - http://Tyk.io - - https://github.com/apioo/fusio.git - - https://www.fusio-project.org/ - - https://zhuanlan.zhihu.com/p/259737867 - - https://www.dreamfactory.com/developers/scripts - - https://docs.jmix.io/jmix/ui/screens.html - - https://github.com/EMResearch/EvoMaster.git - - https://github.com/ambertests/explore-with-postman.git - - https://github.com/monkeyWie/proxyee.git -low-code: - backend: - - https://www.jhipster.tech/tech-board/ - frontend: - - https://github.com/ReactBricks -code-composition: - - https://developer.entando.com/v7.0/docs/ - -mindmapping: - - https://github.com/diduweiwu/xminder.git - - https://github.com/cloudphr/mindmap.git -excel: - - https://github.com/liaochong/myexcel.git - - https://github.com/houbb/iexcel.git -bi: - - https://github.com/shzlw/poli.git -server: - - https://github.com/noboomu/proteus.git -db: - - https://github.com/d3adspace/phoenix.git -format-converter: - - https://github.com/felixklauke/kira.git -pdf-docs: - - https://github.com/wing328/docspring-java -integration: - - https://github.com/yydzxz/ByteDanceOpen.git - -framework: - - https://github.com/Tencent/APIJSON.git - -api-tools: - - https://github.com/typicode/json-server.git - -mock: - - https://gitee.com/ForteScarlet/Mock.java.git - -service-visualization: - - https://github.com/SpectoLabs/hoverfly.git -java: - - category: project-template - repos: - - https://github.com/jshaptic/java-project-template.git - - - category: integration - repos: - - https://github.com/rh-messaging/jira-git-report.git \ No newline at end of file diff --git a/docs/raw-materials/references.yaml b/docs/raw-materials/references.yaml deleted file mode 100644 index e30ebec..0000000 --- a/docs/raw-materials/references.yaml +++ /dev/null @@ -1,3 +0,0 @@ -repos: - - https://github.com/darrachequesne/spring-data-jpa-datatables.git - - https://gitee.com/troyzhxu/okhttps.gitgit \ No newline at end of file diff --git a/docs/raw-materials/thirdparty/smook.md b/docs/raw-materials/thirdparty/smook.md deleted file mode 100644 index dfca14e..0000000 --- a/docs/raw-materials/thirdparty/smook.md +++ /dev/null @@ -1,19 +0,0 @@ -# smooks intro - -While Smooks can be used as a lightweight platform on which to build your own custom processing logic (for a wide range of data formats ``out-of-the-box''), it comes with some very useful features that can be used individually, or seamlessly combined together. -[docs](https://www.smooks.org/v2/documentation/) -## Java Binding - -![img](https://www.smooks.org/v2/assets/images/Binding.png) - -## Transformation -![img](https://www.smooks.org/v2/assets/images/Transform.png) - -## Huge Message Processing - -![img](https://www.smooks.org/v2/assets/images/Hugetrans.png) - -## Message Enrichment - -![img](https://www.smooks.org/v2/assets/images/Enrich.png) - diff --git a/docs/raw-materials/tips/ide/gitignore-template.md b/docs/raw-materials/tips/ide/gitignore-template.md deleted file mode 100644 index 049beda..0000000 --- a/docs/raw-materials/tips/ide/gitignore-template.md +++ /dev/null @@ -1,4 +0,0 @@ -![img.png](gitignore-template.png) - - -![img.png](java-ignore.png) \ No newline at end of file diff --git a/docs/raw-materials/tips/ide/gitignore-template.png b/docs/raw-materials/tips/ide/gitignore-template.png deleted file mode 100644 index db6c740..0000000 Binary files a/docs/raw-materials/tips/ide/gitignore-template.png and /dev/null differ diff --git a/docs/raw-materials/tips/ide/ide-shortcut.md b/docs/raw-materials/tips/ide/ide-shortcut.md deleted file mode 100644 index e69de29..0000000 diff --git a/docs/raw-materials/tips/ide/java-ignore.png b/docs/raw-materials/tips/ide/java-ignore.png deleted file mode 100644 index 02333e4..0000000 Binary files a/docs/raw-materials/tips/ide/java-ignore.png and /dev/null differ diff --git a/docs/raw-materials/tips/ide/security-check.md b/docs/raw-materials/tips/ide/security-check.md deleted file mode 100644 index 3059baa..0000000 --- a/docs/raw-materials/tips/ide/security-check.md +++ /dev/null @@ -1 +0,0 @@ -![img.png](security-check.png) \ No newline at end of file diff --git a/docs/raw-materials/tips/ide/security-check.png b/docs/raw-materials/tips/ide/security-check.png deleted file mode 100644 index 62ead8a..0000000 Binary files a/docs/raw-materials/tips/ide/security-check.png and /dev/null differ diff --git a/docs/raw-materials/tips/tree-explain.md b/docs/raw-materials/tips/tree-explain.md deleted file mode 100644 index a87e248..0000000 --- a/docs/raw-materials/tips/tree-explain.md +++ /dev/null @@ -1,24 +0,0 @@ -## README - -Tree: -Node - a single point of a tree -Edge - line, which connects two distinct nodes -Root - top node of the tree, which has no parent -Parent - a node, other than the root, which is connected to other successor nodes -Child - a node, other than the root, which is connected to predecessor -Leaf - a node without children -Path - a sequence of nodes and edges connecting a node with a descendant -Path Length - number of nodes in the path - 1 -Ancestor - the top parent node of the path -Descendant - the bottom child node of the path -Siblings - nodes, which have the same parent -Subtree - a node in a tree with all of its proper descendants, if any -Node Height - the number of edges on the longest downward path between that node and a leaf -Tree Height - the number of edges on the longest downward path between the root and a leaf (root height) -Depth (Level) - the path length between the root and the current node -Ordered Tree - tree in which nodes has the children ordered -Labeled Tree - tree in which a label or value is associated with each node of the tree -Expression Tree - tree which specifies the association of an expression's operands and its operators in a uniform way, regardless of whether the association is required by the placement of parentheses in the expression or by the precedence and associativity rules for the operators involved -Branching Factor - maximum number of children a node can have -Pre order - a form of tree traversal, where the action is called firstly on the current node, and then the pre order function is called again recursively on each of the subtree from left to right -Post order - a form of tree traversal, where the post order function is called recursively on each subtree from left to right and then the action is called \ No newline at end of file diff --git a/docs/references.yaml b/docs/references.yaml deleted file mode 100644 index d922654..0000000 --- a/docs/references.yaml +++ /dev/null @@ -1,10 +0,0 @@ -blogs: - - https://www.softkraft.co/web-application-architecture/ - - https://github.com/ovh/cds.git - - https://github.com/camunda/camunda-bpm-platform.git - -resources: - - https://itnext.io/ - -build-in: - - https://github.com/admin4j/admin4j-framework.git \ No newline at end of file diff --git a/docs/simplify-qa/low-code/bi/api-dashboard.png b/docs/simplify-qa/low-code/bi/api-dashboard.png deleted file mode 100644 index 4863c49..0000000 Binary files a/docs/simplify-qa/low-code/bi/api-dashboard.png and /dev/null differ diff --git a/docs/simplify-qa/low-code/bi/metabase-add-chart.png b/docs/simplify-qa/low-code/bi/metabase-add-chart.png deleted file mode 100644 index 4737954..0000000 Binary files a/docs/simplify-qa/low-code/bi/metabase-add-chart.png and /dev/null differ diff --git a/docs/simplify-qa/low-code/bi/metabase-charts.png b/docs/simplify-qa/low-code/bi/metabase-charts.png deleted file mode 100644 index 883e41d..0000000 Binary files a/docs/simplify-qa/low-code/bi/metabase-charts.png and /dev/null differ diff --git a/docs/simplify-qa/low-code/bi/metabase-pie.png b/docs/simplify-qa/low-code/bi/metabase-pie.png deleted file mode 100644 index 4b83063..0000000 Binary files a/docs/simplify-qa/low-code/bi/metabase-pie.png and /dev/null differ diff --git a/docs/simplify-qa/low-code/bi/metabase-share.png b/docs/simplify-qa/low-code/bi/metabase-share.png deleted file mode 100644 index eed6ced..0000000 Binary files a/docs/simplify-qa/low-code/bi/metabase-share.png and /dev/null differ diff --git "a/docs/simplify-qa/low-code/bi/\346\265\213\350\257\225\344\275\216\344\273\243\347\240\201\345\256\236\350\267\265\357\274\232METABASE,5\345\210\206\351\222\237\350\256\251\346\265\213\350\257\225\344\273\216\346\255\244\346\212\245\350\241\250\344\270\215\346\261\202\344\272\272.md" "b/docs/simplify-qa/low-code/bi/\346\265\213\350\257\225\344\275\216\344\273\243\347\240\201\345\256\236\350\267\265\357\274\232METABASE,5\345\210\206\351\222\237\350\256\251\346\265\213\350\257\225\344\273\216\346\255\244\346\212\245\350\241\250\344\270\215\346\261\202\344\272\272.md" deleted file mode 100644 index ee11239..0000000 --- "a/docs/simplify-qa/low-code/bi/\346\265\213\350\257\225\344\275\216\344\273\243\347\240\201\345\256\236\350\267\265\357\274\232METABASE,5\345\210\206\351\222\237\350\256\251\346\265\213\350\257\225\344\273\216\346\255\244\346\212\245\350\241\250\344\270\215\346\261\202\344\272\272.md" +++ /dev/null @@ -1,64 +0,0 @@ - ->5分钟让你报表不求人 - -2023年尝试了不少低代码工具,我自己的结论是,组合使用这些工具会让你的效率大大提升,比如说这款metabase,或许之花个5分钟-30分钟就可以让你从此表报不求人,前后端通吃了. 不说夸张的话,感兴趣自己试试就可以. - -[metabase](https://www.metabase.com/docs/latest/) - ->Don't be a bottleneck -Fast analytics with the friendly UX and integrated tooling to let your company explore data on their own. - -测试同学日常如果需要统计一些数据和做一些DASHBOARD,来做一些数据分析,Metabase 可以满足大部分的需求. 当然前提是数据已经落到数据库,那么日常统计如: - -1. Bug数量 -2. 测试进度 -3. 接口测试覆盖率 -4. 测试用例完成度 -等等事情就是小菜一碟,异常轻松. 如果想要在内部系统中集成这个Dashboard通过iframe嵌入就可以. - -用这个方法的最大好处是: -1. 足够简单,有SQL基础,1个小时入门使用吧 -2. 不需要开发任何代码,不需要前端后端开发,一个人都能搞定 -3. 随时随地修改,随时随地集成到现有系统 - -下面用几个例子来说明一下Metabase可以做什么,使用这个希望老板再叫你做一个报表展示系统,不用太担心什么前后端开发的事情了. - -## 1. 安装 - -一条命令即可: - -```sh -docker run -d -p 3000:3000 --name metabase metabase/metabase -v ~/metabase-data:/metabase-data -``` - -## 2. 第一个图表-饼图 - -假设目前你已经有所以API信息,你想统计一下API分模块的数量,用饼图展示,需要怎么做呢? - -假设API信息都在表```apis``` 这个表里面,那么只需要在Metabase中创建SQL查询,同时可视化视图中选择饼图就可以: - -![[metabase-pie.png]] -## 3. 第一个Dashboard - -在创建了第一个饼图之后,可以将此饼图添加到一个Dashboard中, -![[metabase-add-chart.png]] -这样第一个Dashboard也已经做好,如果想要往这个Dashboard中添加更多的内容,可以先去常见一个查询/问题,选择可视化之后,在加入到Dashboard中就可以 -加入后的效果如下: -![[api-dashboard.png]] -## 4. 进阶 - -以上两个例子是最简单的Metabase使用小结,还有一些高级的用法: -1. 交互式图表,就是可以有参数选择进行筛选 -2. 不同图表进行跳转 -3. 各种不同的图表类型使用 - -![[metabase-charts.png]] - -## 5. 集成 - -如何将这些报表集成到已有系统或者老板想任何时候都能看到的话,Metabase有分享功能,可以通过iframe直接嵌入到已有系统: -![[metabase-share.png]] -这样一个报表系统就这么完成了,完全可以满足日常QA使用,只要去解决数据落库的问题,一个日常测试的指标Dashboard就可以非常方便的构建,同时有很多灵活度,因为Dashboard可以随时构建,缺的只是合适的数据. - -![微信公众号发布](https://mp.weixin.qq.com/s?__biz=MzIxMzgzNjA3NA==&mid=2247484211&idx=1&sn=dbbfc1e4c4ee1fcd9dc4fe7212c04da4&chksm=97b18b76a0c6026057fd4589ca5e045d9aac5187aff4057afafa4bd2de970fae919f9d11c39a&token=1054280431&lang=zh_CN#rd) - diff --git "a/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/github-internal.md" "b/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/github-internal.md" deleted file mode 100644 index 711e7c6..0000000 --- "a/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/github-internal.md" +++ /dev/null @@ -1,89 +0,0 @@ -# GITHUB - -## Another FasterRunner - -Python,Django - -总体评价: -1. UI: 4 -2. Features: 4 -3. Codes: 6 - -- [Another Faster Runner](https://github.com/lihuacai168/AnotherFasterRunner.git) - -## arextest 携程开源 -JAVA,typescript前后端分离 - -1. UI: 8 -2. Features: -- [arextest](https://github.com/arextest/arex.git) - -## botcity-framework-core: Opensource RPA - -- [botcity-framework-core](https://github.com/botcity-dev/botcity-framework-core.git -) - -## boyka-framework automation framwork - -- [boyka-framework](https://github.com/BoykaFramework/boyka-framework.git) -- 概念: - A. Session - B. Request - C. Response - D. Action - -## Allure2 - -- [website](https://allurereport.org/) -- [github](https://github.com/allure-framework/allure2) - -## Pity - -- [pity-backend](https://github.com/wuranxu/pity.git) -- [pity-web](https://github.com/wuranxu/pityWeb.git) - - -## test-hub - -- [testhub](https://github.com/testhub-io/testhub) - -## FeatureProbe - -- [FeatureProbe](https://github.com/FeatureProbe/FeatureProbe.git) - -## httptoolkit-ui - -- [httptoolkit-ui](https://github.com/httptoolkit/httptoolkit-ui) - -## pastme - -Infrastructures & Hosting Services -PythonAnywhere -PlanetScale -- [pastme]( https://github.com/collove/pasteme.git) - -## alex - -- [alex-backend](https://github.com/Biexei/alex-backend) -- [alex-frontend](https://github.com/Biexei/alex-frontend.git) - -## insomnia - -- [insomnia](https://github.com/Kong/insomnia) - - -## Collections - -- [proboCI](https://github.com/ProboCI/probo) - - -## Trigger.dev - -- [trigger.dev](https://github.com/triggerdotdev/trigger.dev) - -## Restfox - 不错 -- https://github.com/flawiddsouza/Restfox - -## karate - -- https://github.com/karatelabs/karate.git \ No newline at end of file diff --git "a/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/integrations.md" "b/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/integrations.md" deleted file mode 100644 index e8950f4..0000000 --- "a/docs/simplify-qa/testing/\344\272\247\345\223\201\345\210\206\346\236\220/integrations.md" +++ /dev/null @@ -1,11 +0,0 @@ -# Integrations - -## JIRA - -## TestRails - -## 禅道 - -## Feishu - -## Vika \ No newline at end of file diff --git "a/docs/simplify-qa/testing/\346\216\245\345\217\243\346\265\213\350\257\225\345\267\245\345\205\267.md" "b/docs/simplify-qa/testing/\346\216\245\345\217\243\346\265\213\350\257\225\345\267\245\345\205\267.md" deleted file mode 100644 index 146a4ac..0000000 --- "a/docs/simplify-qa/testing/\346\216\245\345\217\243\346\265\213\350\257\225\345\267\245\345\205\267.md" +++ /dev/null @@ -1,33 +0,0 @@ -接口测试场景: -1. 通过前台UI对接口进行测试 -2. 专门对接口进行测试 - -每个公司情况不一样,有些通过UI对接口进行测试可以了; -而有一些需要专门对接口进行测试. - -对于场景1, 接口参数的校验逻辑可能有有缺失?为什么通过UI会测试不到这块?不过这些问题就看需不需要严格测试 -对于场景2,一般可能接口对外开发,需要进行比较严格的参数检查,接口就是一个公司产品 - -大部分的公司不会明确说是否需要测试接口,原因主要是: -1. 业务测试的时候已经有部分覆盖,单独做接口会有额外的成本产生 -2. 很多是后台业务系统只给公司用,UI上面进行限制就可以了 - -那么单独做接口测试有什么好处,不好的地方以及难点呢? - -1. 好处1: 可以覆盖部分UI测试时候不能覆盖的场景 -2. 好处2: 接口测试脚本可以快速构建测试数据,在特定场景下可以帮助测试提供效率,比如某个冗长的工作流,你本身只要测试某一个环节,但是不需要重头把所有内容都做了,如果直接通过脚本可以直接造需要的数据,快速进行重点测试 -3. 坏处是有额外工作量投入,同时还要额外投入维护成本 -4. 难点主要也是持续的工作量和人力维护成本投入 - -总结好处,坏处以及难点,其实接口自动化测试如果能够顺利推动,其实就一条如何降低开发维护成本,降低到足够低,可能推动就方便了. - -## 1. 问题出发 - -- 额外工作量 -- 日常额外维护 -- 检查点 -- 复用 - - - - diff --git "a/docs/simplify-qa/testing/\346\265\213\350\257\225\345\271\263\345\217\260.md" "b/docs/simplify-qa/testing/\346\265\213\350\257\225\345\271\263\345\217\260.md" deleted file mode 100644 index b55de38..0000000 --- "a/docs/simplify-qa/testing/\346\265\213\350\257\225\345\271\263\345\217\260.md" +++ /dev/null @@ -1,73 +0,0 @@ -今年其实日子并不好过,但是因为前两年被裁员过两回,所以也就无所谓了,积累点东西,做点事情其他其实没有任何太多想法,不管是测试管理也好,测试开发也好,业务测试也好,自己基本都做,名义上是管理者,但是基本上都参与不少,先从测试平台这块开始总结吧。 - -这篇先总结一下主要做的目的和想法,以及实际用到的工具和代码积累. 如果用自己的土话说就是: -1. 平台最重要的是怎么帮助协作,共享和积累 -2. 有什么方便的工具就用什么方便的工具,怎么简单怎么来 -3. 积累一些小工具,慢慢都会用到的 - -## 1.为什么要做? - -已经有自动化测试框架了,为什么还需要做所为的测试平台?其实叫平台还是叫其他都不重要,重要的是解决什么问题? - -一个所为的平台比较和自动化测试框架或者自动化测试脚本最大的好处,个人认为是: -1. 一个团队可以通过一个系统共享成果 -2. 一个团队可以通过一个系统积累内容 -3. 上游团队做的事情可以直接或者很低转化成本的方式被下游团队使用,比如产品的需求文档怎么快速变成测试用例 - -在现实中常见的场景是: -1. 你的自动化脚本写好了,只能在本地跑,其他人如果想用还不那么好用 -2. 业务A的自动化脚本写好了,业务B想要利用业务A的脚本造点数据还是不太方便做 -3. 写自动化脚本的使用,有一些通用工具,每个业务都自己做一些而不能抽象/复用,造成重复工作 -4. 测试用例散落的到处都是,不好维护和复用 - -综合以上,测试平台也好,测试辅助系统也好,主要是提供: -1. 可以共享,复用成果的工具 -2. 提高协作能力,比如说你的测试脚本我能拿来造数,比如说产品需求,可以直接部分转化成测试用例,有些内容本身是重复的,只是不同使用人需要把内容转化成不同的格式/表达,这实际上可以通过协作来满足双方需求 -3. 让好的方法和手段变成系统积累下来 - -所以测试平台/系统的要点帮助***共享***,提高***协作***,否则没有必要一个平台,都是一个人做,不需要什么平台. - -提供这些主要其实是能够提高个人工作效率.比如以前有人做过的,如果有积累就不需要重新重头从零开始做,能复用一些就复用一些. - -因此都是从实际需求出发,一点点积累,改进,好用的东西是磨出来的. 同时平台和个人工程师是一个合作关系,平台是为了提高工程师效率的,而不是去阻止工程师日常行为,个人怎么做一定是灵活的,平台是辅助功能消除一些日常繁琐重复的事情. - -那么从这些出发,个人一年的实践中主要是将下面内容放到测试管理系统中: -1. 接口测试生命周期管理,从接口的新建/弃用到接口自动化测试都进行平台话,打通接口定义代码变更到接口自动化测试完成 -2. 自动化测试管理,提供录制/生成代码/测试平台交互工具,不约束写代码,又可以自动收集测试进度/测试报告,在提供完全的灵活性的前提下,尽可能减少以前需要额外处理的事情比如整理报告,一个去抓包写接口测试用例 -3. 测试用例管理: 测试用例的统一管理,可以拆解/复用满足不同场景下需求,比如需要多人分工,就可以拆解测试用例分别追踪进度; 比如回归测试用例就可以复用以前写过的测试用例 -4. 测试需求: 规范话测试需求同产品达成某种写法的共识,就可以通过AI直接生成部分最常见的测试用例,介绍测试熟练工搬运需求到测试用例的时间,这本身就是协作的一部分 - -从一个端到端的场景来说,就是测试人员配备了工具箱(类似各种IDE插件),可以直接本地直接使用(可以在本地写代码处理特殊场景),完成任务之后可以快速告诉平台(测试报告/测试结果),而需求其他人帮助的时候,通过平台(控制中心)获取已存在的工具支持 - -## 2.做的成本是什么? - -做一个能用的一个平台,个人认为成本是不高的,这个平台包括前后端,包括一些报表工具,直接使用现有个低代码(low-code)工具,会让你非常快速的验证想法. 以自己实践的例子来说: - -1. 后端低代码[supabase](https://supabase.com/): 数据库表可以直接变成API接口不需要任何额外代码,存储过程可以直接变成接口不需要任务额外代码 -2. 前后端系统低代码工具[erupts](https://www.erupt.xyz/#!/)快速开发业务系统验证想法,一个Entity一个页面,开发非常快,基本可以满足日常需求 -3. [metabase](https://www.metabase.com/),通过SQL直接变成统计报表 - -当然low-code的工具非常多,但是以上三个是自己实操用的最多的,也是觉得比较好用的. - -至于开发语言,JAVA/PYTHON/Typescript/Go,都可以,现在语言层面其实个人认为已经都越来越统一,用我的观点话,基本语法不说之后,思想都是通的,核心几个点: -1. 结构化->结构体/entity -2. 依赖注入/AOP -> Dependency Injection -3. Adaptor -> 调用方式转换 -4. 几个合适的框架 -> springboot/gin/jpa,这里想说下spring boot data jpa绝对比mybatis方便用,如果你文档真的仔细看的话 - -当然如果你需要做交互很好的页面,那你就需要更多了, -***REACT/VUE/NEXTJS/ANTD*** 等等,当然最麻烦的其实可能是CSS,不过CSS慢慢原子化之后类似于tailwind这种工具的出现再结合AI的能力,做前端一定会越来越简单一些,这些是看的见的。甚至有一些screen-to-code的工具现在也可以简单用用了,比如: -1. [v0](https://v0.dev/) -2. [screenshot-to-code](https://github.com/abi/screenshot-to-code) - -## 3. 积累的一些代码 - -- [java常用库](https://github.com/fluent-qa/fluentqa-workspace)数据库,EXCEL等等测试常用 -- [python常用库](https://github.com/fluent-qa/fluentqa-python)数据库,EXCEL等等测试常用 - - - - - - - diff --git a/docs/tools/1-akita.md b/docs/tools/1-akita.md deleted file mode 100644 index cb97f8b..0000000 --- a/docs/tools/1-akita.md +++ /dev/null @@ -1,42 +0,0 @@ -# Akita - -[akita](https://docs.akita.software/docs/how-akita-works) -> Akita watches your API traffic for automatic discovery and dashboards - -根据提供的文档《How Akita Works》,以下是Akita软件的要点总结: - -1. **API流量监控**:Akita通过使用eBPF技术自动发现API流量,并为监控和仪表板生成数据。 - -2. **低摩擦部署**:Akita代理(Agent)可以无摩擦地部署,不需要通过SDK提供访问权限,也不需要更改代码或代理。 - -3. **低风险部署**:Akita代理在服务器上运行,仅将请求/响应的元数据发送回云端,确保敏感数据不会暴露给Akita服务器。 - -4. **自动API行为建模**:Akita自动对API流量进行建模,推断API路径,为每个端点提供监控和警报功能。 - -5. **流量分析技术**:Akita代理使用salted hash对有效载荷数据进行加密,并且永远不会解密(unhash),确保敏感数据不会被Akita云服务看到。 - -6. **高级流量分析算法**:Akita云使用先进的流量分析算法自动推断端点结构(包括路径参数)、数据类型、认证等,减少了编写API规范或制作仪表板的工作。 - -7. **数据访问和处理政策**:Akita有明确的数据访问和处理政策,确保用户数据的安全性和隐私性。 - -8. **支持的技术栈**:Akita支持多种技术栈,以适应不同的部署环境和需求。 - -## How to Run - -```shell -docker pull public.ecr.aws/akitasoftware/akita-cli:latest -``` - -start agent -```shell -docker run --rm --network host - -e AKITA_API_KEY_ID= API KEY ID \ - -e AKITA_API_KEY_SECRET= API KEY SECRET \ - akitasoftware/cli:latest apidump \ - --project PROJECT NAME -``` - -## codes - -- [akita-cli](https://github.com/akitasoftware/akita-cli) -- [akita-libs](https://github.com/akitasoftware/akita-libs.git) diff --git a/docs/tools/demo.json b/docs/tools/demo.json deleted file mode 100644 index a3e273c..0000000 --- a/docs/tools/demo.json +++ /dev/null @@ -1,121 +0,0 @@ -{ - "id": 88701696, - "node_id": "MDEwOlJlcG9zaXRvcnk4ODcwMTY5Ng==", - "name": "knife4j", - "full_name": "xiaoymin/knife4j", - "private": false, - "owner": { - "login": "xiaoymin", - "id": 7894406, - "node_id": "MDQ6VXNlcjc4OTQ0MDY=", - "avatar_url": "https://avatars.githubusercontent.com/u/7894406?v=4", - "gravatar_id": "", - "url": "https://api.github.com/users/xiaoymin", - "html_url": "https://github.com/xiaoymin", - "followers_url": "https://api.github.com/users/xiaoymin/followers", - "following_url": "https://api.github.com/users/xiaoymin/following{/other_user}", - "gists_url": "https://api.github.com/users/xiaoymin/gists{/gist_id}", - "starred_url": "https://api.github.com/users/xiaoymin/starred{/owner}{/repo}", - "subscriptions_url": "https://api.github.com/users/xiaoymin/subscriptions", - "organizations_url": "https://api.github.com/users/xiaoymin/orgs", - "repos_url": "https://api.github.com/users/xiaoymin/repos", - "events_url": "https://api.github.com/users/xiaoymin/events{/privacy}", - "received_events_url": "https://api.github.com/users/xiaoymin/received_events", - "type": "User", - "site_admin": false - }, - "html_url": "https://github.com/xiaoymin/knife4j", - "description": "Knife4j is a set of Swagger2 and OpenAPI3 All-in-one enhancement solution", - "fork": false, - "url": "https://api.github.com/repos/xiaoymin/knife4j", - "forks_url": "https://api.github.com/repos/xiaoymin/knife4j/forks", - "keys_url": "https://api.github.com/repos/xiaoymin/knife4j/keys{/key_id}", - "collaborators_url": "https://api.github.com/repos/xiaoymin/knife4j/collaborators{/collaborator}", - "teams_url": "https://api.github.com/repos/xiaoymin/knife4j/teams", - "hooks_url": "https://api.github.com/repos/xiaoymin/knife4j/hooks", - "issue_events_url": "https://api.github.com/repos/xiaoymin/knife4j/issues/events{/number}", - "events_url": "https://api.github.com/repos/xiaoymin/knife4j/events", - "assignees_url": "https://api.github.com/repos/xiaoymin/knife4j/assignees{/user}", - "branches_url": "https://api.github.com/repos/xiaoymin/knife4j/branches{/branch}", - "tags_url": "https://api.github.com/repos/xiaoymin/knife4j/tags", - "blobs_url": "https://api.github.com/repos/xiaoymin/knife4j/git/blobs{/sha}", - "git_tags_url": "https://api.github.com/repos/xiaoymin/knife4j/git/tags{/sha}", - "git_refs_url": "https://api.github.com/repos/xiaoymin/knife4j/git/refs{/sha}", - "trees_url": "https://api.github.com/repos/xiaoymin/knife4j/git/trees{/sha}", - "statuses_url": "https://api.github.com/repos/xiaoymin/knife4j/statuses/{sha}", - "languages_url": "https://api.github.com/repos/xiaoymin/knife4j/languages", - "stargazers_url": "https://api.github.com/repos/xiaoymin/knife4j/stargazers", - "contributors_url": "https://api.github.com/repos/xiaoymin/knife4j/contributors", - "subscribers_url": "https://api.github.com/repos/xiaoymin/knife4j/subscribers", - "subscription_url": "https://api.github.com/repos/xiaoymin/knife4j/subscription", - "commits_url": "https://api.github.com/repos/xiaoymin/knife4j/commits{/sha}", - "git_commits_url": "https://api.github.com/repos/xiaoymin/knife4j/git/commits{/sha}", - "comments_url": "https://api.github.com/repos/xiaoymin/knife4j/comments{/number}", - "issue_comment_url": "https://api.github.com/repos/xiaoymin/knife4j/issues/comments{/number}", - "contents_url": "https://api.github.com/repos/xiaoymin/knife4j/contents/{+path}", - "compare_url": "https://api.github.com/repos/xiaoymin/knife4j/compare/{base}...{head}", - "merges_url": "https://api.github.com/repos/xiaoymin/knife4j/merges", - "archive_url": "https://api.github.com/repos/xiaoymin/knife4j/{archive_format}{/ref}", - "downloads_url": "https://api.github.com/repos/xiaoymin/knife4j/downloads", - "issues_url": "https://api.github.com/repos/xiaoymin/knife4j/issues{/number}", - "pulls_url": "https://api.github.com/repos/xiaoymin/knife4j/pulls{/number}", - "milestones_url": "https://api.github.com/repos/xiaoymin/knife4j/milestones{/number}", - "notifications_url": "https://api.github.com/repos/xiaoymin/knife4j/notifications{?since,all,participating}", - "labels_url": "https://api.github.com/repos/xiaoymin/knife4j/labels{/name}", - "releases_url": "https://api.github.com/repos/xiaoymin/knife4j/releases{/id}", - "deployments_url": "https://api.github.com/repos/xiaoymin/knife4j/deployments", - "created_at": "2017-04-19T04:44:28Z", - "updated_at": "2023-10-16T05:59:45Z", - "pushed_at": "2023-10-13T03:11:00Z", - "git_url": "git://github.com/xiaoymin/knife4j.git", - "ssh_url": "git@github.com:xiaoymin/knife4j.git", - "clone_url": "https://github.com/xiaoymin/knife4j.git", - "svn_url": "https://github.com/xiaoymin/knife4j", - "homepage": "https://doc.xiaominfo.com", - "size": 108145, - "stargazers_count": 3634, - "watchers_count": 3634, - "language": "HTML", - "has_issues": true, - "has_projects": true, - "has_downloads": true, - "has_wiki": true, - "has_pages": true, - "has_discussions": true, - "forks_count": 577, - "mirror_url": null, - "archived": false, - "disabled": false, - "open_issues_count": 41, - "license": { - "key": "apache-2.0", - "name": "Apache License 2.0", - "spdx_id": "Apache-2.0", - "url": "https://api.github.com/licenses/apache-2.0", - "node_id": "MDc6TGljZW5zZTI=" - }, - "allow_forking": true, - "is_template": false, - "web_commit_signoff_required": false, - "topics": [ - "knife4j", - "openapi2", - "openapi3", - "springdoc-openapi", - "springfox-swagger2", - "swagger", - "swagger-ui" - ], - "visibility": "public", - "forks": 577, - "open_issues": 41, - "watchers": 3634, - "default_branch": "dev", - "permissions": { - "admin": false, - "maintain": false, - "push": false, - "triage": false, - "pull": true - } -} \ No newline at end of file diff --git a/docs/tools/json-to-pojo.md b/docs/tools/json-to-pojo.md deleted file mode 100644 index b01330c..0000000 --- a/docs/tools/json-to-pojo.md +++ /dev/null @@ -1,13 +0,0 @@ -# JSON to Pojo - -## Cli - -```shell - -``` - -## Examples - -```json - -``` \ No newline at end of file diff --git a/docs/tools/quick-type.md b/docs/tools/quick-type.md deleted file mode 100644 index 89a4d73..0000000 --- a/docs/tools/quick-type.md +++ /dev/null @@ -1,10 +0,0 @@ -# Quick Type - -- [quick type](https://github.com/glideapps/quicktype) -- [quick type app](https://app.quicktype.io/) - -## installation - -```shell -npm install -g quicktype -``` diff --git a/fluent-apps/pom.xml b/fluent-apps/pom.xml deleted file mode 100644 index 726705b..0000000 --- a/fluent-apps/pom.xml +++ /dev/null @@ -1,21 +0,0 @@ - - - 4.0.0 - - io.fluentqa - qworkspace - 1.0-SNAPSHOT - - - fluent-apps - pom - - - 17 - 17 - UTF-8 - - - \ No newline at end of file diff --git a/fluent-apps/qaserver/.gitignore b/fluent-apps/qaserver/.gitignore deleted file mode 100644 index 5ff6309..0000000 --- a/fluent-apps/qaserver/.gitignore +++ /dev/null @@ -1,38 +0,0 @@ -target/ -!.mvn/wrapper/maven-wrapper.jar -!**/src/main/**/target/ -!**/src/test/**/target/ - -### IntelliJ IDEA ### -.idea/modules.xml -.idea/jarRepositories.xml -.idea/compiler.xml -.idea/libraries/ -*.iws -*.iml -*.ipr - -### Eclipse ### -.apt_generated -.classpath -.factorypath -.project -.settings -.springBeans -.sts4-cache - -### NetBeans ### -/nbproject/private/ -/nbbuild/ -/dist/ -/nbdist/ -/.nb-gradle/ -build/ -!**/src/main/**/build/ -!**/src/test/**/build/ - -### VS Code ### -.vscode/ - -### Mac OS ### -.DS_Store \ No newline at end of file diff --git a/fluent-apps/qaserver/pom.xml b/fluent-apps/qaserver/pom.xml deleted file mode 100644 index 863f8ac..0000000 --- a/fluent-apps/qaserver/pom.xml +++ /dev/null @@ -1,150 +0,0 @@ - - - 4.0.0 - - io.fluentqa - fluent-apps - 1.0-SNAPSHOT - - - qaserver - - - - - - xyz.erupt - erupt-upms - ${erupt.version} - - - - xyz.erupt - erupt-security - ${erupt.version} - - - xyz.erupt - erupt-job - ${erupt.version} - - - io.fluent - fluent-generator - ${fluent.version} - - - - - xyz.erupt - erupt-web - ${erupt.version} - - - org.springframework.boot - spring-boot-starter-tomcat - - - - - org.springframework.boot - spring-boot-starter-undertow - 2.7.12 - - - org.postgresql - postgresql - ${postgresql.version} - - - javax.xml.bind - jaxb-api - 2.3.1 - - - com.github.xiaoymin - knife4j-openapi2-spring-boot-starter - 4.4.0 - - - - io.fluent - fluent-excel - ${fluent.version} - - - io.fluent - fluent-mindmap - ${fluent.version} - - - io.fluent - fluent-erupts-base - ${fluent.version} - - - - io.fluent - fluent-generator - ${fluent.version} - - - io.fluent - fluent-git - ${fluent.version} - - - - cn.hutool - hutool-all - - - io.fluent - fluent-quickdao - 1.0-SNAPSHOT - - - io.fluent - fluent-openapi - 1.0-SNAPSHOT - - - - - - - - - - - - - - - - - maven-compiler-plugin - org.apache.maven.plugins - 3.11.0 - - 17 - 17 - - - - org.springframework.boot - spring-boot-maven-plugin - 2.7.2 - - - - repackage - - - - - - - \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/QAWorkspaceApp.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/QAWorkspaceApp.java deleted file mode 100644 index c6516b4..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/QAWorkspaceApp.java +++ /dev/null @@ -1,17 +0,0 @@ -package io.fluentqa; - -import org.springframework.boot.SpringApplication; -import org.springframework.boot.autoconfigure.SpringBootApplication; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.scheduling.annotation.EnableAsync; -import xyz.erupt.core.annotation.EruptScan; - -@SpringBootApplication -@EnableAsync -@EruptScan -@EntityScan -public class QAWorkspaceApp { - public static void main(String[] args) { - SpringApplication.run(QAWorkspaceApp.class); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentProductConfigModule.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentProductConfigModule.java deleted file mode 100644 index 3dd958c..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentProductConfigModule.java +++ /dev/null @@ -1,67 +0,0 @@ -package io.fluentqa.base; - -import io.fluentqa.base.masterdata.model.MasterData; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.base.project.model.ProjectModel; -import io.fluentqa.base.upload.model.UploadFileModel; -import org.apache.commons.math3.stat.descriptive.summary.Product; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.boot.context.properties.EnableConfigurationProperties; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.context.annotation.Configuration; -import xyz.erupt.core.annotation.EruptScan; -import xyz.erupt.core.constant.MenuTypeEnum; -import xyz.erupt.core.module.EruptModule; -import xyz.erupt.core.module.EruptModuleInvoke; -import xyz.erupt.core.module.MetaMenu; -import xyz.erupt.core.module.ModuleInfo; - -import java.util.ArrayList; -import java.util.List; - -@Configuration -@ComponentScan -@EntityScan -@EruptScan -@EnableConfigurationProperties -public class FluentProductConfigModule implements EruptModule { - - public FluentProductConfigModule() { - } - - @Override - public ModuleInfo info() { - return ModuleInfo.builder().name("fluent-product").build(); - } - - @Override - public void run() { - EruptModule.super.run(); - } - - @Override - public List initMenus() { - List menus = new ArrayList<>(); - menus.add(MetaMenu.createRootMenu("$fluent-master", "产品配置", "fa fa-product-hunt", 90)); - MetaMenu productMetaMenu = MetaMenu.createEruptClassMenu(ProductModuleModel.class, menus.get(0), 0, MenuTypeEnum.TABLE); - productMetaMenu.setIcon("fa fa-group"); - productMetaMenu.setName("产品元数据"); - productMetaMenu.setCode("$product-meta"); - menus.add(productMetaMenu); - MetaMenu masterDataMenu = MetaMenu.createEruptClassMenu(MasterData.class, menus.get(0), 1, MenuTypeEnum.TABLE); - masterDataMenu.setIcon("fa fa-times"); - masterDataMenu.setName("产品字典表配置"); - masterDataMenu.setCode("$master-data"); - menus.add(masterDataMenu); - MetaMenu projectMenu = MetaMenu.createEruptClassMenu(ProjectModel.class, menus.get(0), 2, MenuTypeEnum.TABLE); - projectMenu.setIcon("fa fa-linode"); - projectMenu.setName("项目配置"); - projectMenu.setCode("$project-meta"); - menus.add(projectMenu); - return menus; - } - - static { - EruptModuleInvoke.addEruptModule(FluentProductConfigModule.class); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentUploadTCModule.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentUploadTCModule.java deleted file mode 100644 index 9e44381..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/FluentUploadTCModule.java +++ /dev/null @@ -1,53 +0,0 @@ -package io.fluentqa.base; - -import io.fluentqa.base.upload.model.UploadFileModel; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.boot.context.properties.EnableConfigurationProperties; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.context.annotation.Configuration; -import xyz.erupt.core.annotation.EruptScan; -import xyz.erupt.core.constant.MenuTypeEnum; -import xyz.erupt.core.module.EruptModule; -import xyz.erupt.core.module.EruptModuleInvoke; -import xyz.erupt.core.module.MetaMenu; -import xyz.erupt.core.module.ModuleInfo; - -import java.util.ArrayList; -import java.util.List; - -@Configuration -@ComponentScan -@EntityScan -@EruptScan -@EnableConfigurationProperties -public class FluentUploadTCModule implements EruptModule { - - public FluentUploadTCModule() { - } - - @Override - public ModuleInfo info() { - return ModuleInfo.builder().name("fluent-tc-sync").build(); - } - - @Override - public void run() { - EruptModule.super.run(); - } - - @Override - public List initMenus() { - List menus = new ArrayList<>(); - menus.add(MetaMenu.createRootMenu("$tc-upload", "测试文件管理", "fa fa-file", 100)); - MetaMenu tfUploadSyncMenu = MetaMenu.createEruptClassMenu(UploadFileModel.class, menus.get(0), 0, MenuTypeEnum.TABLE); - tfUploadSyncMenu.setIcon("fa fa-folder-open"); - tfUploadSyncMenu.setName("测试文件同步"); - tfUploadSyncMenu.setCode("$tc-upload-sync"); - menus.add(tfUploadSyncMenu); - return menus; - } - - static { - EruptModuleInvoke.addEruptModule(FluentUploadTCModule.class); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/README.md b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/README.md deleted file mode 100644 index 9fb066c..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/README.md +++ /dev/null @@ -1,8 +0,0 @@ -# README - -shared component: - -- master data: shared configurations -- upload data: upload component -- product: product module -- project: project module \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/model/MasterData.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/model/MasterData.java deleted file mode 100644 index fab03e9..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/model/MasterData.java +++ /dev/null @@ -1,76 +0,0 @@ -package io.fluentqa.base.masterdata.model; - -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import io.fluentqa.base.model.ModelWithValidFlagVo; -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; - -import javax.persistence.Entity; -import javax.persistence.Table; - - -@Erupt(name = "产品字典值配置", power = @Power(importable = true, export = true)) -@Table(name = "master_data") -@Entity -@Data -public class MasterData extends ModelWithValidFlagVo { - - @EruptField( - views = @View(title = "分类"), - edit = @Edit( - search = @Search(vague = true), - title = "获取可选种类", - type = EditType.TAGS, - desc = "动态获取可选种类", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct category from master_data where valid=true" - )) - ) - private String category; - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, - inputType = @InputType - ) - ) - private String detail; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String code; - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/repo/MasterDataRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/repo/MasterDataRepo.java deleted file mode 100644 index db4adae..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/masterdata/repo/MasterDataRepo.java +++ /dev/null @@ -1,15 +0,0 @@ -package io.fluentqa.base.masterdata.repo; - -import io.fluentqa.base.masterdata.model.MasterData; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -import java.util.Optional; - -@Repository -public interface MasterDataRepo extends JpaRepository, JpaSpecificationExecutor { - - Optional findMasterDataByCode(String code); - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/package-info.java deleted file mode 100644 index 3130485..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.base; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleModel.java deleted file mode 100644 index 4a63a5e..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleModel.java +++ /dev/null @@ -1,140 +0,0 @@ -package io.fluentqa.base.product.model; - -import io.fluentqa.base.model.ModelWithValidFlagVo; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.Tree; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.ChoiceType; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.toolkit.handler.SqlChoiceFetchHandler; - -import javax.persistence.*; -import java.util.UUID; - -@Erupt(name = "产品模块配置", - power = @Power(importable = true, export = true), - tree = @Tree(pid = "parent.id")) -@Entity -@Table(name = "products") -public class ProductModuleModel extends ModelWithValidFlagVo { - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String code; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String details; - - @EruptField( - views = @View(title = "类型"), - edit = @Edit( - search = @Search, - title = "获取可选类型", - type = EditType.CHOICE, - desc = "动态获取可选类型", - choiceType = @ChoiceType( - fetchHandler = SqlChoiceFetchHandler.class, - fetchHandlerParams = "select id,name from master_data where category='PRODUCT'" - )) - ) - private String metaType; - - @ManyToOne - @EruptField( - edit = @Edit( - title = "上级树节点", - type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id") - ) - ) - private ProductModuleModel parent; - - - @Column(length = 36, nullable = true, updatable = false) - private String uuid = UUID.randomUUID().toString(); - - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public String getCode() { - return code; - } - - public void setCode(String code) { - this.code = code; - } - - public String getDetails() { - return details; - } - - public void setDetails(String details) { - this.details = details; - } - - public String getMetaType() { - return metaType; - } - - public void setMetaType(String metaType) { - this.metaType = metaType; - } - - public ProductModuleModel getParent() { - return parent; - } - - public void setParent(ProductModuleModel parent) { - this.parent = parent; - } - - public String getUuid() { - return uuid; - } - - public void setUuid(String uuid) { - this.uuid = uuid; - } -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleValidFlagVo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleValidFlagVo.java deleted file mode 100644 index 4aa8bee..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/model/ProductModuleValidFlagVo.java +++ /dev/null @@ -1,47 +0,0 @@ -package io.fluentqa.base.product.model; - -import io.fluentqa.base.model.ModelWithValidFlagVo; -import lombok.Getter; -import lombok.Setter; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; - -import javax.persistence.JoinColumn; -import javax.persistence.ManyToOne; -import javax.persistence.MappedSuperclass; - -@MappedSuperclass -@Getter -@Setter -public class ProductModuleValidFlagVo extends ModelWithValidFlagVo { - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "产品名称",column = "details"), - edit = @Edit( - search = @Search, - title = "产品选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel product; - - @ManyToOne - @JoinColumn(name = "module_id") - @EruptField( - views = @View(title = "模块名称",column = "details"), - edit = @Edit(title = "模块选择", search = @Search, type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - dependField = "product", - dependColumn = "parent.id" - )) - ) - private ProductModuleModel module; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/package-info.java deleted file mode 100644 index 98160a8..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.base.product; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/repo/ProductModuleRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/repo/ProductModuleRepo.java deleted file mode 100644 index ac9d87f..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/repo/ProductModuleRepo.java +++ /dev/null @@ -1,18 +0,0 @@ -package io.fluentqa.base.product.repo; - - -import io.fluentqa.base.product.model.ProductModuleModel; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -import java.util.Optional; - -@Repository -public interface ProductModuleRepo extends JpaRepository, JpaSpecificationExecutor { - - Optional findProductByNameAndValid(String name, boolean valid); - - Optional findProductByCodeAndValid(String codeName, boolean valid); - Optional findProductByParentIdAndNameAndValid(Long parentId, String name, boolean valid); -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/service/ProductModuleService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/service/ProductModuleService.java deleted file mode 100644 index 72afcdf..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/product/service/ProductModuleService.java +++ /dev/null @@ -1,57 +0,0 @@ -package io.fluentqa.base.product.service; - -import io.fluent.builtin.PingYinUtils; -import io.fluentqa.base.product.repo.ProductModuleRepo; -import io.fluentqa.base.proxies.AuditDataEnhancerProxy; -import io.fluentqa.base.masterdata.repo.MasterDataRepo; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.base.masterdata.model.MasterData; -import org.springframework.stereotype.Service; - -import javax.annotation.Resource; -import java.util.Optional; - -@Service -public class ProductModuleService { - @Resource - private ProductModuleRepo metaRepo; - - @Resource - private MasterDataRepo masterDataRepo; - - @Resource - AuditDataEnhancerProxy dataEnhancerProxy; - - public ProductModuleModel createModuleIfNotExist(Long productId, String moduleName, String updater) { - Optional meta = metaRepo.findProductByParentIdAndNameAndValid(productId, - moduleName, true); - if (meta.isPresent()) return meta.get(); - ProductModuleModel parent = new ProductModuleModel(); - parent.setId(productId); - ProductModuleModel module = new ProductModuleModel(); - module.setName(moduleName); - module.setDetails(moduleName); - module.setParent(parent); - module.setCode(PingYinUtils.convertToPinyinAbbreviation(moduleName)); - MasterData data = masterDataRepo.findMasterDataByCode("MODULE").get(); - module.setMetaType(data.getId().toString()); - dataEnhancerProxy.enhanceTimeAndUserAuditData(module,updater); - return metaRepo.save(module); - } - - public ProductModuleModel findApiServiceProduct() { - String API_SERVICE = "API"; - Optional meta = metaRepo.findProductByCodeAndValid(API_SERVICE, true); - if (meta.isPresent()) return meta.get(); - throw new RuntimeException("Please config API Service as a Product in Product Meta"); - } - - public ProductModuleModel createApiModuleIfNotExist(String moduleName,String updater) { - ProductModuleModel parent = findApiServiceProduct(); - return createModuleIfNotExist(parent.getId(), moduleName,updater); - } - - public ProductModuleModel findByName(String productName) { - return metaRepo.findProductByNameAndValid(productName, true).orElse(null); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/model/ProjectModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/model/ProjectModel.java deleted file mode 100644 index 8109405..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/model/ProjectModel.java +++ /dev/null @@ -1,89 +0,0 @@ -package io.fluentqa.base.project.model; - - -import io.fluentqa.base.model.ModelWithValidFlagVo; -import io.fluentqa.base.product.model.ProductModuleModel; -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.Tree; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; - -import javax.persistence.*; -import java.util.UUID; - -@Erupt(name = "项目", - power = @Power(importable = true, export = true), - tree = @Tree(pid = "parent.id")) -@Entity -@Table(name = "projects") -@Data -public class ProjectModel extends ModelWithValidFlagVo { - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "代号" - ), - edit = @Edit( - title = "代号", - type = EditType.INPUT, search = @Search, - notNull = true, - inputType = @InputType - ) - ) - private String code; - - @EruptField( - views = @View( - title = "详细描述" - ), - edit = @Edit( - title = "详细描述", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String details; - - @ManyToOne - @EruptField( - edit = @Edit( - title = "产品列表", - type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id") - ) - ) - private ProductModuleModel productId; - - @ManyToOne - @EruptField( - edit = @Edit( - title = "上级树节点", - type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id") - ) - ) - private ProjectModel parent; - - @Column(length = 36, nullable = false, updatable = false) - private String uuid = UUID.randomUUID().toString(); -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/repo/ProjectModelRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/repo/ProjectModelRepo.java deleted file mode 100644 index 5115fd8..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/project/repo/ProjectModelRepo.java +++ /dev/null @@ -1,18 +0,0 @@ -package io.fluentqa.base.project.repo; - - -import io.fluentqa.base.product.model.ProductModuleModel; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -import java.util.Optional; - -@Repository -public interface ProjectModelRepo extends JpaRepository, JpaSpecificationExecutor { - - Optional findProductByNameAndValid(String name, boolean valid); - - Optional findProductByCodeAndValid(String codeName, boolean valid); - Optional findProductByParentIdAndNameAndValid(Long parentId, String name, boolean valid); -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/model/UploadFileModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/model/UploadFileModel.java deleted file mode 100644 index 484acb0..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/model/UploadFileModel.java +++ /dev/null @@ -1,62 +0,0 @@ -package io.fluentqa.base.upload.model; - -import io.fluentqa.base.upload.proxy.UploadFileDataProxy; -import io.fluentqa.base.product.model.ProductModuleModel; -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.PreDataProxy; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.AttachmentType; -import xyz.erupt.annotation.sub_field.sub_edit.ChoiceType; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.toolkit.handler.SqlChoiceFetchHandler; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@Erupt(name = "测试相关文件上传同步", orderBy = "UploadFileModel.createTime desc") -@Table(name = "uploaded_files") -@Entity -@Data -@PreDataProxy(value = UploadFileDataProxy.class) -public class UploadFileModel extends ProductModuleModel { - - @EruptField( - views = @View(title = "用途"), - edit = @Edit( - search = @Search, - title = "获取可选类型", - type = EditType.CHOICE, - desc = "动态获取可选类型", - notNull = true, - choiceType = @ChoiceType( - fetchHandler = SqlChoiceFetchHandler.class, - fetchHandlerParams = "select distinct code,name from master_data where category='UPLOAD_FILE_USAGE' and valid=true" - )) - ) - private String usage; - - - @EruptField( - views = @View(title = "文件上传"), - edit = @Edit(title = "文件上传", type = EditType.ATTACHMENT, - attachmentType = @AttachmentType(size = 100000)) - ) - private String attachment; - - @EruptField( - views = @View( - title = "用途描述" - ), - edit = @Edit( - title = "用途描述", - type = EditType.TEXTAREA, notNull = true, - inputType = @InputType - ) - ) - private String comments; -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/package-info.java deleted file mode 100644 index 0998acf..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.base.upload; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileDataProxy.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileDataProxy.java deleted file mode 100644 index ccd7116..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileDataProxy.java +++ /dev/null @@ -1,64 +0,0 @@ -package io.fluentqa.base.upload.proxy; - - - -import cn.hutool.core.bean.BeanUtil; -import io.fluentqa.excel.ExcelReadWriter; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.qtm.tc.dto.TestCaseDTO; -import io.fluentqa.qtm.tc.service.TestCaseService; -import io.fluentqa.qtm.tc.service.impl.MindMappingService; -import lombok.extern.slf4j.Slf4j; -import xyz.erupt.core.prop.EruptProp; -import xyz.erupt.core.util.EruptSpringUtil; -import xyz.erupt.jpa.model.MetaDataProxy; -import xyz.erupt.jpa.model.MetaModel; - -import java.util.List; - -@Slf4j -public class UploadFileDataProxy extends MetaDataProxy { - private final MindMappingService mindMappingService; - private final TestCaseService testCaseService; - private final EruptProp eruptProp; - private final ExcelReadWriter excelReadWriter; - - public UploadFileDataProxy() { - mindMappingService = EruptSpringUtil.getBean(MindMappingService.class); - testCaseService = EruptSpringUtil.getBean(TestCaseService.class); - eruptProp = EruptSpringUtil.getBean(EruptProp.class); - excelReadWriter = new ExcelReadWriter(); - } - - @Override - public void beforeAdd(MetaModel metaModel) { - //before add, add some check here - super.beforeAdd(metaModel); - } - - @Override - public void afterAdd(MetaModel metaModel) { - //after add, then doing business process - log.info("start handler uploaded file"); - String filePath = getUploaderFilePath(metaModel); - String uploadType = BeanUtil.getProperty(metaModel, "usage"); - - if(UploadFileTypeEnum.parseType(uploadType).equals(UploadFileTypeEnum.EXCEL_TC)){ - ProductModuleModel product = BeanUtil.getProperty(metaModel, "product"); - ProductModuleModel module = BeanUtil.getProperty(metaModel, "module"); - testCaseService.saveTestCases(getExcelTestCases(filePath),product,module,metaModel.getUpdateBy()); - } - if(UploadFileTypeEnum.parseType(uploadType).equals(UploadFileTypeEnum.FREEMIND)){ - mindMappingService.saveTestCases(filePath,metaModel); - } - } - - private List getExcelTestCases(String filePath){ - return excelReadWriter.readExcel(filePath, TestCaseDTO.class); - } - - private String getUploaderFilePath(MetaModel metaModel) { - return eruptProp.getUploadPath() + BeanUtil.getProperty(metaModel, "attachment"); - } - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileTypeEnum.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileTypeEnum.java deleted file mode 100644 index f9b14ca..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/base/upload/proxy/UploadFileTypeEnum.java +++ /dev/null @@ -1,14 +0,0 @@ -package io.fluentqa.base.upload.proxy; - -public enum UploadFileTypeEnum { - EXCEL_TC,FREEMIND,PM,MINDMAP; - - public static UploadFileTypeEnum parseType(String uploadFileType) { - for (UploadFileTypeEnum uploadFileTypeEnum : UploadFileTypeEnum.values()) { - if (uploadFileTypeEnum.name().equals(uploadFileType)) { - return uploadFileTypeEnum; - } - } - throw new RuntimeException("UploadFileTypeEnum is not found for " + uploadFileType); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/model/ConvertibleBondInfo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/model/ConvertibleBondInfo.java deleted file mode 100644 index 78561c2..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/model/ConvertibleBondInfo.java +++ /dev/null @@ -1,115 +0,0 @@ -package io.fluentqa.cb.model; - -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.jpa.model.MetaModelVo; - -import javax.persistence.*; -import java.math.BigDecimal; - -@Entity -@Table(name = "convertible_bond_info") -@Data -@Erupt(name = "CB数据", power = @Power(importable = true, export = true)) - -public class ConvertibleBondInfo extends MetaModelVo { - - @Column(name = "code", nullable = false) - private String code; - - @Column(name = "name") - private String name; - - @Column(name = "trade_date") - private String tradeDate; - - @Column(name = "pre_close_price") - private BigDecimal preClosePrice; - - @Column(name = "open_price") - private BigDecimal openPrice; - - @Column(name = "high_price") - private BigDecimal highPrice; - - @Column(name = "low_price") - private BigDecimal lowPrice; - - @Column(name = "close_price") - private BigDecimal closePrice; - - @Column(name = "change") - private BigDecimal change; - - @Column(name = "change_rate") - private BigDecimal changeRate; - - @Column(name = "accrued_days") - private Integer accruedDays; - - @Column(name = "accrued_interest") - private BigDecimal accruedInterest; - - @Column(name = "remaining_term") - private BigDecimal remainingTerm; - - @Column(name = "current_yield") - private BigDecimal currentYield; - - @Column(name = "pure_bond_yield") - private BigDecimal pureBondYield; - - @Column(name = "pure_bond_value") - private BigDecimal pureBondValue; - - @Column(name = "pure_bond_premium") - private BigDecimal pureBondPremium; - - @Column(name = "pure_bond_premium_rate") - private BigDecimal pureBondPremiumRate; - - @Column(name = "conversion_price") - private BigDecimal conversionPrice; - - @Column(name = "conversion_ratio") - private BigDecimal conversionRatio; - - @Column(name = "conversion_value") - private BigDecimal conversionValue; - - @Column(name = "conversion_premium") - private BigDecimal conversionPremium; - - @Column(name = "conversion_premium_rate") - private BigDecimal conversionPremiumRate; - - @Column(name = "conversion_pe_ratio") - private BigDecimal conversionPeRatio; - - @Column(name = "conversion_pb_ratio") - private BigDecimal conversionPbRatio; - - @Column(name = "arbitrage_space") - private BigDecimal arbitrageSpace; - - @Column(name = "parity_bottom_price") - private BigDecimal parityBottomPrice; - - @Column(name = "term") - private BigDecimal term; - - @Column(name = "issue_date") - private String issueDate; - - @Column(name = "coupon_rate_issue_reference_rate") - private BigDecimal couponRateIssueReferenceRate; - - @Column(name = "trading_market") - private String tradingMarket; - - @Column(name = "bond_type") - private String bondType; - - // Constructors, getters, setters -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/package-info.java deleted file mode 100644 index 03c2c91..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/cb/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.cb; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/FluentGithubModule.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/github/FluentGithubModule.java deleted file mode 100644 index ed5d62f..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/FluentGithubModule.java +++ /dev/null @@ -1,60 +0,0 @@ -package io.fluentqa.github; - -import io.fluentqa.github.model.AwesomeResource; -import io.fluentqa.github.model.GithubStarredRepo; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.boot.context.properties.EnableConfigurationProperties; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.context.annotation.Configuration; -import xyz.erupt.core.annotation.EruptScan; -import xyz.erupt.core.constant.MenuTypeEnum; -import xyz.erupt.core.module.EruptModule; -import xyz.erupt.core.module.EruptModuleInvoke; -import xyz.erupt.core.module.MetaMenu; -import xyz.erupt.core.module.ModuleInfo; - -import java.util.ArrayList; -import java.util.List; - -@Configuration -@ComponentScan -@EntityScan -@EruptScan -@EnableConfigurationProperties -public class FluentGithubModule implements EruptModule { - - public FluentGithubModule() { - } - - @Override - public ModuleInfo info() { - return ModuleInfo.builder().name("fluent-github").build(); - } - - @Override - public void run() { - EruptModule.super.run(); - } - - @Override - public List initMenus() { - List menus = new ArrayList<>(); - menus.add(MetaMenu.createRootMenu("$github", "github管理", "fa fa-github", 1000)); - MetaMenu starredMenu = MetaMenu.createEruptClassMenu(GithubStarredRepo.class, menus.get(0), 0, MenuTypeEnum.TABLE); - starredMenu.setIcon("fa fa-star"); - starredMenu.setName("github收藏"); - starredMenu.setCode("$github-starred"); - menus.add(starredMenu); - - MetaMenu awesomeModule = MetaMenu.createEruptClassMenu(AwesomeResource.class, menus.get(0), 1, MenuTypeEnum.TABLE); - awesomeModule.setIcon("fa fa-font-awesome"); - awesomeModule.setName("awesomes"); - awesomeModule.setCode("$awesomes"); - menus.add(awesomeModule); - return menus; - } - - static { - EruptModuleInvoke.addEruptModule(FluentGithubModule.class); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/AwesomeResource.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/AwesomeResource.java deleted file mode 100644 index c9205b0..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/AwesomeResource.java +++ /dev/null @@ -1,29 +0,0 @@ -package io.fluentqa.github.model; - -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; - -@Data -@Entity -@Erupt(name = "Awesome Resource", power = @Power(importable = true, export = true)) -public class AwesomeResource extends MetaModel { - @EruptField(views = @View(title = "名称"), edit = @Edit(title = "名称", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String name; - @EruptField(views = @View(title = "URL"), edit = @Edit(title = "URL", type = EditType.TEXTAREA, - search = @Search, notNull = true, inputType = @InputType)) - private String url; - - @EruptField(views = @View(title = "content"), edit = @Edit(title = "content", type = EditType.HTML_EDITOR, - search = @Search, notNull = true, inputType = @InputType)) - private String content; -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/GithubStarredRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/GithubStarredRepo.java deleted file mode 100644 index df1a82f..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/model/GithubStarredRepo.java +++ /dev/null @@ -1,45 +0,0 @@ -package io.fluentqa.github.model; - -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.jpa.model.MetaModelVo; - -import javax.persistence.Entity; - -@Data -@Entity -@Erupt(name = "starred Github Repos", - power = @Power(importable = true, export = true), layout = @Layout( - tableLeftFixed = 3, - pageSize = 30)) -public class GithubStarredRepo extends MetaModelVo { - @EruptField(views = @View(title = "名称"), edit = @Edit(title = "名称", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String name; - @EruptField(views = @View(title = "URL"), edit = @Edit(title = "URL", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String url; - @EruptField(views = @View(title = "全名"), edit = @Edit(title = "全名", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String fullName; - - private String nodeId; - @EruptField(views = @View(title = "描述"), edit = @Edit(title = "描述", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String description; - @EruptField(views = @View(title = "fork数量"), edit = @Edit(title = "fork数量", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private int forksCount; - @EruptField(views = @View(title = "star数量"), edit = @Edit(title = "star数量", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private int stargazersCount; - - - @EruptField(views = @View(title = "主题"), edit = @Edit(title = "主题", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String topics; - - @EruptField(views = @View(title = "语言"), edit = @Edit(title = "语言", type = EditType.INPUT, search = @Search, notNull = true, inputType = @InputType)) - private String language; -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/github/package-info.java deleted file mode 100644 index 92c9212..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.github; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/service/GithubService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/github/service/GithubService.java deleted file mode 100644 index c80b6ae..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/github/service/GithubService.java +++ /dev/null @@ -1,17 +0,0 @@ -package io.fluentqa.github.service; - -import io.fluent.git.github.GithubUserService; -import lombok.extern.slf4j.Slf4j; -import org.springframework.stereotype.Service; - -@Service -@Slf4j -public class GithubService { - GithubUserService userService = new GithubUserService(); - - public void saveUserStarredRepo(String userName,int page){ - log.info("start save all starred repos"); - userService.saveUserStarredRepo(userName,page); - log.info("complete saved user starred repo"); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/GithubStarredCollectorJob.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/GithubStarredCollectorJob.java deleted file mode 100644 index 12d132b..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/GithubStarredCollectorJob.java +++ /dev/null @@ -1,56 +0,0 @@ -package io.fluentqa.jobs; - -import cn.hutool.json.JSONUtil; -import io.fluent.builtin.StringUtils; -import io.fluentqa.github.service.GithubService; -import io.fluentqa.jobs.github.GithubJobFetchParameters; -import lombok.extern.slf4j.Slf4j; -import org.springframework.scheduling.annotation.Async; -import org.springframework.stereotype.Service; -import xyz.erupt.core.annotation.EruptHandlerNaming; -import xyz.erupt.job.handler.EruptJobHandler; - -import javax.annotation.Resource; -import java.util.ArrayList; -import java.util.List; - -@Service -@Slf4j -@EruptHandlerNaming("Github Starred Repo Job") // 如果不添加此配置,类名会作为前端展示依据 -public class GithubStarredCollectorJob implements EruptJobHandler { - @Resource - private GithubService githubService; - /** - * @param code 任务编码 - * @param param 任务参数 - */ - - @Override - public String exec(String code, String param) { - log.info("start job %s with parameter %s".formatted(code,param)); - GithubJobFetchParameters parameters ; - if(StringUtils.isAllEmpty(param)){ - parameters = new GithubJobFetchParameters(); - }else { - parameters = JSONUtil.toBean(param, GithubJobFetchParameters.class); - } - List userNames=StringUtils.split(parameters.getUserNames(),","); - for (String userName : userNames) { - githubService.saveUserStarredRepo(userName,parameters.getFromPage()); - } - return "success"; - } - - - @Override - public void success(String result, String param) { - log.info(String.format("success result %s", result)); - EruptJobHandler.super.success(result, param); - } - - @Override - public void error(Throwable throwable, String param) { - EruptJobHandler.super.error(throwable, param); - } - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/github/GithubJobFetchParameters.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/github/GithubJobFetchParameters.java deleted file mode 100644 index 0cfd142..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/jobs/github/GithubJobFetchParameters.java +++ /dev/null @@ -1,11 +0,0 @@ -package io.fluentqa.jobs.github; - -import lombok.Data; -//https://olakit.cn/box/quartz_cron_build_check -//0 0 23 ? * * * -@Data -public class GithubJobFetchParameters { - private String userNames="qdriven"; - private int fromPage=0; - private int pageSize=50; -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/FluentQAApiModule.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/FluentQAApiModule.java deleted file mode 100644 index 58a13d4..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/FluentQAApiModule.java +++ /dev/null @@ -1,125 +0,0 @@ -package io.fluentqa.qtm; - - -import io.fluentqa.qtm.api.model.*; -import org.springframework.boot.autoconfigure.domain.EntityScan; -import org.springframework.boot.context.properties.EnableConfigurationProperties; -import org.springframework.context.annotation.ComponentScan; -import org.springframework.context.annotation.Configuration; -import xyz.erupt.core.annotation.EruptScan; -import xyz.erupt.core.constant.MenuTypeEnum; -import xyz.erupt.core.module.EruptModule; -import xyz.erupt.core.module.EruptModuleInvoke; -import xyz.erupt.core.module.MetaMenu; -import xyz.erupt.core.module.ModuleInfo; - -import java.util.ArrayList; -import java.util.List; - - -@Configuration -@ComponentScan -@EntityScan -@EruptScan -@EnableConfigurationProperties -public class FluentQAApiModule implements EruptModule { - public FluentQAApiModule() { - } - - @Override - public ModuleInfo info() { - return ModuleInfo.builder().name("fluent-api").build(); - } - - @Override - public void run() { - EruptModule.super.run(); - } - - /** - * API管理: - *

- * 1. API 仓库管理 - * 2. API 接口定义 - * 3. API 接口录制记录 - * 4. API 接口测试 - * - * @return - */ - @Override - public List initMenus() { - List menus = new ArrayList<>(); - menus.add(MetaMenu.createRootMenu("$APIMgr", "接口管理", "fa fa-exchange", 1)); - - MetaMenu menuForAdded = MetaMenu.createEruptClassMenu(RemoteApi.class, - menus.get(0), 1, MenuTypeEnum.TABLE); - menuForAdded.setIcon("fa fa-scissors"); - menuForAdded.setName("API清单"); - menuForAdded.setCode("$API-List"); - menus.add(menuForAdded); - - MetaMenu rawApiTestCaseMenu = MetaMenu.createEruptClassMenu(RawApiTestCase.class, - menus.get(0), 1, MenuTypeEnum.TABLE); - rawApiTestCaseMenu.setIcon("fa fa-scissors"); - rawApiTestCaseMenu.setName("API生成原始测试用例"); - rawApiTestCaseMenu.setCode("$API-TC-GEN"); - menus.add(rawApiTestCaseMenu); - - MetaMenu apiMonitorRecordMenu = MetaMenu.createEruptClassMenu(ApiMonitorRecord.class, - menus.get(0), 2, MenuTypeEnum.TABLE); - apiMonitorRecordMenu.setIcon("fa fa-repeat"); - apiMonitorRecordMenu.setName("API录制记录"); - apiMonitorRecordMenu.setCode("$API-Record"); - menus.add(apiMonitorRecordMenu); - - MetaMenu apiTestRecord = MetaMenu.createEruptClassMenu( - ApiTestRecord - .class, - menus.get(0), 3, MenuTypeEnum.TABLE); - apiTestRecord.setIcon("fa fa-thumbs-up"); - apiTestRecord.setName("API测试结果记录"); - apiTestRecord.setCode("$API-TestResult"); - menus.add(apiTestRecord); - - MetaMenu apiTestScenarioMenu = MetaMenu.createEruptClassMenu( - ApiTestScenario - .class, - menus.get(0), 4, MenuTypeEnum.TABLE); - apiTestScenarioMenu.setIcon("fa fa-folder"); - apiTestScenarioMenu.setName("API测试场景"); - apiTestScenarioMenu.setCode("$API-TestScenario"); - menus.add(apiTestScenarioMenu); - - MetaMenu apiStepMenu = MetaMenu.createEruptClassMenu( - ApiStep - .class, - menus.get(0), 5, MenuTypeEnum.TABLE); - apiStepMenu.setIcon("fa fa-folder"); - apiStepMenu.setName("API用例步骤"); - apiStepMenu.setCode("$API-Step"); - menus.add(apiStepMenu); -// MetaMenu apiDefMenu = MetaMenu.createSimpleMenu("$API-def", "接口定义", "fa fa-check-square-o", -// menus.get(0), 1, ""); -// menus.add(apiDefMenu); -// addNewMenu( -// menus,"$API-Spec-Git","API定义仓库", "fa fa-meetup", ApiSpecGitRepoModel.class, -// MenuTypeEnum.TABLE,1,0 -// ); -// addNewMenu( -// menus,"$API-Spec","API最新版本", "fa fa-gitlab", ApiSpecVersionModel.class, -// MenuTypeEnum.TABLE,1,1 -// ); - - return menus; - } - - - static { - EruptModuleInvoke.addEruptModule(FluentQAApiModule.class); - } - - @Override - public void initFun() { - EruptModule.super.initFun(); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiCaseByCaptureDataHandler.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiCaseByCaptureDataHandler.java deleted file mode 100644 index 3ad4fc3..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiCaseByCaptureDataHandler.java +++ /dev/null @@ -1,24 +0,0 @@ -package io.fluentqa.qtm.api.handler; - -import io.fluentqa.qtm.api.model.ApiMonitorRecord; -import io.fluentqa.qtm.api.service.ApiTestCaseService; -import lombok.extern.slf4j.Slf4j; -import org.springframework.stereotype.Service; -import xyz.erupt.annotation.fun.OperationHandler; - -import javax.annotation.Resource; -import java.util.List; - -@Service -@Slf4j -public class GenerateApiCaseByCaptureDataHandler implements OperationHandler { - - @Resource - private ApiTestCaseService apiService; - @Override - public String exec(List data, Void unused, String[] param) { - log.info("start convert api capture data"); - apiService.convertApiMonitorRecordToTestCase(data); - return null; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiTestStepByApiTestRecord.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiTestStepByApiTestRecord.java deleted file mode 100644 index 2ccb338..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateApiTestStepByApiTestRecord.java +++ /dev/null @@ -1,24 +0,0 @@ -package io.fluentqa.qtm.api.handler; - -import io.fluentqa.qtm.api.model.ApiTestRecord; -import io.fluentqa.qtm.api.service.ApiTestCaseService; -import lombok.extern.slf4j.Slf4j; -import org.springframework.stereotype.Service; -import xyz.erupt.annotation.fun.OperationHandler; - -import javax.annotation.Resource; -import java.util.List; - -@Service -@Slf4j -public class GenerateApiTestStepByApiTestRecord implements OperationHandler { - - @Resource - private ApiTestCaseService apiService; - @Override - public String exec(List data, Void unused, String[] param) { - log.info("start convert api capture data"); - apiService.convertApiTestResultToApiTestStep(data); - return null; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateRawApiCaseHandler.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateRawApiCaseHandler.java deleted file mode 100644 index c64a72e..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/handler/GenerateRawApiCaseHandler.java +++ /dev/null @@ -1,22 +0,0 @@ -package io.fluentqa.qtm.api.handler; - -import io.fluentqa.qtm.api.model.RemoteApi; -import io.fluentqa.qtm.api.service.ApiTestCaseService; -import org.springframework.stereotype.Service; -import xyz.erupt.annotation.fun.OperationHandler; - -import javax.annotation.Resource; -import java.util.List; - -@Service -public class GenerateRawApiCaseHandler implements OperationHandler { - - @Resource - private ApiTestCaseService apiService; - @Override - public String exec(List data, Void unused, String[] param) { - System.out.println("this is tests"); - apiService.convertToRawTestCase(data); - return null; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiMonitorRecord.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiMonitorRecord.java deleted file mode 100644 index ba9ced6..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiMonitorRecord.java +++ /dev/null @@ -1,131 +0,0 @@ -package io.fluentqa.qtm.api.model; - - -import io.fluentqa.qtm.api.handler.GenerateApiCaseByCaptureDataHandler; -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.RowOperation; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.ViewType; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "api_monitor_record") -@Erupt( - name = "接口访问记录", - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), - power = @Power(importable = true, export = true), - rowOperation = {@RowOperation( - title = "生成接口用例数据", - operationHandler = GenerateApiCaseByCaptureDataHandler.class)}, - orderBy = "ApiMonitorRecord.id desc" - -) -@Data -public class ApiMonitorRecord extends MetaModel { - - @EruptField( - views = @View(title = "app"), - edit = @Edit( - title = "app应用名", - type = EditType.TAGS, search = @Search(vague = true), notNull = true, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct app from api_monitor_record" - ) - )) - private String app; - @EruptField( - views = @View(title = "录制名称"), - edit = @Edit(title = "录制名称", notNull = true, search = @Search) - ) - private String recordName; - @EruptField( - views = @View(title = "请求地址"), - edit = @Edit(title = "请求地址", notNull = true, search = @Search) - ) - private String requestUrl; - - @EruptField( - views = @View(title = "服务"), - edit = @Edit( - title = "服务", - type = EditType.TAGS, search = @Search(vague = true), notNull = true, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct service from api_monitor_record" - ) - ) - ) - - private String service; - @EruptField( - views = @View(title = "接口名称"), - edit = @Edit(title = "接口名称", notNull = true, search = @Search) - ) - private String api; - - @EruptField( - views = @View(title = "服务URL"), - edit = @Edit(title = "服务URL", notNull = true, search = @Search) - ) - private String path; - - @EruptField( - views = @View(title = "请求头"), - edit = @Edit(title = "请求报文", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String requestHeaders; - - @EruptField( - views = @View(title = "HTTP方法"), - edit = @Edit(title = "HTTP方法", notNull = true, search = @Search) - ) - private String method; - - @EruptField( - views = @View(title = "请求报文", type = ViewType.CODE), - edit = @Edit(title = "请求报文", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String requestBody; - - - @EruptField( - views = @View(title = "response_headers"), - edit = @Edit(title = "responseHeaders", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String responseHeaders; - - @EruptField( - views = @View(title = "status_code"), - edit = @Edit(title = "status_code", notNull = true, search = @Search) - ) - private int statusCode; - - @EruptField( - views = @View(title = "返回报文", type = ViewType.CODE), - edit = @Edit(title = "返回报文", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String responseBody; - - - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecChangeModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecChangeModel.java deleted file mode 100644 index bbb757b..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecChangeModel.java +++ /dev/null @@ -1,52 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.jpa.model.BaseModel; - -import javax.persistence.Entity; -import javax.persistence.Table; -import java.time.LocalDateTime; - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "api_spec_change") -@Erupt( - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), - name = "api spec 变化记录", power = @Power(export = true) -) -@Data -public class ApiSpecChangeModel extends BaseModel { - @EruptField( - views = @View(title = "应用名-appName") - ) - private String name; - @EruptField( - views = @View(title = "GIT URL") - ) - private String gitUrl; - @EruptField( - views = @View(title = "GIT分支") - ) - private String branch; - - @EruptField( - views = @View(title = "创建时间") - ) - private LocalDateTime createdTime; - - @EruptField( - views = @View(title = "appVersion") - ) - private String appVersion; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecGitRepoModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecGitRepoModel.java deleted file mode 100644 index 36d69e8..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecGitRepoModel.java +++ /dev/null @@ -1,47 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import lombok.Data; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@Entity -@Table(name = "apispec_git_repo") -@Data -@Erupt(name = "skel仓库设置", layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), - power = @Power(importable = true, export = true)) -public class ApiSpecGitRepoModel extends MetaModel { - @EruptField( - views = @View(title = "应用名-appName"), - edit = @Edit(title = "应用名-App Name", notNull = true, search = @Search) - ) - private String name; - @EruptField( - views = @View(title = "gitUrl"), - edit = @Edit(title = "gitUrl", notNull = true) - ) - private String gitUrl; - - @EruptField( - views = @View(title = "gitlabId"), - edit = @Edit(title = "gitlabId", notNull = true) - ) - private Integer gitlabId; - - @EruptField( - views = @View(title = "webUrl"), - edit = @Edit(title = "webUrl", notNull = true) - ) - private String webUrl; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecVersionModel.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecVersionModel.java deleted file mode 100644 index 45d1154..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiSpecVersionModel.java +++ /dev/null @@ -1,102 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import io.fluentqa.base.model.ModelWithValidFlagVo; -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import org.hibernate.annotations.SQLDelete; -import org.hibernate.annotations.Where; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.ViewType; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.InputType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "api_spec_version") -@Erupt( - name = "远程服务原始文件", - power = @Power(export = true), - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30) -) -@Data -@SQLDelete(sql = "update api_spec_version set valid=false where id=?") -@Where(clause = "valid = true") -public class ApiSpecVersionModel extends ModelWithValidFlagVo { - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String name; - - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String type="POSTMAN"; - - @EruptField( - views = @View( - title = "服务类型" - ), - edit = @Edit( - title = "服务类型", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String serviceType; //API or RPC - - @EruptField( - views = @View( - title = "版本" - ), - edit = @Edit( - title = "版本", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String appVersion; - - @EruptField( - views = @View(title = "GIT URL") - ) - private String gitUrl; - @EruptField( - views = @View(title = "GIT分支") - ) - private String branch; - - @EruptField( - views = @View(title = "接口定义", type = ViewType.CODE), - edit = @Edit(title = "接口定义", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String spec; -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiStep.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiStep.java deleted file mode 100644 index 7d72936..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiStep.java +++ /dev/null @@ -1,156 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "api_steps") -@Erupt( - name = "接口测试用例", layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), power = @Power(importable = true, export = true), - orderBy = "ApiTestStep.updateTime desc" -) -public class ApiStep extends MetaModel{ - - @EruptField( - views = @View(title = "场景"), - edit = @Edit(title = "场景", notNull = true, search = @Search) - ) - private String scenario; - - @EruptField( - views = @View(title = "用例名称"), - edit = @Edit(title = "用例名称", notNull = true, search = @Search) - ) - private String caseName; - - @EruptField( - views = @View(title = "服务"), - edit = @Edit( - search = @Search, - title = "获取可选服务", - type = EditType.TAGS, - desc = "获取可选服务", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct service_name from remote_services where type='API' and valid=true" - )) - ) - private String serviceName; - @EruptField( - views = @View(title = "服务方法"), - edit = @Edit(title = "服务方法", notNull = true, search = @Search) - ) - private String serviceMethod; - - @EruptField( - views = @View(title = "接口路径"), - edit = @Edit(title = "接口路径", notNull = true, search = @Search) - ) - private String path; - - @EruptField( -// views = @View(title = "测试请求", type = ViewType.CODE), - edit = @Edit(title = "测试请求", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String request; - @EruptField( -// views = @View(title = "接口请求结果"), - edit = @Edit(title = "接口请求结果", - type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String result; - - - @EruptField( -// views = @View(title = "预期结果"), - edit = @Edit(title = "预期结果", notNull = true, type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String expect; - - - public String getScenario() { - return scenario; - } - - public void setScenario(String scenario) { - this.scenario = scenario; - } - - public String getServiceName() { - return serviceName; - } - - public void setServiceName(String serviceName) { - this.serviceName = serviceName; - } - - public String getServiceMethod() { - return serviceMethod; - } - - public void setServiceMethod(String serviceMethod) { - this.serviceMethod = serviceMethod; - } - - public String getCaseName() { - return caseName; - } - - public void setCaseName(String caseName) { - this.caseName = caseName; - } - - public String getPath() { - return path; - } - - public void setPath(String path) { - this.path = path; - } - - public String getRequest() { - return request; - } - - public void setRequest(String request) { - this.request = request; - } - - public String getResult() { - return result; - } - - public void setResult(String result) { - this.result = result; - } - - - - public String getExpect() { - return expect; - } - - public void setExpect(String expect) { - this.expect = expect; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestRecord.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestRecord.java deleted file mode 100644 index d901d45..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestRecord.java +++ /dev/null @@ -1,131 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import io.fluentqa.qtm.api.handler.GenerateApiTestStepByApiTestRecord; -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.RowOperation; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; -import javax.persistence.Table; - -/** - * 1. 原先的表结构设计几个问题: - * - 没有办法区分那次测试运行记录 - * - 查找不太方便 - * - name 和service name 重复,没有必要同时使用 - * - serviceName 从remote service里面取不过滤API,不够精确 - * 修改方式: - * - name 修改为测试用例运行名称 - * - serviceName 取tags 从remote service 的API 中取 - */ -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "api_test_record") -@Erupt( - name = "接口测试结果", layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), power = @Power(importable = true, export = true), - orderBy = "ApiTestRecord.id desc", - rowOperation = {@RowOperation( - title = "生成可用测试步骤", - operationHandler = GenerateApiTestStepByApiTestRecord.class)} -) -@Data -public class ApiTestRecord extends MetaModel { - @EruptField( - views = @View(title = "测试运行名称"), - edit = @Edit(title = "name", search = @Search) - ) - private String name; - - @EruptField( - views = @View(title = "场景"), - edit = @Edit(title = "场景", notNull = true, search = @Search) - ) - private String scenario; - - @EruptField( - views = @View(title = "用例名称"), - edit = @Edit(title = "用例名称", notNull = true, search = @Search) - ) - private String caseName; - - @EruptField( - views = @View(title = "服务"), - edit = @Edit( - search = @Search, - title = "获取可选服务", - type = EditType.TAGS, - desc = "获取可选服务", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct service_name from remote_services where type='API' and valid=true" - )) - ) - private String serviceName; - @EruptField( - views = @View(title = "服务方法"), - edit = @Edit(title = "服务方法", notNull = true, search = @Search) - ) - private String serviceMethod; - - @EruptField( - views = @View(title = "接口路径"), - edit = @Edit(title = "接口路径", notNull = true, search = @Search) - ) - private String path; - - @EruptField( -// views = @View(title = "测试请求", type = ViewType.CODE), - edit = @Edit(title = "测试请求", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String request; - @EruptField( -// views = @View(title = "接口请求结果"), - edit = @Edit(title = "接口请求结果", - type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String result; - - @EruptField( - views = @View(title = "状态码"), - edit = @Edit(title = "状态码", notNull = true, search = @Search) - ) - private String statusCode; - - @EruptField( - views = @View(title = "错误日志"), - edit = @Edit(title = "错误日志", - type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String errorLog; - - @EruptField( -// views = @View(title = "预期结果"), - edit = @Edit(title = "预期结果", notNull = true, type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String expect; - - //TODO: 通过或者失败 - @EruptField( - views = @View(title = "用例执行结果"), - edit = @Edit(title = "用例执行结果", search = @Search) - ) - private boolean isSuccess; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestScenario.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestScenario.java deleted file mode 100644 index 4055930..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/ApiTestScenario.java +++ /dev/null @@ -1,157 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import io.fluentqa.base.model.ModelWithValidFlag; -import io.fluentqa.base.product.model.ProductModuleModel; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.LinkTree; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; - -import javax.persistence.*; -import java.util.Set; - -/** - * - */ -@Entity -@Erupt(name = "接口测试用例", - power = @Power(export = true), - orderBy = "ApiTestScenario.updateTime desc", - linkTree = @LinkTree(field = "module"), - layout = @Layout( - tableLeftFixed = 3, - pageSize = 30)) -@Table(name = "api_test_scenario") -public class ApiTestScenario extends ModelWithValidFlag { - - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "所属模块", column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel module; - - - @EruptField( - views = @View( - title = "测试场景" - ), - edit = @Edit( - title = "测试场景", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String testScenario; - - @EruptField( - views = @View( - title = "测试场景详细描述" - ), - edit = @Edit( - title = "测试场景详细描述", - type = EditType.TEXTAREA, search = @Search, notNull = true - ) - ) - private String details; - - @EruptField( - views = @View( - title = "优先级" - ), - edit = @Edit( - title = "优先级", - type = EditType.TAGS, - search = @Search, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct key,detail from master_data where category_code = 'PRIORITY' order by 1 " - ) - ) - ) - private String priority = "P2"; - - @EruptField( - views = @View(title = "场景参数"), - edit = @Edit(title = "场景参数", - type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String scenarioParameters; - - @JoinTable(name = "api_test_scenario_steps", - joinColumns = @JoinColumn(name = "api_test_scenario_id", referencedColumnName = "id"), - inverseJoinColumns = @JoinColumn(name = "api_test_step_id", referencedColumnName = "id")) - @ManyToMany(fetch = FetchType.EAGER) - @EruptField( - views = @View(title = "包含用例"), - edit = @Edit( - title = "包含用例", - type = EditType.TAB_TABLE_REFER - ) - ) - private Set testSteps; - public String getPriority() { - return priority; - } - - public void setPriority(String priority) { - this.priority = priority; - } - - - public ProductModuleModel getModule() { - return module; - } - - public void setModule(ProductModuleModel module) { - this.module = module; - } - - public String getTestScenario() { - return testScenario; - } - - public void setTestScenario(String testScenario) { - this.testScenario = testScenario; - } - - public String getDetails() { - return details; - } - - public void setDetails(String details) { - this.details = details; - } - - public String getScenarioParameters() { - return scenarioParameters; - } - - public void setScenarioParameters(String scenarioParameters) { - this.scenarioParameters = scenarioParameters; - } - - public Set getTestSteps() { - return testSteps; - } - - public void setTestSteps(Set testSteps) { - this.testSteps = testSteps; - } -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RawApiTestCase.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RawApiTestCase.java deleted file mode 100644 index 96441b0..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RawApiTestCase.java +++ /dev/null @@ -1,96 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.ViewType; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.Entity; -import javax.persistence.Table; - - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "raw_api_cases") -@Erupt( - name = "接口测试用例生成", layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), power = @Power(importable = true, export = true), - orderBy = "RawApiTestCase.createTime " -) -@Data -public class RawApiTestCase extends MetaModel{ - @EruptField( - views = @View(title = "测试场景"), - edit = @Edit(title = "测试场景", search = @Search) - ) - private String scenario; - - @EruptField( - views = @View(title = "服务名称"), - edit = @Edit(title = "服务名称", notNull = true, search = @Search) - ) - private String serviceName; - - @EruptField( - views = @View(title = "API接口"), - edit = @Edit(title = "API接口", notNull = true, search = @Search) - ) - private String serviceMethod; - - @EruptField( - views = @View(title = "用例名称"), - edit = @Edit(title = "用例名称", notNull = true, search = @Search) - ) - private String name; - - @EruptField( - views = @View(title = "请求路径"), - edit = @Edit(title = "请求路径", notNull = true, search = @Search) - ) - private String uri; - - @EruptField( - views = @View(title = "请求方法"), - edit = @Edit(title = "请求方法", notNull = true) - ) - private String method = "POST"; - - @EruptField( - views = @View(title = "输入", type = ViewType.CODE), - edit = @Edit(title = "输入", - type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String input; - - @EruptField( - views = @View(title = "期望结果", type = ViewType.CODE), - edit = @Edit(title = "期望结果", type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "json")) - ) - private String expected; - - @EruptField( - views = @View(title = "优先级"), - edit = @Edit(title = "优先级") - ) - private String priority = "P1"; - - @EruptField( - views = @View(title = "是否运行"), - edit = @Edit(title = "是否运行") - ) - private boolean isRun = true; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApi.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApi.java deleted file mode 100644 index 239ca7a..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApi.java +++ /dev/null @@ -1,275 +0,0 @@ -package io.fluentqa.qtm.api.model; - -import cn.hutool.core.lang.UUID; -import io.fluentqa.qtm.api.handler.GenerateRawApiCaseHandler; -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import io.fluentqa.base.model.ModelWithValidFlag; -import lombok.Data; -import org.hibernate.annotations.DynamicInsert; -import org.hibernate.annotations.DynamicUpdate; -import org.hibernate.annotations.SQLDelete; -import org.hibernate.annotations.Where; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.RowOperation; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.ViewType; -import xyz.erupt.annotation.sub_field.sub_edit.*; -import xyz.erupt.toolkit.handler.SqlChoiceFetchHandler; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@DynamicUpdate -@DynamicInsert -@Entity -@Table(name = "remote_services") -@Erupt( - name = "远程服务清单", layout = @Layout( - tableLeftFixed = 3, - pageSize = 30), - power = @Power(export = true), - rowOperation = {@RowOperation( - title = "生成原始接口用例", - operationHandler = GenerateRawApiCaseHandler.class)} -) -@Data -@SQLDelete(sql = "update remote_services set valid=false where id=?") -@Where(clause = "valid = true") -public class RemoteApi extends ModelWithValidFlag { - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String name; - - @EruptField( - views = @View( - title = "产品" - ), - edit = @Edit( - title = "产品", - type = EditType.CHOICE, - desc = "获取产品", - choiceType = @ChoiceType( - fetchHandler = SqlChoiceFetchHandler.class, - fetchHandlerParams = "select id,name,details from products where valid =true and parent_id is NULL" - )) - ) - private Long productId; - - @EruptField( - views = @View(title = "模块名"), - edit = @Edit( - search = @Search, - title = "获取可选模块", - type = EditType.TAGS, - desc = "动态获取可选模块", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct module_name from remote_services where valid=true" - )) - ) - private String moduleName; - - @EruptField( - views = @View(title = "服务"), - edit = @Edit( - search = @Search, - title = "获取可选服务", - type = EditType.TAGS, - desc = "获取可选服务", - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct service_name from remote_services where valid=true" - )) - ) - private String serviceName; - - @EruptField( - views = @View(title = "地址"), - edit = @Edit(title = "地址", notNull = true) - ) - private String endpoint; - - @EruptField( - views = @View(title = "方法"), - edit = @Edit(title = "方法", notNull = true, search = @Search) - ) - private String serviceMethod; - - @EruptField( - views = @View(title = "http请求方法"), - edit = @Edit(title = "http请求方法", notNull = true) - ) - private String httpMethod; - - @EruptField( - views = @View(title = "请求报文", type = ViewType.CODE), - edit = @Edit(title = "请求报文", type = EditType.CODE_EDITOR, codeEditType = @CodeEditorType(language = "json")) - ) - private String body; - - @EruptField( - views = @View(title = "接口类型"), - edit = @Edit(title = "接口类型", search = @Search, - type = EditType.CHOICE, choiceType = @ChoiceType( - vl = {@VL(value = "API", label = "API"), @VL(value = "RPC", label = "RPC")} - )) - ) - private String type; - - @EruptField( - views = @View(title = "服务使用状态"), - edit = @Edit(title = "服务使用状态", - search = @Search, notNull = true, - type = EditType.CHOICE, choiceType = @ChoiceType( - vl = {@VL(value = "NEW", label = "新增"), - @VL(value = "IN_USE", label = "使用中"), - @VL(value = "UPDATE", label = "更新"), - @VL(value = "END_OF_LIFE", label = "已作废"),}))) - private String status; - - @EruptField( - views = @View( - title = "引入时版本" - ), - edit = @Edit( - title = "引入时版本", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String addedVersion; - - @EruptField( - views = @View( - title = "最新版本" - ), - edit = @Edit( - title = "最新版本", - type = EditType.INPUT, search = @Search, notNull = true, - inputType = @InputType - ) - ) - private String latestVersion; - - @EruptField( - views = @View(show = false, title = "uid") - ) - private String uId = UUID.fastUUID().toString(); - - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public Long getProductId() { - return productId; - } - - public void setProductId(Long productId) { - this.productId = productId; - } - - public String getModuleName() { - return moduleName; - } - - public void setModuleName(String moduleName) { - this.moduleName = moduleName; - } - - public String getServiceName() { - return serviceName; - } - - public void setServiceName(String serviceName) { - this.serviceName = serviceName; - } - - public String getEndpoint() { - return endpoint; - } - - public void setEndpoint(String endpoint) { - this.endpoint = endpoint; - } - - public String getServiceMethod() { - return serviceMethod; - } - - public void setServiceMethod(String serviceMethod) { - this.serviceMethod = serviceMethod; - } - - public String getHttpMethod() { - return httpMethod; - } - - public void setHttpMethod(String httpMethod) { - this.httpMethod = httpMethod; - } - - public String getBody() { - return body; - } - - public void setBody(String body) { - this.body = body; - } - - public String getType() { - return type; - } - - public void setType(String type) { - this.type = type; - } - - public String getStatus() { - return status; - } - - public void setStatus(String status) { - this.status = status; - } - - public String getAddedVersion() { - return addedVersion; - } - - public void setAddedVersion(String addedVersion) { - this.addedVersion = addedVersion; - } - - public String getLatestVersion() { - return latestVersion; - } - - public void setLatestVersion(String latestVersion) { - this.latestVersion = latestVersion; - } - - public String getuId() { - return uId; - } - - public void setuId(String uId) { - this.uId = uId; - } -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiStatus.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiStatus.java deleted file mode 100644 index 7803e71..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiStatus.java +++ /dev/null @@ -1,5 +0,0 @@ -package io.fluentqa.qtm.api.model; - -public enum RemoteApiStatus { - NEW,IN_USE,UPDATED,END_OF_LIFE -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiType.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiType.java deleted file mode 100644 index 77bcba5..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/model/RemoteApiType.java +++ /dev/null @@ -1,5 +0,0 @@ -package io.fluentqa.qtm.api.model; - -public enum RemoteApiType { - API,RPC -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/package-info.java deleted file mode 100644 index 60e98c6..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.qtm.api; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiMonitorRecordRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiMonitorRecordRepo.java deleted file mode 100644 index fc3cf8a..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiMonitorRecordRepo.java +++ /dev/null @@ -1,14 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.ApiMonitorRecord; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -import java.util.List; - -@Repository -public interface ApiMonitorRecordRepo extends JpaRepository, JpaSpecificationExecutor { - - List findApiMonitorRecordByPath(String path); -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecChangeRepository.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecChangeRepository.java deleted file mode 100644 index f186062..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecChangeRepository.java +++ /dev/null @@ -1,11 +0,0 @@ -package io.fluentqa.qtm.api.repo; - - -import io.fluentqa.qtm.api.model.ApiSpecChangeModel; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository -public interface ApiSpecChangeRepository extends JpaRepository { - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecGitRepoRepository.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecGitRepoRepository.java deleted file mode 100644 index 64af875..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecGitRepoRepository.java +++ /dev/null @@ -1,10 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.ApiSpecGitRepoModel; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository -public interface ApiSpecGitRepoRepository extends JpaRepository { - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecVersionRepository.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecVersionRepository.java deleted file mode 100644 index 5f8c1bd..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiSpecVersionRepository.java +++ /dev/null @@ -1,10 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.ApiSpecVersionModel; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository -public interface ApiSpecVersionRepository extends JpaRepository { - -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestScenarioRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestScenarioRepo.java deleted file mode 100644 index d0c358a..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestScenarioRepo.java +++ /dev/null @@ -1,10 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.ApiTestScenario; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -@Repository -public interface ApiTestScenarioRepo extends JpaRepository, JpaSpecificationExecutor { -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestStepRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestStepRepo.java deleted file mode 100644 index 82cac02..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/ApiTestStepRepo.java +++ /dev/null @@ -1,10 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.ApiStep; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -@Repository -public interface ApiTestStepRepo extends JpaRepository, JpaSpecificationExecutor { -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RawApiTestCaseRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RawApiTestCaseRepo.java deleted file mode 100644 index cd2e625..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RawApiTestCaseRepo.java +++ /dev/null @@ -1,10 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.RawApiTestCase; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -@Repository -public interface RawApiTestCaseRepo extends JpaRepository, JpaSpecificationExecutor { -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RemoteServiceRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RemoteServiceRepo.java deleted file mode 100644 index 5213fd1..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/repo/RemoteServiceRepo.java +++ /dev/null @@ -1,23 +0,0 @@ -package io.fluentqa.qtm.api.repo; - -import io.fluentqa.qtm.api.model.RemoteApi; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.data.jpa.repository.JpaSpecificationExecutor; -import org.springframework.stereotype.Repository; - -import java.util.List; -import java.util.Optional; - -@Repository -public interface RemoteServiceRepo extends JpaRepository, JpaSpecificationExecutor { - - Optional findRemoteApiByEndpointAndServiceNameAndServiceMethod( - String endpoint,String serviceName,String serviceMethod - ); - - Optional> findRemoteApiByModuleNameAndServiceNameAndLatestVersionNot( - String moduleName,String serviceName,String latestVersion - ); - - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/ApiTestCaseService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/ApiTestCaseService.java deleted file mode 100644 index 018a7e4..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/ApiTestCaseService.java +++ /dev/null @@ -1,107 +0,0 @@ -package io.fluentqa.qtm.api.service; - -import cn.hutool.core.bean.BeanUtil; -import cn.hutool.core.util.StrUtil; -import io.fluent.builtin.CollectionsUtils; -import io.fluentqa.qtm.api.model.*; -import io.fluentqa.qtm.api.repo.ApiMonitorRecordRepo; -import io.fluentqa.qtm.api.repo.ApiTestStepRepo; -import io.fluentqa.qtm.api.repo.RawApiTestCaseRepo; -import org.springframework.beans.BeanUtils; -import org.springframework.stereotype.Service; - -import javax.annotation.Resource; -import java.util.ArrayList; -import java.util.List; -import java.util.function.Function; - -@Service -public class ApiTestCaseService { - private final String DEFAULT_EXPECTED = "{ \"status_code\":200,\"values\":{}\n" + - "}"; - @Resource - RawApiTestCaseRepo rawApiTestCaseRepo; - - @Resource - ApiMonitorRecordRepo apiMonitorRecordRepo; - - @Resource - ApiTestStepRepo apiTestStepRepo; - - - /** - * @param services: HTTP API Services - * 1. Merge All HttpAPI Services if method,service and request-body/input is same - */ - public void convertToRawTestCase(List services) { - List cases = new ArrayList<>(); - services.forEach(service -> { - RawApiTestCase apiCase = new RawApiTestCase(); - BeanUtils.copyProperties(service, apiCase); - apiCase.setUri(service.getEndpoint()); - apiCase.setExpected(DEFAULT_EXPECTED); - //get body - String path = service.getEndpoint().replaceAll("https://\\{\\{base_url\\}\\}", ""); - List result = apiMonitorRecordRepo.findApiMonitorRecordByPath(path); - if (!result.isEmpty()) { - apiCase.setInput(result.get(0).getRequestBody()); - } else { - apiCase.setInput(service.getBody()); - } - cases.add(apiCase); - }); - rawApiTestCaseRepo.saveAll(cases); - } - - /** - * @param records Http traffic - * 1. Merge All HttpAPI Services if method,service and request-body/input is same - */ - public void convertApiMonitorRecordToTestCase(List records) { - List cases = new ArrayList<>(); - List result = CollectionsUtils.filterToReduceRedundant( - records, new Function() { - @Override - public String apply(ApiMonitorRecord apiMonitorRecord) { - return StrUtil.join( - "-", apiMonitorRecord.getApi(), apiMonitorRecord.getApp(), - apiMonitorRecord.getService(), apiMonitorRecord.getPath(), - apiMonitorRecord.getMethod(), apiMonitorRecord.getRequestBody() - ); - } - } - ); - result.forEach(service -> { - RawApiTestCase apiCase = new RawApiTestCase(); - BeanUtils.copyProperties(service, apiCase); - apiCase.setUri(service.getPath()); - apiCase.setExpected(DEFAULT_EXPECTED); - apiCase.setServiceMethod(service.getMethod()); - apiCase.setServiceName(service.getService()); - apiCase.setName(service.getService()); - apiCase.setServiceMethod(service.getApi()); - //get body - apiCase.setInput(service.getRequestBody()); - apiCase.setScenario(service.getRecordName()); - cases.add(apiCase); - }); - rawApiTestCaseRepo.saveAll(cases); - } - - /** - * Only Passed Test Scenario could be converted into test steps - * - * @param data - */ - public void convertApiTestResultToApiTestStep(List data) { - List apiTestSteps = new ArrayList<>(); - data.forEach((apiTestRecord) -> { - if (apiTestRecord.isSuccess()) { - apiTestSteps.add(BeanUtil.copyProperties(apiTestRecord, ApiStep.class)); - } else { - throw new RuntimeException("some cases are failed, can't convert to api test step"); - } - }); - apiTestStepRepo.saveAll(apiTestSteps); - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/RemoteApiService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/RemoteApiService.java deleted file mode 100644 index 91fcd73..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/api/service/RemoteApiService.java +++ /dev/null @@ -1,142 +0,0 @@ -package io.fluentqa.qtm.api.service; - -import cn.hutool.core.bean.BeanUtil; -import io.fluent.postman.PostmanParser; -import io.fluent.postman.model.PostmanCollection; -import io.fluent.postman.model.PostmanItem; -import io.fluentqa.qtm.api.model.ApiSpecVersionModel; -import io.fluentqa.qtm.api.model.RemoteApi; -import io.fluentqa.qtm.api.model.RemoteApiStatus; -import io.fluentqa.qtm.api.repo.RemoteServiceRepo; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.base.product.service.ProductModuleService; -import lombok.extern.slf4j.Slf4j; -import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.stereotype.Service; - -import javax.transaction.Transactional; -import java.util.List; -import java.util.Optional; - -@Service -@Slf4j -public class RemoteApiService { - - @Autowired - private ProductModuleService productMetaService; - - @Autowired - private RemoteServiceRepo remoteServiceRepo; - - - /** - * 1. 解析postman 文件 - * 2. 确认接口是 - * - 新增:NEW: 当前记录无记录,则更新 - * - 更新:UPDATED: 当前记录中有记录,但是接口定义发生变化,比如请求内容 - * - 使用中,IN_USE: - * - 移除:END_OF_LIFE: 接口不在最新清单中 - * 3. - * - * @param apiSpec - */ - @Transactional - public void apiSpecToApiList(ApiSpecVersionModel apiSpec, String updater) { - ProductModuleModel productMeta = productMetaService.createApiModuleIfNotExist(apiSpec.getName(), updater); - if (apiSpec.getSpec().isEmpty()) return; - PostmanCollection collection = PostmanParser.create().toPostmanCollection(apiSpec.getSpec()); - for (PostmanItem postmanItem : collection.getItem()) { - for (PostmanItem item : postmanItem.getItem()) { - RemoteApi rs = toRemoteApi(apiSpec, productMeta, postmanItem, item); - createOrUpdateRemoteApi(rs, apiSpec); - } - updateStatusToEndOfLife(apiSpec, postmanItem); - } - } - - @Transactional - public void apiSpecsToApiList(List apiSpecs, String updater) { - for (ApiSpecVersionModel apiSpec : apiSpecs) { - try { - this.apiSpecToApiList(apiSpec, updater); - } catch (Exception e) { - log.error("%s-api-failed,error=%s".formatted( - apiSpec.getName(), e.getMessage() - )); - } - } - } - - private RemoteApi toRemoteApi(ApiSpecVersionModel apiSpec, ProductModuleModel productMeta, - PostmanItem postmanItem, PostmanItem item) { - RemoteApi rs = new RemoteApi(); - rs.setName(item.getName()); - rs.setServiceName(postmanItem.getName()); - rs.setServiceMethod(item.getName()); - rs.setBody(item.getRequest().getBody().get("raw").toString()); - rs.setHttpMethod(item.getRequest().getMethod()); - rs.setEndpoint(item.getRequest().getUrl().getRaw()); - rs.setType(apiSpec.getServiceType()); - rs.setModuleName(apiSpec.getName()); - rs.setProductId(productMeta.getParent().getId()); - return rs; - } - - /** - * create or update 基本可以使用同一种方式处理 - * 1. 输入实体 - * 2. 字段检验规则 - * 3. 判断重复确认条件 - * 4. 更新字段处理,保存记录 - * TODO: try to integrate with Feishu - * - * @param newApi - */ - public void createOrUpdateRemoteApi(RemoteApi newApi, ApiSpecVersionModel apiSpec) { - Optional api = remoteServiceRepo.findRemoteApiByEndpointAndServiceNameAndServiceMethod( - newApi.getEndpoint(), - newApi.getServiceName(), - newApi.getServiceMethod()); - if (api.isEmpty()) { - newApi.setStatus(RemoteApiStatus.NEW.toString()); - newApi.setAddedVersion(apiSpec.getAppVersion()); - newApi.setLatestVersion(apiSpec.getAppVersion()); - remoteServiceRepo.save(newApi); - } else { - RemoteApi existApi = api.get(); - BeanUtil.copyProperties(newApi, existApi, "id"); - //TODO: 如何确认接口变更-暂时不确认 - if (existApi.getBody().equalsIgnoreCase(newApi.getBody())) { - existApi.setStatus(RemoteApiStatus.IN_USE.toString()); - existApi.setLatestVersion(apiSpec.getAppVersion()); - } else { - existApi.setStatus(RemoteApiStatus.UPDATED.toString()); - existApi.setLatestVersion(apiSpec.getAppVersion()); - } - remoteServiceRepo.save(existApi); - } - } - - public void updateStatusToEndOfLife(ApiSpecVersionModel apiSpec, PostmanItem postmanItem) { - Optional> apiListOptional = remoteServiceRepo.findRemoteApiByModuleNameAndServiceNameAndLatestVersionNot( - apiSpec.getName(), - postmanItem.getName(), - apiSpec.getAppVersion() - ); - if (apiListOptional.isEmpty()) { - log.info("没有需要删除的数据"); - } else { - List apiList = apiListOptional.get(); - for (RemoteApi api : apiList) { - api.setStatus(RemoteApiStatus.END_OF_LIFE.toString()); - remoteServiceRepo.save(api); - log.info("有需要删除的数据"); - } - - } - } - - -} - - diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/package-info.java deleted file mode 100644 index 65ca3a8..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.qtm.tc.pm; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOption.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOption.java deleted file mode 100644 index 8a0f6ee..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOption.java +++ /dev/null @@ -1,158 +0,0 @@ -package io.fluentqa.qtm.pm.requirement; - - -import io.fluentqa.base.model.ModelWithValidFlag; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.fun.ChoiceFetchHandler; -import xyz.erupt.annotation.fun.VLModel; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.ChoiceType; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; - -import javax.persistence.Entity; -import javax.persistence.Table; -import java.util.ArrayList; -import java.util.List; - -@Entity -@Erupt(name = "字段选项", - power = @Power(export = true, importable = true), - orderBy = "FieldOption.updateTime desc") -@Table(name = "field_option") -public class FieldOption extends ModelWithValidFlag implements ChoiceFetchHandler { - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称",notNull = true - ) - ) - private String name; - @EruptField( - views = @View( - title = "字段code" - ), - edit = @Edit( - title = "字段code" - ) - ) - private String code; - - @EruptField( - views = @View(title = "编辑类型"), - edit = @Edit(title = "编辑类型", - notNull = true, type = EditType.CHOICE, - choiceType = @ChoiceType(type = ChoiceType.Type.RADIO, fetchHandler = FieldOption.class)) - ) - private String type; - - @EruptField( - views = @View( - title = "是否可以为空" - ), - edit = @Edit( - title = "是否可以为空", - type = EditType.BOOLEAN - ) - ) - private boolean notNull; - @EruptField( - views = @View( - title = "字段约束条件" - ), - edit = @Edit( - title = "字段约束条件", - type = EditType.CODE_EDITOR, notNull = true, - codeEditType = @CodeEditorType(language = "text"), - desc = "字段约束条件" - ) - ) - private String constrains; - - @EruptField( - views = @View( - title = "和其他业务关系" - ), - edit = @Edit( - title = "和其他业务关系", - type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String relatedTo; - - @EruptField( - views = @View(title = "显示顺序", sortable = true), - edit = @Edit(title = "显示顺序", notNull = true) - ) - private Integer sort; - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public String getCode() { - return code; - } - - public void setCode(String code) { - this.code = code; - } - - public String getType() { - return type; - } - - public void setType(String type) { - this.type = type; - } - - public boolean isNotNull() { - return notNull; - } - - public void setNotNull(boolean notNull) { - this.notNull = notNull; - } - - public String getConstrains() { - return constrains; - } - - public void setConstrains(String constrains) { - this.constrains = constrains; - } - - public String getRelatedTo() { - return relatedTo; - } - - public void setRelatedTo(String relatedTo) { - this.relatedTo = relatedTo; - } - - public Integer getSort() { - return sort; - } - - public void setSort(Integer sort) { - this.sort = sort; - } - - @Override - public List fetch(String[] params) { - List list = new ArrayList<>(); - for (FieldOptionType value : FieldOptionType.values()) { - list.add(new VLModel(value.name(), value.getDesc())); - } - return list; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOptionType.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOptionType.java deleted file mode 100644 index 44c921c..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/FieldOptionType.java +++ /dev/null @@ -1,13 +0,0 @@ -package io.fluentqa.qtm.pm.requirement; - -import lombok.Getter; - -@Getter -public enum FieldOptionType { - NUMBER("数值"),STRING("字符串"),RELATION("关联"),DATE("日期"),ENUM("枚举"),BOOLEAN("布尔"); - - private String desc; - FieldOptionType(String desc) { - this.desc = desc; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementFeature.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementFeature.java deleted file mode 100644 index a9432f7..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementFeature.java +++ /dev/null @@ -1,34 +0,0 @@ -package io.fluentqa.qtm.pm.requirement; - -import io.fluentqa.base.model.ModelWithValidFlag; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; - -import javax.persistence.Entity; -import javax.persistence.Table; - -@Entity -@Erupt(name = "需求功能点", - power = @Power(export = true, importable = true), - orderBy = "RequirementFeature.updateTime desc") -@Table(name = "requirement_features") -public class RequirementFeature extends ModelWithValidFlag { - - @EruptField( - views = @View( - title = "业务功能相关说明" - ), - edit = @Edit( - title = "业务功能相关说明", - type = EditType.CODE_EDITOR, notNull = true, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String feature; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementType.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementType.java deleted file mode 100644 index d960f62..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/RequirementType.java +++ /dev/null @@ -1,15 +0,0 @@ -package io.fluentqa.qtm.pm.requirement; - -import lombok.Getter; - -@Getter -public enum RequirementType { - CREATE("创建"),UPDATE("更新"),DELETE("删除/归档"),SEARCH("查询"), COMPLEX("复杂业务"), - WORKFLOW("工作流"), - REPORT("报表"),OTHER("其他"); - - private String desc; - RequirementType(String desc) { - this.desc = desc; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/TestRequirement.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/TestRequirement.java deleted file mode 100644 index 6f8bd08..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/pm/requirement/TestRequirement.java +++ /dev/null @@ -1,192 +0,0 @@ -package io.fluentqa.qtm.pm.requirement; - - -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import io.fluentqa.base.model.ModelWithValidFlag; -import io.fluentqa.base.product.model.ProductModuleModel; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.fun.ChoiceFetchHandler; -import xyz.erupt.annotation.fun.VLModel; -import xyz.erupt.annotation.sub_erupt.LinkTree; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.ChoiceType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.annotation.sub_field.sub_edit.TagsType; - -import javax.persistence.*; -import java.util.ArrayList; -import java.util.List; -import java.util.Set; - - -@Entity -@Erupt(name = "测试需求管理", - power = @Power(export = true, importable = true), - orderBy = "TestRequirement.updateTime desc", - linkTree = @LinkTree(field = "module")) -@Table(name = "test_requirements") -public class TestRequirement extends ModelWithValidFlag implements ChoiceFetchHandler { - - @EruptField( - views = @View(title = "需求概述"), - edit = @Edit(title = "需求概述") - ) - private String summary; - - @EruptField( - views = @View(title = "需求类型"), - edit = @Edit(title = "需求类型", - notNull = true, type = EditType.CHOICE, - choiceType = @ChoiceType(type = ChoiceType.Type.RADIO, - fetchHandler = TestRequirement.class)) - ) - private String type; - - @OneToMany(cascade = CascadeType.ALL, orphanRemoval = true) - @JoinColumn(name = "test_req_id") - @EruptField( - edit = @Edit(title = "功能点", type = EditType.TAB_TABLE_ADD) - ) - private Set features; - - - @OneToMany(cascade = CascadeType.ALL, orphanRemoval = true) - @JoinColumn(name = "test_req_id") - @OrderBy("sort") - @EruptField( - edit = @Edit(title = "字段管理", type = EditType.TAB_TABLE_ADD) - ) - private Set fieldOptions; - - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "所属模块", column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel module; - - @EruptField( - views = @View( - title = "提示词" - ), - edit = @Edit( - title = "提示词" - ) - ) - private String prompts; - - @EruptField( - views = @View( - title = "优先级" - ), - edit = @Edit( - search = @Search, - title = "优先级", - type = EditType.TAGS, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct key,detail from master_data where category_code = 'PRIORITY' order by 1 " - ) - ) - ) - private String priority = "P2"; - - @EruptField( - views = @View( - title = "需求状态" - ), - edit = @Edit( - title = "需求状态" - ) - ) - private String status; - - @Override - public List fetch(String[] params) { - List list = new ArrayList<>(); - for (RequirementType value : RequirementType.values()) { - list.add(new VLModel(value.name(), value.getDesc())); - } - return list; - } - - public String getPriority() { - return priority; - } - - public void setPriority(String priority) { - this.priority = priority; - } - - - public String getPrompts() { - return prompts; - } - - public void setPrompts(String prompts) { - this.prompts = prompts; - } - - - public Set getFieldOptions() { - return fieldOptions; - } - - public void setFieldOptions(Set fieldOptions) { - this.fieldOptions = fieldOptions; - } - - - public String getType() { - return type; - } - - public void setType(String type) { - this.type = type; - } - - public Set getFeatures() { - return features; - } - - public void setFeatures(Set features) { - this.features = features; - } - - public String getSummary() { - return summary; - } - - public void setSummary(String summary) { - this.summary = summary; - } - - public String getStatus() { - return status; - } - - public void setStatus(String status) { - this.status = status; - } - - public ProductModuleModel getModule() { - return module; - } - - public void setModule(ProductModuleModel module) { - this.module = module; - } -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/dto/TestCaseDTO.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/dto/TestCaseDTO.java deleted file mode 100644 index bc1c15f..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/dto/TestCaseDTO.java +++ /dev/null @@ -1,28 +0,0 @@ -package io.fluentqa.qtm.tc.dto; - -import com.github.crab2died.annotation.ExcelField; -import lombok.Data; - -@Data -public class TestCaseDTO { - @ExcelField(title = "产品名称") - private String productName; - @ExcelField(title = "模块名称") - private String moduleName; - @ExcelField(title = "功能点") - private String feature; - @ExcelField(title = "用例描述") - private String summary; - @ExcelField(title = "优先级") - private String priority = "P2"; //check it - @ExcelField(title = "用例前提条件") - private String precondition; - @ExcelField(title = "测试步骤") - private String steps; - @ExcelField(title = "期望结果") - private String expectedResult; - - @ExcelField(title = "用例ID") - private String uuid; - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/handlers/GenerateTestRecordHandler.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/handlers/GenerateTestRecordHandler.java deleted file mode 100644 index f5b2fe9..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/handlers/GenerateTestRecordHandler.java +++ /dev/null @@ -1,49 +0,0 @@ -package io.fluentqa.qtm.tc.handlers; - - -import cn.hutool.core.bean.BeanUtil; -import io.fluentqa.qtm.tc.model.TestCase; -import io.fluentqa.qtm.tc.model.TestResult; -import io.fluentqa.qtm.tc.model.TestRun; -import io.fluentqa.qtm.tc.model.TestScenario; -import io.fluentqa.qtm.tc.repo.TestResultRepo; - -import org.springframework.stereotype.Service; -import xyz.erupt.annotation.fun.OperationHandler; - - -import javax.annotation.Resource; -import java.util.List; - -@Service -public class GenerateTestRecordHandler implements OperationHandler { - @Resource - private TestResultRepo testResultRepo; - - @Override - public String exec(List data, Void unused, String[] param) { - for (TestRun testRun : data) { - //get all test cases - for (TestCase testCase : testRun.getTestCases()) { - TestResult result = BeanUtil.copyProperties(testCase, TestResult.class); - result.setTestCaseUUID(testCase.getUuid()); - result.setTestRun(testRun); - result.setQaOwner(testRun.getTestOwner()); - testResultRepo.save(result); - } - for (TestScenario tc : testRun.getTestScenarios()) { - for (TestCase testCase : tc.getTestCases()) { - TestResult result = BeanUtil.copyProperties(testCase, TestResult.class); - result.setTestCaseUUID(testCase.getUuid()); - result.setTestRun(testRun); - result.setQaOwner(testRun.getTestOwner()); - result.setTestScenario(tc.getName()); - testResultRepo.save(result); - } - } - } - return null; - } - - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestCase.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestCase.java deleted file mode 100644 index 4b8a23f..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestCase.java +++ /dev/null @@ -1,238 +0,0 @@ -package io.fluentqa.qtm.tc.model; - -import io.fluentqa.base.handlers.SqlTagFetchHandler; -import io.fluentqa.base.model.ModelWithValidFlagVo; -import io.fluentqa.base.product.model.ProductModuleModel; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Layout; -import xyz.erupt.annotation.sub_erupt.LinkTree; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.*; - -import javax.persistence.Entity; -import javax.persistence.JoinColumn; -import javax.persistence.ManyToOne; -import javax.persistence.Table; - -/** - * - */ -@Entity -@Erupt(name = "测试用例", - power = @Power(export = true), - orderBy = "TestCase.updateTime desc", - linkTree = @LinkTree(field = "module"),layout = @Layout( - tableLeftFixed = 3, - pageSize = 30)) -@Table(name = "test_cases") -public class TestCase extends ModelWithValidFlagVo { - - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "所属模块",column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel module; - - @ManyToOne - @JoinColumn(name = "parent_product_id") - @EruptField( - views = @View(title = "父模块",column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel parent; - - @ManyToOne - @JoinColumn(name = "root_product_id") - @EruptField( - views = @View(title = "所属产品",column = "details"), - edit = @Edit( - notNull = true, - search = @Search, - title = "产品模块选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType(id = "id", label = "name", - pid = "parent.id")) - ) - private ProductModuleModel product; - - @EruptField( - views = @View( - title = "功能点" - ), - edit = @Edit( - title = "功能点", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String feature; - @EruptField( - views = @View( - title = "用例描述" - ), - edit = @Edit( - title = "用例描述", - type = EditType.INPUT, notNull = true - ) - ) - private String summary; - - @EruptField( - views = @View( - title = "优先级" - ), - edit = @Edit( - title = "优先级", - type = EditType.TAGS, - search = @Search, - tagsType = @TagsType( - fetchHandler = SqlTagFetchHandler.class, - fetchHandlerParams = "select distinct key,detail from master_data where category_code = 'PRIORITY' order by 1 " - ) - ) - ) - private String priority = "P2"; - - - @EruptField( - views = @View( - title = "测试步骤" - ), - edit = @Edit( - title = "测试步骤", - type = EditType.CODE_EDITOR, notNull = true, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String steps; - @EruptField( - views = @View( - title = "期望结果" - ), - edit = @Edit( - title = "期望结果", - type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String expectedResult; - - @EruptField( - views = @View( - title = "用例ID" - ) - ) - private String uuid; - @EruptField( - views = @View( - title = "用例前提条件" - ), - edit = @Edit( - title = "用例前提条件", - type = EditType.CODE_EDITOR, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String precondition; - - public String getFeature() { - return feature; - } - - public void setFeature(String feature) { - this.feature = feature; - } - - public String getSummary() { - return summary; - } - - public void setSummary(String summary) { - this.summary = summary; - } - - public String getPriority() { - return priority; - } - - public void setPriority(String priority) { - this.priority = priority; - } - - public String getSteps() { - return steps; - } - - public void setSteps(String steps) { - this.steps = steps; - } - - public String getExpectedResult() { - return expectedResult; - } - - public void setExpectedResult(String expectedResult) { - this.expectedResult = expectedResult; - } - - public String getUuid() { - return uuid; - } - - public void setUuid(String uuid) { - this.uuid = uuid; - } - - public String getPrecondition() { - return precondition; - } - - public void setPrecondition(String precondition) { - this.precondition = precondition; - } - - - public ProductModuleModel getModule() { - return module; - } - - public void setModule(ProductModuleModel module) { - this.module = module; - } - - public ProductModuleModel getParent() { - return parent; - } - - public void setParent(ProductModuleModel parent) { - this.parent = parent; - } - - public ProductModuleModel getProduct() { - return product; - } - - public void setProduct(ProductModuleModel product) { - this.product = product; - } -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestResult.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestResult.java deleted file mode 100644 index bdb52e0..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestResult.java +++ /dev/null @@ -1,206 +0,0 @@ -package io.fluentqa.qtm.tc.model; - - -import io.fluentqa.base.model.ModelWithValidFlagVo; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.LinkTree; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.CodeEditorType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; - -import javax.persistence.Entity; -import javax.persistence.ManyToOne; -import javax.persistence.Table; - - -@Table(name = "test_results") -@Entity -@Erupt(name = "测试执行结果", - power = @Power(importable = true, export = true) - , linkTree = @LinkTree(field = "testRun") -) -public class TestResult extends ModelWithValidFlagVo { - @ManyToOne - @EruptField( - views = @View(title = "所属测试安排", column = "name"), - edit = @Edit(title = "所属测试安排", type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id", expandLevel = 2)) - ) - private TestRun testRun; - - @EruptField( - views = @View( - title = "测试场景" - ), - edit = @Edit( - title = "测试场景", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String testScenario; - @EruptField( - views = @View( - title = "功能点" - ), - edit = @Edit( - title = "功能点", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String feature; - @EruptField( - views = @View( - title = "用例描述" - ), - edit = @Edit( - title = "用例描述", - type = EditType.INPUT, notNull = true - ) - ) - private String summary; - @EruptField( - views = @View( - title = "用例优先级" - ), - edit = @Edit( - title = "用例优先级", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String priority = "P2"; //check it - @EruptField( - views = @View( - title = "测试步骤" - ), - edit = @Edit( - title = "测试步骤", - type = EditType.CODE_EDITOR, notNull = true, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String steps; - @EruptField( - views = @View( - title = "用例期望结果" - ), - edit = @Edit( - title = "用例期望结果", - type = EditType.CODE_EDITOR, notNull = true, - codeEditType = @CodeEditorType(language = "text") - ) - ) - private String expectedResult; - - - @EruptField( - views = @View( - title = "测试测试结果" - ), - edit = @Edit( - title = "测试测试结果", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String qaTestResult; - - - @EruptField( - views = @View( - title = "测试负责人" - ), - edit = @Edit( - title = "测试负责人", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String qaOwner; - - private String testCaseUUID; - - - public String getTestCaseUUID() { - return testCaseUUID; - } - - public void setTestCaseUUID(String testCaseUUID) { - this.testCaseUUID = testCaseUUID; - } - - public String getFeature() { - return feature; - } - - public void setFeature(String feature) { - this.feature = feature; - } - - public String getSummary() { - return summary; - } - - public void setSummary(String summary) { - this.summary = summary; - } - - public String getPriority() { - return priority; - } - - public void setPriority(String priority) { - this.priority = priority; - } - - public String getSteps() { - return steps; - } - - public void setSteps(String steps) { - this.steps = steps; - } - - public String getExpectedResult() { - return expectedResult; - } - - public void setExpectedResult(String expectedResult) { - this.expectedResult = expectedResult; - } - - - public String getQaTestResult() { - return qaTestResult; - } - - public void setQaTestResult(String qaTestResult) { - this.qaTestResult = qaTestResult; - } - - public String getQaOwner() { - return qaOwner; - } - - public void setQaOwner(String qaOwner) { - this.qaOwner = qaOwner; - } - - public TestRun getTestRun() { - return testRun; - } - - public void setTestRun(TestRun testRun) { - this.testRun = testRun; - } - - public String getTestScenario() { - return testScenario; - } - - public void setTestScenario(String testScenario) { - this.testScenario = testScenario; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestRun.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestRun.java deleted file mode 100644 index e054be6..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestRun.java +++ /dev/null @@ -1,251 +0,0 @@ -package io.fluentqa.qtm.tc.model; - - -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.qtm.tc.handlers.GenerateTestRecordHandler; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_erupt.RowOperation; -import xyz.erupt.annotation.sub_erupt.Tree; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; -import xyz.erupt.annotation.sub_field.sub_edit.BoolType; -import xyz.erupt.annotation.sub_field.sub_edit.ReferenceTreeType; -import xyz.erupt.annotation.sub_field.sub_edit.Search; -import xyz.erupt.jpa.model.MetaModel; - -import javax.persistence.*; -import java.time.LocalDate; -import java.util.Set; - -//TODO: Filter By Status -//TODO: input by Uploaded File or File Sync -@Entity -@Table(name = "test_runs") -@Erupt(name = "测试执行计划", - power = @Power(importable = true, export = true), - tree = @Tree(id = "id", label = "name", pid = "parent.id") - ,rowOperation = {@RowOperation( - title = "生成执行测试用例", - operationHandler = GenerateTestRecordHandler.class)} -) - -public class TestRun extends MetaModel { - - @ManyToOne - @JoinColumn(name = "product_id") - @EruptField( - views = @View(title = "产品名称", column = "name"), - edit = @Edit( - search = @Search, - title = "产品选择", - type = EditType.REFERENCE_TREE, - desc = "动态获取产品", - referenceTreeType = @ReferenceTreeType( - pid = "parent.id")) - ) - private ProductModuleModel product; - - @ManyToOne - @EruptField( - edit = @Edit( - title = "父级测试安排", - type = EditType.REFERENCE_TREE, - referenceTreeType = @ReferenceTreeType(pid = "parent.id") - ) - ) - private TestRun parent; - - - @EruptField( - views = @View( - title = "测试负责人" - ), - edit = @Edit( - title = "测试负责人", - type = EditType.INPUT, search = @Search - ) - ) - private String testOwner; - - @JoinTable(name = "test_run_cases", - joinColumns = @JoinColumn(name = "test_run_id", referencedColumnName = "id"), - inverseJoinColumns = @JoinColumn(name = "test_case_id", referencedColumnName = "id")) - @ManyToMany(fetch = FetchType.EAGER) - @EruptField( - views = @View(title = "包含用例"), - edit = @Edit( - title = "包含用例", - type = EditType.TAB_TABLE_REFER - ) - ) - private Set testCases; - - @JoinTable(name = "test_run_scenario", - joinColumns = @JoinColumn(name = "test_run_id", referencedColumnName = "id"), - inverseJoinColumns = @JoinColumn(name = "test_scenario_id", referencedColumnName = "id")) - @ManyToMany(fetch = FetchType.EAGER) - @EruptField( - views = @View(title = "包含测试场景"), - edit = @Edit( - title = "包含测试场景", - type = EditType.TAB_TABLE_REFER - ) - ) - private Set testScenarios; - - public ProductModuleModel getProduct() { - return product; - } - - public void setProduct(ProductModuleModel product) { - this.product = product; - } - - public String getTestOwner() { - return testOwner; - } - - public void setTestOwner(String testOwner) { - this.testOwner = testOwner; - } - - public Set getTestCases() { - return testCases; - } - - public void setTestCases(Set testCases) { - this.testCases = testCases; - } - - public Set getTestScenarios() { - return testScenarios; - } - - public void setTestScenarios(Set testScenarios) { - this.testScenarios = testScenarios; - } - - public TestRun getParent() { - return parent; - } - - public void setParent(TestRun parent) { - this.parent = parent; - } - - @EruptField( - views = @View( - title = "名称" - ), - edit = @Edit( - title = "名称", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String name; - @EruptField( - views = @View( - title = "详细" - ), - edit = @Edit( - title = "详细", - type = EditType.INPUT, search = @Search, notNull = true - ) - ) - private String detail; - @EruptField( - views = @View( - title = "开始时间" - ), - edit = @Edit( - title = "开始时间", - type = EditType.DATE, search = @Search, - boolType = @BoolType - ) - ) - private LocalDate startDate; - @EruptField( - views = @View( - title = "预计完成时间" - ), - edit = @Edit( - title = "预计完成时间", - type = EditType.DATE, search = @Search, - boolType = @BoolType - ) - ) - private LocalDate estimatedCompletedDate; - @EruptField( - views = @View( - title = "完成时间" - ), - edit = @Edit( - title = "完成时间", - type = EditType.DATE, search = @Search, - boolType = @BoolType - ) - ) - private LocalDate completedDate; - @EruptField( - views = @View( - title = "当前状态" - ), - edit = @Edit( - title = "当前状态", - type = EditType.INPUT, search = @Search, notNull = true, - boolType = @BoolType - ) - ) - private String status; - - public String getName() { - return name; - } - - public void setName(String name) { - this.name = name; - } - - public String getDetail() { - return detail; - } - - public void setDetail(String detail) { - this.detail = detail; - } - - public LocalDate getStartDate() { - return startDate; - } - - public void setStartDate(LocalDate startDate) { - this.startDate = startDate; - } - - public LocalDate getEstimatedCompletedDate() { - return estimatedCompletedDate; - } - - public void setEstimatedCompletedDate(LocalDate estimatedCompletedDate) { - this.estimatedCompletedDate = estimatedCompletedDate; - } - - public LocalDate getCompletedDate() { - return completedDate; - } - - public void setCompletedDate(LocalDate completedDate) { - this.completedDate = completedDate; - } - - public String getStatus() { - return status; - } - - public void setStatus(String status) { - this.status = status; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestScenario.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestScenario.java deleted file mode 100644 index eb2efa8..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/model/TestScenario.java +++ /dev/null @@ -1,43 +0,0 @@ -package io.fluentqa.qtm.tc.model; - -import io.fluentqa.base.model.NamedModelVO; -import xyz.erupt.annotation.Erupt; -import xyz.erupt.annotation.EruptField; -import xyz.erupt.annotation.sub_erupt.Power; -import xyz.erupt.annotation.sub_field.Edit; -import xyz.erupt.annotation.sub_field.EditType; -import xyz.erupt.annotation.sub_field.View; - -import javax.persistence.*; -import java.util.Set; - -@Entity -@Table(name = "test_scenarios") -@Erupt(name = "测试场景管理", - power = @Power(importable = true, export = true) -) -//@PreDataProxy(value= TestScenarioCaseProxy.class) - -public class TestScenario extends NamedModelVO { - - @JoinTable(name = "test_scenario_cases", - joinColumns = @JoinColumn(name = "test_scenario_id", referencedColumnName = "id"), - inverseJoinColumns = @JoinColumn(name = "test_case_id", referencedColumnName = "id")) - @ManyToMany(fetch = FetchType.EAGER) - @EruptField( - views = @View(title = "包含用例"), - edit = @Edit( - title = "包含用例", - type = EditType.TAB_TABLE_REFER - ) - ) - private Set testCases; - - public Set getTestCases() { - return testCases; - } - - public void setTestCases(Set testCases) { - this.testCases = testCases; - } -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/package-info.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/package-info.java deleted file mode 100644 index 263ec49..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/package-info.java +++ /dev/null @@ -1 +0,0 @@ -package io.fluentqa.qtm.tc; \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestCaseRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestCaseRepo.java deleted file mode 100644 index dbc1f7d..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestCaseRepo.java +++ /dev/null @@ -1,15 +0,0 @@ -package io.fluentqa.qtm.tc.repo; - -import io.fluentqa.qtm.tc.model.TestCase; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository -public interface TestCaseRepo extends JpaRepository { - public TestCase findByUuid(String uuid); - - -// @Query(nativeQuery = true,value = "delete from test_cases where test_plan=:testPlan") -// @Modifying -// public void deleteByTestPlan(@Param("testPlan") String testPlan); -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestResultRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestResultRepo.java deleted file mode 100644 index 6af93c7..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestResultRepo.java +++ /dev/null @@ -1,11 +0,0 @@ -package io.fluentqa.qtm.tc.repo; - - -import io.fluentqa.qtm.tc.model.TestResult; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository(value = "testResultRepo") -public interface TestResultRepo extends JpaRepository { - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestRunRepo.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestRunRepo.java deleted file mode 100644 index 19d7da7..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/repo/TestRunRepo.java +++ /dev/null @@ -1,11 +0,0 @@ -package io.fluentqa.qtm.tc.repo; - - -import io.fluentqa.qtm.tc.model.TestRun; -import org.springframework.data.jpa.repository.JpaRepository; -import org.springframework.stereotype.Repository; - -@Repository -public interface TestRunRepo extends JpaRepository { - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/TestCaseService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/TestCaseService.java deleted file mode 100644 index 45ffd97..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/TestCaseService.java +++ /dev/null @@ -1,14 +0,0 @@ -package io.fluentqa.qtm.tc.service; - -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.qtm.tc.dto.TestCaseDTO; -import org.springframework.stereotype.Service; - -import java.util.List; - -@Service -public interface TestCaseService { - - public void saveTestCases(List cases, - ProductModuleModel product, ProductModuleModel module,String updater); -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/MindMappingService.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/MindMappingService.java deleted file mode 100644 index 04f6154..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/MindMappingService.java +++ /dev/null @@ -1,41 +0,0 @@ -package io.fluentqa.qtm.tc.service.impl; - - - -import cn.hutool.core.bean.BeanUtil; -import io.fluentqa.mindmap.api.MindMapAccessor; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.qtm.tc.dto.TestCaseDTO; -import io.fluentqa.qtm.tc.service.TestCaseService; -import org.springframework.stereotype.Service; -import xyz.erupt.jpa.model.MetaModel; - -import javax.annotation.Resource; -import javax.transaction.Transactional; -import java.util.List; - -/** - * 1.Import MindMapping file to test case database - * 2.export selected test cases as mindmapping file - */ -//TODO: convert to same TestCase Converter -@Service("mindMappingService") -public class MindMappingService { - - @Resource - private TestCaseService testCaseService; - - public List toTestCaseModel(String xmlFilePath) { - MindMapAccessor accessor = new MindMapAccessor(); - return accessor.readMindMapToBean(xmlFilePath, TestCaseDTO.class); - } - - @Transactional - public void saveTestCases(String xmlFilePath, MetaModel model) { - List testCaseModels = toTestCaseModel(xmlFilePath); - ProductModuleModel product = BeanUtil.getProperty(model, "product"); - ProductModuleModel module = BeanUtil.getProperty(model, "module"); - testCaseService.saveTestCases(testCaseModels,product,module,model.getUpdateBy()); - } - -} diff --git a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/TestCaseServiceImpl.java b/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/TestCaseServiceImpl.java deleted file mode 100644 index bab5b68..0000000 --- a/fluent-apps/qaserver/src/main/java/io/fluentqa/qtm/tc/service/impl/TestCaseServiceImpl.java +++ /dev/null @@ -1,94 +0,0 @@ -package io.fluentqa.qtm.tc.service.impl; - - -import cn.hutool.core.bean.BeanUtil; -import cn.hutool.core.lang.UUID; -import cn.hutool.core.util.StrUtil; -import io.fluentqa.base.proxies.AuditDataEnhancerProxy; -import io.fluentqa.base.product.model.ProductModuleModel; -import io.fluentqa.base.product.service.ProductModuleService; -import io.fluentqa.qtm.tc.dto.TestCaseDTO; -import io.fluentqa.qtm.tc.model.TestCase; -import io.fluentqa.qtm.tc.repo.TestCaseRepo; -import io.fluentqa.qtm.tc.service.TestCaseService; -import org.springframework.scheduling.annotation.Async; -import org.springframework.stereotype.Service; - -import javax.annotation.Resource; -import javax.transaction.Transactional; -import java.util.List; - -@Service -public class TestCaseServiceImpl implements TestCaseService { - @Resource - private TestCaseRepo testCaseRepo; - @Resource - private ProductModuleService productMetaService; - - @Resource - private AuditDataEnhancerProxy dataEnhancerProxy; - @Override - @Transactional - @Async - /** - * notice:parent Product can't be created,parent product must be configured - * 1. 如果UUID没有或者找不到,则新增测试用例 - * 2. 新增测试用例中, - */ - public void saveTestCases(List cases, - ProductModuleModel parentProduct, - ProductModuleModel module,String updater) { - for (TestCaseDTO aCase : cases) { - TestCase tcEntity = createOrUseExistingTestCase(aCase); - ProductModuleModel rootProduct = getRootProductMeta(aCase); - ProductModuleModel parentModule = productMetaService.createModuleIfNotExist(rootProduct.getId(), - aCase.getModuleName(),updater); - ProductModuleModel subModule = whichSubModule(parentModule, aCase,updater); - tcEntity.setModule(subModule); - tcEntity.setProduct(rootProduct); - tcEntity.setParent(parentModule); - if (StrUtil.isBlankIfStr(aCase.getPriority())) { - tcEntity.setPriority("P2"); - } - tcEntity.setSteps(StrUtil.join(":\n", aCase.getFeature(), - aCase.getSummary(), aCase.getSteps())); - - testCaseRepo.save(tcEntity); - } - } - - private TestCase createOrUseExistingTestCase(TestCaseDTO aCase) { - TestCase tcEntity; - if (StrUtil.isBlank(aCase.getUuid())) { - tcEntity = BeanUtil.copyProperties(aCase, TestCase.class); - tcEntity.setUuid(UUID.fastUUID().toString(true)); - - } else { - tcEntity = testCaseRepo.findByUuid(aCase.getUuid()); - if (tcEntity == null) { - tcEntity = BeanUtil.copyProperties(aCase, TestCase.class); //生成新的的UUID - } else { - BeanUtil.copyProperties(aCase, tcEntity, "id"); //更新数据库数据 - } - } - return tcEntity; - } - - - private ProductModuleModel getRootProductMeta(TestCaseDTO aCase) { - ProductModuleModel rootProductMeta = productMetaService.findByName(aCase.getProductName()); - if (rootProductMeta == null) { - throw new RuntimeException("找不到产品"); - } - return rootProductMeta; - } - - - private ProductModuleModel whichSubModule(ProductModuleModel parentProduct, TestCaseDTO aCase,String updater) { - if (parentProduct.getName().equalsIgnoreCase(aCase.getModuleName())) return parentProduct; - return productMetaService.createModuleIfNotExist(parentProduct.getId(), aCase.getModuleName(),updater); - - } - - -} diff --git a/fluent-apps/qaserver/src/main/resources/application-dev.yaml b/fluent-apps/qaserver/src/main/resources/application-dev.yaml deleted file mode 100644 index 6777e4b..0000000 --- a/fluent-apps/qaserver/src/main/resources/application-dev.yaml +++ /dev/null @@ -1,101 +0,0 @@ -erupt-app: - # 是否开启水印,1.12.0 及以上版本支持 - waterMark: false - # 登录失败几次出现验证码,值为0时表示一直需要登录验证码 - verifyCodeCount: 2 - # 登录密码是否加密传输,特殊场景如:LDAP登录可关闭该功能获取密码明文 - pwdTransferEncrypt: true - # 多语言配置,默认支持:简体中文、繁体中文、英文、日文;具体配置详见erupt-i18n模块 - locales: [ "zh-CN","zh-TW","en-US","ja-JP" ] -erupt: - # 是否开启csrf防御 - csrfInspect: true - # 是否开启redis方式存储session,默认false,开启后需在配置文件中添加redis配置(同 spring boot) - redisSession: false - # 附件上传存储路径, 默认路径为:/opt/erupt-attachment - uploadPath: /Users/patrick/data/temp - # 是否保留上传文件原始名称 - keepUploadFileName: false - # 登录session时长(redisSession为true时有效) - upms.expireTimeByLogin: 120 - # 是否记录操作日志,默认true,该功能开启后可在【系统管理 → 操作日志】中查看操作日志 - security.recordOperateLog: false - -spring: - datasource: - url: jdbc:postgresql://db.supabase.orb.local:5432/postgres?currentSchema=workspace - username: postgres - password: postgres - jpa: - show-sql: true - generate-ddl: true - database-platform: org.hibernate.dialect.PostgreSQLDialect - database: postgresql - -# mail: -# username: xxxx@qq.com -# password: xxxxxxx -# host: smtp.qq.com -# properties: -# mail.smtp.ssl.auth: true -# mail.smtp.ssl.enable: true -# mail.smtp.ssl.required: true - servlet: - multipart: - max-file-size: 100MB - max-request-size: 100MB - -# springdoc-openapi项目配置 -#springdoc: -# swagger-ui: -# path: /swagger-ui.html -# tags-sorter: alpha -# operations-sorter: alpha -# api-docs: -# path: /v3/api-docs -# group-configs: -# - group: 'default' -# paths-to-match: '/**' -# packages-to-scan: io.fluentqa -# knife4j的增强配置,不需要增强可以不配 -knife4j: - enable: true - openapi: - title: QA Workspace API - description: "`QA Workspace API - # workspace" - email: fluentqa@163.com - concat: fluent-qa -# url: https://docs.xiaominfo.com -# version: v4.0 -# license: Apache 2.0 -# license-url: https://stackoverflow.com/ -# terms-of-service-url: https://stackoverflow.com/ - group: - test1: - group-name: qa workspace api - api-rule: package -# api-rule-resources: -# - com.knife4j.demo.new3 - -server: - # 启用 gzip 压缩 - compression: - mime-types: application/javascript,text/css,application/json,application/xml,text/html,text/xml,text/plain - enabled: true - error: - includeException: true - includeStacktrace: ALWAYS - includeMessage: ALWAYS - port: 9090 -logging: - level: - root: TRACE - io.fluentqa: DEBUG - org.hibernate: DEBUG - io.fluent: DEBUG - xyz.erupt: DEBUG - -magic-api: - web: /fluentapi/v1 - resource.location: ./magic-script \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/resources/application.yaml b/fluent-apps/qaserver/src/main/resources/application.yaml deleted file mode 100644 index caf4dfc..0000000 --- a/fluent-apps/qaserver/src/main/resources/application.yaml +++ /dev/null @@ -1,3 +0,0 @@ -spring: - profiles: - active: dev \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/resources/public/app.css b/fluent-apps/qaserver/src/main/resources/public/app.css deleted file mode 100644 index 44c63ae..0000000 --- a/fluent-apps/qaserver/src/main/resources/public/app.css +++ /dev/null @@ -1,24 +0,0 @@ -layout-header { - background: #3f51b5 !important; -} - -/* 例:修改登录页样式 */ -layout-passport > .container { - background-position: center !important; - background-repeat: repeat !important; - background-size: cover !important; - background-color: #fff !important; - background-image: url(https://www.erupt.xyz/demo/login-bg.svg) !important; -} - -layout-passport .title { - font-family: Courier New, Menlo, Monaco, Consolas, monospace !important; -} - -layout-passport form { - padding: 26px !important; - margin: 8px !important; - background: rgba(255, 255, 255, 0.9); - border-radius: 3px; - box-shadow: 1px 1px 10px rgba(190, 184, 184, 0.3); -} \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/resources/public/app.js b/fluent-apps/qaserver/src/main/resources/public/app.js deleted file mode 100644 index c4435ce..0000000 --- a/fluent-apps/qaserver/src/main/resources/public/app.js +++ /dev/null @@ -1,71 +0,0 @@ -window.eruptSiteConfig = { - //erupt接口地址,在前后端分离时指定 - domain: "", - //附件地址,一般情况下不需要指定,如果自定义对象存储空间,则需在此指定附件资源访问地址 - fileDomain: "", - //标题 - title: "QA Workspace", - //描述 - desc: "QA Base", - //是否展示版权信息 - copyright: true, - //高德地图 api key,使用地图组件须指定此属性,amapKey获取地址:https://lbs.amap.com (服务平台为:Web端(JS API)) - amapKey: "xxxx", - //高德地图 SecurityJsCode - amapSecurityJsCode: "xxxxx", - //logo路径 - logoPath: "erupt.svg", - //logo文字 - logoText: "erupt", - //注册页地址(仅是一个链接,需要自定义实际样式) - registerPage: "", - //自定义导航栏按钮,配置后将会出现在页面右上角 - r_tools: [{ - text: "自定义功能按钮", - icon: "fa-eercast", - mobileHidden: true, - click: function (event) { - alert("Function button"); - } - }], - // //登录成功事件 - // login: function(user){ - // - // }, - // //注销事件 - // logout: function(user){ - // - // } -}; - -// //路由回调函数 -// window.eruptRouterEvent = { -// //key表示要监听的路由切换地址,为url hash地址最后一段 -// //例如:http://www.erupt.xyz:9999/#/build/table/demo中demo为回调key -// demo: { -// //路由载入事件 -// load: function (e) { -// -// }, -// //路由退出事件 -// unload: function (e) { -// -// } -// }, -// //$ 为全路径通配符,在任何路由切换时都会执行load与unload事件 -// $: { -// load: function (e) { -// -// }, -// unload: function (e) { -// } -// } -// }; -// -// //erupt生命周期函数 -// window.eruptEvent = { -// //页面加载完成后回调 -// startup: function () { -// -// } -// } \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/resources/public/home.html b/fluent-apps/qaserver/src/main/resources/public/home.html deleted file mode 100644 index 35600cf..0000000 --- a/fluent-apps/qaserver/src/main/resources/public/home.html +++ /dev/null @@ -1,12 +0,0 @@ - - - - home - - - - - -

测试管理工具箱

- - \ No newline at end of file diff --git a/fluent-apps/qaserver/src/main/resources/tpl/operation.tpl b/fluent-apps/qaserver/src/main/resources/tpl/operation.tpl deleted file mode 100644 index d77f7e5..0000000 --- a/fluent-apps/qaserver/src/main/resources/tpl/operation.tpl +++ /dev/null @@ -1,17 +0,0 @@ - -
- - <#list rows as row> - - - - - - -
${row.id}${row.choice!''}${row.code!''}
-
\ No newline at end of file diff --git a/libs-concept.png b/libs-concept.png deleted file mode 100644 index b624a06..0000000 Binary files a/libs-concept.png and /dev/null differ diff --git a/overall.png b/overall.png deleted file mode 100644 index b624a06..0000000 Binary files a/overall.png and /dev/null differ diff --git a/pom.xml b/pom.xml index eb3ee7e..e3e41cb 100644 --- a/pom.xml +++ b/pom.xml @@ -18,13 +18,11 @@ fluent-erupts components modules - fluent-wrappers - fluent-apps - 1.12.14 + 1.12.15 1.0-SNAPSHOT 42.5.1 UTF-8 diff --git a/qa-automation.png b/qa-automation.png deleted file mode 100644 index feec58d..0000000 Binary files a/qa-automation.png and /dev/null differ diff --git a/references.yaml b/references.yaml deleted file mode 100644 index f987426..0000000 --- a/references.yaml +++ /dev/null @@ -1,223 +0,0 @@ -stacks: - - https://www.stackradar.co/ -agent: - - https://www.agentcrew.co/ - -frontend: - - https://easyfrontend.com/ - -web-editor: - - https://deco.cx/ - - https://github.com/toeverything - - https://liveblocks.io/ - - https://shuffle.dev/ -diagrams: - - https://www.eraser.io/ - -ai-framework: - - https://www.edenai.co/ - - https://dashboard.cohere.com/ -website: - - https://explorer.globe.engineer/ -api: - - https://github.com/pb33f/libopenapi.git -backend-api: - - https://gitee.com/ssssssss-team/magic-api -cases: - - https://github.com/ugorsahin/TalkingHeads.git - -devops: - - https://github.com/merico-dev/lake.git - - https://github.com/YaoApp/xgen.git - - https://gitee.com/jianmu-dev/jianmu.git - - https://gitee.com/ketr/jecloud - -database-tools: - - https://github.com/azimuttapp/azimutt.git -qa-tools: - - https://gitee.com/cat2bug/cat2bug-platform.git -qa-ai-tools: - - https://gitee.com/panday94/chat-master.git -admin: - - https://github.com/build-admin/build-admin-nuxt.git -springboot: - - https://github.com/spring-projects-experimental/spring-modulith.git -data-orm: - - https://github.com/ebean-orm/ebean.git -api-mgt: - - https://github.com/apiaryio/dredd.git - - https://github.com/Kong/insomnia.git - - https://github.com/xyyxhcj/vpi.git -low-code: - - https://gitee.com/yabushan/low-code-data-center.git - - https://gitee.com/zj1983/zz.git - - https://gitee.com/y_project/RuoYi-Vue.git - - https://gitee.com/newcorenet/elcube-backend.git - - https://github.com/sallamy2580/goframe2.0-fullstack.git - - https://github.com/structr/structr - - https://gitee.com/jmix/jmix - - https://github.com/jet-admin/jet-bridge - - https://netease.github.io/tango-site/ -bi: - - https://github.com/pinterest/querybook.git -openapi: - - https://github.com/getkin/kin-openapi.git -code-gen: - - https://github.com/cmeza20/spring-data-generator.git - - https://github.com/fescobar/allure-docker-service.git -todo: - - https://github.com/omkarcloud/bose.git - - https://github.com/emaiannone/surface - - https://github.com/Tencent/QTAF - - https://github.com/BalamiRR/Testinium-QA.git - - https://github.com/abstracta/jmeter-java-dsl.git - - https://github.com/graalvm/graalvm-demos.git -testing-framework: - - https://github.com/ghoshasish99/CognizantIntelligentTestScripter.git -libs: - - https://github.com/blinkfox/fenix.git - - https://github.com/opengoofy/hippo4j.git - - https://github.com/pf4j/pf4j.git - - https://github.com/xvik/guice-validator.git - - https://github.com/yqhp/yqhp.git - - https://github.com/test-instructor/yangfan.git - - https://github.com/sartography/SpiffWorkflow.git - - https://github.com/baloise/test-automation-framework.git - - https://github.com/kubeshop/tracetest.git - - https://github.com/trytouca/trytouca.git - - https://github.com/aws/universal-test-runner.git - - https://github.com/EsperoTech/yaade.git - - https://github.com/taverntesting/tavern.git - - https://github.com/shulieTech/Takin-web.git - - https://github.com/QualitySphere/qsphere-svc.git - - https://gitee.com/hsth/qingyun.git - - https://github.com/teemtee/tmt.git - - https://github.com/ae86sen/aomaker.git - - https://github.com/dongfanger/tep.git - - https://gitee.com/season-fan/autometer-api.git - - https://github.com/kiwicom/contessa.git - - https://github.com/testsigmahq/testsigma.git - - https://github.com/httprunner/httprunner.git - - https://github.com/wu-clan/httpfpt.git - - https://github.com/cobrateam/splinter.git - - https://github.com/bcpeinhardt/schnauzerUI.git - - https://github.com/hamibot/hamibot.git - - https://github.com/Blazemeter/taurus.git - - https://github.com/bmarsh9/spate.git - - https://github.com/SeldomQA/seldom.git - - https://github.com/keploy/keploy.git - - https://github.com/Blazemeter/apiritif.git - - https://github.com/spotify/github-java-client.git - - https://github.com/cdimascio/dotenv-java.git - - https://github.com/jcabi/jcabi-github.git - - https://github.com/joelittlejohn/jsonschema2pojo.git - - https://github.com/pf4j/pf4j.git - - https://github.com/domaframework/doma.git - - https://github.com/jcabi/jcabi-github.git - - https://github.com/javaxcel/javaxcel-core.git - - https://github.com/joelittlejohn/jsonschema2pojo.git - - https://github.com/exactpro/clearth - - https://github.com/FeatureProbe/FeatureProbe.git - - https://github.com/dipjyotimetia/HybridTestFramework.git - - https://github.com/microsoft/HydraLab.git - - https://github.com/qu-niao/LimApiTest.git - - https://github.com/Chras-fu/Liuma-engine.git - - https://github.com/Chras-fu/Liuma-platform.git - - https://github.com/easysoft/zendata.git - - https://github.com/bmw-software-engineering/trlc.git - - https://github.com/flawiddsouza/Restfox.git - - https://github.com/easysoft/zentaoatf.git - - https://github.com/authorjapps/zerocode.git - - https://github.com/baizunxian/zerorunner.git - - https://github.com/testingisdocumenting/znai.git - - https://github.com/ShaftHQ/SHAFT_ENGINE - - https://github.com/oneops/oneops.git - - https://github.com/SeldomQA/poium.git - - https://github.com/testingisdocumenting/webtau.git - - https://github.com/Tinkoff/overhave.git - - https://github.com/ryandem1/oar.git - - https://github.com/tag1consulting/goose - - https://github.com/stepci/stepci.git - - https://github.com/ctripcorp/flybirds.git - - https://github.com/Meituan-Dianping/lyrebird.git - - https://github.com/zebrunner/carina.git - - https://github.com/os7blue/dobby.git - - https://github.com/SciCrunch/scibot.git - - https://gitee.com/troyzhxu/bean-searcher.git -data-plugins: - - https://github.com/data-integrations/database-plugins.git -data-factory: - - https://github.com/houbb/data-factory.git - -guides: - - https://github.com/Snailclimb/JavaGuide.git - - https://github.com/vladmihalcea/high-performance-java-persistence.git - - https://github.com/Linkshegelianer/java-labs.git - - https://github.com/jvm-bloggers/jvm-bloggers.git -test-libs: - - https://github.com/quick-perf/quickperf -tutorials: - - https://www.baeldung.com/ - - https://www.dariawan.com/ - - https://reflectoring.io/ - - https://javarevisited.blogspot.com/ - - https://medium.com/javarevisited/9-things-java-programmers-should-learn-in-2018-3f0b2207dfc4 - - https://www.designgurus.io/ - - https://www.amitph.com/ - -product: - - https://app.qase.io/apps - - https://testsigma.com/ - - https://www.kualitee.com/ - - https://www.techtarget.com/searchsoftwarequality/definition/test-case - -testing-tutorial: - - https://www.coursera.org/articles/how-to-write-test-cases - - https://www.guru99.com/ - - https://strongqa.com/ - - https://www.geeksforgeeks.org/difference-between-test-case-and-scenarios/ - - https://www.leapwork.com/blog/test-case-vs-test-scenario - - https://www.lambdatest.com/blog/test-scenario-vs-test-case/ - - https://smartbear.com/test-management/testing-scripts-cases-scenarios/ - - https://www.tatvasoft.com/outsourcing/2023/07/test-case-vs-test-scenario.html - - https://www.qamadness.com/knowledge-base/test-scenario-definition-purpose-and-how-to-create/ - - https://www.wetest.net/blog/understanding-the-difference-test-scenario-test-case-707.html - - https://blog.testlodge.com/whats-the-difference-between-test-case-and-test-scenario/ - - https://toolsqa.com/software-testing/test-scenario/ - - https://www.botplayautomation.com/ - - https://artoftesting.com/ - - https://www.browserstack.com/guide/how-to-write-test-cases - - https://www.applause.com/blog/what-is-a-test-case-examples-types-format -robot-framework: - - https://github.com/QualitySphere/hRobot.git -repo-migration: - - https://github.com/ctco/cukes.git - - https://github.com/jdi-testing/jdi-dark.git -products: - - JIRA - - JunoOne - - Klaros-Testmanagement - - QACoverage - - Qase - - SPIRATEST by Inflectra - - TestFLO for JIRA - - Testpad - - XQual - - Xray - - Zephyr Scale - - Zephyr Squad - -self-hosts: - - https://github.com/dronahq/self-hosted.git - - https://github.com/LetTTGACO/elog.git - - https://zeon.studio/ -agile/NPM: - - -solutions: - - name: cal - repos: - - https://github.com/calcom/cal.com.git - - name: payment - repos: - - https://gitee.com/dromara/payment-spring-boot.git \ No newline at end of file diff --git a/to-do.yaml b/to-do.yaml deleted file mode 100644 index 87ede38..0000000 --- a/to-do.yaml +++ /dev/null @@ -1,24 +0,0 @@ -data: - - https://gitee.com/opengoofy/crane4j - - https://github.com/datageartech/datagear.git - - https://github.com/apicat/datagen.git -low-code: - - https://github.com/pingapi/crabc-api - -toolkits: - - https://github.com/sharat87/littletools.git - - https://github.com/QAInsights/Streamlit-JMeter.git - - https://github.com/mnotgod96/AppAgent.git - - https://github.com/shulieTech/Takin-web.git -openapi: - - https://github.com/wu-clan/httpfpt.git - - https://openapi.tools/ - -integrations: - - https://gitee.com/durcframework/torna.git - - metersphere - - https://github.com/crusher-dev/crusher.git - - http://github.com/hoppscotch/hoppscotch.git -report: - - https://github.com/SeldomQA/XTestRunner.git - - https://github.com/extent-framework/extentreports-java.git \ No newline at end of file